The list of extensions being scanned for are pretty clear and obvious. What is really interesting to me are the extensions _not_ being scanned for that should be.
The big one that comes to mind is "Contact Out" which is scan-able, but LinkedIn seems to pretend like it doesn't exist? Smells like a deal happened behind the scenes...
How is doing either going to keep you competitive in the market when everyone is coding faster than you using modern tools?
That stance honestly sound like me not using a compiler and doing everything in assembly like I did 40 years ago in my bedroom in 6th grade on my Apple //e.
I might be an old guy at 51. But I’m not that old guy. I’m the old guy who didn’t have to worry about “ageism” in 2020 when I got my first (and hopefully last ) job at BigTech in 2020, another after looking for two weeks in 2023 (with 3 offers) and another in 2024 when looking just by responding to an internal recruiter - I’m a staff cloud consultant (full time) specializing in app dev.
Not claiming I’m special. But I like to eat, stay clothes and stay housed. I do what it takes
Are they coding faster? The METR 2025 study shows LLM users feel faster but are actually slower. If LLMs make programmers more productive, then awesome LLM–written software is available everywhere for a low price — so where is it?
Why would any company past the savings on to you? Besides “no one ever got fired for buying IBM”. No company is going to replace SalesForce with a vibe coded alternative they find out about on “Ask HN”. Coding has never been the issue with having a successful project. Look at all of the companies who were crushed by BigTech just throwing a few developers at the same problem and having a good enough alternative.
Librewolf disables webgl out of the box to combat fingerprinting. You have to enable it by setting `webgl.disabled = false` in about:config, OR maybe it'll work if you add an exception for the site in settings under the tracking protection section.
The site works on my Librewolf version 146.0-2 installed via Flatpak
Because it's a zero-maintenance solution that performs well out the box, has excellent analysis tools that are free, and works with no issues.
Postgres is work. Mysql is not. I say this having used mysql, postgres, oracle, sybase, mssql, ingres, rdb, dbase, and other random data stores in production.
I was 17 at the time :) And FWIW, the whole joke there was that neither me nor the other guy being interviewed had anything at all to do with the attacks on PSN and XBL.
Yes, I'm sure my comments here are just full of terribly damaging stuff.
Not sure what the theory here is. Am I supposed to worry about the judges stalking me online and reading my HN comments professing innocence?
The prosecutors couldn't, and wouldn't even want to use anything I've written here, especially considering the trial is over and they can't just file new evidence.
Someone else leaked a copy of a shared throwaway VM used for hacks. Akin to https://www.thc.org/segfault/, but longer lived and potentially tens of people with access.
The leaked home folder data doesn't really tie that VM to anyone, which is natural given that it seems to have mostly been used to run headless hacking tools and inspect their output.
The idea that I'm linked to this VM comes from the ridiculous idea that lazy hackers would not share SSH key files in order to control access to groups of virtual machines. I.e. if a SSH key fingerprint is at one point tied to me, that key must also still belong to me even when used from a internet connection belonging to another person in another country with a similar track record as me.
In court we had long debates about whether or not hackers could actually be so lazy as to violate best practices by sharing private key material, the lower court rejected such an idea as incredible and found me guilty.
I have a better idea-- why doesn't GitHub (that closed source platform) donate 20% of all revenue to opensource projects that enable the company to exist?
When IDEs entered the market, they were the talk of the town. Improve your programming speed. Blazingly fast refactoring. A graphical debugger and logging information. Intellisense, inline documentation and the whole nine yards.
Many developers never bothered with IDEs. We were happy using Vim, Emacs and many of us continue to do so today.
It's not surprising the first "innovation" was agentic programming with a modified IDE.
I'm sure many people will enjoy their new IDEs. I don't enjoy it. I enjoy doing things a different way.
Wow, so to prevent AI scrapers from harvesting my data I need to send all of my traffic through a third party company that gets to decide who gets to view my content. Great idea!
Yes, they could roll their own, but you have no issues with this being necessary? I think the attitude of "just deal with it" is far more negative than someone expressing they are upset with the state of the internet, its controllers, and its abusers.
This is like saying "lets just get rid of all the guns" to solve gun violence and gun crime in the USA. The cat is out of the bag and no one can put it back. We live in a different world now and we have to figure it out.
> Must everything in AI threads be so negative and condescending?
Because if I own a website or a service and it is being degraded or slowed by some third party tool that wants to slurp its content for his own profit and don't even share, I tend to be irritated. And AI apologists/evangelists don't help when they try to justify the behavior.
I use iocaine[0] to generate a tarpit. Yesterday it served ~278k "pages" consisting of ~500MB of gibberish (and that's despite banning most AI scrapers in robots.txt.)
It still fails with all of my extensions disabled (wipr, privacy redirect). I just get a download dialog. I don't know what the HTTP status code is, however.
I found a flagged HN submission about it and it has just about the same result for me and for others. My first tap failed in a weird way (showed some text then redirected quickly to its git repo) and all subsequent taps trigger a download.
Unfortunately and you kind of have to count this as the cost of the Internet. You've wasted 500Mb of bandwidth.
I've had colocation for eight years+. My monthly b/w cost is now around 20-30Gb a month given to scrapers where I was only be using 1-2Gb a month, years prior.
I pay for premium bandwidth (it's a thing) and only get 2TB of usable data. Do I go offline or let it continue?
i have no idea what this does because the site is rejecting my ordinary firefox browser with "Error code: 418 I'm a teapot". Even from a private browser.
If I hit it with Chrome, now I can see a site.
Seems pretty not ready for prime time as a lot of my viewers use Firefox
Anubis is the only tool that claims to have heuristics to identify a bot, but my understanding is that it does this by presenting obnoxious challenges to all users. Not really feasible. Old school approaches like ip blocking or even ASN blocking are obsolete - these crawlers purposely spam from thousands of IPs, and if you block them on a common ASN, they come back a few days later from thousands of unique ASNs. So this is not really a "roll your own" situation, especially if you are running off the shelf software that doesn't have some straightforward means of building in these various approaches of endless page mazes (which I would still have to serve anyway).
I would argue most never did.
If you spend time in the startup world you quickly realize how little the average developer cares about craftsmanship or quality.
The startup world is full of mantras like move fast and break things, or if you are not embarrassed by your mvp it’s not an mvp.
reply