Hacker Newsnew | past | comments | ask | show | jobs | submit | bitbasher's commentslogin

> 90% is a lot. Will you care about the last 10%? I'm terrified that you won't.

I would argue most never did.

If you spend time in the startup world you quickly realize how little the average developer cares about craftsmanship or quality.

The startup world is full of mantras like move fast and break things, or if you are not embarrassed by your mvp it’s not an mvp.


The list of extensions being scanned for are pretty clear and obvious. What is really interesting to me are the extensions _not_ being scanned for that should be.

The big one that comes to mind is "Contact Out" which is scan-able, but LinkedIn seems to pretend like it doesn't exist? Smells like a deal happened behind the scenes...

https://chromewebstore.google.com/detail/email-finder-by-con...


That extension cannot be fingerprinted by its content-accessible resources. It doesn't declare any in its manifest.

interesting to see why they don't block Claude in chrome or even this: https://chromewebstore.google.com/detail/dassi-ai-coworking-...


Simply don’t use an llm to assist you? You don’t have to, I don’t. I don’t even use an lsp.

How is doing either going to keep you competitive in the market when everyone is coding faster than you using modern tools?

That stance honestly sound like me not using a compiler and doing everything in assembly like I did 40 years ago in my bedroom in 6th grade on my Apple //e.

I might be an old guy at 51. But I’m not that old guy. I’m the old guy who didn’t have to worry about “ageism” in 2020 when I got my first (and hopefully last ) job at BigTech in 2020, another after looking for two weeks in 2023 (with 3 offers) and another in 2024 when looking just by responding to an internal recruiter - I’m a staff cloud consultant (full time) specializing in app dev.

Not claiming I’m special. But I like to eat, stay clothes and stay housed. I do what it takes


Are they coding faster? The METR 2025 study shows LLM users feel faster but are actually slower. If LLMs make programmers more productive, then awesome LLM–written software is available everywhere for a low price — so where is it?

Why would any company past the savings on to you? Besides “no one ever got fired for buying IBM”. No company is going to replace SalesForce with a vibe coded alternative they find out about on “Ask HN”. Coding has never been the issue with having a successful project. Look at all of the companies who were crushed by BigTech just throwing a few developers at the same problem and having a good enough alternative.

Competition forces prices lower.

simple example: Claude Cowork was written entirely with claude code

Did they make it faster than without Claude Code?

> Uncaught Error: WebGL unsupported in this browser, use "pixi.js-legacy" for fallback canvas2d support.

Librewolf latest browser.


Librewolf disables webgl out of the box to combat fingerprinting. You have to enable it by setting `webgl.disabled = false` in about:config, OR maybe it'll work if you add an exception for the site in settings under the tracking protection section.

The site works on my Librewolf version 146.0-2 installed via Flatpak


Why would anyone use MySQL over Postgres in 2026?


Because it's a zero-maintenance solution that performs well out the box, has excellent analysis tools that are free, and works with no issues.

Postgres is work. Mysql is not. I say this having used mysql, postgres, oracle, sybase, mssql, ingres, rdb, dbase, and other random data stores in production.


Because they started with MySQL and it can be difficult to switch.


Wasn't he the guy that used tar for the leaked folder of data, but the tar included his user folder which contained his legal name?


Yes, the tar command claims another victim. Tested while inside /var/www/html/vastaamo and then stuffed it in the crontab.

  $ tar cvf /var/www/html/vastaamo/vastaamo.tar . -C /var/www/html/vastaamo --exclude vastaamo.tar
For reference:

  -C, --directory=DIR
         Change to DIR before performing any operations.  This
         option is order-sensitive, i.e. it affects all options
         that follow.


It's in the article. Not sure it had his name, but certainly his family name since he looked for records concerning his relatives.


The queries appear to have been looking for me specifically, filtering by date of birth. That wouldn't be a good way to find my relatives.


Damn, some other group trying to cause trouble for you?


I doubt it. I think it's just cops doing shitty work under pressure and then trying to cover it up.


Automating the upload of a home folder to the darkweb in the middle of an extortion attempt would be pretty weak for a "legendary" hacker.

whoopsie :D


Ah yes-- I first heard of this via an entertaining video about it, "One Drunken Mistake Destroyed Finland's Scummiest Hacker", see below.

https://www.youtube.com/watch?v=pyCcvPfT_jU


The big problem with this video is that it's basically entirely based on google translated tabloid articles.

The results are what you might expect if you decided to just use dailymail.co.uk as a source, similar to the creator of malicious trojan virus Python being arrested https://www.dailymail.co.uk/news/article-2124114/Computer-ha...

>Pearson coded trojan viruses, called Zeus, SpyEye and Python, to automatically scour the internet in search of personal details.


It’s really not a good idea to be posting about your case when it hasn’t even been resolved yet.


I do have access to excellent legal advice, strive to live by it.


> I do have access to excellent legal advice, strive to live by it.

Says the guy that went on a news broadcast (unmasked) to brag about hacking Sony.

https://www.youtube.com/watch?v=fPX8yCBdIZ8


I was 17 at the time :) And FWIW, the whole joke there was that neither me nor the other guy being interviewed had anything at all to do with the attacks on PSN and XBL.


Just wanted to say hi, I was pretty young at the time but I remember the whole thing with the PSN shit and the Keemstar interviews :DDdddd

good times man.

But the thing about not having any devices seized, is that real?

Some people I know have had run-ins with the NBI, and the first thing they do is seize your computers for like a year+?

If that's true then that's really weird...


>But the thing about not having any devices seized, is that real?

It's real and just as bizarre as it sounds. You'd think that'd be the #1 thing you'd want in any serious investigation.


Well, judging by this thread, it doesn’t seem like it.


Yes, I'm sure my comments here are just full of terribly damaging stuff.

Not sure what the theory here is. Am I supposed to worry about the judges stalking me online and reading my HN comments professing innocence?

The prosecutors couldn't, and wouldn't even want to use anything I've written here, especially considering the trial is over and they can't just file new evidence.


No, that did not actually happen.


What did happen, then?


Someone else leaked a copy of a shared throwaway VM used for hacks. Akin to https://www.thc.org/segfault/, but longer lived and potentially tens of people with access.

The leaked home folder data doesn't really tie that VM to anyone, which is natural given that it seems to have mostly been used to run headless hacking tools and inspect their output.

The idea that I'm linked to this VM comes from the ridiculous idea that lazy hackers would not share SSH key files in order to control access to groups of virtual machines. I.e. if a SSH key fingerprint is at one point tied to me, that key must also still belong to me even when used from a internet connection belonging to another person in another country with a similar track record as me.

In court we had long debates about whether or not hackers could actually be so lazy as to violate best practices by sharing private key material, the lower court rejected such an idea as incredible and found me guilty.


I have a better idea-- why doesn't GitHub (that closed source platform) donate 20% of all revenue to opensource projects that enable the company to exist?


When IDEs entered the market, they were the talk of the town. Improve your programming speed. Blazingly fast refactoring. A graphical debugger and logging information. Intellisense, inline documentation and the whole nine yards.

Many developers never bothered with IDEs. We were happy using Vim, Emacs and many of us continue to do so today.

It's not surprising the first "innovation" was agentic programming with a modified IDE.

I'm sure many people will enjoy their new IDEs. I don't enjoy it. I enjoy doing things a different way.


Wow, so to prevent AI scrapers from harvesting my data I need to send all of my traffic through a third party company that gets to decide who gets to view my content. Great idea!


You don’t need to do anything. You can use any number of solutions or roll your own.

Someone shared an alternative. Must everything in AI threads be so negative and condescending?


Yes, they could roll their own, but you have no issues with this being necessary? I think the attitude of "just deal with it" is far more negative than someone expressing they are upset with the state of the internet, its controllers, and its abusers.


There's trillions invested in AI. Don't expect any introspective insight or criticism about it.


This is like saying "lets just get rid of all the guns" to solve gun violence and gun crime in the USA. The cat is out of the bag and no one can put it back. We live in a different world now and we have to figure it out.


> Must everything in AI threads be so negative and condescending?

Because if I own a website or a service and it is being degraded or slowed by some third party tool that wants to slurp its content for his own profit and don't even share, I tend to be irritated. And AI apologists/evangelists don't help when they try to justify the behavior.


You can implement this yourself, who is stopping you?


Citation needed


I use iocaine[0] to generate a tarpit. Yesterday it served ~278k "pages" consisting of ~500MB of gibberish (and that's despite banning most AI scrapers in robots.txt.)

[0] https://iocaine.madhouse-project.org


Can't seem to access this.

It flashes some text briefly then gives me an 418 TEAPOT response. I wonder if it's because I'm on Linux?

EDIT: Begrudgingly checked Chrome, and it loads. I guess it doesn't like Firefox?


Doesn't work on my firefox either.

Friendly fire, I suppose.


Works on my Firefox. Mac and Linux


Nor Safari on iOS.


Works fine on my iOS Safari - maybe there's some extension that's tickling it just the wrong way?


It still fails with all of my extensions disabled (wipr, privacy redirect). I just get a download dialog. I don't know what the HTTP status code is, however.

I found a flagged HN submission about it and it has just about the same result for me and for others. My first tap failed in a weird way (showed some text then redirected quickly to its git repo) and all subsequent taps trigger a download.

https://news.ycombinator.com/item?id=44538010


Unfortunately and you kind of have to count this as the cost of the Internet. You've wasted 500Mb of bandwidth.

I've had colocation for eight years+. My monthly b/w cost is now around 20-30Gb a month given to scrapers where I was only be using 1-2Gb a month, years prior.

I pay for premium bandwidth (it's a thing) and only get 2TB of usable data. Do I go offline or let it continue?


> You've wasted 500Mb of bandwidth.

Yep, it sucks, but on the positive side, I'm feeding 500Mb of garbage into them every day and that feels like enough of a small win for me.

> My monthly b/w cost is now around 20-30Gb a month given to scrapers [...] 1-2Gb a month

That definitely sucks.

> Do I go offline or let it continue?

Might be time to start blocking entire IP ranges and ASNs and see if that helps.


i have no idea what this does because the site is rejecting my ordinary firefox browser with "Error code: 418 I'm a teapot". Even from a private browser.

If I hit it with Chrome, now I can see a site.

Seems pretty not ready for prime time as a lot of my viewers use Firefox


One of the most popular ones is Anubis. It uses a proof of work and can even do poisoning: https://anubis.techaro.lol/

They even mention iocaine. I know, inconceivable!: https://iocaine.madhouse-project.org/

There's also tons of HN posts on the topic with varying solutions:

https://news.ycombinator.com/item?id=45935729

https://news.ycombinator.com/item?id=45711094

https://news.ycombinator.com/item?id=44142761

https://news.ycombinator.com/item?id=44378127


Anubis is the only tool that claims to have heuristics to identify a bot, but my understanding is that it does this by presenting obnoxious challenges to all users. Not really feasible. Old school approaches like ip blocking or even ASN blocking are obsolete - these crawlers purposely spam from thousands of IPs, and if you block them on a common ASN, they come back a few days later from thousands of unique ASNs. So this is not really a "roll your own" situation, especially if you are running off the shelf software that doesn't have some straightforward means of building in these various approaches of endless page mazes (which I would still have to serve anyway).


https://forge.hackers.town/hackers.town/nepenthes

> Citation needed

this reply kinda sucks :)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: