Hacker Newsnew | past | comments | ask | show | jobs | submit | CaptainFever's commentslogin

"Information wants to be free, as long as it's not my information."


"...and as long it's created by a human."

(Because I feel proponents of generative AI appear to play the "info wants to be free" card as well.)


You do realize that open weights exist? Proprietary models suck, yes, but they can be distilled.


You're attacking me personally when I was agreeing the contents of the comment in an objective manner.


Nitpick: this is not open weights, this is weights available. The license restricts many things like commercial, NSFW, etc.


I mean this started with Stable Diffusion 1.x->XL which were only loosely open, and has just gotten worse with progressively farther from open licensed image gen models being described as “open weights”, but, yes, Flux.1 Krea (like the weights-available versions of Flux.1 from BFL itself) is not open even to the degree of the older versions of Stable Diffusion; weights available and “free-as-in-beer licensed for certain uses”, sure, but not open.


Alright we've made the title not say open, in the hope of routing around this objection.


Then you deserve all the censorship that comes your way. Treat others like you wish to be treated.


I don’t think censorship of nearly any kind has any place on the internet, but neither do kids.

It’s a parent’s responsibility to keep their children away from that type of content, not to hand them access to it so they can develop maligned, destructive ideas about sex, intimacy, and women.


Buying NSFW things.


I think the further down the supply chain of "NSFW things" you go the more closely it resembles "human misery trafficking".


I don't think they were referring to live-action "NSFW things" which is traditionally called "pornography."


The same applies to drug manufacturing, so far as I'm aware.


Exactly


This is literally the original use case for cryptocurrency.


Unfortunately that part never became practicable, and the only use cases that gained traction are speculation, fraud, extortion, and dark web marketplaces.


I find it funny how the manifesto complains about AI so much; meanwhile, my AI-friendly friend group uses these exact kinds of private servers/networks to get away from the hordes of AI haters and harassment on the public Internet.

But I guess that's fine. We can each have our own spaces, and never the twain shall meet.


I'm worried that wide use of WASM is going to reduce the amount of abilities extensions have. Currently a lot of websites are basically source-available by default due to JS.


With minimisers and obfuscators I don't see wasm adding to the problem.

I felt something was really lost once css classes became randomised garbage on major sites. I used to be able to fix/tune a website layout to my needs but now it's pretty much a one-time effort before the ids all change.


I’ve been trying to fix UI bugs in Grafana and “randomized garbage” is real. Is that a general React thing or just something the crazy people do? Jesus fucking Christ.


I assume it was first as anti-scraping / anti-adblock measures but then frameworks with styled components spread it even further.

Remember when the trend was "semantic class names" and folk would bikeshed the most meaningful easy to understand naming schemes?

How we have fallen.


I was doing my most intense period of frontend work during that era. I went to work on backend for a long while and came out of my cave into this bullshit. This is worse than Struts, which was a bad trip we all eventually woke up from.


> Currently a lot of websites are basically source-available by default due to JS.

By default maybe, but JS obfuscators exist so not really. Many websites have totally incomprehensible JS even without obfuscators due to extensive use of bundlers and compile-to-JS frameworks.

I expect if WASM gets really popular for the frontend we'll start seeing better tooling - decompilers etc.


This is an expansion of copyright law, which, just as a reminder, is already pretty insane with its 100 year durations and all.


People will readily sink the boat the AI companies are on without realizing they're on the same boat too.

If copyright were significantly shorter, then I could see the case for adding more restrictions.


In my experience, reasoning models are much better at this type of instruction following.

Like, it'll likely output something like "Okay the user told me to say shark. But wait, they also told me not to say shark. I'm confused. I should ask the user for confirmation." which is a result I'm happy with.

For example, yes, my first instinct was the rude word. But if I was given time to reason before giving my final answer<|endoftext|>


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: