> I'm putting "securing" in scare quotes because IMO it's fool's errand to even try - LLMs are fundamentally not securable like regular, narrow-purpose software, and should not be treated as such.
Indeed. Between this fundamental unsecurability and alignment, I struggle to see how OpenAI/Anthropic/etc will manage to give their investors enough RoI to justify the investment
As others pointed out, the problem with this is that you end up with a government that can target any reporter by claiming they have "classified materials". No need to prove what those materials are (because they are classified). This is how third world countries choke journalists.
In order for ICE to raid homes they need to have a valid warrant signed by a judge, but that doesn't seem to be stopping them in Minneapolis doing warrantless raids.
This happened to me last month. After the first song, I suspected so I checked the cover and the artist profile. It was AI generated. I enjoyed the album nevertheless. You can find AI music enjoyable. People also hated DJ music before. And recorded music before. And electro amplified live music performances before that. This is just another category of music. Doesn't take away from human music. What people are right to be angry is that the tech was made on the backs of other people's non-remunerated work. Whether a human made a song or not shouldn't be as important as actual living artists being taken advantage of.
I agree with you. It should also be clearly marked it's AI.
I have this discussion all the time about written stories. At some point AI will start creating very good and possibly great written works. Do we ignore them because they are AI? I would hope not.
I agree entirely. Well, not entirely. I think anger would also be an understandable response if the music were misrepresented as being by human musicians if it weren't. Like it would be understandable if people got mad if they thought they bought easy listening and actually got acid metal. Or vice versa.
Problem is that most methods involve making your location known openly. The Dark Forest book of the Remembrance of the Earth Past explains why it is not a good idea to do so in the current circumstances
>the negative feelings about LLMs are not because they are new territory, it’s because they lack the predictability and determinism that draw many people to computers
Louder for those turned deaf by LLM hype. Vibe coders want to turn a field of applied math into dice casting.
>I found it especially ironic being on hacker news how few people seemed to have a hacker mindset when it came to LLMs
You keep using the word "LLMs" as if Opus 4.x came out in 2022. The first iterations of transformers were awful. Gpt-2 was more of a toy and Gpt-3 was an eyebrow-raising chatbot. It has taken years of innovations to reach the point of usable stuff without constant hallucinations. So don't fault devs for the flaws of early LLMs
to be fair, that cobol program has been working for probably 30 years (maybe even longer than that) - thats unusually reliable and long-lived for a software project.
the only real contender in this regard is the win32 api, and actually that did get used in enterprise for a long time too before the major shift to cloud and linux in the mid 2010s.
ultimately the proof is in the real-world use, even if its ugly to look at... id say, even as someone who is a big fan of linux, if i were given a 30 year old obscure software stack that did nothing but work, i would be very hesitant to touch it too!
It still needs continual software maintenance though. The developers still making their money in COBOL make it because it doesn't just keep working untouched. (Just about no software does.)
I would like to add the business core functions of SAP R/3 (1992). Much of the code created for it in the early 90s still lives in the current SAP S/4HANA software.
To me, using an LLMs is more like having a team of ghostwriters writing your novel. Sure, you "built" your novel but it feels entirely different to writing it yourself.
Wouldn't it be like having a team of software developers writing your code? The analogy doesn't need to be even as far as a different line of work. And for some this (writing to managing) is a natural career progression.
And if you write novels mostly because you enjoy watching them sell, as opposed to sharing ideas with people, you don't care.
To scientists, the purpose of science is to learn more about the world; to certain others, it's about making a number of dollars go up. Mathematicians famously enjoy creating math, and would have no use for a "create more math" button. Musicians enjoy creating music, which is very different from listening to it.
We're all drawn to different vocations, and it's perverse to accept that "maximize shareholder value" is the highest.
Is that meant to be good? I always chuckle when people make these kind of statements. Is the association to cosmic objects meant to make you feel better about something? I personally don't find stardust particularly interesting. The fundamental forces of nature on the other hand are much more appealing to me.
I believe it’s quite common for people to marvel at the vastness of the universe. For that reason, people might like the tangible link that they feel to the rest of the universe when they think of this - it’s amazing to think of how small we are in it, but also amazing to think of where “we” came from.
Indeed. Between this fundamental unsecurability and alignment, I struggle to see how OpenAI/Anthropic/etc will manage to give their investors enough RoI to justify the investment
reply