Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've used it a bit. I've done some very useful stuff, and I've given up with other stuff and just done it manually.

What it excels at is translation. This is what LLMs were originally designed for after all.

It could be between programming languages, like "translate this helm chart into a controller in Go". It will happily spit out all the structs and basic reconciliation logic. Gets some wrong but even after correcting those bits still saves so much time.

And of course writing precise specs in English, it will translate them to code. Whether this really saves time I'm not so convinced. I still have to type those specs in English, but now what I'm typing is lost and what I get is not my own words.

Of course it's good at generating boilerplate, but I never wrote much boilerplate by hand anyway.

I've found it's quite over eager to generate swathes of code when you wanted to go step by step and write tests for each new bit. It doesn't really "get" test-driven development and just wants to write untested code.

Overall I think it's without doubt amazing. But then so is a clown at a children's birthday party. Have you seen those balloon animals?! I think it's useful to remain sceptical and not be amazed by something just because you can't do it. Amazing doesn't mean useful.

I worry a lot about what's happening in our industry. Already developers get away with incredibly shoddy practices. In other industries such practices would get you struck off, licences stripped, or even sent to prison. Now we have to contend with juniors and people who don't even understand programming generating software that runs.

I can really see LLMs becoming outlawed in software development for software that matters, like medical equipment or anything that puts the public in danger. But maybe I'm being overly optimistic. I think generally people understand the dangers of an electrician mislabelling a fusebox or something, but don't understand the dangers of shoddy software.



  > people understand the dangers of an electrician mislabelling a fusebox or something, but don't understand the dangers of shoddy software
i mean, most software is covered by those big all-caps NO WARRANTY ASSUMED OR IMPLIED / USE AT YOUR OWN RISK

if there was legal recourse for most software issues you can bet the current frenzy around ai-agentic coding would be much more carefully done

i think like many things, laws wont appear until a major disaster happens (or you get a president on your side *)

* https://www.onlinesafetytrainer.com/president-theodore-roose...


And there is indeed software that is not covered by those tags, plenty of it in fact. It just so happens that it's a few orders of magnitude more expensive, and so you never hear about it until you're actually designing e.g. provably safe firmware for pacemakers and the like.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: