Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

And that is the best thing about AI, it allows you to do and try so much more in the limited time you have. If you have an idea, build it with AI, test it, see where it breaks. AI is going to be a big boost for education, because it allows for so much more experimentation and hands-on.


By using AI, you learn how to use AI, not necessarily how to build architecturally sound and maintainable software, so being able to do much more in a limited amount of time will not necessarily make you a more knowledgeable programmer, or at least that knowledge will most likely only be surface-level pattern recognition. It still needs to be combined with hands-on building your own thing, to truly understand the nuts and bolts of such projects.


If you end up with a working project where you understand all the moving parts, I think AI is great for learning and the ultimate proof whether the learning was succesful if whether you can actually build (and ship) things.

So human teachers are good to have as well, but I remember they were of limited use for me when I was learning programming without AI. So many concepts they tried to teach me without having understood themself first. AI would have likely helped me to get better answers instead of, "because that is how you do it" when asking why to do something in a certain way.

So obviously I would have prefered competent teachers all the time and also now competent teachers with unlimited time instead of faulty AIs for the students, but in reality human time is limited and humans are flawed as well. So I don't see the doomsday expectations for the new generation of programmers. The ultimate goal, building something that works to the spec, did not change and horrible unmaintainable code was also shipped 20 years ago.


I don't agree, to me switching from hand coded source code to ai coded source code is like going from a hand-saw to an electric-saw for your woodworking projects. In the end you still have to know woodworking, but you experiment much more, so you learn more.

Or maybe it's more like going from analog photography to digital photography. Whatever it is, you get more programming done.

Just like when you go from assembly to c to a memory managed language like java. It did some 6502 and 68000 assembly over 35 years ago, now nowbody knows assembly.


> to me

Key words there. To you, it's a electric saw because you already know how to program, and that's the other person's point; it doesn't necessarily empower people to build software. You? Yes. Generally though when you hand the public an electric saw and say "have at it, build stuff" you end up with a lot of lost appendages.

Sadly, in this case the "lost appendages" are going to be man-decades of time spent undoing all the landmines vibecoders are going to plant around the digital commons. Which means AI even fails as a metaphorical "electric saw", because a good electric saw should strike fear into the user by promising mortal damage through misuse. AI has no such misuse deterrent, so people will freely misuse it until consequences swing back wildly, and the blast radius is community-scale.

> more like going from analog photography to digital photography. Whatever it is, you get more programming done.

By volume, the primary outcome of digital photography has been a deluge of pointless photographs to the extent we've had to invent new words to categorize them. "selfies". "sexts". "foodstagramming". Sure, AI will increase the actual programming being done, the same way digital photography gave us more photography art. But much more than that, AI will bring the equivalent of "foodstagramming" but for programs. Kind of like how the Apple App Store brought us some good apps, but at the same time 9 bajillion travel guides and flashlight apps. When you lower the bar you also open the flood gates.


Being able to do it quicker and cheaper will often ensure more people will learn the basics. Electrical tools open up woodworking to more people, same with digital photography, more people take the effort to learn the basics. There will also be many more people making rubbish, but is that really a problem?

With ai it’s cheap and fast for a professional to ask the AI: what does this rubbish software do, and can you create me a more robust version following these guidelines.


> With ai it’s cheap and fast for a professional to ask the AI: what does this rubbish software do, and can you create me a more robust version following these guidelines.

This falls apart today with sufficiently complex software and also seems to require source availability (or perfect specifications).

One of the things I keep an eye out for in terms of "have LLMs actually cracked large-product complexity yet" (vs human-overseen patches or greenfield demos) is exactly that sort of re-implementation-and-improvement you talk about. Like a greenfield Photoshop substitute.


Your last point is also something that happened when the big game engines such as Unity became free to use. All of a sudden, Steam Greenlight was getting flooded with gems such as "potato peeling simulator" et al. I suppose it is just a natural side effect of making things more accessible.


> Sadly, in this case the "lost appendages" are going to be man-decades of time spent undoing all the landmines vibecoders are going to plant around the digital commons.

Aren't you being overly optimistic that these would even get traction?


Pessimistic, but yeah. It's just my whole life has been a string of the absolute worst ideas being implemented at scale, so I don't see why this would buck the trend.


> By using AI, you learn how to use AI, not necessarily how to build architecturally sound and maintainable software

> will not necessarily make you a more knowledgeable programmer

I think we'd better start separating "building software" from programming, because the act of programming is going to continue to get less and less valuable.

I would argue that programming has been very overvalued for a while even before AI. And the industry believes it's own hype with a healthy dose of elitism mixed in.

But now AI is removing the facade and it's showing that the idea and the architecture is actually the important part, not the coding if it.


I find it super ironic that you talk about "the industry believing its own hype" and then continue with a love letter for AI.


Ok. But most developers aren't building AI tech. Instead, they're coding a SPA or CRUD app or something else that's been done 10000 times before, but just doing it slightly differently. That's exactly why LLMs are so good at this kind of (programming) work.


I would say most people are dealing with tickets and meetings about the tickets more than they are actually spending time with their editor. It may be similar, but that 1 percent difference needs to be nailed down right, as that's where the business lifeline lays.

Also, not all dev jobs are web tech or AI tech.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: