Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

we've never seen a profession drive themselves so aggressively to irrelevance. software engineering will always exist, but it's amazing the pace to which pressure against the profession is rising. 2026 will be a very happy new year indeed for those paying the salaries. :)




We've been giving our work away to each other for free as open source to help improve each other's productivity for 30+ years now and that's only made our profession more valuable.

I see little proof open source has resulted in higher wages and not the fact that everything is being digitized and the subsequent demand for such people to assist in such.

I'm not sure how I can prove it, but ~25 years ago building software without open source sucked. You had to build everything from scratch! It took months to get even the most basic things up and running.

I think open source is the single most important productivity boost to our industry that's ever existed. Automated testing is a close second.

Google, Facebook, many others would not have existed without open source to build on.

And those giants and others like them that were enabled by open source employed a TON of people, at competitive rates that greatly increased our salaries.


25 years ago, I was slinging apps together super fast using VB6. It was awesome. It was a level of productivity few modern stacks can approach.

I'm too young to have used VB in the workforce, but I did use it in school, and honestly off that alone I'm inclined to agree.

I've seen VB namedropped frequently, but I feel like I've yet to see a proper discussion of why it seems like nothing can match its productivity and ease of use for simple desktop apps. Like, what even is the modern approach for a simple GUI program? Is Electron really the best we can do?

MS Access is another retro classic of sorts that, despite having a lot of flaws, it seems like nothing has risen to fill its niche other than SaaS webapps like airtable.


You can add Macromedia Flash to that list - nothing has really replaced it, and as a result the world no longer has an approachable tool for building interactive animations.

https://www.youtube.com/watch?v=hnaGZHe8wws

This is a nice video on why Electron is the best you might be able to do.


Thanks for the link - this is a cool video. Though it seems like it's mostly focusing on the performance/"bloat" side of things. I do agree that's an annoying aspect of Electron, and I do think his justifications for it are totally fair, but I was more so thinking about ease of use, especially for nontechnical people / beginners.

My memory of it is very fuzzy, but I recall VB being literally drag-and-drop, and yet still being able to make... well, acceptable UIs. I was able to figure it out just fine in middle school.

In comparison, here's Electron's getting started page: https://www.electronjs.org/docs/latest/ The "quick start" is two different languages across three different files. The amount of technologies and buzzwords flying around is crazy, HTML, JS, CSS, Electron, Node, DOM, Chromium, random `charset` and `http-equiv` boilerplate... I have to imagine it'd be rather demoralizing as a beginner. I think there's a large group of "nontechnical" users out there (usually derided by us tech bros as "Excel programmers" or such) that can perfectly understand the actual logic of programming, but are put off by the amount of buzzwords and moving parts involved, and I don't blame them at all.

(And sure, don't want to go in too hard on the nostalgia. 2000s software was full of buzzwords and insane syntax too, we've improved a lot. But it had some upsides.)

It just feels like we lost the plot at some point when we're all using GUI-based computers, but there's no simple, singular, default path to making a desktop GUI app anymore on... any, I think, of the popular desktop OSes?


You are totally right. Going even way back, in days of TurboPascal, you could include graphics.h and get a very cool snake game going within half an hour. Today, doing anything like that is a week of advanced stuff. Someone wanted to recreated that experience today and came up with this: https://github.com/dascandy/pixel

But as you can see how much boiler plate was needed to be written for them to write this.

https://github.com/dascandy/pixel/blob/master/examples/simpl...

See the user example and then look at src for boilder plate.

In old days, you could easily write a full operating system from scratch on 8051 while use PS/2 peripherals. Today, all peripherals are USB and USB 2.0 standard is 500 pages long.

I also agree that we have left behind the idea of teaching probably or at least removed it from the mainstream.


> 25 years ago, I was slinging apps together super fast using VB6. It was awesome. It was a level of productivity few modern stacks can approach.

If that were too, wouldn't we all be using VB today?


Ever try to maintain a bunch of specialized one-off thrown-together things like that? I inherited a bunch of MS Access apps once ...

everything old is new again


Excel (and spreadsheets in general) is not quite the same as VB but is similar in that it solves practical problems and normal people can work with it.

How are you measuring productivity?

What one can make with VB6 (final release in 1998) is very far from what can make with modern stacks. (My efficiency at building LEGO structures is unbelievable! I put the real civil engineers to shame.)

Perhaps you mean that you can go from idea to working (in the world and expectations of 1998) very quickly. If so, that probably felt awesome. But we live in 2025. Would you reach for VB6 now? How much credit does VB6 deserve? Also think about how 1998 was a simpler time, with lower expectations in many ways.

Will I grant advantages to certain aspects of VB6? Sure. Could some lessons be applicable today? Probably. But just like historians say, don't make the mistake of ignoring context when you compare things from different eras.


Agentic coding is just another rhyme of 25 y/o frenzy of "let's outsource everything to India." The new generation thinks this time is really special with us. Let's check again in 25 years

Indeed it did; I remember those times. All else being equal I still think SWE salaries on average would of been higher if we kept it like that given basic economics - there would of been a lot less people capable of doing it but the high ROI automation opportunities would of still been there. The fact that "it sucked" usually creates more scarcity on the supply side; which all being equal means higher wages and in our capitalist society - status. Other professions that are older as to the parent comment already know this and don't see SWE as very "street smart" disrupting themselves. I've seen articles recently like "at least we aren't in coding" from law, accounting, etc an an anecdote to this.

With AI at least locally I'm seeing the opposite now - less hiring, less wage pressure and in social circles a lot less status when I mention I'm a SWE (almost sympathy for my lot vs respect only 5 years ago). While I don't care for the status aspect, although I do care for my ability to earn money, some do.

At least locally inflation adjusted in my city SWE wages bought more and were higher in general compared to others in the 90's-2000's than on wards (ex big tech). Partly because this difficulty and low level knowledge meant only very skilled people could participate.


Monopolizing the work doesn't work unless you have the power to suppress anyone else joining the competition, i.e. "certified developers only".

Otherwise people would have realized they can charge 3x as much by being 5x as productive with better tools while you're writing your code in notepad for maximum ROI, and you would have either adjusted or gone out of business.

Increased productivity isn't a choice, it's a result of competition. And that's a good thing overall, even if it sucks for some developers who now have to actually work for the first time in decades. But it's good for society at large, because more things can be done.


Sure - I agree with that, and I agree its good for society but as you state probably not as good for the SWE who has to work harder for the same which was my point and I think you agree. Other professions have done what you have stated (i.e. certification) and seen higher wages than otherwise which also proves my point. They see this as the "street smart" thing to do, and generally society respects them for it putting their profession on a higher pedestal as a result. People respect people who take care of themselves first generally I find as well. Personally I think there should be a balance between the two (i.e. a fair go for all parties; a fair day's work with some job security over a standard career lifetime but not extortionary).

Also your notion of "better tools" may of not happened, or happened more slowly without open source, AI, etc which would of meant higher salaries for longer most probably. That's where I disagree with the parent poster's claim of higher salaries - AI seems to be a great recent example of "better tools" disrupting the premium SWE's enjoy rather than improving their salaries. Whether that's fair or not is a different debate.

I was just doubting the notion of the parent comment that "open source software" and "automated testing" create higher salaries. Usually efficiency economically (some exceptional cases) creates lower salaries for the people who are made more efficient all else being equal - and the value shifts from them to either consumers or employers.


> Other professions have done what you have stated (i.e. certification) and seen higher wages than otherwise which also proves my point.

I'd generally agree with that if it regards to safety (e.g. industrial control systems), but we manage that by certifying the manufacturer, not the individual developer. But otherwise I think it's harmful to society, even if beneficial to the individuals - but there's a lot of things falling in that bucket, and it's usually not the things we strive for at a societal level.

In my experience, getting better and faster has always translated into being paid more. I don't know that there's a direct relationship to specific tools, but I'm pretty sure that the mainstreaming of software development has caused the huge inflation of total comp that you see in many companies. If it was slow and there's only this handful of people that can do it, but they're not really adding a huge amount of value, you wouldn't be seeing that kind of multiplier vs the average job.


> But otherwise I think it's harmful to society, even if beneficial to the individuals

I disagree a little in that stability/predictability to people also adds some benefit to society - constant disruption/change for the sake of efficiency I believe at extreme levels would be bad for mental health at the very least and probably cause some level of outrage and dysfunction. I know as an SWE tbh I'm feeling a bit of it - can't imagine if it was everyone.

I personally think there is a tradeoff; people on average have limits to adaptability in their lifetimes and so it needs to be worth it for people to invest and enter in a given profession (some level of economic profit that makes their limited time worth spending in it). It shouldn't be excessive though - it should be where both client and producer get fair/equal value for the time/effort they both need to put in.


> ex big tech

I mean, this seems like a pretty big thing to leave out, no? That's where all the crazy high salaries were!

Also, there are still legacy places that more or less build software like it's 1999. I get the impression that embedded, automotive, and such still rely a lot on proprietary tools, finicky manual processes, low level languages (obviously), etc. But those are notorious for being annoying and not very well paid.


I'm talking about what I perceive to be the median salary/conditions with big tech being only a part of that. My point is more that I remember back in that period good salaries could be had outside big tech too even in the boring standard companies that you state. I remember banks, insurance, etc paying very well for example compared to today for an SWE/tech worker - the good opportunities seemed more distributed. For example I've seen contract rates for some of the people we hire haven't really changed for 10 years for developers. Now at best they are on par with other professional white collar workers; and the competition seems fiercer (e.g. 5 interviews for a similar salary with leetcode games rather than experienced based interviews).

Making software easier and more abstract has allowed less technical people into the profession, allowed easier outsourcing, meant more competition/interview prep to filter out people (even if the skills are not used in the job at all), more material for AI to train on, etc. To the parent comment's point I don't think it has boosted salaries and/or conditions on average for the SWE - in the long run (10 years +) it could be argued that economically the opposite has occurred.


even if that's true it's clear enough AI will reduce the demand for swe

I don't think that's certain. I'm hoping for a Jevons paradox situation where AI drives down the cost of producing software to the point that companies that previously weren't in the market for custom software start hiring software engineers. I think we could see demand go up.

This makes sense. Imagine PHP or NodeJS without a framework, or front end development without React. Your projects would take much longer to build. The time saved with the open source frameworks and libraries is more than what an AI agent can save you.

> we've never seen a profession drive themselves so aggressively to irrelevance.

Should we be trying to put the genie back in the bottle? If not, what exactly are you suggesting?

Even if we all agreed to stop using AI tools today, what about the rest of world? Will everybody agree to stop using it? Do you think that is even a remote possibility?


Does the rest of the world want to make money in a way not involving digging ditches? I feel like people from developing countries that spend 18 hours a day studying, giving their entire childhood to some standardized test, may not want yo be rewarded with no job prospects. Maybe that’s a crazy position.

Don't care have too much to do must automate away my today responsibilities so I can do more tomorrow trvst the plqn

Software Engineers will still exist.

Software Devs not so much.

There is a huge difference between the two and they are not interchangeable.


Good luck convincing new overlords.

Your take is this meme https://knowyourmeme.com/memes/dig-the-fucking-hole.


sorry i don't speak meme

Also it really baffles me how many are actually in on the hype train. Its a lot more than the crypto bros back in the day. Good thing AI still cant reason and innovate stuff. Also leaking credentials is a felony in my country so I also wont ever attach it to my codebases.

I think the issue is folks talk past each other. People who find coding agents useful or enjoyable are labeled “on the hype train” and folks for which coding agents don’t work for them or their workflow are considered luddites. There are an incredible number of contradicting claims and predictions out there as well, and I believe what we see is folks projecting their reaction to some amalgamation of them onto others. I see a lot of “they” language, and a lot of viral articles about business leadership “shoving AI down our throats” and it becomes a divisive issue like American political scene with really no one having a real conversation

I think the reason for the varying claims and predictions is because developers have wildly different standards for what constitutes working code. For the developers with a lower threshold, AI is like crack to them because gen ai's output is similar to what they would produce, and it really is a 10x speedup. For others, especially those who have to fix and maintain that code, it's more like a 10x slowdown.

Hence why you have in the same thread, some developer who claims that Claude writes 99% of their code and another developer who finds it totally useless. And of course others who are somewhere in the middle.


Have you considered that it's a bit dismissive to assume that developers who find use out of AI tools necessarily approve of worse code than you do, or have lower standards?

It's fine to be a skeptic. Or to have tried out these tools and found that they do not work well for your particular use case at this moment in time. But you shouldn't assume that people who do get value out of them are not as good at the job as you are, or are dumber than you are, or slower than you are. That's just not a good practice and is also rude.


I never said anything about being worse, dumber, and definitely not slower. And keep in mind worse is subjective - if something doesn't require edge case handling or correctness, bugs can be tolerated etc, then something with those properties isn't worse is it?

I'm just saying that since there is such a wide range of experiences with the same tools, it's probably likely that developers vary on their evaluations of the output.


Okay, I certainly agree with you that different use cases can dictate different outcomes when using AI tooling. I would just encourage everyone who thinks similar to you to be cautious about assuming that someone who experiences a different result with these tools is less skilled or dealing with a less difficult use case - like one that has no edge cases or has greater tolerance for bugs. It's possible that this is the case, but it is just as possible that they have found a way to work with these tools that produces excellent output.

Yeah I agree, it doesn't really have to do with skill or different use cases, it's just what your threshold is for "working" or "good".

There's also the effect of different models. Until the most recent models, especially for concise algorithms, I felt it was still easier to sometimes do it myself (i.e. a good algo can be concise/more concise than a lossy prompt) and leave the "expansion/repetitive" boilerplate code to the LLM. At least for me the latest models do feel like a "step change" in that the problems can be bigger and/or require less supervision on each problem depending on the tradeoff you want.

Its all a hype train though. People still believe in the AI gonna bring utopia bullshit while the current infra is being built on debt. The only reason it still exists is that all these AI companies believe in some kind of revenue outside of subscriptions. So its all about:

Owning the infrastructure and enshittify (ads) once enough products are based on AI.

Its the same chokehold Amazon has on its Vendors.


Hard to have a conversation when often the critics of LLM output receive replies like "What, you used last week's model?! No, no, no, this one is a generational leap"

Too many people are invested into AI's success to have a balanced conversation. Things will return to normal after a market shakedown of a few larger AI companies.


On HN I think you overestimate the number of optimists that are optimists because they have some vested interest. Everyone everywhere arguably has a vested interest. I would also argue all of the folks on HN that are hostile and dismissive of coding agents also have a vested interest (just for the sake of contrasting your argument). If coding agents were really crappy I wouldn’t be using them just like I didn’t use them until end of 2025.

What conversation is hard to have? If you mean trying to convince people coding agents can or cannot do a specific thing then that may never go away. If you take an overall theme or capability, in some cases it will “just work” and in other cases it needs some serious steering or scaffolding, and in other cases it will just waste as much time as you will let it. It’s an imperfect tool and it may always be, and two people insisting it can do something and it cannot do that same thing may both be right.

What is troubling to me is the attitude of folks that are heavily hostile towards these models and the people that use them. People routinely conflate market promises and actual delivered tools and capabilities and lump people who enjoy and get lots of mileage out of these tools into what appears to be a big strawman camp of fawning fans who don’t understand or appreciate Real Software Engineering; people who would write bad code anyway and not know. It’s quite insulting but also wrong. Not saying you are part of this camp! But as one lonely optimist in a sea of negativity that’s certainly the perspective I’ve developed from the “conversations” I’ve seen on HN


your credentials shouldn't be in your codebase to begin with!

.env files are a thing in tons of codebases

but thats at runtime, secrets are going to be deployed in a secure manner after the code is released

.env files are used to develop as well, for some things like PayPal u dont have to change the credentials, you just enable sandbox mode. If I had some LLM attached to my codebase, it would be able to read those credentials from the .env file.

This has nothing to do with deployment. I never talked about deployment.


If you have your PayPal creds in your repository, you are doing it wrong.

.gitignore is a thing

Which every AI tool I’m aware of respects and ignores by default.

Why is it that they can add new env variables then?

It is trivial to append to files without reading them. Also, no AI provider even wants your secrets, they are a liability. Do whatever you want though, I'm not here to convince you of anything.

If your secrets are in your repo, you've probably already leaked them.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: