Hacker Newsnew | past | comments | ask | show | jobs | submit | holoduke's commentslogin

A philosophical question. Will software in the future be executed completely by a LLM like architecture? For example the control loop of an aircraft control system being processed entirely based on prompt inputs (sensors, state, history etc). No dedicated software. But 99.999% deterministic ultra fast and reliable LLM output.

Its not patterns engine. It's a association prediction engine.

Running them in a loop with context, summaries, memory files or whatever you like to call them creates a different story right?

what kind of question is that

What about the client ? I find the Claude cliënt better in planning, making the right decision steps etc. it seems that a lot of work is also in the cli tool itself. Specially in feedback loop processing (reading logs. Browsers. Consoles etc)

What about companies in general? I mean US companies? Aren't they all google like or worse?

Some are more evil than others.

Pure client side rendering is the only way to get max speed with lowest latency possible. With ssr you always have bigger payloads or double network rounds.

You're literally downloading a bunch of stuff first just to do a first paint, versus just sending back already built and styled HTML/CSS. Not only is client-only technically slower, it's also perceptively slower.

That’s a laughable claim. SSR is objectively faster, since the client does nearly zero work other than downloading some assets. If the responses are pre-computed and sitting in server memory waiting for a request to come along, no client side rendering technique can possibly beat that.

Of course there are cases where SSR makes sense, but servers are slow; the network is slow; going back and forth is slow. The browser on modern hardware, however, is very fast. Much faster than the "CPU"s you can get for a reasonable price from data centers/colos. And they're mostly idle and have a ton of memory. Letting them do the work beats SSR. And since the logic must necessarily be the same in both cases, there's no advantage to be gotten there.

If your argument is that having the client do all the work to assemble the DOM is cheaper for you under the constraints you outlined then that is a good argument.

My argument is that I can always get a faster time to paint than you if I have a good cluster of back end services doing all that work instead of offloading it all to the client (which will then round trip back to your “slow servers over a slow network”) anyway to get all the data.

If you don’t care about time to paint under already high client-side load, then just ship another JS app, absolutely. But what you’re describing is how you deliver something as unsatisfying as the current GitHub.com experience.


Idk. My applications are editor-like. So they fetch a bit of data, but rendering the edit options in HTML is much larger in size then the data, especially since there are multiple views on the same data. So that would put a larger burden on the server and make network transfer slower. Since generating the DOM in the browser is quite fast (there's no high client-side load; I don't know where you get that from), I've got good reason to suppose it beats SSR in my case.

Mind you, I've got one server with 4 CPUs and 8GB memory that can run 2 production and 10 test services (and the database for all), and the average load is .25 or so. That makes that it responds quickly to requests, which also has its advantage.


That makes sense. And btw when I say “already high client load”, my assumption is that most users have 50 other tabs open :)

> objectively faster

> provides zero evidence


Some pretty compelling evidence is history: we had dynamic and interactive web pages 20 years ago that were faster on computers that were an order of magnitude slower.

I don’t really need to provide “evidence”. I told you why SSR is faster and tbh idc if your time to paint is trash.

Still living the early 2000s eh? Pretty much all interactive responsive apps are all 100% client side rendered. Your claim about SSR being objectively faster looks like a personal vendetta against client side rendered apps. Or javascript. Happy days!

It was faster then and it’s still faster now. Of course, you’d have to learn how a computer works to know that I’m right, but that would be a bridge too far for JavaScript casuals! Just add one more library bro! Then you’ll be able to tell if a number is even or odd!

At least my prediction is accurate after all:)

Confidently wrong, I like it!

Nothing gets worse in computers. Name me one thing. And if the current output quality of LLM stays the same but speed goes up 1000, quality of the generated code can be higher.

Hot keys. Used to be, you could drive a program from the keyboard with hotkeys and macros. No mouse. The function keys did functions. You could drive the interface blindfolded, once you learned it. Speed is another one. Why does VSCode take so long to open? and use so much memory and CPU? it's got a lot of features for a text editor, but it's worse than vim/emacs in a lot of ways.

Boot time.

Understandability. A Z80 processor was a lot more understandable than today's modern CPUs. That's worse.

Complexity. It's great that I can run python on a microcontroller and all, but boring old c was a lot easier to reason about.

Wtf is a typescript. CSS is the fucking worst. Native GUI libraries are so much better but we decided those aren't cool anymore.

Touchscreens. I want physical buttons that my muscle memory can take over and get ingrained in and on. Like an old stick shift car that you have mechanical empathy with. Smartphones are convenient as all hell, but I can't drive mine after a decade like you can a car you know and feel, that has physical levers and knobs and buttons.

Jabber/Pidgin/XMPP. There was a brief moment around 2010? when you didn't have to care what platform someone else was using, you could just text with them on one app. Now I've got a dozen different apps I need to use to talk to all of my friends. Beeper gets it, but they're hamstrung. This is a thing that got worse with computers!

Ever hear of wirths law? https://en.wikipedia.org/wiki/Wirth%27s_law

Computers are stupid fast these days! why does it take so long to do everything on my laptop? my mac's spotlight index is broken, so it takes it roughly 4 seconds to query the SQLite database or whatever just so I can open preview.app. I can open a terminal and open it myself in that time!

And yes, these are personal problems, but I have these problems. How did the software get into such a state that it's possible for me to have this problem?


> Wtf is a typescript.

A godsend.

> Native GUI libraries are so much better but we decided those aren't cool anymore.

Lolno.


Native GUI libs look like shit out of the box, and are terrible to work with when you want to make something that doesn't look like out of the box tkinter/swing/qt/winforms Windows 95 looking crap.

Software has gotten considerably worse with time. Windows and MacOS are basically in senescence from my point of view. Haven't added a feature I've wanted in years, but manages to make my experience worse year to year anyways.

CPU vulnerability mitigations make my computer slower than when I bought it.

Computers and laptops are increasingly not repairable. So much ewaste is forced on us for profit.

The internet is a corporate controlled prison now. Political actors create fake online accounts to astroturf, manipulate, and influence us.

The increasing cost of memory and GPU make computers no longer affordable.


Windows

Model improvement. But certainly also the cli tool itself. That's where all the planning takes place

Bbs. Downloaded first shareware version of doom. Was it 4mb or something? I remember I had like 5kb/s and paid 5 cents a minute. My parents weren't happy those days. Now they are :)

Doom Shareware was originally just 2mb or so, enough to fit on two floppies: https://doomwiki.org/wiki/Shareware

BBS were a little off-tangent if you were actually using the internet itself, which I was (or least, I was pretty shortly after a few months on JANET (UK) and Bitnet (US & Israel).

The end of traditional news sites is coming. At least for the newspaper websites. Future mcp like systems will generate on the fly newstites in your desired style and content. Journalists will have some kind of paid per view model provided by these gpt like platforms which of course take a too big of a chunk. I can't imagine a WSJ is able to survive.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: