Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Agree. I want rock solid Linux compatibility with mac like hardware quality / battery life and a Thinkpad like toughness and keyboard. I don't really need it to be upgradable as long as it lasts 8 years.


IMHO I dont think people are considering what you lose when you cant upgrade, You get locked in to a device artificially created life cycle that's dictated by the manufacturer.

I understand where you are coming from, I guess it just makes me sad to see more and more people moving away from tech that is less in their control. And i consider upgradability and modularity and important aspect of that.


We never had anything different, though. Computers always became so obsolete after a while that there was no longer any point in trying to upgrade them. I think I got eight years out of my 1997 Power Mac G3, including a CPU upgrade to a G4, RAM upgrades, hard disk upgrades, a video card, and USB expansion, but then the new machines coming out were just so much better that throwing money into more upgrades was just tossing it into a black hole.


Maybe in the late 90s and early 2000s. These days hardware from over a decade ago works fine. I am typing this comment on a 2011 Dell E6410. Install Debian / Arch Linux and the machine is surprisingly capable. Just running HTOP I am using 2.5G of ram (out of 8GB) and the CPU is at 2%.

TBH, I have a Ryzen 5950X based tower and while it is faster than my previous desktop which was a i7 4970K (or whatever it is), the previous machine would be fine tbh. I am not even sure why I upgraded tbh.


I guess its a byproduct of a faster moving curve with improved technology. 20 years ago you didn't need to replace the entire platform for at least 10 years.


20 years ago I was hopping from Intel to AMD and then back to Intel. After that practically every decent jump in CPU performance on the Intel side of things meant a new socket (LGA775, 1156, 1155, 1150, 1151...). AMD typically kept sockets for a bit longer but wasn't as competitive until Ryzen which had a few jumps in chipset compatibility in there.

In the last 20 or so years if I wanted a few years newer CPU for whatever reason it usually meant I needed a whole new motherboard, and that often (but not always) also meant new RAM.


Is it artificial though, really? You buy whatever is available now and it eventually becomes obsolete and you have to buy a new one. Maybe there is some kind of multi vendor collusion going on but it doesn't seem that likely.

Where I think repairability really makes sense is in things that don't materially improve and should last 30 years (e.g. appliances).


If there's an ability to upgrade my GPU 3 years in but I can't, then yes. It's artificial. We just got way too comfortable with the mentality of throwing out everything and getting new cheap tech overtime.

I guess the one thing AI is doing that's good for this scene will be to make people value what they have more.


Who is "we"?


I'm pretty sure part of the reason of integrating everything on the board has some nefarious reasons, at least on Laptop's. Louis Rossman talked about a design flaw in Apple Macbooks where if the SSD fails, in my cases, your system will fail to power up because the mainboard is designed to fail when the SSD fails.(If I am interpreting that correctly)[0]. Remember this flaw is in the Macbooks where the SSD's are soldiered into the board. IMHO there are ways to design integrated hardware in such a way where failures minimize damage and I think many companies decide its not in there best interest to design hardware to prevent that. IMHO this is done in bad faith.

[0}: https://www.youtube.com/watch?v=0qbrLiGY4Cg


My partner with a Macbook works on AI and has told me how great Apple silicon is, and their Macbook would run so many things so well.. except they don't have enough RAM and there's no way to upgrade it..


I’m in the same boat as your partner except that I generally max the RAM in my laptop when buying it.

The thing is it would probably be the same issue with a Framework or any other brand of laptop as they all have some final limit on RAM or GPU RAM.

If you upgrade the GPU or motherboard you have to ask what will happen to the old one. You can reuse some of them but most probably will just be e-waste.

There’s a chance when upgrading a whole laptop that the old one will a new use somewhere.


I'm a hoarder so I'd just keep it around. I still have my Playstation 2 after all.

Every laptop except my first college one is also somewhere around my house. Even my $300 high school laptop that could really only run Microsoft Word (I remember running Fallout 3 on it at lowest settings at a brisk 10 fps). Even for that college laptop I salvaged the storage, ram, and disk drive.


This is exactly what I want.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: