> A company of 100 engineers should probably have 10-20% of the team allocated to just internal tools and making things go faster.
But beware of Jevons paradox.
Say that eg. a software project has 10 developers, and each build takes ~15 minutes. Most developers would take at least some care to check their patches, understand how they work etc, before submitting. And then discuss follow-on steps with their team over a coffee.
Now imagine near-instant builds. A developer could copy-paste a fix & hit "submit", see "oh that doesn't work, let's try something else", and repeat. You'll agree that probably wouldn't help to improve the codebase quality. It would just increase the # of non-functional patches that can be tested & rejected in a given time span.
Nothing new in the article imho. But it's a nice overview of what content creators are facing, and what to look for when carving out a niche.
The #1 point really: have access to data / experiences / expert knowledge that's unique & can't be distilled from public sources and/or scraped from the internet. This has always been the case. It just holds more weight when AI agents are everywhere.
That's what is expected to finally kill Moore's law: the economics. At some point it'll still be technically possible to fabricate smaller IC structures, stack more layers etc, but the tech to do so (and fabs to do it at scale) will be costly enough that it's just not worth it.
The other point is of course a next-gen fab first needs to be built, and get those yields up. While previous-gen fab already exists - with all the fine-tuning already done & kinks ironed out. Not to mention maaanny applications simply don't need complex ICs (typical 32bit uC comes to mind, but even 8bit ones are still around).
> Still amazes me how much we did with only 512MB.
Or the other way around: still amazes me how many GBs today's machines need to do conceptually simple things. Things that were done ages ago (successfully, if not better) on much, much, much lower-powered kit. Never mind CPU speed.
Eg. GUIs have been done on machines with a few 100s Kbytes of RAM. 'Only 512MB' is already >1000x that.
> Imo there's an identifiable core common to all of these kinds of package managers (..)
Indeed. It's hard to see why eg. a prog language would need its own package management system.
Separate mechanics from policy. Different groups of software components in a system could have different policies for when to update what, what repositories are allowed etc. But then use the same (1, the system's) package manager to do the work.
It is easy to see why the system package managers in that are in use are not sufficient. The distro packages are too bureaucratic and too opinionated about all the wrong things.
I don't disagree that it would be nice if there was more interoperability, but so far I havn't seen anyone even try to tackle the different needs that exist for different users. Heck, we really havn't seen anyone trying to build a cross-language package system yet, and that should be a lot easier than trying to bridge the development-distro chasm.
Randall's license text is really chill; normally to give attribution to an image from Commons there would be more to do, but currently the article does hotlink to the page on Xkcd.com, and that is clearly "stated in the LICENSE file".
> Then for another 3 day hike where I used it to boil water for two persons it took so long that other people were done eating where we started.
There's quite a spread between the heating output of alcohol stoves. # Of holes, alu vs. Ti vs. brass, filler materials (if any). Some have simmer rings, some don't. Outside temps matter too.
The trick is to use it a # of times before you go out camping / backpacking. So that you're familiar with its behavior.
Disclaimer: cooking daily on a deluxe model (Origo 3000). Safest method to cook on a boat.
Propane/butane burners are easier to regulate, but these gasses have the nasty habit of sticking to the floor. So a leak could cause a deadly explosion (which happens semi-regularly).
That Origo: I could flip the whole thing over while burning & it wouldn't start a fire. Can't remove the burner from the stove while it's on due to a safety catch.
Also have a Trangia stashed somewhere (just the burner not pots/holder). Also used many times.
Both Swedish design & highly recommended. Cheap/ubiquitous fuel is a big plus too. Sadly the Origo isn't made anymore afaik.
Most of the stoves used for backpacking have the same basic design. I have two: an Esbit and a Toaks, the latter is lighter (it's titanium) but both have the same design as a small Trangia or the one in the article.
It takes 7-10mn to boil the water I need for food + a coffee, while using my gas burner it takes 2-3mn.
One issue I forgot to mention is that it's almost impossible to get the remaining fuel in the burner back into the container.
But beware of Jevons paradox.
Say that eg. a software project has 10 developers, and each build takes ~15 minutes. Most developers would take at least some care to check their patches, understand how they work etc, before submitting. And then discuss follow-on steps with their team over a coffee.
Now imagine near-instant builds. A developer could copy-paste a fix & hit "submit", see "oh that doesn't work, let's try something else", and repeat. You'll agree that probably wouldn't help to improve the codebase quality. It would just increase the # of non-functional patches that can be tested & rejected in a given time span.
In other words: be careful what you wish for.