I’m surprised. They were massive names in the OS scene until recently. Their lineage can be traced back to Solaris and share many of the cool technologies that Sun developers pioneered. So much so that for the first few years of SmartOS, Linux felt like a hobbyist platform in comparison due to its lack of dtrace, containerisations, ZFS, etc.
Linux has come a long way since, which is a large part of the reason why SmartOS has become less relevant. The latter being a great shame because competition breeds innovation and we are losing a lot of interesting interesting Unixes from the public consciousness.
Edit: oh come on. I post this and it almost immediately gets negative karma despite being both factual and informative. A perfect example of the rife abuse of peer moderation on this site. I honestly don’t think I’ll bother wasting my time on here any more.
The downvotes might be for the slightly bombastic tone. SmartOs and IllumOS never felt “massive” to me, particularly compared to Linux. They might have had some nice tooling inherited from Solaris, but they were never particularly appealing for people who were not invested in the Solaris ecosystem - which had already been effectively wiped out by Linux by the time IllumOS and SmartOs appeared.
Or maybe the downvotes are related to the rather fanatic behaviour of some readers of this thread – I got downvoted for mentioning I hadn’t heard of those OSes, only to be then voted up; and downvoted elsewhere when I told somebody to cheer up because they had been downvoted for essentially making the same comment. It’s pretty frustrating, to be honest: “try to cheer somebody up, get punished for it” isn’t the way things are meant to be.
+1 for your reply and to cheer you up. I also got down-voted to hell for posting that rather innocuous comment — don’t let it get to you.
As for the element of surprise, not everybody moves in the same circles. You might be deep into server-grade operating systems, or DevOps, or other branches of information technology that are far from my daily experience. I haunt the areas associated with theoretical computer science, applied mathematics, finance, and the ins-and-outs of the (European) payments system. Each of our haunts are so vast that it’s easy to be erudite in general and yet still pretty ignorant of things others take for granted.
Your comment reminds me a bit of the relationship between Clojure and JavaScript.
Back in 2014, Clojure-based web development had all kinds of groundbreaking ideas that massively improved quality of life: immutable data structures, hot code reloading (figwheel), decoupling of state and view (devcards), with all the benefits that come with functional programming.
The JavaScript ecosystem has since closed the gap and has a lot of great libraries or tools that do these things, and it's just _so popular_ that it's hard to make the business case for Clojure(Script).
Personally, I still think Clojure has a lot of really smart people contributing to it. It's exciting to see tools like libpython-clj enable Carin Meier's work around leveraging bleeding-edge Python Machine Learning libraries in Clojure, and something in my gut tells me that Rich and Co. are on the right track with spec while the JS community pursues TypeScript instead.
It's true that every project needs a community if it's going to survive, and because the internet makes it so easy to share ideas, being the birthplace of good ideas is not enough to ensure your survival.
I'll be interested to see what the landscape looks like in another half a decade!
I did a lot with SmartOS and, eventually, Linux got good enough. But also, Kubernetes won everything, ZFS ended up in everything, and dtrace ended up in most things. It was a thing of beauty but I can't say I ever felt like I properly understood it.
I have nothing against buying cases but let’s be honest, if the answer to a genuine solvable hardware problem is to buy more hardware then you have to sympathise with those who complain about the insanity of the current market.
Why commit to small differences in an entire inventory/supply chain problem for a very complex and expensive device instead of a thin wrapper?
Phones are thin! The case makes them reasonably sized. Treat a naked phone the same as you would a naked human: unsuitable and unprepared to go outside.
But if phones were built more durable would they be smaller than the thick case or could they make use of the extra space that a case requires by having larger batteries, etc.
I never get a phone case. I’ve dropped my iPhone XS I don’t know how many times. Not a single scratch or crack on it. Pro tip: needing a case is a marketing gimmick “it’s made of glass!”
KDE 4 was released in 2008 and has been stable and consistent for more than a decade. That predates Windows 7. So in the same time KDE has been consistent you’ve had had the migration from XP or Vista to Win7. Then Windows 8 and now Windows 10. They’ve all bought massive changes in the UI experience.
If you don’t like KDE then use LXCE or Enlightenment or any of the other Linux desktop environments that have been pretty static (and have been even longer than KDE).
So yeah, there actually is a lot of choice on Linux and not all of it looks dated.
Mate desktop has been fairly stable (the default green tinted icon set is gross but fixable). I switched to it after Ubuntu long term support, and Debian on my other computer dropped gnome 2. Just didn't like gnome 3 or unity. Cinnamon was okay but at the time was still coupled to the brain dead ideas the gnome team had been perusing at the time. I haven't checked it out recently enough to know if it had improved.
You have to do a graceful restart after rotation, according to the docs, which then leads to scripts that try a couple times and then reboot. (It's been a while, but I did have an install where graceful restart would work once, but consistently fail on the second rotation).
If you don’t change the inode then no restart should be required. There’s definitely a few tools out there that solve this problem without needing to restart httpd.
If the authors are reading this: The home page has some weird width issues going on for me where content starts somewhere off the left of the screen and ends somewhere off the other end. To compound the issue, I cannot scroll to the left.
I’d go further than you and say they should be removed by default on all fields.
Want to know if a Boolean field is unset? Well it’s no longer Boolean because you now have 3 states for that field. So why not use a char, U/Y/N with the default being U?
NULL might have made more sense 30+ years ago when systems were more resource constrained but it doesn’t make sense now for most of the problems people are trying to solve day to day. If anything, it creates more problems.
Just to be clear, I’m not saying they should be removed entirely but rather that they shouldn’t have to be explicitly disabled on every CREATE.
I will say one use case for NULL that is hugely helpful is outer joins and nested queries. However these don’t generate high performance queries so if you’re having to rely on them then you might need to rethink your database schema anyway.
So essentially I don’t disagree with you, I just think you’re being too nice limiting your complaint to string fields.
> Well it’s no longer Boolean because you now have 3 states for that field. So why not use a char, U/Y/N with the default being U?
Well because instead of using a type that exactly encodes the concept of "yes/no/unset" (nullable boolean), you'd be using a type that encodes "any 1-character text, with arbitrary meaning and most of the values being nonsensical"
The problem is you need a boat load of additional code to cover unset. Not just in SQL (syntax is different for NULL than it is for comparing any other type) but often also in your importing language too (eg some languages will cast NULL to a nil value that can actually raise exceptions or even crash your application if not handled correctly).
Capturing those edge cases is non-trivial compared checking the value of a char.
In an idea world your unit tests and CI pipelines would catch all of those but that depends on well written tests. Thus in my experience having fewer hidden traps from the outset is automatically a better design than one that perfectly fits an academic theory but is harder to support in practice.
It'd probably be more sane than trying to stuff a 3VL into bunch of 2VL operations, because you refuse to acknowledge that you don't actually have a 2VL type
You can move freely amongst counties in the Schengen Area, which is a subset of EU counties (which itself is a subset of European counties).
> The free movement of persons is a fundamental right guaranteed by the EU to its citizens. It entitles every EU citizen to travel, work and live in any EU country without special formalities.
You can move freely among EU countries, not just Schengen countries. Schengen isn't just a subset of EU countries -- it's many EU countries and some non-EU countries, and it's to do with a lack of border checks rather than the basic freedom of movement right.
Without wanting to sound dismissive to the massive contribution Tim Berners-Lee, and CERN as a whole, has made to IT; we’d probably have ended up with the same user experience we have today but with Gopher as the base tech. Or maybe something else entirely different from markup perspective, such as something RTF or s-expressions...?
Great technological advancements are rarely the work of one genius in isolation. Usually it requires the imagination of a great many people to be captured. By which point the world is already ready for such an invention so it becomes more of an evolutionary breakthrough rather than a one idea that couldn’t be recreated.
In theory, yes. But there are a few caveats to that:
1. CERN wouldn’t have been able to patent anything too generalised because there was prior art (Gopher)
2. Thus if the patent was too expensive developers would have just used a similar technology (bare in mind it took quite a few years before the web to evolve from a toy, then something that companies “needed” but it wasn’t contributing massively to their bottom line, to what it is today where a great many businesses sole market is web based).
3. Even if it had somehow became part of industry standard or CERN had achieved a vague patent that prevented similar implementations from coexisting, Europe has this thing called (if I remember the acronym correctly) FRAND patents where patents which are required as part of a standard have to be fairly licensed.
None of this means CERN couldn’t have potentially made a lot from HTML & HTTP. But I also think part of the reason it was the success it was, was because it was a royalty free open specification.
Yep, I agree that any or all of those would apply (note that regarding F/RAND, a holder can refuse, and then the standard must exclude their IP) but at the same time we have seen over and over that corporations (and the nation-states that back them up) can get in really aggressive patent wars if they see fit, even over the concept of black, glossy rectangular cuboids...
As I said elsewhere when that argument was made; it’s a very tenuous connection. Things named after him isn’t the same as things he was directly involved in.
I love Monty Python but it’s disingenuous to claim they they’re any more “on topic” than a basketball player (and I say this as someone who’s not generally a sports fan).
The only real difference between the two is one is more popular on HN than the other. Which is a real shitty reason to flag an obituary in my opinion.
Linux has come a long way since, which is a large part of the reason why SmartOS has become less relevant. The latter being a great shame because competition breeds innovation and we are losing a lot of interesting interesting Unixes from the public consciousness.
Edit: oh come on. I post this and it almost immediately gets negative karma despite being both factual and informative. A perfect example of the rife abuse of peer moderation on this site. I honestly don’t think I’ll bother wasting my time on here any more.