LibreOffice is catastrophically bad. It is slow, buggy, and everything it does is either pointlessly emulating a bad product, or pointlessly going against expectations.
It exists for one reason only, which is OSS fervor. Great, but that doesn’t lead to great design.
I'm with wolvoleo. I'm forced to use MS Office at work but install only LO on my personal machines. It may lack features or pizzazz but as a reliable, unfussy authoring tool, it serves my needs very well.
> pointlessly going against expectations
If you're referring to the ribbon, I'm not sold on its superiority. The vast majority of other software still uses the familiar menu structure, which is what LO uses too.
Granted, well meaning educational programs expose students to MS Office and its paradigm, from an early age. For their sake, I eagerly await a coding assistant AI powerful enough to reskin LibreOffice to look just MS Office, ribbon and all.
I started my wife on LibreOffice, putting it on her Mac when her 365 subscription lapsed. She loves it. Her needs aren't fancy, though, and she can create her own or open others' documents and spreadsheets just fine.
I don't agree, I use it all the time. I never use the 'real' office at home, though I do at work. And I'm really happy with it. It works fine, it's pretty light and it runs on every OS without me having to use a substandard web version.
I understand their copying the MS Office look and feel because that muscle memory is key to converting users. I like the way they didn't go all-in on those ribbons which have always been pretty terrible.
In that sense I think the biggest issues with the product is that it's taking so many cues from MS Office which on its own is pretty terrible but has grown to be abundant.
I think the whole office workflow is grossly outdated anyway. Excel is mostly misused as a pisspoor database which it deeply sucks at because it doesn't offer any way to safeguard data integrity. What MS should do is overhaul Access completely to make users grok it better. But they don't care.
Word docs are still full of weird template issues, PowerPoint still uses the old overhead projector transparent slide paradigm.
What it really needs is someone to look at this without any of the 1980s baggage and come up with tools for workflow problems from this century with techniques that fit this century. Adding an AI clippy like MS has done does not cut it at all.
But it does mean having to chip away at the entrenched market position of office, that's the problem. Microsoft stops innovating when they've cornered the market, just like they did with internet explorer. Someone has to do a chrome on office, but it will need someone with a big bag of money. Not an open source project run on a shoestring.
So yeah I think LibreOffice is not great but the not great bits are copied from MS Office because they simply have no alternative.
I recently began using markdown readers/writers like Typora and they’ve blown me away— what LibreOffice Writer could have been. Competing directly with MS Word was a trap.
You have to consider the origins, going back to Star Office in the days where most people were on really slow Dialup if they had internet at all. And even a lot of businesses were almost worse, sharing a single dialup or ISDN connection.
I suppose so, but I think it's relatively rare for languages to just drop support for older features and force developers to rewrite code using newer mechanisms.
Python 2 to 3 is a good example of what can be expected to happen - very slow adoption of the new version since companies may just not have the resources or desire to rewrite existing code that is running without problems.
Late in the C++ 20 timeline P1881 Epochs were proposed. Similar to Rust's editions, which at that time had been tried in anger only once (2018 Edition) this proposed that there should be a way for C++ to evolve, forbidding obsolete syntax in newer projects rather than just growing forever like cancer.
Epochs was given the usual WG21 treatment and that's the end of that. Rust shipped 2021 Edition, and 2024 Edition, and I see no reason to think 2027 Edition won't happen.
The current iteration of Bjarne's "Profiles" idea is in a similar ballpark though it got there via a very different route. This time because it will aid safety to outlaw things that are now considered a bad idea. If this goes anywhere its nearest ship vehicle is C++ 29.
Now, Python 2 to Python 3 did take a few years, maybe a decade. But just shipping the mechanism to reform C++ looks likely to take at least nine years. Not the reform, just the mechanism to enable it.
In general, even though semantic changes haven't yet been a thing in editions so far.
Crate A uses edition X.
Crate B uses edition Y + N, which has broken Rust language semantics between those editions, or changed a standard library type that Crate A uses on its public API in a backwards incompatible way.
Now an application on edition Y + M, with M >= N, would like to depend on Crate A and Crate B, and calling that incompatible API would be a requirement for the implementation goal.
So far editions haven't bothered covering such cases, deemed as not worthwhile thinking about.
Those in such scenarios get to hunt for another crate to replace Crate A, or have a local fork.
I think it would be useful to give an example of the kind of semantics that would be desirable to change, and where people might think that it should be possible to change at an edition boundary.
I’m not very sure, but maybe something like unforgettable types? Or linear types? Maybe negative trait bounds, or specialization?
That was the example, now if you rather want to play defensive instead of engaging into a meaningful discussion about the Rust approach to language versioning and its constraints, well it is as it is.
> Python 2 to 3 is a good example of what can be expected to happen
People keep bringing this up when discussing backwards compatibility breaks, but I think the conclusion should be a bit more nuanced than just "backwards compatibility break <=> years (decades?) of pain".
IMHO, the problem was the backwards compatibility break coupled with the inability to use Python 2 code from 3 and vice versa. This meant that not only did you need to migrate your own code, but you also needed everything you depend on to also support Python 3. This applied in reverse as well - if you as a library developer naively upgraded to Python 3, that left your Python 2 consumers behind.
Obviously the migration story got better over time with improvements to 2to3, six, etc., that allowed a single codebase to work under both Python 2 and 3, but I think the big takeaway here is that backwards compatibility breaks can be made much more friendly as long as upgrades can be performed independently of each other.
Python3 drops support for older versions of python3 all the time. Every single release comes with a bunch of deprecated and removed features. There is a very strong chance that a python 3.7 codebase is totally broken on 3.14. Almost no language takes backwards compatibility as seriously as c++ does.
It's an interesting policy decision for a language - whether to guarantee (or at least practice) ongoing backwards compatibility or not.
The benefit of NOT maintaining backwards compatibility is that hopefully you end up with a cleaner language without all the baggage of the past, and "impedence mismatches" of new features that don't play well with old ones, etc, etc.
However, as a developer, I think I prefer the backwards compatible approach, knowing that:
1) Upgrading to latest version of the compiler hopefully isn't a big deal, and I can take advantage of any benefits that brings, including new language features, without having to sign up for rewriting existing tested code to remove deprecated features. In a large code base any such forced rewrites could be pretty onerous, and may involve a lot of retesting and potential for introducing bugs into code that was previously working.
2) It's nice to be able to take older projects, that you haven't worked on in a while (whether hobby projects, or legacy corporate code) and have the code still compile with the latest installed tool set, rather than to have developed "bit rot" due to using language features that have since been deprecated. Of course bit rot may also occur due to library changes, so language backwards compatibility is no guarantee of smooth sailing.
Maybe going forwards, with AI coding tools, this will become less of an issue, if they become capable of this sort of codebase updates without introducing errors. In fact, going forwards it may well be that choice of programming languages itself becomes seen as more something the tooling takes care of. Nowadays we write in high level languages and don't really think or care about what the generated machine code looks like. Maybe in the future we can write in high level instructions to the AI, without even having to care what the implementation language is?
I don't think I've ever worked at a company where slacking off was the problem. The vast majority of people want to do good work.
What I _have_ seen is several companies afflicted by this really strange characteristic of the software development industry: We appear to be the only industry on the planet where it is common to pick leaders (executives) that know nothing about the product or how it's made.
You can't run a bridge building company without knowing how to build a bridge. You can't run a law firm without knowing law.
You don't need to know all the nitty gritty - big picture is important - but understanding the product _in depth_ is a requirement in any business.
Are you in a completely different world than me? Because even the CEO of Boeing is not an engineer. Larry Ellison The CEO of the biggest bank in my country holds a masters degree in business economics, but nothing related to finance, econometrics or risk management. The CEO of US steel is an accountant. Don't even get me started on the (non)education of some politicians.
Understanding the product is often important, but equally often it is something you can delegate to others. It's only the younglings that think intimate knowledge of the product is the hallmark of a great leader, because that is the only thing they themselves bring to the table.
I agree fully with your comment, but I wish to point out that Larry Ellison's was a programmer at the time that the company that became Oracle was founded by him and his co-founders.
I get the point of his comment but it’s just nonsense… plenty of good CEOs aren’t SMEs in the field and plenty of bad ones are. the CEO of Boeing is absolutely an engineer - and so was the CEO during most of the years people consider the worst in Boeing’s quality history with the 737 Max (Muilenburg).
> Are you in a completely different world than me? Because even the CEO of Boeing is not an engineer. Larry Ellison The CEO of the biggest bank in my country holds a masters degree in business economics, but nothing related to finance, econometrics or risk management. The CEO of US steel is an accountant.
Specifically the CEO is more like the figurehead of the company; this role is to present and "sell the value" of the company to investors, important customers and partners. So often it is not too worrysome if the CEO has a different background; sometimes this can even make sense.
What should worry one much more is if the leadership layers below come from a very different background than what the company's industry is.
It's not always consciously slacking off. For example when I was at Google most of the team was simply incompetent. They thought they were smart (PhDs!) and working hard. But they refused to work together. They estimated tasks in weeks and months and at the end of my time there after I'd done very little due to obstruction by other teams I was praised for my high productivity.
I never saw anyone just sitting around or really slacking. But they couldn't execute anything. It was depressing.
You raise a good point here. When I think about writing multi-threaded code, three things come to mind about why it is so easy in Java and C#: (1) The standard library has lots of support for concurrency. (2) Garbage collection. (3) Debuggers have excellent support for multi-threaded code.
Not really, especially as garbage collection doesn't achieve memory safety. Safety-wise, it only helps avoid UAF due to lifecycle errors.
Garbage collection is primarily just a way to handle non-trivial object lifecycles without manual effort. Parallelism happens to often bring non-trivial object lifecycles, but this is not a major problem in parallelism.
In plain C, the common pattern is trying to keep lifecycles trivial, and the moment this either doesn't make sense or isn't possible, you usually just add a reference count member:
In both Go and C, all types used in concurrent code needs to be reviewed for thread-safety, and have appropriate serialization applied - in the C case, this just also includes the refcnt itself. And yes you could have UAF or leak if you don't call ref/unref correctly, but that' sunrelated to parallism - it's just everyday life in manual memory management land.
The issues with parallelism is the same in Go and C, that you might have invalid application states, whether due to missing serialization - e.g., forgetting to lock things appropriately or accidentally using types that are not thread safe at all - or due to business logic flaws (say, two threads both sleeping, waiting for the other one to trigger an event and wake it up).
That might be convenient if your language has semantics that map well-ish to C99 semantics. But C is a really messy language with lots of little quirks. For example, Rust code would compile to something slower if it had to use C as an intermediate representation.
Also, compiled languages want accurate and rich debug info. All of that information would be lost.
I think in C# the way to solve this is to have two separate types, `Ok<TValue>` and `Err<TError>`, and provide implicit conversions for both to `Result<TValue, TError>`.
The static method approach showcased in the article is really long-winded.
Interesting conversations and connections are with those people who are different from yourself, not with those who share the same experience of everything.
It exists for one reason only, which is OSS fervor. Great, but that doesn’t lead to great design.
reply