Hacker Newsnew | past | comments | ask | show | jobs | submit | deviantbit's commentslogin

"I believe there are two main things holding it back."

He really science’d the heck out of that one. I’m getting tired of seeing opinions dressed up as insight—especially when they’re this detached from how real systems actually work.

I worked on the Cell processor and I can tell you it was a nightmare. It demanded an unrealistic amount of micromanagement and gave developers rope to hang themselves with. There’s a reason it didn’t survive.

What amazes me more is the comment section—full of people waxing nostalgic for architectures they clearly never had to ship stable software on. They forget why we moved on. Modern systems are built with constraints like memory protection, isolation, and stability in mind. You can’t just “flatten address spaces” and ignore the consequences. That’s how you end up with security holes, random crashes, and broken multi-tasking. There's a whole generation of engineers that don't seem to realize why we architected things this way in the first place.

I will take how things are today over how things used to be in a heart beat. I really believe I need to spend 2-weeks requiring students write code on an Amiga, and the programs have to run at the same time. If anyone of them crashes, they all will fail my course. A new found appreciation may flourish.


One of the most important steps of my career was being forced to write code for an 8051 microcontroller. Then writing firmware for an ARM microcontroller to make it pretend it was that same 8051 microcontroller.

I was made to witness the horrors of archaic computer architecture in such depth that I could reproduce them on totally unrelated hardware.


I tell students today that the best way to learn is by studying the mistakes others have already made. Dismissing the solutions they found isn’t being independent or smart; it’s arrogance that sets you up to repeat the same failures.

Sounds like you had a good mentor. Buy them lunch one day.


I had a similar experience. Our professor in high school would have us program a z80 system entirely by hand: flow chart, assembly code, computing jump offsets by hand, writing the hex code by hand (looking up op-codes from the z80 data sheet) and the loading the opcodes one byte at the time on a hex keypads.

It took three hours and your of us to code an integer division start to finish (we were like 17 though).

The amount of understanding it gave has been unrivalled so far.


> I worked on the Cell processor and I can tell you it was a nightmare. It demanded an unrealistic amount of micromanagement and gave developers rope to hang themselves with.

So the designers of the Cell processor made some mistakes and therefore the entire concept is bunk? Because you've seen a concept done badly, you can't imagine it done well?

To be clear, I'm not criticising those designers, they probably did a great job with what they had, but technology has moved on a long way from then... The theoretical foundations for memory models, etc. are much more advanced. We've figured out how to design languages to be memory safe without significantly compromising on performance or usability. We have decades of tooling for running and debugging programs on GPUs and we've figured out how to securely isolate "users" of the same GPU from each other. Programmers are as abstracted from the hardware as they've ever been with emulation of different architectures so fast that it's practical on most consumer hardware.

None of the things you mentioned are inherently at odds with more parallel computation. Whether something is a good idea can change. At one point in time electric cars were a bad idea. Decades of incremental improvements to battery and motor technology means they're now pretty practical. At one point landing and reusing a rocket was a bad idea. Then we had improvements to materials science, control systems, etc. that collectively changed the equation. You can't just apply the same old equation and come to the same conclusion.


> and we've figured out how to securely isolate "users" of the same GPU from each other

That's the problem, isn't it.

I don't want my programs to act independently, they need to exchange data with each other (copy-paste, drag and drop). Also i cannot do many things in parralel. Some thing must be done sequencially.


> There's a whole generation of engineers that don't seem to realize why we architected things this way in the first place.

Nobody teaches it, and nobody writes books about it (not that anyone reads anymore)


So, there are books out there. I use Computer Architecture: A Quantitative Approach by Hennessy and Patterson. Recent revisions have removed historical information. I understand why they did remove it. I wanted to use Stallings book, but the department had already made arrangements with the publisher.

The biggest problem on why we don't write books is that people don't buy them. They take the PDF and stick it on github. Publishers don't respond to the authors on take down requests, github doesn't care about authors, so why spend the time on publishing a book? We can chase grant money. I'm fortunate enough to not have to chase grant money.


While financial incentives is important to some, a lot of people write books to share their knowledge and give the book out for free. I think more people are doing this now, and there are also open collaborative textbook projects.

And I personally think that it is weird to write books during your working hour, and also get monet from selling that book.


"financial incentives"

This is the most ignorant response I've seen yet. We don't expect monetary gain from publishing a book. We expect our costs to be covered.

This is about the consumer, not the publisher. If we lived in a socialist system, they would still pirate our publications and we will still be in debt over it.


That's a financial incentive, I'm not sure what your rejection is exactly.


> What amazes me more is the comment section—full of people waxing nostalgic for architectures they clearly never had to ship stable software on.

Isn't it much more plausible that the people who love to play with exotic (or also retro), complicated architectures (with in this case high performance opportunities) are different people than those who love to "set up or work in an assembly line for shipping stable software"?

> I really believe I need to spend 2-weeks requiring students write code on an Amiga, and the programs have to run at the same time. If anyone of them crashes, they all will fail my course. A new found appreciation may flourish.

I rather believe that among those who love this kind of programming a hate for the incompetent fellow student will happen (including wishes that these become weed out by brutal exams).


The problem is that the exotic complexity enthusiasts cluster in places like HN and sometimes they overwhelm the voices of reason.


Those students would all drop out and start meditating. That would be a fun course. Speed run developing for all the prickly architectures of the 80s and 90s.


I see what you did there.


Guru meditation, for the uninitiated.


> They forget why we moved on. Modern systems are built with constraints like memory protection, isolation, and stability in mind. You can’t just “flatten address spaces” and ignore the consequences.

Is there any reason why GPU-style parallelism couldn't have memory protection?


It does. GPUs have full MMUs.


They do? Then how do i do the forbidden stuff by accessing neighboring pixel data?


do you mean accessing data outside of your app's framebuffer? or just accessing neighboring pixels during a shader pass, because those are _very_ different things. GPU MMUs mean that you cant access a buffer that doesn't belong to your app that's it, its not about restricting pixel access within your own buffers


TIL. Thank you.


Have you actually done that recently?


I loved and really miss the cell. It did take quite a bit of work to shuffle things in and out of the SPUs correctly (so yeah, it took much longer to write code and greater care), but it really churned through data.

We had a generic job mechanism with the same restrictions on all platforms. This usually meant if it ran at all on Cell it would run great on PC because the data would generally be cache friendly. But it was tough getting the PowerPC to perform.

I understand why the PS4 was basically a PC after that - because it's easier. But I wish there was still SPUs off the side to take advantage of. Happy to have it off die like GPUs are.


On flattening address spaces: the road not taken here is to run everything in something akin to the JVM, CLR, or WASM. Do that stuff in software not hardware.

You could also do things like having the JIT optimize the entire running system dynamically like one program, eliminating syscall and context switch overhead not to mention most MMU overhead.

Would it be faster? Maybe. The JIT would have to generate its own safety and bounds checking stuff. I’m sure some work loads would benefit a lot and others not so much.

What it would do is allow CPUs to be simpler, potentially resulting in cheaper lower power chips or more cores on a die with the same transistor budget. It would also make portability trivial. Port the core kernel and JIT and software doesn’t care.


> On flattening address spaces: the road not taken here is to run everything in something akin to the JVM, CLR, or WASM.

GPU drivers take SPIR-V code (either "kernels" for OpenCL/SYCL drivers, or "shaders" for Vulkan Compute) which is not that different at least in principle. There is also a LLVM-based soft-implementation that will just compile your SPIR-V code to run directly on the CPU.


We end up relying on software for this so much anyway. Your examples plus the use of containers and the like at OS level.


"The birth and death of JavaScript"


[flagged]


What the ever loving hell, it was a perfectly reasonable idea in response to another idea.

They weren't saying it should be done, and went out of the way to make it explicit that they are not claiming it would be better.

It was a thought exploration, and a valid one, even if it would not pan out if carried all the way to execution at scale. Yes it was handwaving. So what? All ideas start as mere thoughts, and it is useful, productive, and interesting to trade them back and forth in these things called conversations. Even "fantasy" and "handwavy" ones. Hell especially those. It's an early stage in the pollination and generation of new ideas that later become real engineering. Or not, either way the conversation and thought was entertaining. It's a thing humans do, in case you never met any or aren't one yourself.

The brainstorming was a hell of a lot more valid, interesting, and valuable than this shit. "Just go away" indeed.


It wasn't handwaving or brainstorming. Microsoft even built a research OS like this:

https://www.microsoft.com/en-us/research/project/singularity...

Have people really never used a higher level execution environment?

The JVM and the CLR are the most popular ones. Have people never looked at their internals? Then there's the LISP machines, Erlang, Smalltalk, etc., not to mention a lot of research work on abstract machines that just don't have the problems you get with direct access to pointers and hardware.


Some folks in the graphics programming community are allergic to these kind of modern ideas.

They are now putting up with JITs in GPGPUs, thanks to the market pressure from folks using languages like Julia and Python, that rather keep using those languages than having to rewrite their algorithms in C or C++.

These are communities that even adopting C over Assembly, and C++ over C, has been an uphill battle, let alone something like a JIT, that is like calling for the pitchforks and torches.

By the way, one of the key languages used in the Connection Machine mentioned on the article was StarLisp.

https://en.wikipedia.org/wiki/*Lisp


I'm going to call this out. The entire post obviously has bucket loads if aggression which can be taken as just communication style, but the last line was just uncalled for.

I have seen you make high quality responses to crazy posts.

Do better.


Don't worry, with LLMs, we're moving away from anything that remotely looks like "stable software" :)

Also, yeah, I recall the dreaded days of cooperative multitasking between apps. Moving from Windows 3.x to Linux was a revelation.


With LLM's it is just more visible. When the age of "updates" begun, the age of stable software died.


True. The quality of code yielded by LLMs would have been deemed entirely unacceptable 30 years ago.


> I really believe I need to spend 2-weeks requiring students write code on an Amiga, and the programs have to run at the same time. If anyone of them crashes, they all will fail my course.

Fortran is memory-safe, right? ;-)


Yep, transputers failed miserably. I wrote a ton a code for them. Everything had to be solved in a serial bus, which defeated the purpose of the transputer.


Quite fascinating. Did you write about your experiences in that area? Would love to read it!


Not in those terms, but an autobiography is coming, and bits and pieces are being explained. I expect about 10 people to buy the book, as all of the socialists will want it for free. I am negotiating with a publisher as we speak on the terms.


Could you elaborate on the “serial bus” bit?


Please explain how these "worker cores" should operate.


Unified memory doesn't mean unified address space. It frustrates me when no one understands unified memory.


If you fix the pages tables (partial tutorial online) you can have continuous unified address space on Apple Silicon.


Let’s be honest, saying “just fix the page tables” is like telling someone they can fly if they “just rewrite gravity.”

Yes, on Apple Silicon, the hardware supports shared physical memory, and with enough “convincing”, you can rig up a contiguous virtual address space for both the CPU and GPU. Apple’s unified memory architecture makes that possible, but Apple’s APIs and memory managers don’t expose this easily or safely for a reason. You’re messing with MMU-level mappings on a tightly integrated system that treats memory as a first-class citizen of the security model.

I can tell you never programmed on an Amiga.


Oh yes I programmed all the Amiga models, mostly in assembly level. I reprogrammed the ROMs. I also published a magazine on all the Commodore computers internals and build lots of hardware for these machines.

We had the parallel Inmos Transputer systems during the heyday of the Amiga, they where much better designed than any the custom Amiga chips.


Inmos was a disaster. No application ever shipped on one. EVER. It used a serial bus to resolve the problems that should have never been problems. Clearly you never wrote code for one. Each oslink couldn't reach more than 3 feet. What a disaster that entire architecture was.


I shipped 5 applications on an 800 Inmos Transputer supercomputer. Sold my parallel C compilers, macro Assembler. Also an OS, Macintosh Nubus interface card, Transputer graphics cards, a full paper copier and laserprinter. I know of dozens of successful products.


Sure you did. What were they? The only successful transputer was the T414 and it never made it outside academia.


Well, I believe there were military radar projects that shipped in reasonable quantities and served for reasonable lifetimes.

I think I remember some medical imaging products as well?

I don’t dispute that the Transputer was ultimately unsuccessful, but it wasn’t completely unused in real-world products.

See also TI’s C40, which was quite similar and similarly successful.


Hey don't shit on my retro alternative timeline nostalgia. We were all writing Lisp programs on 64 CPU Transputer systems with FPGA coprocessors, dynamically reconfigured in realtime with APL.


/s/LISP/Prolog and you've basically described the old "Fifth Generation" research project. Unfortunately it turns out that trying to parallelize Prolog is quite a nightmare, the language is really, really not built for it. So the whole thing was a dead-end in practice. Arguably we didn't have a real "fifth-gen" programming language prior to Rust, given how it manages to uniquely combine ease of writing parallel+concurrent code with bare-metal C like efficiency. (And Rust is now being used to parallelize database query, which comfortably addresses the actual requirement that Prolog had been intended for back then - performing "search" tasks on large and complex knowledge bases.)


You probably already know about https://github.com/mthom/scryer-prolog are you saying we should accelerate scryer-prolog with a Grayskull board?


Also, the Fifth Generation computer project subscribed to the old AI paradigm, not based on statistics.


Not true. "just fix the page tables" took me 4 hours. And only 15 minutes with the Linux kernel on Apple Silicon.


Obviously you missed the sarcasm.


I know the APIs don't make it easy, that's precisely why I want different APIs.


Wait until they get to code generated by AI. All of this Rust code that people are using and sticking in various Linux services. It won't be covered under the GPL. It will flat out be public domain.


This is a terrible opinion piece. Layoffs do work. Cutting hours does not work. I am always amazed how few people understand how a business operates. They think its a community organized event where there is unlimited revenue.

There are a few basic accounting principles employees need to understand. It all revolves around Assets = Liabilities + Owner’s Equity. If you think otherwise, take a "Cost Accounting" course. This is a pure numbers game. Everyone wants to wrap a psycho analysis into something that has ZERO relevancy.

If a company is bleeding revenue, it cannot sustain the overhead of employees. They have to go. Cutting hours does nothing for those on a salary, and those that are hourly, benefit costs are far more expensive than their wage. If a company has stagnant growth, that means leadership has made bad decisions, and things have to change.

Employees feelings don't matter on a Balance Sheet, Income Statement and Cash Flow Statement. There is not a "Employee's Feelings" column on the ledger. Everyone can be replaced. No one is special, unless you're a majority shareholder.


> If a company is bleeding revenue, it cannot sustain the overhead of employees. They have to go.

No, they don't. During the dotcom bubble my company gave us 2 choices, layoffs or 20% pay cut. We took the latter. Everyone stayed and we had our pay back to previous levels in a couple years and the company remained profitable.

You can, in fact, treat people as people and still run a company.

> Employees feelings don't matter on a Balance Sheet, Income Statement and Cash Flow Statement.

This is only true of companies of a certain (large) size, when all semblance of employees being people have been abstracted away. In companies of more reasonable sizes you must take employee moral into account or you will lose critical employees which could kill the company.


Surely there are some situations where a pay cut is insufficient because you would like to reorganize the company, for example to wind down an unprofitable initiative or division and there's not a reasonable way to maintain the same headcount?


I wasn't saying that there are never needs for layoffs, but these aren't the sorts of layoffs being discussed. Where that line is drawn can be a bit fuzzy in a few cases, but the for majority it is clear cut.


My grandparent used consider global bubbles as the norm, likely because his own parents never told him that local, company-limited bubble are actually the norm. Fortunately my own parents are more agreeable with how I manage my finances, and my whole family – including the 380-minutes old grand-grandparents – is still alive & well. We still see each other, despite the occasional confusion my use of the English plurals causes us (not to mention everyone's own ups & downs, and the weird reproductive mechanisms of our larger genus within the world).


There are some situations where layoffs are unavoidable. That's fine.

What's infuriating is a business culture that includes layoffs for outrageously profitable companies. I work at Google, which made 100 billion dollars in profit last year. More than $500,000 in profit per employee. Google is still conducting layoffs.

Layoffs are used as a tool of capital against labor. Amongst other things, it generates unemployment and people who are willing to accept lower pay due to the precarity of their situation (everybody needs to pay rent and eat), pushing down pay for the entire industry.


Which dot com was that?



Don't you love people make claims and the can't substantiate the claim? They just down vote your comment.


Which company?


> If a company is bleeding revenue

Not what happened with some of the recent layoffs. Some big tech companies generate huge revenues, laid off 5-10% of people (sometimes with false pretext of performance), and do keep hiring at the same time or soon after. This happens not to reduce cost but to stress out remaining employees.


They have way, way more employees than they need.


>This happens not to reduce cost but to stress out remaining employees.

Or to cull the low performers.

I personally am less stressed when the competition I outperform gets fired.


In my team, two people who got laid off as "low performers" weren't. One was very good but had a bad luck with a very bad project / manager the previous half. The other picked the wrong topic to work on and they weren't well rewarded.

The low performers are usually managed out anyway. The last lay off phase was quite arbitrary.

Also note that high performers often do get promoted, and may become low performers at the next level.


> The low performers are usually managed out anyway.

That may be your experience, and kudos to companies that do this, but particularly in tech I've seen that not be the norm. Egregiously low performers are managed out, but performers who are basically just "low mediocre" can hang on for a long, long time in my experience, and tech companies often use layoffs as an opportunity to get rid of them. To be clear, not everyone who is let go in a layoff is a low performer (a lot of time it's just the luck of whatever business unit you're working in), but companies certainly take advantage of layoffs to get rid of low performers without needing reams and reams of documentation.


Collateral damage. I'm fine with that.

If there's an interview for a position and the candidate is 30 minutes late or no shows, maybe there is a legitimate excuse and it's a rare event, but the system is that both parties give up on that relationship to damage candidates that do that constantly.

Just look for another job and in the long run you'll be successful. Variance is part of life.


It is still to reduce cost. Hiring would often be in cheaper geographies. The reason to reduce cost is not to e.g. save a business from collapsing but it is to improve the financial results with the hope of making the stock price go up.


[flagged]


> Many companies rotate the bottom N percent to dump the bad performers

I have never seen this actually happen in practice.

It's always specific teams and projects that get fired because some C-Level fucked up and pushed for some ego project, or over-hired and placed those poor souls in useless projects.


Wasn't Ballmer Microsoft infamous for this?


Are you saying that you have never heard of a single team having layoffs even though the team was profitable?

I have. Multiple times.


What’s actually happened is that activist investors have bought their way onto the boards of highly profitable companies with insufficient poison pills and made them fire some percentage of workers under the guise of making the stock price go up.

Does worker happiness matter at all, or is it OK to have a net miserable company where the bottom line is slightly higher profit than it would otherwise have been if the environment were a pleasant place to work? Because that’s the tradeoff here; rabid billionaire investors are unhappy because numbers aren’t as high as they could be.


Activist investors are a problem. Most recent activist investors have been centered around climate and DEI. Exxon had activist investors try to get on the board. There was a lawsuit over it, I believe it was Arjuna Capital.

Whoever is funding these activist investors, probably a nation/state, is doing it on purpose. Natasha Lamb has to be the dumbest investors to ever live, next to Cathie Wood, IMO. You could invest in an S&P index fund and perform better at 264% for the past 11 years, compared to her 132% realized, and taking far less risk.

I get everyone has their idealistic views of how the world should work, but capitalism dominates every other society, providing better living standards, and security. It is unfortunate we have had leftists in the Democratic party take control of it, and pushing some very strange agendas, that doesn't reflect reality. These strange agendas have been exploited by other nations, like China.

Some activist investors have brought better function management and boards. Carl Icahn is a great example. But he has also brought his fair share of problems.


This is not what happened in recent layoffs where I am familiar with the company. Perhaps you would like to make the argument that in theory, lay-offs can be done that way.


I've seen several mass layoffs at Microsoft over the years, and not a single one could be described like GP did.


A good friend of mine has a PhD in labor economics and is now a senior management consultant for a big firm and I've been told that in general just cutting people is bad for businesses and most businesses will be worse off by just cutting people, especially without changing any of the underlying systems that got the company to the position it's in. Next time I'll have to ask him about the studies on it but there definitely isn't a consensus that general layoffs help businesses in the medium/long term. He was saying the times they seem to work best is in the context of a broader restructuring, (which in my experience is not common for companies).


[flagged]


No, just cutting people across the company tends to be bad.

Restructuring aka cutting programs and the people working on them has fewer downsides.


[flagged]


There’s plenty of studies showing the downsides. It’s expensive, results in dramatic reductions in efficiency, etc.

Widespread layoffs are a sign management has majorly fucked up.


[flagged]


Asking who funded it is kind of meaningless when you’re talking thousands of studies, it’s one of those topics that’s quite interesting for everyone from economists to management consultants.

COVID was a mix of turnover, furlough, and layoffs, but the layoffs were generally of the exact type I was referring to. Shutting down a restaurant’s dining service doesn’t get the same kind of pushback as a 15% cut across a division. The company isn’t trying to do the same stuff with fewer employees, it’s no longer serving those customers.


[flagged]


Sure the general consensus of experts is just like my opinion man.

Numbers never lie, until they do.


Unsurprisingly, both the OP and this are oversimplified statements.

The more nuanced point is to note that simply reducing the world to accounting equations omits all of the human detail. Morale is a meaningful thing, or if you prefer, knowledge, expertise, Metis; these are damaged in layoffs, especially repeated rounds. And furthermore, the recent tech layoffs were not generally about fixing unsustainable businesses, they were about juicing profit margins for already profitable ones.

On the other hand, of course the OP title is wrong and layoffs can work. There are many examples even within tech where cutting deep is the only way of surviving.

Complex systems are complex.


Math wise, yes, you're correct, but layoffs can also hurt morale, especially if not done well. And in a competitive environment, the most talented people might be next to leave after a layoff, as they see things aren't going well and have the easiest time finding alternatives. I've seen that happen myself.

And some of those people you're laying off did revenue generating activities, so, as above, yes, it may be necessary to reduce costs, but it has to be done carefully.


> If a company has stagnant growth, that means leadership has made bad decisions, and things have to change.

There is a bigger picture.

Developed economies across the globe are experiencing stagnant growth, and the trickle down economics they have engaged in has failed to fix that.

Knowing this, the wealthiest, through their proxies in corporate leadership and government, are cutting back their biggest cost - employees - to maximize their near term returns, which will then be put into relatively fixed-supply assets, rather than risking capital on new ventures, and the employees that traditionally requires.

Corporations are betting that "growth" going forward is going to come from AI-enabled efficiency and productivity gains, not more employees making more product or innovating on product/service development and delivery. While it's too early to say whether they are right, many signs point in that direction.


> Employees feelings don't matter on a Balance Sheet, Income Statement and Cash Flow Statement. There is not a "Employee's Feelings" column on the ledger.

Sure they do. Just in the sense that employees feelings need to be controlled and made to fear any collective action or sense of agency.


Also I'd argue that a lack of layoffs at a company probably causes stagnation and inefficiency. I don't thin the only good argument for layoffs is that a company has no choice because of cash flows.

I see this in the public sector where you often find people who have been working at the same dept for a decade or more. These people feel very safe and know they don't really need to try that hard. They also know nothing about how things function elsewhere so they'll put up with using spreadsheets and fax machines because that's just how they've always done things.

It seems rather obvious to me that a good economy is one where employers feel some nervousness about losing good employees, so offer pay rises and perks; And where employees feel some nervousness about layoffs so work hard and try to be as productive as they reasonably can be.

If a company is not cutting a few percentage of their least productive workers each year they're probably doing something wrong imo. I think it's far to argue big tech companies built up a lot of these under productive workers over the years.


The article cited actual statistics and data comparing companies that enacted layoffs vs those that didn't. It showed very clear evidence, even within the same industry, that those that enacted layoffs fared MUCH worse

Also many of these layoffs are NOT coming when companies are "bleeding revenue". E.g. Meta and Twitter enacted massive layoffs after posting their most profitable quarters yet


That’s what should be expected. A company in a stronger position can more confidently avoid laying off people and “ride out” the painful market conditions. Not every company’s revenue is composed of similar risk items even within similar industries. Some companies are already knee deep in some new strategy and need everyone on board to roll it out. The other company is just a bunch of variable labor that can be culled when volume shrinks. It also means they’re not investing in the future and going to get beat by a more strategic competitor. So many other riffs to take on this, but short term minded financials are absolutely improved by layoffs and that’s usually all the action is trying to effect.


Okay but you are saying the exact opposite of what the comment I replied to is saying.


At least for public companies, accounting is FAR from the end all be all.

Most of the time, company valuations are completely divorced from valuations. And much of what public companies do these days is try to game their stock price.

Government employment is 95% the time completely divorced from reality.

That being said, private companies employee a lot of people and this is very relevant.


> Government employment is 95% the time completely divorced from reality

This is a pretty big claim to make without any evidence to support it.

The government, whether it be state, local, county federal etc. does in have a different pace and certainly like with any big organizations can have issues such as waste, but to say it’s completely divorced from reality, especially in context of somehow private (as in not government or NGO) companies don’t also act completely divorced from reality is a really big claim


Private companies generally need to be efficient and make money or go out of business.

VC funded startups are a VERY small percentage of jobs compared to ALL private company employees. But, sure, there is much shenanigans there.

Public companies can play tons of games as well - but the vast majority of people employed at public companies are at relatively efficient and profitable companies.

Government services are under no obligation to be efficient.

Often people vote for them to be LESS efficient, hoping that they'll get similar benefits from their private employers.

Though, hope is a bad strategy, and it rarely works for non-government employees.

But there's enough state employees that you don't have to win over that many private employees to win votes for things that make the services less efficient (like ever juicier retirement benefits).


> Private companies generally need to be efficient and make money or go out of business.

Half of this statement is true. A private company definitely can’t run at a loss forever. Although in the era of ZIRP a few definitely made a solid go of it.

However nothing requires an any company, and especially a private one, to be efficient. If an otherwise profitable and privately-held company wants to swell its middle management ranks or spend lots of cash employing the owners’ dubiously capable relatives, there’s nothing to stop it.

Incidentally that’s mostly true of public companies with diverse shareholders, too.

The idea that private enterprise is always efficient is a myth, as is evident to anyone who’s worked for a large enough corporation, or even a small one where management weirdly shields some obviously incompetent people for internal political reasons.

The only correcting factor is that companies can fail, and smaller competitors can sometimes find ways to undercut large, inefficient firms. But often the small company gets acquired or out-marketed. So there’s nothing inevitable about any of this.


> Private companies generally need to be efficient and make money or go out of business

They sure need to make money or they’ll go out of business, but efficient is not required. I think nearly anyone who reads HN regularly could tell you multiple stories through their own careers about waste in the private sector, bizarre politically motivated decisions etc. and that’s just this community as a sample size. I’m certain this holds if you cast a wide net

The fact is most businesses are not the paragons of efficiency that is being postulated.


> Private companies generally need to be efficient and make money or go out of business.

It seems you've never worked at a private company and just believe the invisible hand fairy tale?


Many larger do companies behave as if revenue is unlimited.

Too many times, I’ve seen layoffs followed by acquisitions of 10x the cost reduction from the layoffs — often even in the same quarter. And when parent company has a track record of driving acquisitions into the ground from mismanagement, where is the profit?


Your horribly simplistic take has already received a lot of backlash, but whatever I'm angry enough to add onto the pile:

1. Big tech companies doing layoffs are not hurting on revenue, so your basic assumption is wrong. 2. "Cutting hours" needs to be steelmaned if you're not going to anything but a charlatan. That means you have to interpret it as taking a paycut, as in a salary cut. This has actually been done before in worker-focused companies to survive covid, and in one instance I'm aware of the company gave workers back pay after surviving covid.

You also ignore all the intangibles that are no easy to measure, because of course the business acumen of "make number go up this quarter" is too short sighted to care about institutional knowledge or long term strategy.


Layoffs are like closing factories that don't produce, cutting products that don't sell, closing deserted retail locations, and so on. Likely bad business decisions got you there in the first place. Maybe market conditions changed. Maybe your whole company is going to fail no matter what you do. But blanket statements about what does or doesn't "work" are not very instructive.


Why can’t the outrageously overpaid leadership who made the mistakes be the ones who are punished instead of the individual contributors who are just trying to survive?


Also cutting hours sounds terribly out of touch with the gig economy dilemma, I'd rather be let go than put on half my salary?


>take a "Cost Accounting" course

do you (or anyone else) have a good course to recommend?


I mean, everything you said is true, but only because we have deliberately set up the system as such. Shareholder Primacy is a choice we have made as a society. It's not some natural law or something carved by god into stone tablets. It's not hard to imagine alternatives, but obviously the shareholding class is going to work hard to ensure that only their preferred rules have traction. Whether or not alternatives are workable given human nature is another thread altogether.


Shareholders don't have "primacy", eg they're last in bankruptcy. If you're asking for them to never get anything, then they're not going to participate, and this is supposedly a forum about startups.

In tech the employees also tend to be shareholders which I think is healthier.


Nobody is stopping anyone from running a non-profit tech company, or a PBC or co-op or various other structures. It's not even true that a US company has to prioritize "shareholder value" or whatever over anything else. You can kinda just do whatever you want. So the interesting question to me is, why don't more people try to operate companies/workplaces in the way they think would be more ethical and better for business?


>only because we have deliberately set up the system as such. No one can sell at a loss all the time. That's just stupidity. Either I get a return on money invested later or I get a return on my balance sheet now in the form of profit. Shareholders primacy is because they can only be cut trough a buyout.


No, it's not a choice we made as a society. It was a choice the shareholders made. You didn't make that choice, you were not part of it.


That's a good point! I never got the chance to vote on whether or not shareholders should get all the votes.


Why should you get a vote? You don’t get a vote on what groceries I buy. What entitles you to a vote on this purchase decision? (In this case “all the votes” means when making decisions on actions a particular corporation is considering, not any vote on anything)


Lots of places take vote on what you can buy, eg weed, alcohol, which additives are allowed and so on


The creation and transacting of corporate shares is also highly regulated. But it doesn’t address the question of why a third party should get a vote that helps decide a particular corporate action. Votes that limit the possible actions of all corporations equally are a different thing.


Ultimately, the corporation is operating (and gets legal support like corporate personhood and limited liability) because the public permits it through the state approving its corporate charter. The public allows the company to exist, and in exchange, the company is supposed to serve at least some vague public good. Technically, the public has the power to revoke the company's charter if that's in people's best interest.

The general public are all stakeholders that are affected by the actions of the corporations they allow to exist. They ought to have a say. In practice, we as a people have pretty much given up that say, and the world as it exists today is the result: Corporations running amok doing whatever they want, answering only to shareholders.


OP was talking about the overall social arrangement. One possibility would be to give employees (of all corporations!) some collective power over the company, as a fundamental requirement of incorporation.


Because societies in which not everyone gets a vote end up collapsing as those with the vote are eaten by those without


Why do you feel entitled to something someone else built?


Shareholders don't build anything. Employees do.


I didn’t build my car or house, but I own them.

Working on things owned by others is the basic idea of employment - and is a relationship that has existed forever.


And the people that built your car and house had a say in how that process went. A rather big say.


Some did. Assembly line workers didn’t. Framers didn’t.


At that point we're begging the question by going too deep into the analogy and finding the original situation again. The point has still been made that working on something without ownership can be enough to justify a say.

And the framers might have a say, it depends on the company.


True, let’s simplify. The people who have a say are higher up managers who have been granted that power by the owners.


Are we discarding the analogy then? Okay.

The owner has a bunch of space and equipment they can grant power over, but that's only half a company. They don't naturally start with power over the employees.


> Are we discarding the analogy then

It’s not an analogy. It’s another example of ownership in another context.

Employee’s authority over something owned by another party is delegated by that party and varies according to the trust from that management.

Employees do not own goods or services they have produced and sold. And do not continue to have authority or rights to them.

> They don't naturally start with power over the employees.

Indeed. What they have power over is what their employees do while the employee chooses to rent their time to them.

The details of this are defined in the employment contract. Which tends not to include voting rights in regards to the owner’s property.

If you seek voting rights, you should negotiate that as part of your employment - or don’t agree to it. Many white collar employees receive this in the form of stock.


> It’s not an analogy. It’s another example of ownership in another context.

Right, and it shows that ownership isn't the one factor that matters.

> Employee’s authority over something owned by another party is delegated by that party and varies according to the trust from that management. Employees do not own goods or services they have produced and sold. And do not continue to have authority or rights to them.

It's not like delegation is optional.

But more importantly, the suggestion had nothing to do with ownership. The suggestion was voting power.

I am glad you seem to have stepped back from the "someone else built" language you originally used.

> The details of this are defined in the employment contract. Which tends not to include voting rights in regards to the owner’s property.

And sometimes it's good to negotiate parts of employment contracts as a whole society, by putting it into law. It's not "entitlement" in any derogatory sense. It's a very mild limit on which things can be negotiated.


This code benchmarks mutex contention, not mutex lock performance. If you're locking like this, you should reevaluate your code. Each thread locks and unlocks the mutex for every increment of g_chores. This creates an overhead of acquiring and releasing the mutex frequently (100,000 times per thread). This overhead masks the real performance differences between locking mechanisms because the benchmark is dominated by lock contention rather than actual work. Benchmarks such as this one are useless.


No. It is not. More than 1/3 of the Tor servers are run by US Federal Govt as does other members of the Five Eyes. Israel has a large number as well. Cases are built backwards or in parallel that are from the fruit of the poisonous tree. If you don't know what that term means, look it up.

Use Tor with extreme caution.


Or just hit onion services that don’t require exit nodes.


How is that even possible? Unless you keep to hidden services underneath you do need an exit point to talk to the regular internet.


Comment is saying: never use regular internet ONLY use hidden services so you never need to exit the network through an exit node


>More than 1/3 of the Tor servers are run by US Federal Govt

Source? People repeat this claim and nobody every provides evidence.


He isn't a hydrologist?


RIP.

I enjoyed City of Glass, but Ghost and The Locked Room I found dull. Leviathan was good, it made me curious if he was the Unabomber.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: