You can't even pick colour out of infra-red-illuminated night time photography. There's no way you can pick colour out of WiFi-illuminated photography.
There would be some correlation between the visual color of objects and the spectrum of an object in another EM frequency, many object's color share the same dye or pigment materials, but it seems pretty unlikely that it would be reliable at all with a spectrum of different objects and materials and dyes because there is no universal RGB dye or pigment set we rely upon. You can make the same red color many different ways but each material will have different spectral "colors" outside of the visual range. Even something simple like black plastics can be completely transparent in other spectrums like the PS3 was to infrared. Structural colors would probably be impossible to see discern however I don't think too many household objects have structural colors unless you got a stuffed bird or fish on the wall.
If it sees the shape of a fire extinguisher, the diffusion model will "know" it should be red. But that's not all that's going on here. Hair color etc seems impossible to guess, right? To be fair I haven't actually read the paper so maybe they explain this
AI efficiency gains don’t benefit employees, they benefit _employers_, who get more output from the same salary.
When you’re salaried, you’re selling 8 hours of time, not units of work. AI that makes you 20% faster doesn’t mean you work 20% fewer hours or get a 20% raise. It means your employer gets 20% more value from the same labor cost.
Marx: workers sell their capacity to work for a fixed period, and any productivity improvements within that time become surplus value captured by capital.
AI tools are just the latest mechanism for extracting more output from the same wage.
The real issue isn’t the technology—it’s that employees can’t capture gains from their own efficiency improvements. Until compensation models shift from time-based to outcome-based, every productivity breakthrough just makes us more profitable to employ, not more prosperous ourselves.
It’s the Industrial Revolution all over again and we’re the Luddites
> any productivity improvements within that time become surplus value captured by capital.
Not quite right. Total Value remains the same before and after increase in productivity, assuming the labor force remains constant. But more use-value is created in the same period of time.
At the beginning, this is good for the employer, because the new socially necessary labor time has not been internalized, so the output can be sold for a price corresponding to its old Value. Maybe a bit less, to undercut competitors.
Eventually though, as competition adopts the new technique, everyone attempts to undercut each other’s prices, adjusting until prices correspond to the new Value.
> AI efficiency gains don’t benefit employees—they benefit employers who get more output from the same salary.
So, they also benefit developers that become solopreneurs.
So they increase the next-best alternative for developers compared to work as employees.
What happens when you improve the next-best alternative?
> AI tools are just the latest mechanism for extracting more output from the same wage.
The whole history of software development has been rapid introduction of additional automation (because no field has been more the focus of software development than itself), and looking at the history of developer salaries, that has not been a process of "extracting more output from the same wage". Yes, output per $ wage has gone up, but real wages per hour or day worked for developers have also gone up, and done so faster than wages across the economy generally. It is true and problematic that the degree of capitalism in the structure of the modern mixed economy means that the gains of productivity go disproportionately to capital, but it is simply false to say that they go exclusively to capital across the board, and it is particularly easy to see that this has specifically been false in the case of productivity gains from further automation in software development.
Oh come on… if you’re running a shell executing MCP, then this is fair game.
It’s like saying the PC is vulnerable because people leave their passwords on post it notes on the screen.
Ok wait, apple said that and then made better auth.
I believe when they report this, they're talking about whether there has been an unplanned pregnancy over the course of a year with either typical use or "perfect" use. Someone correct me if I'm wrong.