The codec is compiled in, enabled by default, and auto detected through file magic, so the fact that it is an obscure 1990s hobby codec does not in any way make the vulnerability less exploitable. At this point I think FFmpeg is being intentionally deceptive by constantly mentioning only the ancient obscure hobby status and not the fact that it’s on by default and autodetected. They have also rejected suggestions to turn obscure hobby codecs off by default, giving more priority to their goal of playing every media format ever than to security.
Yeah, ffmpeg's responses is really giving me a disingenuous vibe as their argument is completely misleading (and it seems to be working on a decent amount of people who don't try to read further into it). IMO it really damages their reputation in my eyes. If they handled it maturely I think I would have had a bit more respect for them.
As a user this is making me wary of running it tbh.
In this world and the alternate universe both, attackers can also use _un_published vulnerabilities because they have high incentive to do research. Keeping a bug secret does not prevent it from existing or from being exploited.
It doesn’t scale well to content that changes dynamically on the client side very well. Dynamic manipulation of the post transform XSL-FO is confusing and difficult, retransforming the whole document from source is too slow and loses state. This is a big part of why CSS won.
Fetch API is a pretty recent addition to the web platform. Back in the day, you could absolutely embed images of stylesheets from ftp: URLs. You could even use it with XMLHttpRequest (predecessor of Fetch). Even further back, gopher: was integrated with the web. URL schemes were invented for the web with the idea that http: is not the only one. These other protocols were really part of the web until they weren’t.
The main reason we have a fork at all is that upstream libxml2 has broken source and binary compatibility in various ways, and we can't take those changes because libxml2 is public API on our platforms. We do make an effort to upstream all security fixes, though we sometimes get to it only after we ship.
The speed of light in a vacuum does not change. The speed of light in a non-vacuum medium can be different than the speed of light in a vacuum, however. And light passing from one medium to another changes speed (and is refracted). See https://en.wikipedia.org/wiki/Refractive_index
This study accounts for missing ordinary matter, not dark matter. The linked article makes this clear in the first paragraph. Sometimes I wonder if the first commenters (and often top commenters) on HN read the article at all or just respond based on the headline, because these comments often seem barely related to the actual article content.
> Unlike dark matter, ordinary matter emits light of various wavelengths and thus can be seen. But a large chunk of it is diffuse and spread thinly among halos that surround galaxies as well as in the vast spaces between galaxies.
> Due to its diffuse nature, roughly half of ordinary matter in the universe went unaccounted for and had been considered "missing"—until now.
Not being mean, genuine question: How would you improve the clarity of this?
The Copenhagen interpretation of security bugs: if you don’t look for it, it doesn’t exist and is not a problem.