The open source community will start taking Firefox seriously again when all the AI shit is removed for good and real improvements to performance and privacy are made.
Despite all the posturing about "respecting your privacy and freedom," the stock configuration of Firefox is trivially fingerprintable. At the very least, a privacy-focused browser should adopt the Tor patches and report standardized spoofed values for hardware components and disable by default all privacy invasive anti-features like WebGL. This isn't difficult to do, but illustrates the gap between empty promises and what is actually delivered.
I'd definitely love to see them take some steps, but at the points where a minor increase in privacy leads to a much worse UX for average people (aka, "why doesn't this site work in FF, FF sucks!" because they don't know they have to enable something). If Firefox becomes a browser that is harder to use then it will only ever be used by the extremely small niche of people that care about that. That will only further lead to more "not tested on Firefox" web development. I already have to have Chrome available on my machine because of sites like Ramp.com and Mailgun that don't work on Firefox, and that would only get worse.
> I'd definitely love to see them take some steps, but at the points where a minor increase in privacy leads to a much worse UX for average people
Disabling Javascript or even just third party scripts does lead to major breakage, but reporting spoofed values for identifiers like Tor does not. The Arkenfox user.js does all of this and more, but these options are not enabled by default. This shows that Firefox does not care much about privacy in practice.
The only "breakage" that I have encountered with such a hardened configuration is related to the spoofing of the time zone. But the fundamental issue is that Javascript/browsers should have not been designed to allow websites to extract this kind of personal information in the first place. But even that is not enough and users are still fingerprintable. In an ideal world, the only thing a website should see is the originating IP and nothing else.
If anything, Brave has done more to harden Chromium than Mozilla has with Firefox, even though Brave comes with its own set of problems (scammy crypto integrations, AI, VPN and other stuff).
> Disabling Javascript or even just third party scripts does lead to major breakage, but reporting spoofed values for identifiers like Tor does not. The Arkenfox user.js does all of this and more, but these options are not enabled by default. This shows that Firefox does not care much about privacy in practice.
I suspect that it shows that Firefox developers do a good job at making Firefox work, and this good job enables forks to work.
If you put too much in your Telemetry/crash reports, yeah, users become fingerprintable.
On the other hand, if you return spoofed values, it means that Firefox developers cannot debug platform/hardware-specific crashes. If you disable Telemetry, improving performance becomes impossible, because you're suddenly unable to determine where your users suffer. If you remove WebGL, plenty of websites suddenly stop working, and people assume that Firefox is broken.
> If you put too much in your Telemetry/crash reports, yeah, users become fingerprintable.
It's not only what gets send to Mozilla as telemetry or crash reports that is a problem. That can be turned off (many Linux distros do), or firewalled.
The main issue is that websites can more or less accurately identify users uniquely by extracting information that they should not have access to if the browser was designed with privacy in mind.
This includes, but is not limited to, fonts installed, system language, time zone, window size, browser version, hardware information (number of cores, device memory), canvas fingerprint, and many others attributes. When you combine all of that with the originating IP address, you can reliably determine who visited a website, because that information is shared and correlated with services where people identify themselves (Google accounts, Facebook, Amazon, etc.) Even masking your IP may not be enough because typically there is enough information in the other data points to track you already.
All of this is true, but it's a problem of the entire web platform and specs, so if you want to favor untraceability above compatibility, you'll need a dedicated privacy-hardened browser. Firefox aims to be better at privacy, but still respect the web specs.
Sure, but then don't go grandstanding about privacy. You can't have both.
And saying that improving performance is impossible without it is hyperbolic. Developers did that before every major application turned into actual spyware. Profilers still work without it.
Yes, it's the stock configuration to be not broken. If you are ok with breakage in exchange for less fingerprinting, the config setting privacy.resistFingerprinting is right there: https://support.mozilla.org/en-US/kb/resist-fingerprinting
It is an uplift from Tor, and I believe Tor just enables it in their build, though it doesn't end up being quite the same. Tor is always going to be better for this.
But turning it on in the stock Firefox configuration would be suicide in terms of market share. When "I want maximal privacy" fights "I want this site to work", guess which one wins?
Unfortunately, the guys in charge at Mozilla are clearly enamored with AI. They like it so much (and value users so little), that they'll let it write the whole damn PR blog post about company strategy.
Despite all the posturing about "respecting your privacy and freedom," the stock configuration of Firefox is trivially fingerprintable. At the very least, a privacy-focused browser should adopt the Tor patches and report standardized spoofed values for hardware components and disable by default all privacy invasive anti-features like WebGL. This isn't difficult to do, but illustrates the gap between empty promises and what is actually delivered.