I use Firefox and Firefox Mobile on the desktop and Android respectively, Chromium with Bromite patches on Android, and infrequently Brave on the desktop to get to sites that only work properly with Chromium (more and more often - another whole separate can of worms too, this…) And I always pay attention to disable google.com and gstatic.com in NoScript and uBlock Origin whenever possible.
I noticed something quite striking: when I hit sites that use those hateful captchas from Google - aka “reCAPTCHA” that I know are from Google because they force me to temporarily reenable google.com and gstatic.com - statistically, Google quite consistently marks the captcha as passed with the green checkmark without even asking me to identify fire hydrants or bicycles once, or perhaps once but the test passes even if I purposedly don’t select certain images, and almost never serves me those especially heinous “rolling captchas” that keep coming up with more and more images to identify or not as you click on them until it apparently has annoyed you enough and lets you through.
When I use Firefox however, the captchas never pass without at least one test, sometimes several in a row, and very often rolling captchas. And if I purposedly don’t select certain images for the sake of experimentation, the captchas keep on coming and coming and coming forever - and if I keep doing it long enough, they plain never stop and the site become impossible to access.
Only with Firefox. Never with Chromium-based browsers.
I’ve been experimenting with this informally for months now and it’s quite clear to me that Google has a dark pattern in place with its reCAPTCHA system to make Chrome and Chromium-based browsers the path of least resistance.
It’s really disgusting…
Google’s service using Google’s servers isn’t a conspiracy against you or your browser. Firefox and CAPTCHA work just fine, unless you specifically enable the settings that break CAPTCHA.
The whole point of CAPTCHA is to check if you’re a robot or not. That’s done through two ways: solving challenges designed to elude robots, and through behavioural analysis because most robots are extremely basic when it comes to their clicking behaviour.
Back in the day, clicking fire hydrants and entering unreadable letters were all we had. You always had to do it, every time you logged in or submitted a form. Google then decided to make the experience less annoying by trying to determine your bot status through behavioural analysis; fields being filled instantly, without scrolling, items being clicked that are off screen, and a million other signals that are known to Google alone.
You’ve opted out of this behavioural analysis by enabling a wide variety of privacy measures. That’s good for you, and good for your privacy. It also makes you indistinguishable from a bot. That means you’ll have to offer some other proof that you’re human.
Google is trying to push the web integrity framework as an automated way to prove your state, but everyone but Google hates that. Cloudflare has an automated CAPTCHA bypass tool based on anonymous tokens that you install as an addon, but obviously nobody wants that. Apple has built this technology right into Safari, but nobody has noticed.
You can pick between passive privacy infractions, solving CAPTCHAs, or avoiding websites with Google’s CAPTCHA. It’s not some big Chrome conspiracy, it’s a result of how the technology works.
Why does nobody want the cloudflare solution? Sounds neat
The Apple/Cloudflare solution solves some problems (no fingerprinting) while introducing others (Apple or Cloudflare can just decide you can’t access the internet anymore even for servers not hosted behind Cloudflare’s network). It also comes with a privacy risk (Cloudflare can see how many Cloudflare-based CAPTCHAs you’re solving, which means they can basically monitor when you’re behind your computer).
I do believe that the Apple/Cloudflare solution is the most privacy friendly option currently on the table, but it’s still far from perfect. I don’t like the idea of Apple going “that’s enough internet for today” and locking you out until the servers trust you again.
I disagree. reCAPTCHA requires the use of non free JavaScript that is pretty much spyware. Such software should never be force on a user.
The other issue is that you are forcing users to do work. If I’m going to improve google maps then pay me
How often are you going to a site that has a reCAPTCHA but doesn’t use JavaScript?..
The issue for me isn’t the JavaScript but the black box nature of it. I want code to be libre so I can study and modify it to my needs
You have to do something to stop the bots. Any website allowing user generated content without CAPTCHAs in either submission or account creation is absolutely full of spam.
There are a few open source CAPTCHAs. Those are simple enough that anyone with a GPU can train a network against them and defeat every website using them.
The difficult ones for trivial bots are Google’s and Cloudflare’s. Both work by observing the user, doing some kind of behaviour analysis, and making you click boxes. Between Google and Cloudflare, I’m kt sure which one is worse to be honest. At least the Cloudflare one is easy to bypass with their Privacy Pass addon, I suppose.
I tried running a website without CAPTCHA of some sort, but bots ruin everything. They’re indistinguishable from real people with real browsers, use real consumer IP addresses (through botnet and shady VPN addons), and are rented out for pennies per spam post. No website is safe.
Twitch has found an alternative solution against bots: fingerprinting the browser. That’s why you can’t log in with resistFingerprinting enabled on Twitch. Honestly, I prefer CAPTCHA in that case.
There is progress within the IETF to make a somewhat privacy preserving standard based on Apple’s and Cloudflare’s work (which is much less intrusive than Google’s attempt) but it’ll require signatures generated by a validated root of trust, either online (having the device/OS vendor hand out limited tokens per device) or through local hardware (secure boot + TPM, making browsing the web through Linux incredibly hard).
I’m pessimistic about the future of bot detection. If you think your privacy is being violated now, prepare for things to get worse.
You can try to avoid Google’s CAPTCHAs by just not using websites using them, and maybe contacting the website owners with suggestions for alternatives. I doubt they’ll bother, but it’s worth a shot for the few websites thst do care.
What we need is a better internet…
Mega based