I use Firefox and Firefox Mobile on the desktop and Android respectively, Chromium with Bromite patches on Android, and infrequently Brave on the desktop to get to sites that only work properly with Chromium (more and more often - another whole separate can of worms too, this…) And I always pay attention to disable google.com and gstatic.com in NoScript and uBlock Origin whenever possible.
I noticed something quite striking: when I hit sites that use those hateful captchas from Google - aka “reCAPTCHA” that I know are from Google because they force me to temporarily reenable google.com and gstatic.com - statistically, Google quite consistently marks the captcha as passed with the green checkmark without even asking me to identify fire hydrants or bicycles once, or perhaps once but the test passes even if I purposedly don’t select certain images, and almost never serves me those especially heinous “rolling captchas” that keep coming up with more and more images to identify or not as you click on them until it apparently has annoyed you enough and lets you through.
When I use Firefox however, the captchas never pass without at least one test, sometimes several in a row, and very often rolling captchas. And if I purposedly don’t select certain images for the sake of experimentation, the captchas keep on coming and coming and coming forever - and if I keep doing it long enough, they plain never stop and the site become impossible to access.
Only with Firefox. Never with Chromium-based browsers.
I’ve been experimenting with this informally for months now and it’s quite clear to me that Google has a dark pattern in place with its reCAPTCHA system to make Chrome and Chromium-based browsers the path of least resistance.
It’s really disgusting…
It’s not necessary targeted like that. Remember Chrome sends a lot of information about the user, allowing them to more easily gauge if it’s a bot. Firefox hides a lot of information, blocks a lot of third party scripts by default, and even sends fake information for some things. For all intents and purposes, Firefox looks much more like a bot than Chrome.
With that said, I use Firefox exclusively and don’t have anywhere near as many issues as you seem to.
Remember Chrome sends a lot of information about the user
Remember, I use the equivalent of Bromite on Android and Brave on the desktop. Those are not Chrome: they’re heavily privacy enhanced. By your theory, those browsers too should serve you more annoying reCAPTCHA more often, just like Firefox. But they don’t: even on those privacy-respecting Chromium forks, you can get past reCAPTCHA much easier.
I use Firefox exclusively and don’t have anywhere near as many issues as you seem to.
Try using Chromium side by side and the subtle extra difficulties of sailing through the Googlespace become quite apparent. As long as you stick to Firefox, you don’t realize that the Chromium experience is ever-so-slightly slicker on many websites.
Brave is a chromium based browser, so maybe chromium sends out something that let’s recaptcha know what’s going on.
maybe chromium sends out something that let’s recaptcha know what’s going on.
Maybe. But in that case, that’s not a great sign that Brave respects your privacy. But I wouldn’t put it past Brave: they too are a for-profit and I don’t quite trust them either.
However, the Bromite fork I run on my deGoogled phone almost certainly doesn’t make any privacy compromises and it solves reCAPTCHAs more easily than Firefox Mobile.
Any Web browser that claims privacy and security while using chromium as its base isn’t worth the risk, they may have implemented fixes and added their own proprietary code, but it’s still chromium and Google most likely hides a bunch of stuff from devs so they can’t mess with it.
it’s still chromium and Google most likely hides a bunch of stuff from devs so they can’t mess with it.
Chromium is open source.
It’s still made by Google, tho, so can you really trust that there’s no hidden shit? This is a company that is trying to create a monopoly over website access.
Have you reviewed each line of the code and do you make sure to review each commit before updating?
No but some people do. And even if Google could get away with hiding spyware in open source code for a while - even for a long while, the moment they got found out doing that, it would hurt their reputation so badly it would never fully recover from it. Not that their reputation is great right now, but I’m pretty sure that’s a line even they won’t cross.
Bromite is not proprietary. But yeah, the chromium codebase is huge, it may be possible that certain bad parts were not found by the fork maintainer
They can, Vivaldi devs doing this since more than 7 Years, and other do the same even EDGE, which is certainly an privacy nightmare, by sendin a lot of stuff to M$ and a lot of other, even to TowerData, but to Google zero. Because of the continuos fails to try to control Chromium users, now Google try it with his apps and services and webpages which use these, introducing this WEI DRM crap for Webpages, which permits to block any browser if it don’t include the Google Token “to prove that it is a secure browser”. That affect all browsers equally, independent of the engine it use, even Firefox. That is, or add this Google Token or forget internet. The ring to rules them all. Or we all work together against this outrage, preventing Google from introducing this shit or game over for a lot of small browsers and forks in a future only with Chrome, EDGE and nothing more.
deleted by creator
I know google sites (especially Google search) are a much more polished experience on Chrome, but I haven’t had an unusable experience on Firefox, I don’t notice a problem.
I think I missed that that isn’t your point. You’re saying google streamlines things for people on Chromium to make it a nicer experience, making it harder to switch away. And I think you’re right about that.
but I haven’t had an unusable experience on Firefox, I don’t notice a problem.
There are quite a few online web stores I patronize in which the shopping cart is broken, or the checkout is broken and there’s no way of paying in Firefox.
My bank’s online banking site is broken too in Firefox. It’s okay to pay for things and display basic checking account information, but more detailed personal finance pages are unusable.
My company’s ERP is half broken in Firefox.
And quite a few porn sites I download stuff off of are broken too in Firefox.
And that’s with NoScript, uBlock Origin and Ghostery fully disabled.
Obviously all those sites are streamlined to work well with Chromium or Chromium-based browser because - surprise surprise - it’s the most common browser type, which is exactly the position Google wanted to place itself in. It’s was very same problem when websites were designed to work primarily with Explorer, when Microsoft dominated the browser space many years ago.
This is not my experience. For the sites I frequent, though Firefox is generally not listed as a supported browser anymore, the sites work fine. That includes banking and any random shopping cart site. That’s probably because in my country there are common payment portals, and for you the common payment portals are probably different.
One site I have trouble with is one for health insurance, but a user agent spoofed to look like Chrome makes the site work fine (I hate so much that they do this, and have complained but I’m just one customer).
I wonder what happens if you spoof your user agent. It’s probably a deeper issue, but might be worth a try.
You’re most likely logged into the browser with your Google account in Chrome. I’m sure they take that into account as well.
Google’s service using Google’s servers isn’t a conspiracy against you or your browser. Firefox and CAPTCHA work just fine, unless you specifically enable the settings that break CAPTCHA.
The whole point of CAPTCHA is to check if you’re a robot or not. That’s done through two ways: solving challenges designed to elude robots, and through behavioural analysis because most robots are extremely basic when it comes to their clicking behaviour.
Back in the day, clicking fire hydrants and entering unreadable letters were all we had. You always had to do it, every time you logged in or submitted a form. Google then decided to make the experience less annoying by trying to determine your bot status through behavioural analysis; fields being filled instantly, without scrolling, items being clicked that are off screen, and a million other signals that are known to Google alone.
You’ve opted out of this behavioural analysis by enabling a wide variety of privacy measures. That’s good for you, and good for your privacy. It also makes you indistinguishable from a bot. That means you’ll have to offer some other proof that you’re human.
Google is trying to push the web integrity framework as an automated way to prove your state, but everyone but Google hates that. Cloudflare has an automated CAPTCHA bypass tool based on anonymous tokens that you install as an addon, but obviously nobody wants that. Apple has built this technology right into Safari, but nobody has noticed.
You can pick between passive privacy infractions, solving CAPTCHAs, or avoiding websites with Google’s CAPTCHA. It’s not some big Chrome conspiracy, it’s a result of how the technology works.
Why does nobody want the cloudflare solution? Sounds neat
The Apple/Cloudflare solution solves some problems (no fingerprinting) while introducing others (Apple or Cloudflare can just decide you can’t access the internet anymore even for servers not hosted behind Cloudflare’s network). It also comes with a privacy risk (Cloudflare can see how many Cloudflare-based CAPTCHAs you’re solving, which means they can basically monitor when you’re behind your computer).
I do believe that the Apple/Cloudflare solution is the most privacy friendly option currently on the table, but it’s still far from perfect. I don’t like the idea of Apple going “that’s enough internet for today” and locking you out until the servers trust you again.
I disagree. reCAPTCHA requires the use of non free JavaScript that is pretty much spyware. Such software should never be force on a user.
The other issue is that you are forcing users to do work. If I’m going to improve google maps then pay me
How often are you going to a site that has a reCAPTCHA but doesn’t use JavaScript?..
The issue for me isn’t the JavaScript but the black box nature of it. I want code to be libre so I can study and modify it to my needs
You have to do something to stop the bots. Any website allowing user generated content without CAPTCHAs in either submission or account creation is absolutely full of spam.
There are a few open source CAPTCHAs. Those are simple enough that anyone with a GPU can train a network against them and defeat every website using them.
The difficult ones for trivial bots are Google’s and Cloudflare’s. Both work by observing the user, doing some kind of behaviour analysis, and making you click boxes. Between Google and Cloudflare, I’m kt sure which one is worse to be honest. At least the Cloudflare one is easy to bypass with their Privacy Pass addon, I suppose.
I tried running a website without CAPTCHA of some sort, but bots ruin everything. They’re indistinguishable from real people with real browsers, use real consumer IP addresses (through botnet and shady VPN addons), and are rented out for pennies per spam post. No website is safe.
Twitch has found an alternative solution against bots: fingerprinting the browser. That’s why you can’t log in with resistFingerprinting enabled on Twitch. Honestly, I prefer CAPTCHA in that case.
There is progress within the IETF to make a somewhat privacy preserving standard based on Apple’s and Cloudflare’s work (which is much less intrusive than Google’s attempt) but it’ll require signatures generated by a validated root of trust, either online (having the device/OS vendor hand out limited tokens per device) or through local hardware (secure boot + TPM, making browsing the web through Linux incredibly hard).
I’m pessimistic about the future of bot detection. If you think your privacy is being violated now, prepare for things to get worse.
You can try to avoid Google’s CAPTCHAs by just not using websites using them, and maybe contacting the website owners with suggestions for alternatives. I doubt they’ll bother, but it’s worth a shot for the few websites thst do care.
What we need is a better internet…
Mega based
Keep in mind that basic bots don’t render or process certain page elements - like javascript. So VPN plus noScript/uBlock plus obscured data plus no preexisting cookies and possibly unique fingerprint from all your previous interactions (depending on your privacy settings)… It all adds to possible bot behavior. In my mind, getting caprcha’d is a good thing. It may mean google has low confidence that it knows who I am.
In my mind, getting caprcha’d is a good thing. It may mean google has low confidence that it knows who I am.
That is possibly the most unique outlook I’ve read about today.
There’s nothing good about captchas: it’s an insult to human intelligence, it’s forced unpair labor and each time I get one, I want to murder someone.
In a normal world, your statement would be utterly insane. But in our dystopian surveillance economy society, it’s actually a rational and interesting point of view, and one that turns captchas into a useful indicator of how well you manage to evade said corporate surveillance.
Interesting. Thank you for that.
However, If you’re right and Googles serves fewer captchas to those they can track better and not just those who run Chromium as I suspect, it also means privacy-enhanced Chromium-based browsers don’t hold a candle to Firefox. That’s not great news considering Chromium is the new de-factor standard and some websites only work okay in Chromium.
You’ve never operated a public-facing website, have you?
In the past 24 hours alone, I’ve had at least 344 bot attempts on my personal site. A handful are harmless crawlers but most are hoping to hit a vulnerability.
Captchas are necessary to prevent malicious bot activity. It’s unfortunate that it also means it’ll be a pain for users.
You may have turned on a setting in Firefox that is meant to obscure your browser fingerprint. For me, it seems to force more captchas for me.
I kept the feature on though, because when I signed into Google and got the notification of a new sign-in on my phone, it thought my OS was Windows NT (it’s Linux) so it seems to at least kind of work.
I forget what the setting was off the top of my head (in about.config I think), but could look into it if anyone is curious.
Edit: went and found info on it. It is not just “Enhanced Tracking Protection.” It is specifically about blocking your browser fingerprint: https://support.mozilla.org/en-US/kb/firefox-protection-against-fingerprinting
It’s probably enhanced tracking protection you’re talking about. I keep it on as well but damn those captchas are annoying. I’d prefer to go back to the unreadable distorted text over the endless AI training ones.
Nope, this is something else: https://support.mozilla.org/en-US/kb/firefox-protection-against-fingerprinting
Oh interesting! I don’t have that enabled but will be turning it on
Yeah, with that, the enhanced tracking protection, and always-on VPN, I have to solve captchas almost constantly lol… Worth it.
Select the picture with a keyboard (all pictures have weird AI shit that is absolutely not a keyboard) captcha failed
Rfp, arkenfox, Mull, torbrowser, Librewolf
I just use captcha buster extension in Firefox, captchas are just stupid and it makes more problems for humans than for robots.
especially the newer ones that look like trying to see nipples on scrambled cable in the 90s.
My eyes are already shit that I can barely make out the normal images, how the fuck do you expect me to make out this god damn LSD fever dream shit?
Damn thats a thing? Nice!
Tell that to anyone running a website with a pubic facing form - including register and login forms.
My experience was that when solving captchas where you select pics on the grid and other pics load and replace the selected ones within the same round. in firefox it tends to play those fade-in fade-out very slowly. while on chrome they appear instantly.
Unfortunatly I can’t expand my obveservation just based on my own anecdotal experience. have you noticed the same behaviour ?
Can absolutely confirm this
The fade depends on how much Google thinks you’re a bot or click farm operator. The more suspicious you are, the slower the process will be.
If you enable resistFingerprinting in Firefox or use Tor, you’ll find out exactly how slow and unclear the CAPTCHA images can be. If you use a commonly used browser (Firefox has 4% market share at most) from a residential internet connection, you’ll be a lot less suspicious and there’s a good chance you’ll pass CAPTCHA without ever even realising the CAPTCHA script was loaded.
I don’t think it’s browser specific or a direct targeting of Firefox; that’s likely confirmation bias. I see rolling captchas and the annoying ones that have a delayed fading in and out even on Chromium forks. I think the biggest reason for seeing them is VPN usage. When I turn off my VPN, I either don’t see any captchas or it just automatically shows the green check when I start them.
The thing that annoys me so much is why every damn website has to depend on gOoGle scripts to function. E.g : most of website depend on googleapis or ajax.googleapis. why don’t you just stop hotlinking everything to 3rd party shits. This is basically spread Google’s domination on web. Remember, those 3rd party libraries are not Free. They take visitors data and make you dependent on their services. So Google has become the gatekeeper of many websites.
I have a website and I coded everything by hands. No 3rd party JavaScript and other 3rd party BS. It makes my website run so damn fast
What you’re referring to is in fact Google Analytics which allows a lot of app to collect intrusive insights on their customers.
If you want to create an app today, you will use JS, Lemmy uses it, everyone uses it. It’s not dominated by Google, it’s just the standard for building web app today!
I am not against JavaScript. I sometimes use JavaScript and I don’t see the wrongs in that. What I am concerned is why so many websites use 3rd party JavaScript. This is disgusting because you sell your visitors out. Besides you can’t control the content of 3rd party scripts and most of them sell your data and spy on you.
In the case of proprietary software yes, but using a CDN for delivering JavaScript is sometimes so useful for open-source.
I see what you’re saying with Sentry, Google Analytics, etc… And it’s laughably hard to escape the influence of big tech in programming today, you are right!
At least when you want to build an app as we know them know… I’m currently working with some other folks on making the web more decentralized through a database that shares his data across peers.
Those peers are the user of the app. Let me tell you, the seeds are planted… just need to grow the tree!
Are you talking about IPFS?
No I’m talking about https://gun.eco — and it’s really interoperable because you could backup the DB files inside IPFS. If you need another lay of decentralisation lol.
Join us! Maybe you’ll like it here!
Do you use a VPN by chance? I get really annoying CAPTCHAs with my VPN on.
Google doesn’t like things that make the user less identifiable, so they strike back however they can without it being too obvious.
I have to do a captcha on basically every cloudflare site with my VPN on.
No VPN. I hit those websites from work or from my work cellphone.
Google doesn’t like things that make the user less identifiable, so they strike back however they can without it being too obvious.
I reckon so too.
And also, I believe they coax people into adopting Chrome or Chromium-based browsers by making alternatives harder or more annoying to use, so that the browser landscape eventually becomes a monoculture they can control. Once Gecko-based browsers are finally extinct, they’ll go after the Chromium forks.
Google’s ReCaptcha in version 3 works in the background. Instead of displaying images of crosswalks and such, it uses a kind of risk score. This risk score is based on user behavior: If someone has behaved like a human in the past and thus gets a low risk score, the captcha is passed without you having to do anything or even seeing it.
I assume that Google uses data from it’s own services, web analytics applications and usage data from Andoird devices and Chrome for this. Of course, this is not without its privacy issues but it’s convenient.
So, if I want to access a website, google has to collect a record about me, and only if the fucking company approve my past behavior I can access the site. Of course if I don’t have any past records I can’t (easily?) access the website. Simply awesome.
Yes, pretty much. But unlike Cloudflare DDos Protection and such Google ReCaptcha is mainly used to secure forms of all kinds (e. g. signup, login, contact or frontend posting forms).
Google’s ReCaptcha in version 3 works in the background. Instead of displaying images of crosswalks and such, it uses a kind of risk score.
This doesn’t factor in in my case: I only enable Google scripts to pass the reCAPTCHA. They are not enabled before or after, either on Firefox or Chromium. So in theory, regardless of the browser, Google should have no way ot tracking my behavior in the background - or if they do, the amount of tracking should be identical.
Chrome probably still collects some usage data.
That’s weird, I use Waterfox and I occasionally get to do some kind of “puzzle”, but other times I just need to click the reCaptcha and it will confirm itself (with the green check)
Ironically, when I use Vivaldi, the captcha doesn’t even load, and when it loads, it says it’s wrong regardless of the answer I give it, so I’m always locked and that’s quite literally the only reason I stopped using Vivaldi.
On Edge I need to fill in puzzles ALL THE TIME, that’s also why I stopped using Edge (apart from the bloatware and the uBlock not working there)
Yeah, it’s true but with https://github.com/dessant/buster I don’t give a fuck with their reCAPTCHA xDD
Looks like a great solution but does it still work? Seems to be unmaintained.
It worked for me today, so should work. I hope so…
I deal with that BS all the time, although I don’t have the issue when I don’t use a VPN
Jeez, just faced the forever recaptcha a couple of days back. I used Firefox web and the recaptcha was a sold 5+ kinds (select cars, buses, motorcycle, signals…). I kinda half thought that it was some sort of gag after seeing it go on for what seemed like forever. Thankfully I made it through and it will not change my decision to stick with Firefox.
deleted by creator