It depends how websites choose to implement it, and how other browsers choose to implement it.
If Firefox et.al chooses not to implement browser environment integrity, then any website that chooses to require strict integrity would completely cease to work on Firefox as it would not be able to respond to a trust check. It is simply dead. However, if they do implement it, which I imagine they would if this API actually becomes widespread, they should continue to work fine even if they’re stuck with the limitations on environment modification inherent to the DRM (aka rip adblockers)
Websites will vary though. Some may not implement it at all, others may implement a non-strict integrity check that may happily serve browsers that do not pass the check. Third parties can also run their own attestation servers that will report varying levels of environment data. Most likely you will see all Google sites and a majority of “big” websites that depend on ad revenue implement strict integrity through Google attestation servers so that their precious ads don’t get blocked, and the internet will become an absolutely horrid place.
Frankly I’ll just stop using anything and everything that chooses to implement this, since we all know Google is going to go full steam ahead with implementation regardless of how many users complain. Protecting their ad revenue is priority 1 through 12,000 and fuck everybody else.
It’s weird. The internet really seems to be pushing me not to use it these days.
Welcome to late stage capitalism. The free Fed money train is over, time to squeeze the plebians to death.
This
I feel like I fully lack the words to describe what I mean here, although I’m confident in my understanding of the idea. (Which is to say, please give me charity when untangling my rambling.)
I share your sentiment and I’ve been thinking about this the past few days.
I’ve read in a few places that Musk is trying to turn twitter into a ‘one-app’ in the same way that WeChat is. The common pushback against that is that we already have that - it’s the web browser. The web browser isn’t going anywhere.
But turning the browser into a closed ecosystem that Google gets to set the standard for, harvest the data for, advertise through, and ensure that users are locked in to their version of the experience/data that they collect essentially makes Chrome the one-app.In much the same way that google killed XMPP, Microsoft used its weight to hamstring open document formats - this seems like an effort to thread a rope around the neck of the open internet and use google’s considerable market share to close off the open internet.
Somewhat ironically, we may find ourselves in search of a ‘new, open internet’ if corporations continue to define our current internet.
Maybe we’ll call it “Web 1.0.”Let’s call it the fediverse. :)
deleted by creator
Not the Internet, that is neutral. It is only one large corp that is trying to control the Internet. If everyone boycotts them, then they will fail.
Nah not just one company. Reddit, Twitter, basically every social media, streaming services, every site adding stupid ads and auto playing videos, etc. It all adds up
Fully agree, I was just trying to keep it relevant to Google. All that shit needs to be dropped. As people realize that rather than free, all that shit is really expensive, maybe they’ll make a move.
The vast majority of internet traffic these days goes through a few different portals. Pretty much the few biggest sites. Google/YouTube, Facebook/Instagram/Whatsapp, Tiktok, Reddit, Twitter(“x”)
Most people connect through these through some type of application on a mobile device. Most of these users couldn’t tell you what DRM was or what web standards are. They don’t care, they just wanna look at funny videos and get updates through clickbait headlines.
These people aren’t going to boycott anything. The same thing that reddit is in the process of doing - killing off the old users and considering their power over the average apathetic user - Google is essentially going to try and do.
It’s a scary time. The internet we all grew up with it irreversibly changing.
The asleep will continue to feed those big corps with their blood. The rest of us will move to other solutions. That is life.
Where did you get this idea that the internet is neutral?
https://en.wikipedia.org/wiki/Net_neutrality_in_the_United_States
Neutral like electricity. It is a force that can be used for good or bad. Google is trying to harness that energy for its own profit and control. I wasn’t referring to the structures created to administer it. That is another can of worms.
This. Like for real. I might be in a minority here but but I’m not going to just accept this crap and deal with it. If you implement these changes and your site is not absolutely essential for me then I’m going elsewhere. If 90% of big websites become unusable with my browser then I’m going to hang in the rest 10% with my like-minded folks. I don’t care that it’s quiet and much more slow paced, it’s still better than the shit everyone else is serving and frakly better for my mental health aswell.
I spent like 2 to 3 hours on reddit every single day for 10 years. Then they killed my favourite app and I just quit then and there and haven’t looked back. I have no problem doing that again.
I look forward to this implementation, as it will make it easier for me to see which sites are truly not worth visiting, and which sites are.
Hopefully Google does not implement it seeing there huge backlash.
I have a weak grasp of this, but a developer working on this responded to some criticism.
If the developers working to implement this are to be believed, they are intentionally setting it up so that websites would have an incentive to still allow untrusted (for lack of a better term) clients to access their sites. They do this by intentionally ignoring any trust check request 5% - 10% of the time, to behave as if the client is untrusted, even when it is. This means that if a website decides to only allow trusted clients, they will also be refusing trusted clients 5% - 10% of the time.
The relevant part of the response is quoted here:
WEI prevents ecosystem lock-in through hold-backs
We had proposed a hold-back to prevent lock-in at the platform level. Essentially, some percentage of the time, say 5% or 10%, the WEI attestation would intentionally be omitted, and would look the same as if the user opted-out of WEI or the device is not supported.
This is designed to prevent WEI from becoming “DRM for the web”. Any sites that attempted to restrict browser access based on WEI signals alone would have also restricted access to a significant enough proportion of attestable devices to disincentivize this behavior.
Additionally, and this could be clarified in the explainer more, WEI is an opportunity for developers to use hardware-backed attestation as alternatives to captchas and other privacy-invasive integrity checks.
And what happens when they decide to revoke that 5-10% after they got everyone onboard?
I mean, the same thing that is happening right now, right? The point would be that websites would not be built to only allow trusted clients-- it would still have to allow all clients. If they wanted to remove this 10% thing, it’s not like the entire web would instantly stop being built to allow untrusted clients.
the 10% sounds like bait. Once they’ve got everyone on board and things are running smoothly (for them), it will be muuuch harder to resist.
I’m not sure this is true (keep in mind: weak grasp). This 10% would push websites from specifically blocking untrusted clients-- but if they got rid of the 5%, it would not magically change all the websites to block untrusted clients. They’d still need to update to do this.
I don’t want to come off like I’m defending this though-- I really just don’t know enough to say.
The vast majority of them would not change the default, and a simple mandatory update would change that to 0% without them having to do anything.
Do you think Google will implement this in the end?
Thats such a weird clause to include and is likely just a honeypot. Why even bother allowing unverified browsers to connect, since it invalidates the entire purpose of the trust system? If any bad actor can simply choose to not use the trust system while still having full access, then the system is less than useless for its stated purpose (catch bots/bad faith traffic, ensure no malware) and serves only to decrease the speed and experience of legitimate users.
That opt-out clause won’t last a year once it’s mandatory in Chromium.
An attestation method that randomly fails in 5-10% of cases makes no sense. It’s not attestation anymore, it’s a game of dice. This is blatant rhetoric in response to the DRM criticism. Nobody sane would ever use such a method.
I confess I don’t really understand how it is supposed to work if it’s designed to randomly not work haha. I really hope I’ve made it clear that I lack knowledge in this.
The developers working on this should not be believed and anyone who sees their resume for the rest of time should put it directly in the trash.
If this is the case then what’s actually the point of it?
Yeah but that can be removed at any time. It’s a bit optimistic to think those safeguards would remain when they stand in the way of profit…
The purpose is to make it so websites don’t require a trusted client. If they took that away after the fact, the websites wouldn’t magically switch to requiring trusted clients, wouldn’t they? It would still need to be updated for this. So we’d be pretty much where we are now, with a software change and public outcry about it.
That sounds nice but there’s no guarantee they’ll implement it, or if they do, that they won’t just remove it someday down the road. This could just be a way for them to avoid criticism for now, and when criticism has died down a bit, they can just remove it.
This is a very plausible concern, true.
Tho hopefully Google is force to stop this seeing how much backlash there is.
Maybe this thing will evolve into two webs. One where the majority using Chrome will be, mostly busy watching ads and reading the shitty sites Google has picked for them.
But another where browsers who don’t support this can be. Stuff like Lemmy and forums and other things run by individuals with an interest and passion.
We would still need to use chrome for the official stuff like our bank’s or office websites, but there would be another world out there for people who refuse to accept being subjected to this shit. Alternative websites would shoot up and became popular.
Thanks for such a detailed explanation. I really hope firefox doesn’t follow through with this.
Mozilla has already said they oppose the idea https://github.com/mozilla/standards-positions/issues/852
However, if they do implement it, which I imagine they would if this API actually becomes widespread,
The problem is, is not really possible to implement it in a truly open-source browser, since anyone compiling it themselves (including distro maintainers) would fail the check unless they perfectly match a build approved by the attestor. If it differs from the approved version, that’s specifically what WEI is intended to prevent.
Hasn’t there been this thing that told you you are opening a sketchy website for years?
Why do we need more policing, who asked for that?
If you fall for some scam website then it’s on you. Don’t use the internet. The end.
This isn’t about browsers blocking users from scam websites. This is about websites blocking users with browsers that remove ads.
Wanna talk about web safety? Yeah, et.al now comes up as a website link.
Thanks Google! Thanks for letting pretty much any .bullshit top level domain…
calling other countries TLDs bullshit is quite a take
The original top level domains were only .com, .net, .org, and .gov. Your fancy country top level domains were never part of the original internet plan.
Is that origin.al or not?
Whoops, my bad, I must have made a typo somewhere…
One accidental dot, which happens to be near the letter N on the keyboard, can be the difference between a word and a link.
Do you really wanna see the effects of someone registering origin.al …?
You’re missing the point. Lemme test yet another thing (do not click if this pops up as a link)…
google.bullshit
^ See, I don’t know what dot nonsense they do and don’t accept anymore, but I’m gonna make an educated guess before I post that for at least some users that’ll display as a link.
I think it was a typo, the phrase is usually written “et al.” which cannot be confused with a domain.
That et is exclusively for Albanian use.
Perhaps it was.
goggle.com was once a typo as well.
Also, what’s the difference between a typo and an autocorrect glitch unnoticed?
If one single dot is the difference between legit words vs a janky link, the internet is doomed.
attachment.zip
You do realize another way to write et al is…
et. al.
Miss one space, bam, your typo turns into a link these days.
That’s on you, not the internet or google. As has been pointed out, dot al is a TLD for a country. Just because you can’t type properly and didn’t spell check yourself, doesn’t mean the internet is doomed.
You make typo, send it to friend, friend clicks link…
Is that origin.al or not?
“et” doesn’t need to be abbreviated, it’s a full word. “al.” is short for “alia”.
You could argue that typos shouldn’t get turned into links, but there’s simply no good way of stopping that from happening.
Yes, yes there most certainly is a way to completely prevent that from ever happening.
Get rid of this whole automatic link detecting shit altogether and require the use of https:// before every single link.
Believe it or not, that’s how the internet used to work, and we didn’t have stupid shit like attachment.zip
deleted by creator
That’s the TLD of Albania though
And? et.al is used in practically all USA legal documents.
So what, all our legal documents are supposed to link to Albania now?
Cuz that’s how this shit tries to work now.
As others have pointed out, it’s actually: et al.
You’re mad about nothing.
Congrats, you must not make any typos. I guess nobody else makes any tpyos either according to your statistics.
One wrong dot, one wrong space, suddenly legit text becomes an unexpected, unintended link.
Of course I make typos :) But
.al
is the top-level domain of a country. This is the original purpose of the system. If you type something that looks like a valid domain, and this is a valid domain, why not make it a link? Maybe I mistook your point all along. Why don’t you think this should be a link?I would agree that we have too many useless TLDs, and Google did help in spreading more domains, but I just don’t think this is a case where it applies.
The original top level domains were .com, .net, .org, and .gov. Your fancy country top level domains were never part of the original internet plan.
Is that origin.al or not?
Whoops, my bad, I must have made a typo somewhere…
deleted by creator
You can tell me what’s incorrect all day long. Doesn’t matter. Many people can’t spell to save their life, plus autocorrect likes to screw with people as well.
If one accidental character is the difference between a legal term and a link, the world is soon to be fucked.
Just wait until someone registers et.al…
The grandparent commenter should’ve written “et al.” instead. The “alii” is the abbreviated part, not the “et”.
Agreed about the bullshit TLDs, by the way.
How is that Google’s fault?
Cuz Google somehow or another managed to get .zip passed as a top level domain.
attachment.zip
^ That’s not even written out as a markdown link, that’s literally just the letters I typed.
See, clearly Google is getting a bit careless about their top level domains. Lemme try a thing…
^ If this shows up as a link, please don’t click. Or at least proceed with caution.
Probably just depends on how websites or apps handle it? I’m using Liftoff and that’s not a link for me
Strangely, I’ve discovered that for whatever reason, Liftoff isn’t working right (as per current standards).
Hmm interesting, it probably just needs to be updated
Does show up as a link for me. Using Jerboa for Android.
Mozilla has already posted their protest to this…
Sure, they’re against it, but if it gets implemented by Chrome and by many major websites, they won’t have a choice but to implement it as well. Otherwise, their browser just won’t work and people will have to use Chromium browsers or nothing at all.
Honestly, they could have good grounds for an antitrust lawsuit if this API comes to pass and everyone uses Google attestation servers. It’s gardenwalling the browser space just like Microsoft was.
Honestly, they could have good grounds for an antitrust lawsuit
And what was the last successful antitrust suit? It wasn’t Microsoft. They just dragged out the trial until they had a favorable administration settle with them.
That would be a great anti trust suit if the US actually enforced anti trust laws, but they don’t. If you’re not already a dominant semi-monopoly, you can buy and do whatever honestly.
Then don’t use Google. I’m slowly but surely working towards degoogling myself. Not there quite yet, but I’m working on it.
^ Free anonymous email, for the B/S that asks for an email when they got no business with one.
The whole point is that non-Chromium browsers might lose functionality on a significant portion of major websites. Imagine if Amazon, Netflix, and Youtube suddenly stopped working in Firefox. How many Firefox users would tolerate that?
You are not limited to using one browser at a time. Use firefox as much as you please. You can use google if you must.
Sure, because the average user won’t think his Firefox to be broken and just switch to chrome altogether. Chrome has no issue with that site after all. Once enough pages have it even most technically inclined people will probably not want to constantly juggle between browsers, just to use their banking site or whatever.
I don’t use Amazon or Netflix in the first place. Plus the FTC is going after Amazon anyways…
https://chat.maiion.com/post/179544
Act like I care…
Ok, but you can see how perhaps other people might care, right? Like you’re not a complete psychopath, right?
Not my problem you people fell into the corporate trap. I saw it coming as far back as 2011.
I like your tude dude!
Everyone is whining while still holding on to big corps balls.
I don’t care about x thing, it doesn’t affect me
Then next year
Why are they killing “y thing that affects me”
Attitudes like that are a big factor in our current culture war.
I’m using actual solid services established in the late 90’s that still work today. I have terabytes of data stored, and it’s not through Google or iCloud.
Don’t care, I already predicted half of this crap over a decade ago. Nobody wants to listen to me though.
Oh well, not my problem.
Your recommendation isn’t wrong, but it’s a mistake to think problems like this can be solved with a mere boycott. This absolutely requires consumer protection legislation.
Funny, note that that website uses DRM content. I have DRM disabled on Firefox and when I visit that site I get two DRM warnings.
I noticed that too after I posted that comment. Must be a recent change. ☹️
Mozilla has been bullied exactly this way in the past into implementing DRM measures I believe.
I already use ff and if there’s a site that requires drm to work, i don’t care for that site. They need visitors not the other way around.
I don’t understand why others like Brave, Opera, Vivaldi…etc are silent on this big of a threat.
Vivaldi has a post on their blog regarding this.
Glad to see them joining the protest. I hope this doesn’t get implemented.
Most of those are Chrome with a different front end…
https://lemmy.today/post/158549
It hasn’t even been a week. They’re responding
You just listed a bunch of Chromium browsers. They don’t care.
Yes. If it becomes a success on Chrome, other interested parties will pressure Firefox to adopt the standard as well.
I doubt Firefox will give in. Much more likely is that websites start blocking it until you cannot use the internet without Chromium
Firefox will be in a tight corner assuming every other browser vendor picks this up. They can decide to go against it but Firefox does not live in isolation.
https://hacks.mozilla.org/2014/05/reconciling-mozillas-mission-and-w3c-eme/
What everybody seems to be forgetting is that there is a ton of web-content fetching being done right now which is not done by browsers.
For example, all the webcrawlers doing indexing for search engines.
Consider the small possibility that any major website that does this either becomes inaccessible for any webcrawler which does not implement this (say, those indexing sites for search engines other than Google’s) or has exceptions for webcrawlers which are one big backdoor for browsers to also come in (in fact a number of paywall-bypassing solutions relly on sending the right HTTP headers to use exactly existing exceptions for webcrawlers).
Even webcrawlers implementing this are relying on “integrity validation” servers from a 3rd party (I bet that’s going to be Google) so think about how Google can interfere here with 3rd party webcrawlers by merelly throttling down integrity validation responses for those.
Oh, and open source webcrawler implementations can forget all about being validated.
(By the way, this would also impact any AI-data gathering webcrawlers that don’t use APIs to get the data but rather go in via the web interface)
This is quite possibly a far bigger play by Google than just for browser and Ad dominance.
Only if they proceed AND websites enforce it. The last reply I read from the Googler that was part of the draft spec said they were building in a guardrail that prevents sites from outright blocking non-compliant clients without also blocking a not insignificant portion of their desired userbase.
To me, it sounded like they’d just randomly not send the DRM information sometimes. So, the fix for web sites would be to tell the user to reload until the information is passed along.
That’s pretty terrible UX, though. I think it’s more likely that websites will continue integrating a CAPTCHA service and that service will simply try to short-circuit its decision by asking for attestation. If none is given the user gets to click on pictures of street lights.
You seem to be more worried about UX than those sites. At least in the EU, the user has to click through a multi-step wizard about cookie stuff to get to any content on every site these days. This wizard is not mandated by law, but these sites choose to use it anyways, just to squeeze a bit more money out of their visitors.
Why wouldn’t they have no guardrails at all so they can just block non-compliant browsers? Isn’t that their goal?
The devs responsible for this say their goal is to detect bots, but make sure it doesn’t harm people not using this tech. I’m actually inclined to be believe them. The problem is that those guardrails could turn out to be ineffective, or Google could decide to just disable them at some point.
Yes, because it will implicitly discourage the use of any other type of browser
Sites that implement it will drop traffic which will lower ad revenue which will mean less money.
If browsers that do not implement it gain a higher market share it might have them miss out on so much money that they don’t block the browser or not implement the drm.
it’s a battle of browsers now and i am happy to stand on the site of firefox.
Seems like it’s much more than a discouragement
You know those movies where aliens attack earth and we always win? I think these outcomes are mostly true because I’ve said it before and I’ll say it again, there’s nothing humans can’t ruin. Whether it’s meeting your family at the arrival gate or alien societies we’ll destroy it. The internet is just the next thing.
It will affect for some sites, not for others. You’ll no longer be able to bypass paywalls to read news, for example, because those sites will most likely adopt the DRM. Some streaming services may do the same, maybe even some social networks. But places like lemmy will still be generally unnaffected.
Why would you not be able to bypass news paywalls? As long as one user pays for the service they can then crawl the site and host the content on a separate site.
I really hope they cancel this
All lemmy links to another site wouldnt work then… lol
How likely is Google to implement this?
I’m sure they will use it on their own products, so maybe it’s time to replace google products?
I will probably switch to protonmail when gmail won’t be accessible from firefox
Maybe we go back to Peer to Peer?
Centrally hosted websites are perfectly fine, they just need to not opt-in this tech.