-
Notifications
You must be signed in to change notification settings - Fork 229
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make Privacy Sandbox API unavailable for users that have opted out #335
Comments
Unfortunately, we have seen many historical cases where web sites react to user choices like "turn off third-party cookies" by pressuring the user to change their settings, sometimes including blocking website access unless they do. So revealing the user's setting in the way you suggest turns out to be harmful to people's ability to choose and to browser privacy goals. |
Yes, and a detectable opt in/out would also add to the browser's fingerprinting surface. |
@michaelkleber sorry very late to this thread but I'd like to understand better why the publisher choosing to block access to their content if a particular monetization path isn't available, isn't acceptable? If the publisher wants to make a choice to deny access to their content, and they lose all their viewers, they can course correct and try a different monetization path (which may include a walled garden that has it's own impact on consumers), but on the face of it that seems a valid choice for the publisher to ask the consumer to make w/r/t content they own. |
Web sites are welcome to allow or deny access to their content, including charging for it. But the browser's job is to be the user's agent. So browsers often let the user make choices, and certainly should prefer those choices to be free from coercion. |
Thanks for the fast response! I'm still trying to connect some dots here, when we say "user's agent" and "free from coercion" what do we mean?
I think when we say "harmful to people's ability to choose and to browser privacy goals" it sounds like we're implicitly saying "harmful to people's ability to choose and to browser privacy goals while keeping content available under certain terms". If a publisher believes the monetization through Fledge isn't sufficient, is their only option to require a paywall on any page view from Chrome browser? Providing user configuration options with privacy centric defaults, even aggressively privacy centric defaults, would balance the interests and allow for an "evolving conversation" that users and publishers can have to determine the direction(s) people want to go, and the dots I can't yet connect are that allowing that is coercive. |
This is not a specific choice about one API, it's a general way that browsers prefer to implement choices for the user. When a user takes a browser action like "use incognito mode", the browser could of course have some signal of "This User Is In Incognito Mode", which broadcasts the user's choice to anyone who cares. Browsers have instead chosen to interpret the user's choice as a preference expressed to the browser but not revealed to every website they visit, as best as possible — so a user in incognito mode should look like a regular user who has just started using this browser and does not have any prior history or stored data. The browser is acting on the user's behalf here, and that generally should involve revealing user settings to websites only when necessary. Specifically with regards to the Protected Audience API (now renamed from FLEDGE), many visitors to a website might well have zero Interest Groups stored on their browser. This could be because they have not visited many commercially-relevant websites recently, or because they have cleared some data, or because they are in incognito mode, or because they have chosen to turn off the API using a browser-provided control. There is no reason that the browser needs to reveal any of theses situations to websites the user visits. Indeed one goal of these new ads-related APIs is explicitly to not reveal ad-related information to the sites you visit, so the argument against doing so here is stronger even than it is in the general case. |
Had to meditate on this one quite a bit, the "Incognito Mode" point made me think a lot, as I certainly agree that having a signal for Incognito Mode would seem odd; I don't think I'd like a world where the publisher can say "add yourself to at least 4 Interest Groups to see this content!!!" It has caused my brain to segfault trying to put words to my reactions. First, one point of clarity, when you say (Also, as to why I'm pushing for "conversation" and what I mean: I think we're generally framing this as improving data protections, which is true but I think incomplete; it would be more precise, nerdy, grandiose, and fun to say that what we're actually doing is "a) improving society's Privacy-ContentQuality-Monetization-Competitiveness Pareto Curve, and b) choosing a point on it for people". I think (a), the work to push the curve out and enable more of each, is great and why I'm excited to work on this; I think if we choose (b), the point on the curve, we're making a mistake. Giving consumers and publishers tools to let the market choose (b) is more "pro-ecosystem", and will let the balance of power in the industry move in a more organic, less directed, way, which I think has value for all participants.) Specifically on the Incognito Mode case, I think there's daylight between the two. In the case of Incognito Mode the user makes a choice by clicking a button which gives her user-agent instructions that has certain behaviors associated with it, different from the default behaviors. With Privacy Sandbox (PaAPI et al) we're changing the default behaviors. Finally, specifically for the original ask, which is asking about preserving resources when they aren't useful due to opt out, could you also maintain the API, but the code inside Chromium just knows to do nothing, and it "silently saves resources"? |
What you are asking for sounds like moving the balance of power away from the person using the browser, which is pretty contrary to the goal of the browser as a user agent.
Yes indeed — that is what we do when it doesn't cause an undesirable leak of private information. |
Chrome advises that ad tech checks whether the relevant API is available before using it, using the approaches documented here.
However, if a user opts-out of Privacy Sandbox trials (e.g., by visiting chrome://settings/privacySandbox), the sandbox APIs still appear exposed to javascript.
This presents a challenge, particularly for FLEDGE, as significant resources and latency can be incurred when attempting call FLEDGE's runAdAuction. For example, if sellers believe FLEDGE is eligible to run, they may:
In contrast, if access to an API is disabled via a permissions policy, callers can check whether the feature is blocked before calling the relevant API.
Ask:
The text was updated successfully, but these errors were encountered: