-
Notifications
You must be signed in to change notification settings - Fork 16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Align our Duty of Loyalty with Richards/Hartzog's #241
Comments
|
This looks like a misreading of the text we have. If that's indeed the case, I'll work on a tweak to make it clearer. The text doesn't criticise "any processing that benefits someone other than the user" (that would be very broad, there would be almost no processing), rather it talks about "processing that is not in the person's interest but instead benefits another actor." If the processing is in the person's interest AND in someone else's too, then that's fine. I don't see how that intersects with group privacy. There can be articulations of privacy with other principles as we regularly discuss that would justify sharing data in ways that are detrimental to a person but good for the group (eg. revealing a politician's misdeeds). But I think that's separate (and also not about group privacy). This isn't to say that I don't agree with your point that paying browser vendors to shape the search market isn't a real problem. It is, you're right in concluding that it's a violation of the duty of loyalty, and you're right that it has a significantly negative impact on people's privacy. The W3C should indeed be working on fixing that, but I don't think that this is the place to solve this? You seem to be implying that Mozilla would get a pass for this violation of its duty of loyalty because providing Firefox is in the public's interest. Again, I think that someone who wanted to justify this practice would have to rely on other principles. I don't want to put this group in a position to have to decide what's in the public interest and gets a pass on this violation, and what isn't. We'd also risk landing in the strange place where Firefox gets a pass because they're public interest and need the money, but Safari and Chrome don't? I would also find it hard to justify that somehow being disloyal is excusable for a public-interest benefit but other violations of privacy aren't. A lot of entities today are funded by advertising methods that violate people's privacy, and many are arguably doing public-interest work (at the very least group-valuable). They might then be justified in asking for exemptions. I think it's easier to keep it conceptually simple by sticking to what is widely understood as agent disloyalty: if your insurance agent takes payments to systematically recommend one specific insurance company despite doubting that it's in their clients' interest, that's still self-dealing even if they use the money to do good things! Again, I really don't want to dismiss the concern that browsers are generally not acting as user agents when it comes to search. It's one of the Web's biggest issues and it has a lot of negative impact. I just don't think that we should either 1) solve that here or 2) carve out an exemption for it. |
|
The core of the issue here is that we're misrepresenting what our citation ([Taking-Trust-Seriously]) says. We're allowed to describe principles that go beyond the academics, of course, but we need to say if that's what we're doing, instead of our current text of "Its user agent duties include ([Taking-Trust-Seriously]): [not what [Taking-Trust-Seriously] says]". The issues around group privacy (where an individual's interest can conflict with a group's privacy interest) and Mozilla's business model are, I think, an indication of the danger of going beyond the academics. They had some reason to write the limited principle they did, and if we don't consider that reason, our principles are likely to be bad in some way. We wouldn't add carve-outs to our principle to deal with them; instead we'd use the academics' principle as they wrote it, which doesn't need carve-outs. |
|
Do you have an example of a case that would run afoul of the principle as we formulated it but wouldn't in the cited formulation? I think that I understand your point when I consider it in the abstract, but the examples you give don't connect for me. (If the concern is direct mapping to the literature, we can expand or modify the references list.) Keep in mind that TTS is written for data fiduciaries in general whereas this section is about user agents. It's pretty well established on the web that browsers have stronger obligations than arbitrary sites. For group privacy, I'm not sure that I can think of an example that wouldn't involve balancing with another principle in a generic way? For default search, it's hard to imagine this not running afoul of even the weakest possible definition of fiduciary? This is literally a trusted party acting in a way that is detrimental to its users (as browser developers aren't shy of stating) for pay. Again, this isn't about pointing fingers — I spent five years working for an ad-supported media company, I totally get why people can choose to do things they feel aren't right to keep the lights on until they figure out a better way. But it's hard to see how we could tell people with a straight face that sites sharing data with third parties, who they might have reviewed with great care, is bad, but browsers selling traffic to the highest bidder is somehow fine. I mean, if someone were looking for evidence that privacy is being used as an excuse for ulterior motives, they'd have a field day with this kind of arbitrary exemption :-/ |
|
|
|
@darobin to rephrase:
Ask Richards/Hartzog for input. |
* 'Make it clearer that this is detrimental to people, fixes #241' * Update index.html Co-authored-by: Jeffrey Yasskin <jyasskin@google.com> * 'more disloyalty less self-dealing' * Update index.html * Update index.html Co-authored-by: Wendy Seltzer <wseltzer@w3.org> * Update index.html Co-authored-by: Jeffrey Yasskin <jyasskin@google.com> --------- Co-authored-by: Robin Berjon <robin.berjon@nytimes.com> Co-authored-by: Jeffrey Yasskin <jyasskin@google.com> Co-authored-by: Wendy Seltzer <wseltzer@w3.org>
@michaelkleber noticed that our duty of loyalty conflicts with notions of group privacy because it criticizes any processing that benefits someone other than the user. This led me to look for the definition of loyalty in Taking Trust Seriously in Privacy Law, and it turns out that we've exaggerated what our citation said:
Aligning with the academic duty of loyalty would allow practices like the one that funds Mozilla, selling the default search provider.
The text was updated successfully, but these errors were encountered: