-
Notifications
You must be signed in to change notification settings - Fork 78
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Needs provisions for encrypting content for privacy #225
Comments
So there are two ways to encrypt things. You can encrypt things between servers, which is done via SSL/TLS. You could also give each user a public/private keypair on the server, but at that point you're nearly in "as good/bad" of a situation as HTTPS because a server administrator can still watch things using your private key. (Though maybe you've gotten rid of some ceritificate authority attacks.) What I suspect you really want is encryption from user to user where a server administrator can't observe or interfere with the contents. It's a desirable feature, but it has consequences: side effects performed by the server upon federated activities are pretty much impossible in a client-server model, as ActivityPub currently operates under. How can a server update the followers list when it doesn't know there is another follower? Assuming you're using https from end to end, it's probably not a very compatible mechanism for ActivityPub's federated side effect model, since we're trying to capture federation as it works between servers. However, it might actually be fine for some activities to work fine on end to end encryption. If you encrypt the object, it can be retrieved in client to server operation, and the user's client could still display the activity, but no server-sided side effects could be performed. This is probably fine for "just showing messages"; it would be a lot like the way encrypted email works. So, we could maybe have a wrapper object which you could imagine looking like:
You can still pass that around between servers. The encryptedPayload can be decrypted and transformed into its own appropriate object. However, if we're going to explore this, it's too late in the incubation game for this to get into ActivityPub itself. It would have to go in as an extension. However, I think such an extension is possible, and it may be worth exploring in the Verifiable Claims Web Incubator Community Group. |
I've captured this as a possible extension to explore in the community group. |
Note that there's one more way to do this, which "already is possible" in ActivityPub with no extensions, sneakily! ActivityPub does not mandate |
@brianolson What do you think of the above? Does exploring this as an extension, in the way I described above, make sense to you? |
End-to-end encryption would have similar requirements and problems as true peer-to-peer, right?(Currently, ActivityPub is federated like email and XMPP, not peer-to-peer like BitTorrent.) @cwebber ActivityPub over HTTP/Onion is transparent, it's just that you never get an IP address for the provider, and any federators that want to talk to it have to have the Tor software installed and configured. |
@astronouth7303 Similar and problems, yes... the suggestion I gave above for the encryptedPayload is not really true peer to peer, it's closer to GPG wrapped email content. (Note, as we discussed on the call, the easiest way to avoid having to trust your administrator is to administrate your own server!) |
The working group members discussed this today btw, and while there wasn't a formal resolution, people generally agreed that this should be handled as an extension in the Community Group and not in the ActivityPub spec itself. @brianolson Would you like to join the Community Group to discuss this? I've put it on tomorrow's agenda: https://www.w3.org/wiki/SocialCG/2017-05-31#Topics |
Given the troubles with getting people to adopt end-to-end keypair encryption (recently done a little better by Open Whisper System's "Signal" app) I think I'd be happy with a compromise of having the server hold the keypair and an end user picking a service provider of their choice that's hopefully trustworthy and competent to run a secure server and not snoop on their users. e.g. I use GMail, I basically trust Google a lot, it wouldn't be a stretch for them to hold a encrypted email key pair on their server and show the data to me over HTTPS but the data moves through other email systems in a secure way to other users and their preferred system. |
@brianolson Such a thing could be done in an extension, especially assuming you already have the user's public key linked up with Linked Data Signatures, but what practical security aspect does it give you in improvement over communications done with an TLS/SSL connection? Let's do an analysis: End to end encryption:
TLS encrypted, no private-key-per-user:
TLS encrypted, private-key-per-user but the server manages it:
In the latter scenario, the only advantage you get over ssl/tls is protection against a possible badly acting certificate authority, which is hopefully rare. The administrator can still snoop on all your communication. Am I wrong? |
Note that if you're vulnerable to CA attacks, you may well be screwed anyway if you have any sort of password authentication or are vulnerable to other forms of session hijacking. |
One thing caught on the CG call today is that the biggest challenge isn't the encryption, it's key management (which is how Signal, Telegram, etc make things easier). |
Note also that when we discussed this topic both in the Working Group and then Community Group, one point that came up was the fact that end-to-end encryption is less important in a federated context. In other words, one answer to the problem that E2E encryption solves is "run your own server." To be clear, that doesn't mean we don't want to explore this. But it does mean that it's... less urgent. One other thing brought up in the CG call that @cwebber referenced is that we're not sure whether we have enough expertise to properly design a robust cryptographic system, and we want to make sure we don't do this in an insecure way. |
While I do agree with this, one caveat I'd like to give is that it's very difficult for people to fully trust their servers these days. Most are hosted in some large datacenter by some provider that could either have a mass breach or give in to some investigative authority without the user ever knowing. So I definitely think "run your own server" alleviates much of it, but not everyone has the skills to do so, and it's not a panacea either. :) |
Is peer to peer communication completely insecure, or is it mainly meant for certain kinds of websites? The peer to peer I'm mainly familiar with a Zeronet, which I used to make an author website. It seems like even in peer to peer, there is still the issue of trusting the peer is who they say they are. In a programming language like Ruby, it's easier for me to imagine a solution, but for stuff like CSS, PHP, Javascript, or HTML I'm completely at a loss. The solution involves something like: public keys are picked randomly from the get go, and the public key given to the correspondent. Because of the nature of "Key Dice" the end user has to try all possible keys in order to decrypt the communication. But the keys themselves are not transmitted as each user has all possible keys. If they want to find out the key specifically, they answer a specific question only that end user could possibly know, that just something breaking into the system wouldn't. I'm not really certain there could be an equivalent on a website. But a little outside my area. |
An example of a decent (but slightly prone to malware if you use Windows or Macs) system would be something like Aram Barthols dead drop offline file sharing network. But the issue is still determining the end user is who they say they are. I was considering soldering a thumb drive to a toy drone, though it may not work for everyone. Might point being that it seems like the only secure method these days would be something like Offline networks or at least Patchwork/Scuttlebutt. |
Effectively this launches into "how to build a web of trust" conversation. I've talked about that in a Rebooting Web of Trust paper but it's a bit of a rabbit hole, in what's already a rabbit hole thread. Anyway it's been a while since @brianolson wrote the original post and we never got commenter feedback on what they thought here so I think that means commenter timeout? |
I think one advantage not listed way back of 'server managed key-pair per user' is that data for that user can be encrypted at rest on the server storage and only decrypted in memory while the user has a valid login session. Each user's data is encrypted separately, so if the server is compromised there is still a decryption attack per-user to actually get the data back. Some will fall to weak passwords but most of the data could be fine. |
Sorry to necropost, but I haven't found anywhere that this discussion has continued. |
If we restrict to direct message (with a single recipient) one may just encrypt the message (Note) payload using the public key of the recipient. So, if the recipient holds her own private key, she can decrypt the message payload. The main obstacle I see is the common practice to use HTTP signature, so that private kays must resides on server and not on clients. |
Given the modern environment of state and commercial actors and their pressures to censor, silence, or exploit social networks; a new federated social network protocol should have a strong focus on private communications.
See also "The Moral Character of Cryptographic Work"
http://web.cs.ucdavis.edu/~rogaway/papers/moral-fn.pdf
The text was updated successfully, but these errors were encountered: