New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Media content caching strategy #1847

Closed
Gargron opened this Issue Apr 15, 2017 · 121 comments

Comments

Projects
None yet
@Gargron
Member

Gargron commented Apr 15, 2017

Right now, Mastodon downloads local copies of:

  • avatar
  • header
  • status media attachments

On these local copies, Mastodon can perform operations like resizing, optimizing, creating thumbnails that fit Mastodon's UI - because the origin of the content can provide media in super large sizes that would severely impact end-user's bandwidth and browser performance if just displayed verbatim.

Moreover, bandwidth is not always cheap - it is capped to something like 1TB/mo on DigitalOcean, and is super expensive on Amazon S3, so hotlinking images and videos would severily impact owners of small instances, when lots of users of large instances would view their content from their public timelines (or even just home timelines through boosts). It does feel fair that an instance's admin is responsible for serving content to their own users, rather than also to users of other instances, which should be their admins' responsibilities.

However, this has storage and legal implications. I would like to hear your thoughts on how this can be improved.

@BjarniRunar

This comment has been minimized.

Show comment
Hide comment
@BjarniRunar

BjarniRunar Apr 15, 2017

Potentially low hanging fruit: You could reduce (but not eliminate) exposure if caching was disabled (or TTLs set very low, or caching limited to RAM-only on a swapless machine) for content that was tagged #nsfw.

More complicated, is to have feedback from the blocking/reporting into the cache layer, to quickly purge data users have flagged as objectionable. This is a rabbit-hole of complexity, but probably worth doing if you intend to keep the cache.
...

Another benefit to an instance loading media on behalf of its users, is it improves slightly the privacy if the instance users. Browsing the federated timeline or boosted tweets won't automatically leak your IP to an instance you have no pre-existing relationship with.

Yet another benefit: resizing images may disable/thwart exploits based on corrupt data (the instance itself is at higher risk of this though, and browsers are arguably better hardened/tested than the image conversion libraries used server-side).

I see a lot of benefits to what you are currently doing and I think it is the right thing for both users and the health of the network. However, the risk to admins is real and serious. Just my 2c, hope this is helpful. :-)

BjarniRunar commented Apr 15, 2017

Potentially low hanging fruit: You could reduce (but not eliminate) exposure if caching was disabled (or TTLs set very low, or caching limited to RAM-only on a swapless machine) for content that was tagged #nsfw.

More complicated, is to have feedback from the blocking/reporting into the cache layer, to quickly purge data users have flagged as objectionable. This is a rabbit-hole of complexity, but probably worth doing if you intend to keep the cache.
...

Another benefit to an instance loading media on behalf of its users, is it improves slightly the privacy if the instance users. Browsing the federated timeline or boosted tweets won't automatically leak your IP to an instance you have no pre-existing relationship with.

Yet another benefit: resizing images may disable/thwart exploits based on corrupt data (the instance itself is at higher risk of this though, and browsers are arguably better hardened/tested than the image conversion libraries used server-side).

I see a lot of benefits to what you are currently doing and I think it is the right thing for both users and the health of the network. However, the risk to admins is real and serious. Just my 2c, hope this is helpful. :-)

@tyrosinase

This comment has been minimized.

Show comment
Hide comment
@tyrosinase

tyrosinase Apr 15, 2017

I'm seeing people hotlink to images from Twitter, so that's also a consideration: if you freely allow hotlinking you run into the potential for people to use your instance as free image caching. Kind of a separate-but-related issue.

tyrosinase commented Apr 15, 2017

I'm seeing people hotlink to images from Twitter, so that's also a consideration: if you freely allow hotlinking you run into the potential for people to use your instance as free image caching. Kind of a separate-but-related issue.

@spikewilliams

This comment has been minimized.

Show comment
Hide comment
@spikewilliams

spikewilliams Apr 15, 2017

I like the principal that an instance should be responsible for serving content to their own users, but I wonder if there should be a distinction between short-term and long-term storage.

Most of the traffic for any given piece of media will occur within 24 hours. After a certain period - 7 days? 30 days? - its mostly just being kept around for archival purposes. At that point, it may make sense to revert to the hosting by the original instance (problematic if that instance goes offline) or to some third party host or federation of hosts - and the instance can negotiate retrieval if the image gets requested again.

I am intrigued by Swarm as a peer-to-peer means of long-term data caching.

spikewilliams commented Apr 15, 2017

I like the principal that an instance should be responsible for serving content to their own users, but I wonder if there should be a distinction between short-term and long-term storage.

Most of the traffic for any given piece of media will occur within 24 hours. After a certain period - 7 days? 30 days? - its mostly just being kept around for archival purposes. At that point, it may make sense to revert to the hosting by the original instance (problematic if that instance goes offline) or to some third party host or federation of hosts - and the instance can negotiate retrieval if the image gets requested again.

I am intrigued by Swarm as a peer-to-peer means of long-term data caching.

@ghost

This comment has been minimized.

Show comment
Hide comment
@ghost

ghost Apr 15, 2017

Maybe you can add another class of trusted instances. Now you can silence or suspend instances, but if you add the option to trust certain instances, you can take actions on them. Like this:

Trusted instances: Cache for a longer time and locally keep all media .
Normal instances: Cache for a shorter time and locally keep all media for a limited time (e.g. 1 month and after that hotlink)
Silenced instances: Cache for a shorter time and hotlink all media. Silence on federated timeline.
Suspended instances: No caching, no hotlinking. Block all communication.

ghost commented Apr 15, 2017

Maybe you can add another class of trusted instances. Now you can silence or suspend instances, but if you add the option to trust certain instances, you can take actions on them. Like this:

Trusted instances: Cache for a longer time and locally keep all media .
Normal instances: Cache for a shorter time and locally keep all media for a limited time (e.g. 1 month and after that hotlink)
Silenced instances: Cache for a shorter time and hotlink all media. Silence on federated timeline.
Suspended instances: No caching, no hotlinking. Block all communication.

@vielmetti

This comment has been minimized.

Show comment
Hide comment
@vielmetti

vielmetti Apr 15, 2017

If people are storing on an expensive network they might look at self-hosting with Minio as their server back end, and then they can manage that storage queue themselves.

Minio will also federate across servers, so conceptually n servers could set up 2n+1 spindles, connect them all together, and have a shared cached file system.

vielmetti commented Apr 15, 2017

If people are storing on an expensive network they might look at self-hosting with Minio as their server back end, and then they can manage that storage queue themselves.

Minio will also federate across servers, so conceptually n servers could set up 2n+1 spindles, connect them all together, and have a shared cached file system.

@Technowix

This comment has been minimized.

Show comment
Hide comment
@Technowix

Technowix Apr 15, 2017

Contributor

I would more like "no cache at all" for "Normal" ones, since legal don't take into account the "time" we host it...

Contributor

Technowix commented Apr 15, 2017

I would more like "no cache at all" for "Normal" ones, since legal don't take into account the "time" we host it...

@maethor

This comment has been minimized.

Show comment
Hide comment
@maethor

maethor Apr 15, 2017

@Gargron Today, what is the TTL of the cache? It seems to be 14 days, but we are not sure about it.

I believe the current system is good because it protects users, and this should be the absolute priority (technically, but also privacy speaking). I really think hotlinking medias would be a bad idea, because of privacy. It would be ever worse if you hotlink « bad instances ».

Maybe you could allow the admins to configure the TTL? If I don't want any risk, I put a TTL of 1 minute, but I'm advise that it will be a little CPU intensive. Maybe I will do 1 day, or 1 week if I am careful. Maybe I will do 1 month if I don't care (personal instance, for example).

Another idea could be to separate the instance medias from the local copies. The admins could then have a sense of what is consuming storage.

maethor commented Apr 15, 2017

@Gargron Today, what is the TTL of the cache? It seems to be 14 days, but we are not sure about it.

I believe the current system is good because it protects users, and this should be the absolute priority (technically, but also privacy speaking). I really think hotlinking medias would be a bad idea, because of privacy. It would be ever worse if you hotlink « bad instances ».

Maybe you could allow the admins to configure the TTL? If I don't want any risk, I put a TTL of 1 minute, but I'm advise that it will be a little CPU intensive. Maybe I will do 1 day, or 1 week if I am careful. Maybe I will do 1 month if I don't care (personal instance, for example).

Another idea could be to separate the instance medias from the local copies. The admins could then have a sense of what is consuming storage.

@patf

This comment has been minimized.

Show comment
Hide comment
@patf

patf Apr 15, 2017

Contributor

Extending the existing domain blocking feature to allow admins to choose not to cache media content from certain instances (without having to suspend them) could be a viable (and relatively easy-to-implement) approach.

// Edit: Turns out there already is a hidden reject_media domain block type, so that's great news.

Contributor

patf commented Apr 15, 2017

Extending the existing domain blocking feature to allow admins to choose not to cache media content from certain instances (without having to suspend them) could be a viable (and relatively easy-to-implement) approach.

// Edit: Turns out there already is a hidden reject_media domain block type, so that's great news.

@ldidry

This comment has been minimized.

Show comment
Hide comment
@ldidry

ldidry Apr 15, 2017

Contributor

We could use a Camo instance: the URL of the image is still yourinstance.tld… but Camo proxies the request to the actual server that have the image. And for caching, we could put a Varnish between Nginx and Camo. It's working for images, but I don't know if it will work for mp4.

This way, your instance would never download content from other instances.

Contributor

ldidry commented Apr 15, 2017

We could use a Camo instance: the URL of the image is still yourinstance.tld… but Camo proxies the request to the actual server that have the image. And for caching, we could put a Varnish between Nginx and Camo. It's working for images, but I don't know if it will work for mp4.

This way, your instance would never download content from other instances.

@Technowix

This comment has been minimized.

Show comment
Hide comment
@Technowix

Technowix Apr 15, 2017

Contributor

But it's still distributing it...

Contributor

Technowix commented Apr 15, 2017

But it's still distributing it...

@maethor

This comment has been minimized.

Show comment
Hide comment
@maethor

maethor Apr 15, 2017

Which is what we want, I think.

maethor commented Apr 15, 2017

Which is what we want, I think.

@Technowix

This comment has been minimized.

Show comment
Hide comment
@Technowix

Technowix Apr 15, 2017

Contributor

@maethor I'm worried of "illegal" content coming "from" my instance tbh, proxying stuff make it "on" my instance in a legal way :/

Contributor

Technowix commented Apr 15, 2017

@maethor I'm worried of "illegal" content coming "from" my instance tbh, proxying stuff make it "on" my instance in a legal way :/

@Gargron

This comment has been minimized.

Show comment
Hide comment
@Gargron

Gargron Apr 15, 2017

Member

@maethor Current TTL is "forever". I think this is part of the things that need to be adjusted

@ldidry With Camo it seems like you would be doing the same as we're doing now, but with more effort since you'd still need to implement the actual caching oni top of it. Perhaps just adding TTLs on the current system would be better?

Member

Gargron commented Apr 15, 2017

@maethor Current TTL is "forever". I think this is part of the things that need to be adjusted

@ldidry With Camo it seems like you would be doing the same as we're doing now, but with more effort since you'd still need to implement the actual caching oni top of it. Perhaps just adding TTLs on the current system would be better?

@gellenburg

This comment has been minimized.

Show comment
Hide comment
@gellenburg

gellenburg Apr 15, 2017

I'm unorigmoniker@mastodon.social. Moving my arguments over here because I feel they're worth considering.

The solution to address @Technowix's concern is to not cache remote images. That's the only way you're going to address that concern, and the next time an instance admin posts loli or shota or outright CP.

The solution to address @BjarniRunar's concern about user privacy is to place the onus for user privacy on the user.

Here me out.

It is my opinion that instance admins don't have a responsibility for protecting users' privacy, and that should squarely fall on the responsibility of the user, and here's why.

Each user has a different threat model that they're concerned about. Some might be located in repressive regimes, others might be sharing their family's computer in the living room. Is it appropriate for an instance admin to try to provide protection for users posting from China or that gay teen posting from Iran or Saudi Arabia? What about that home-school kid with strict parents who only want their kid to be exposed to ideas that they approve of? What about that kid who's using a school laptop with spyware installed that they can't shut off? What about the political dissident under surveillance by their Government for views that are contrary to accepted norms?

Each of those threat vectors can be addressed through separate means.

As I replied to Bjarni on .social, I feel that it is up to me and every other user to take privacy into our own hands. Tor is free, and if you can afford it VPNs aren't that expensive.

I mentioned CloudFlare as a viable option for instance admins as it's the most well-known and accessible CDN. It also has the benefit of protecting instances from the "slashdot effect" or if something should go viral, or from DDoS attacks if somebody posts a toot that pisses some group or person off.

In my personal opinion those should definitely be considered as viable solutions and alternatives.

(Edit: a word)

gellenburg commented Apr 15, 2017

I'm unorigmoniker@mastodon.social. Moving my arguments over here because I feel they're worth considering.

The solution to address @Technowix's concern is to not cache remote images. That's the only way you're going to address that concern, and the next time an instance admin posts loli or shota or outright CP.

The solution to address @BjarniRunar's concern about user privacy is to place the onus for user privacy on the user.

Here me out.

It is my opinion that instance admins don't have a responsibility for protecting users' privacy, and that should squarely fall on the responsibility of the user, and here's why.

Each user has a different threat model that they're concerned about. Some might be located in repressive regimes, others might be sharing their family's computer in the living room. Is it appropriate for an instance admin to try to provide protection for users posting from China or that gay teen posting from Iran or Saudi Arabia? What about that home-school kid with strict parents who only want their kid to be exposed to ideas that they approve of? What about that kid who's using a school laptop with spyware installed that they can't shut off? What about the political dissident under surveillance by their Government for views that are contrary to accepted norms?

Each of those threat vectors can be addressed through separate means.

As I replied to Bjarni on .social, I feel that it is up to me and every other user to take privacy into our own hands. Tor is free, and if you can afford it VPNs aren't that expensive.

I mentioned CloudFlare as a viable option for instance admins as it's the most well-known and accessible CDN. It also has the benefit of protecting instances from the "slashdot effect" or if something should go viral, or from DDoS attacks if somebody posts a toot that pisses some group or person off.

In my personal opinion those should definitely be considered as viable solutions and alternatives.

(Edit: a word)

@ldidry

This comment has been minimized.

Show comment
Hide comment
@ldidry

ldidry Apr 15, 2017

Contributor

@Gargron Nope, it's not the same. You said:

Right now, Mastodon downloads local copies of:

Camo downloads the images, but don't store them anywhere. It's just an image proxy. The Varnish I suggested is here to cache the images, but only in memory (well, you can make it cache them on disk, but it's not the default behavior (at least on Debian)). If you restart Varnish, you wipe all the cache. And the cache has a limited size, so new images replaces old ones in the cache system.

Contributor

ldidry commented Apr 15, 2017

@Gargron Nope, it's not the same. You said:

Right now, Mastodon downloads local copies of:

Camo downloads the images, but don't store them anywhere. It's just an image proxy. The Varnish I suggested is here to cache the images, but only in memory (well, you can make it cache them on disk, but it's not the default behavior (at least on Debian)). If you restart Varnish, you wipe all the cache. And the cache has a limited size, so new images replaces old ones in the cache system.

@gellenburg

This comment has been minimized.

Show comment
Hide comment
@gellenburg

gellenburg Apr 15, 2017

I should point out that what I mean by instance admins not having an onus to protect user privacy, what I mean is it is the users' responsibility for protecting the privacy of their web surfing habits.

Instance admins definitely have a responsibility for ensuring that SSL is enabled, is properly configured, that their servers are regularly patched and updated, and that any security vulnerabilities they discover or that are brought to their attention are promptly taken care of.

Instance admins also have a responsibility for ensuring that the software they're running is properly configured and that they take steps to prevent any data leakage incidents (lock down their servers, don't expose configuration files and passwords to the internet, etc.)

Above that, I do not feel it is an instance admin's (or Eugen's) responsibility for trying to protect every user from every real or perceived threat that may be out there. If you attempt to apply protection to the lowest common denominator or user on your system you are not going to be able to provide effective protection for most of your users.

gellenburg commented Apr 15, 2017

I should point out that what I mean by instance admins not having an onus to protect user privacy, what I mean is it is the users' responsibility for protecting the privacy of their web surfing habits.

Instance admins definitely have a responsibility for ensuring that SSL is enabled, is properly configured, that their servers are regularly patched and updated, and that any security vulnerabilities they discover or that are brought to their attention are promptly taken care of.

Instance admins also have a responsibility for ensuring that the software they're running is properly configured and that they take steps to prevent any data leakage incidents (lock down their servers, don't expose configuration files and passwords to the internet, etc.)

Above that, I do not feel it is an instance admin's (or Eugen's) responsibility for trying to protect every user from every real or perceived threat that may be out there. If you attempt to apply protection to the lowest common denominator or user on your system you are not going to be able to provide effective protection for most of your users.

@Gargron

This comment has been minimized.

Show comment
Hide comment
@Gargron

Gargron Apr 15, 2017

Member

@ldidry But ideally you'd still crop/downsize images for the end-user. I just meant that the cache wiping could be part of the current system, rather than replacing the current system with Camo

Member

Gargron commented Apr 15, 2017

@ldidry But ideally you'd still crop/downsize images for the end-user. I just meant that the cache wiping could be part of the current system, rather than replacing the current system with Camo

@Tryum

This comment has been minimized.

Show comment
Hide comment
@Tryum

Tryum Apr 15, 2017

hi, this is tryum on apoil.org instance !

This morning I threw some ideas, I don't know if it's viable or feasible :

-If content is crypted on the storage and the keys are distributed via another channel (to decrypt the content client side), does it still expose the admin to legal threats ? (must be state dependent).

-hotlinking the medias to the source instance, but distribute them also via p2p (ie webtorrent or any webtrc datachannel tech...) : The more viral the toot goes, the more distributed the media is, thus preventing small instance from slashdot effect.

If those ideas are silly, please be gentle, I'm not a web techy ;)

Tryum commented Apr 15, 2017

hi, this is tryum on apoil.org instance !

This morning I threw some ideas, I don't know if it's viable or feasible :

-If content is crypted on the storage and the keys are distributed via another channel (to decrypt the content client side), does it still expose the admin to legal threats ? (must be state dependent).

-hotlinking the medias to the source instance, but distribute them also via p2p (ie webtorrent or any webtrc datachannel tech...) : The more viral the toot goes, the more distributed the media is, thus preventing small instance from slashdot effect.

If those ideas are silly, please be gentle, I'm not a web techy ;)

@norio

This comment has been minimized.

Show comment
Hide comment
@norio

norio Apr 15, 2017

@Gargron Hi, I'm pawoo.net founder (pixiv inc.).

We understand that mature images uploaded by our users are potentially problematic from the legal view, since it could be hosted in servers in other countries.

However, we would like to protect our users' works as possible unless they are illegal in Japan. We are caught in a dilemma.

We, as pawoo.net admin would like to obligate users to flag mature contents as NSFW. And as for images in server, we propose Mastodon...

  • not to store in cache if it's NSFW.
  • to block NSFW images of other instances, just showing NSFW label with a link of the original content.

We are in compliance with the law of our country, and deal with our content.

We spare no technical effort to resolve this problem.

norio commented Apr 15, 2017

@Gargron Hi, I'm pawoo.net founder (pixiv inc.).

We understand that mature images uploaded by our users are potentially problematic from the legal view, since it could be hosted in servers in other countries.

However, we would like to protect our users' works as possible unless they are illegal in Japan. We are caught in a dilemma.

We, as pawoo.net admin would like to obligate users to flag mature contents as NSFW. And as for images in server, we propose Mastodon...

  • not to store in cache if it's NSFW.
  • to block NSFW images of other instances, just showing NSFW label with a link of the original content.

We are in compliance with the law of our country, and deal with our content.

We spare no technical effort to resolve this problem.

@DanielGilbert

This comment has been minimized.

Show comment
Hide comment
@DanielGilbert

DanielGilbert Apr 15, 2017

The Varnish I suggested is here to cache the images, but only in memory (well, you can make it cache them on disk, but it's not the default behavior (at least on Debian)).

@MrGilbert@social.gilbert.world here.

It would be illegal to have CP or CP-like images in the non-volatile cache (aka RAM) here in my jurisdiction. Yes, it's hard to prove for law enforcement, but anyways - the laws are there. Although there are somehow some EU laws that prohibit this, they haven't been adopted into local laws.

Furthermore, I don't know if Cloudflare or any other big CDN is doing some kind of "matching" or "scanning" on the media they are delivering, so that might not be an option either, as the "mastodon train" is picking up speed (sorry for that - imagine a little mastodon sitting in a train. It's super-cute).

@Tryum Encryption might be an option. But distributed encryption is somehow complex, I guess.

@norio Cool that you are here! I guess it's not the NSFW per se, It's more the lolicon content, which is somehow problematic in western countries.

DanielGilbert commented Apr 15, 2017

The Varnish I suggested is here to cache the images, but only in memory (well, you can make it cache them on disk, but it's not the default behavior (at least on Debian)).

@MrGilbert@social.gilbert.world here.

It would be illegal to have CP or CP-like images in the non-volatile cache (aka RAM) here in my jurisdiction. Yes, it's hard to prove for law enforcement, but anyways - the laws are there. Although there are somehow some EU laws that prohibit this, they haven't been adopted into local laws.

Furthermore, I don't know if Cloudflare or any other big CDN is doing some kind of "matching" or "scanning" on the media they are delivering, so that might not be an option either, as the "mastodon train" is picking up speed (sorry for that - imagine a little mastodon sitting in a train. It's super-cute).

@Tryum Encryption might be an option. But distributed encryption is somehow complex, I guess.

@norio Cool that you are here! I guess it's not the NSFW per se, It's more the lolicon content, which is somehow problematic in western countries.

@eeeple

This comment has been minimized.

Show comment
Hide comment
@eeeple

eeeple Apr 15, 2017

Just my 2 cents :

  • legal considerations are on a country by country basis, making a universal solution almost impossible.
  • there should probably be a legal warning when installing a mastodon instance, warning admins about the potential legal problems which may arise.
  • Mastodon's documentation will probably need to include a legal section where admins can consult their local laws and act accordingly. (maybe even create some kind of TL;DR like https://tldrlegal.com/)
  • caching as it is implemented right now is in its infancy, and could really use more customization. It should be up to the instance's admin to choose the content caching policy.
  • cost is to be taken into consideration, because bandwidth is expensive, and resources are limited, both financially, and technologically.

I hope this may help.

eeeple commented Apr 15, 2017

Just my 2 cents :

  • legal considerations are on a country by country basis, making a universal solution almost impossible.
  • there should probably be a legal warning when installing a mastodon instance, warning admins about the potential legal problems which may arise.
  • Mastodon's documentation will probably need to include a legal section where admins can consult their local laws and act accordingly. (maybe even create some kind of TL;DR like https://tldrlegal.com/)
  • caching as it is implemented right now is in its infancy, and could really use more customization. It should be up to the instance's admin to choose the content caching policy.
  • cost is to be taken into consideration, because bandwidth is expensive, and resources are limited, both financially, and technologically.

I hope this may help.

@marcan

This comment has been minimized.

Show comment
Hide comment
@marcan

marcan Apr 15, 2017

This is a fundamental disconnect between the law and technology, and I doubt there is a technical solution. CP laws are so broken around the world that even trying to police CP content can actually cause you legal grief (a team I'm part of was in the past told by a lawyer to stop filtering out known CP content hashes from a system, because that was a legal liability). Mind you, that's for real CP (real children), not loli (drawn content), but the latter is considered equivalent in some jurisdictions...

Good luck with the attempt at a technical fix, but I will be very impressed if you manage to find one. I would suggest getting a lawyer if you want to accurately evaluate the legal implications.

marcan commented Apr 15, 2017

This is a fundamental disconnect between the law and technology, and I doubt there is a technical solution. CP laws are so broken around the world that even trying to police CP content can actually cause you legal grief (a team I'm part of was in the past told by a lawyer to stop filtering out known CP content hashes from a system, because that was a legal liability). Mind you, that's for real CP (real children), not loli (drawn content), but the latter is considered equivalent in some jurisdictions...

Good luck with the attempt at a technical fix, but I will be very impressed if you manage to find one. I would suggest getting a lawyer if you want to accurately evaluate the legal implications.

@spikewilliams

This comment has been minimized.

Show comment
Hide comment
@spikewilliams

spikewilliams Apr 15, 2017

@gellenberg

If would be great if we could rely on users to manage their own privacy, but many - probably most - users simply don't have the depth of technical knowledge that would equip them to make good decisions in that regard, much less the skills and time to effectively implement those decisions. They will tend to default to what the platform provides. If the platform wants to protect its users, it should be proactive in providing sensible privacy features.

spikewilliams commented Apr 15, 2017

@gellenberg

If would be great if we could rely on users to manage their own privacy, but many - probably most - users simply don't have the depth of technical knowledge that would equip them to make good decisions in that regard, much less the skills and time to effectively implement those decisions. They will tend to default to what the platform provides. If the platform wants to protect its users, it should be proactive in providing sensible privacy features.

@DanielGilbert

This comment has been minimized.

Show comment
Hide comment
@DanielGilbert

DanielGilbert Apr 15, 2017

Maybe, in the short term, as @norio mentioned, it might be an option to not cache NSFW flagged media. Although this would mean that we need to deal with higher traffic as an instance admin - so, at some point, people would start requesting a "block NSFW from my instance" option, to save bandwidth. Which would, in turn, mean some kind of censorship, which we don't want at all.

Gosh, maybe the technical problem is even our smallest one...

DanielGilbert commented Apr 15, 2017

Maybe, in the short term, as @norio mentioned, it might be an option to not cache NSFW flagged media. Although this would mean that we need to deal with higher traffic as an instance admin - so, at some point, people would start requesting a "block NSFW from my instance" option, to save bandwidth. Which would, in turn, mean some kind of censorship, which we don't want at all.

Gosh, maybe the technical problem is even our smallest one...

@delroth

This comment has been minimized.

Show comment
Hide comment
@delroth

delroth Apr 15, 2017

There are images that are not NSFW and will still cause admins in certain jurisdictions legal trouble. Looking at it only from the loli angle is very American centric.

delroth commented Apr 15, 2017

There are images that are not NSFW and will still cause admins in certain jurisdictions legal trouble. Looking at it only from the loli angle is very American centric.

@gellenburg

This comment has been minimized.

Show comment
Hide comment
@gellenburg

gellenburg Apr 15, 2017

@Tryum wrote:

-hotlinking the medias to the source instance, but distribute them also via p2p (ie webtorrent or any webtrc datachannel tech...) : The more viral the toot goes, the more distributed the media is, thus preventing small instance from slashdot effect.

I fully support this, and would LOVE to see Mastodon implement something like WebTorrent for media delivery. That is the holy grail to me.

@norio wrote:

to block NSFW images of other instances, just showing NSFW label with a link of the original content.

This to me seems like a fair and just compromise.

@eeeple wrote:

caching as it is implemented right now is in its infancy, and could really use more customization. It should be up to the instance's admin to choose the content caching policy.

I totally agree with this. An instance admin should be in total control over what content is stored on, or passes through, their server, since they're paying for both. :-)

@spikewilliams wrote:

many - probably most - users simply don't have the depth of technical knowledge that would equip them to make good decisions in that regard

What a golden opportunity then we have (all of us, users and admins) to educate users about important skills that users can use both on Mastodon and throughout the rest of their internet "lives"!

If the platform wants to protect its users, it should be proactive in providing sensible privacy features.

Sure perhaps. Just like there's ProtonMail.com and Gmail.com. One does a better job at protecting their users privacy than the other.

(Edit: speling mistake or two.)

gellenburg commented Apr 15, 2017

@Tryum wrote:

-hotlinking the medias to the source instance, but distribute them also via p2p (ie webtorrent or any webtrc datachannel tech...) : The more viral the toot goes, the more distributed the media is, thus preventing small instance from slashdot effect.

I fully support this, and would LOVE to see Mastodon implement something like WebTorrent for media delivery. That is the holy grail to me.

@norio wrote:

to block NSFW images of other instances, just showing NSFW label with a link of the original content.

This to me seems like a fair and just compromise.

@eeeple wrote:

caching as it is implemented right now is in its infancy, and could really use more customization. It should be up to the instance's admin to choose the content caching policy.

I totally agree with this. An instance admin should be in total control over what content is stored on, or passes through, their server, since they're paying for both. :-)

@spikewilliams wrote:

many - probably most - users simply don't have the depth of technical knowledge that would equip them to make good decisions in that regard

What a golden opportunity then we have (all of us, users and admins) to educate users about important skills that users can use both on Mastodon and throughout the rest of their internet "lives"!

If the platform wants to protect its users, it should be proactive in providing sensible privacy features.

Sure perhaps. Just like there's ProtonMail.com and Gmail.com. One does a better job at protecting their users privacy than the other.

(Edit: speling mistake or two.)

@marcan

This comment has been minimized.

Show comment
Hide comment
@marcan

marcan Apr 15, 2017

@delroth Absolutely. Think of e.g. the DMCA implications of tooting cryptographic material. That doesn't even have to be media, you can fit a lot of things breaking a lot of laws around the world in 500 characters.

marcan commented Apr 15, 2017

@delroth Absolutely. Think of e.g. the DMCA implications of tooting cryptographic material. That doesn't even have to be media, you can fit a lot of things breaking a lot of laws around the world in 500 characters.

@DanielGilbert

This comment has been minimized.

Show comment
Hide comment
@DanielGilbert

DanielGilbert Apr 15, 2017

@delroth It all started with the "lolicon-controversy". I agree that there are other, non-NSFW images that are problematic, e.g. nazi symbolism here in Germany. But the laws are not so strong on this type of images as on CP. :)

@macan I guess that's another issue, isn't it? I can ban a user which toots non-legal stuff on my instance. But toots from other instances won't get stored on my server. I guess that's the point here. Correct me if I'm wrong, though.

I'm wondering if caching of incomplete content could solve the problem here. If I store half the data of an image on my machine, I might not face legal issues. But that's only "amateur lawyer" me.

DanielGilbert commented Apr 15, 2017

@delroth It all started with the "lolicon-controversy". I agree that there are other, non-NSFW images that are problematic, e.g. nazi symbolism here in Germany. But the laws are not so strong on this type of images as on CP. :)

@macan I guess that's another issue, isn't it? I can ban a user which toots non-legal stuff on my instance. But toots from other instances won't get stored on my server. I guess that's the point here. Correct me if I'm wrong, though.

I'm wondering if caching of incomplete content could solve the problem here. If I store half the data of an image on my machine, I might not face legal issues. But that's only "amateur lawyer" me.

@Kovensky

This comment has been minimized.

Show comment
Hide comment
@Kovensky

Kovensky Apr 15, 2017

Also looking from the other side, caching/hosting perfectly "US legally safe" erotic images in Japan may be troublesome if it's not censored.

Kovensky commented Apr 15, 2017

Also looking from the other side, caching/hosting perfectly "US legally safe" erotic images in Japan may be troublesome if it's not censored.

@KitRedgrave

This comment has been minimized.

Show comment
Hide comment
@KitRedgrave

KitRedgrave Apr 15, 2017

I would think there should also just be an option to globally not download or cache anything server-side, as there is with GNU Social, if you just don't want to deal with this mess. Your users would have to choose to click-to-load things, and it wouldn't be your liability as an admin (though IANAL, YMMV, etc etc).

Maybe this already got brought up and I missed it?

KitRedgrave commented Apr 15, 2017

I would think there should also just be an option to globally not download or cache anything server-side, as there is with GNU Social, if you just don't want to deal with this mess. Your users would have to choose to click-to-load things, and it wouldn't be your liability as an admin (though IANAL, YMMV, etc etc).

Maybe this already got brought up and I missed it?

@BrainShit

This comment has been minimized.

Show comment
Hide comment
@BrainShit

BrainShit Apr 15, 2017

Maybe this was mentioned here already and I just didn't realize.
How about blocking the cache based on content warning tags obv this would require there to be a proper list of what warnings to use globally but it could somewhat fix the problem, if you crack down on people who don't properly tag their toots.

BrainShit commented Apr 15, 2017

Maybe this was mentioned here already and I just didn't realize.
How about blocking the cache based on content warning tags obv this would require there to be a proper list of what warnings to use globally but it could somewhat fix the problem, if you crack down on people who don't properly tag their toots.

@delroth

This comment has been minimized.

Show comment
Hide comment
@delroth

delroth Apr 15, 2017

@BrainShit this does not scale. 1. Defining this list will be extremely tricky. 2. It requires cooperation from everyone to tag their content properly, but the tagging does not matter to the author, only to their subscribers (if we assume authors share content that is legal in their servers' legislation). But even more importantly, 3. Laws change, and people won't retroactively go and re-tag older content.

delroth commented Apr 15, 2017

@BrainShit this does not scale. 1. Defining this list will be extremely tricky. 2. It requires cooperation from everyone to tag their content properly, but the tagging does not matter to the author, only to their subscribers (if we assume authors share content that is legal in their servers' legislation). But even more importantly, 3. Laws change, and people won't retroactively go and re-tag older content.

@Exagone313

This comment has been minimized.

Show comment
Hide comment
@Exagone313

Exagone313 Apr 15, 2017

Hell no, please do not use P2P to share content between users. Downloading of some content may be legal when uploading is not (for example, in France, about copyrighted content like movies). P2P works in both ways.
For a platform, you have to moderate uploaded content to have a chance to stay legal. For a user, you are fucked.

Exagone313 commented Apr 15, 2017

Hell no, please do not use P2P to share content between users. Downloading of some content may be legal when uploading is not (for example, in France, about copyrighted content like movies). P2P works in both ways.
For a platform, you have to moderate uploaded content to have a chance to stay legal. For a user, you are fucked.

@stefafafan

This comment has been minimized.

Show comment
Hide comment
@stefafafan

stefafafan Apr 15, 2017

It seems too difficult to have users know whether or not the content they upload is okay globally. Not caching media flagged NSFW (or more like, not legal for other instances) does seem fine to me, but that is relying on the users to be able to correctly judge what is okay and what is not.

KitRedgrave's suggestion of just having an option to globally not download/cache stuff seemed nice to me.

stefafafan commented Apr 15, 2017

It seems too difficult to have users know whether or not the content they upload is okay globally. Not caching media flagged NSFW (or more like, not legal for other instances) does seem fine to me, but that is relying on the users to be able to correctly judge what is okay and what is not.

KitRedgrave's suggestion of just having an option to globally not download/cache stuff seemed nice to me.

@BrainShit

This comment has been minimized.

Show comment
Hide comment
@BrainShit

BrainShit Apr 15, 2017

@delroth 1. I do agree with that one
2. It also requires everyone to tag their content as NSFW if we're still talking about just blocking that
If there's an instance where people generally don't properly tag you can still just completely block that
3. While I do agree with that, people won't go back and re-tag their NSFW Stuff either

But I guess you are right with this doesn't really scale overall

BrainShit commented Apr 15, 2017

@delroth 1. I do agree with that one
2. It also requires everyone to tag their content as NSFW if we're still talking about just blocking that
If there's an instance where people generally don't properly tag you can still just completely block that
3. While I do agree with that, people won't go back and re-tag their NSFW Stuff either

But I guess you are right with this doesn't really scale overall

@furoshiki

This comment has been minimized.

Show comment
Hide comment
@furoshiki

furoshiki Apr 15, 2017

I think the new strategy of content-policy should focus on publisher.

Publisher may declare the content-policy. For example.

<body>
    <div class='h-entry'>
        <div class='status__content p-name emojify'>
            <div class='e-content' style='display: block; direction: ltr'>
                <data class="e-mastodon-content-no-cache"></data>
                <p>
                    <a href="https://pawoo.net/media/231Kh9YxVJaHJrb6xug" rel="nofollow noopener" target="_blank">
                        <span class="invisible">https://</span>
                        <span class="ellipsis">pawoo.net/media/231Kh9YxVJaHJr</span>
                        <span class="invisible">b6xug</span>
                    </a>
                </p>
            </div>
        </div>
    </div>
</body>

e-mastodon-content-no-cache is a new parameter. Subscribers check it, and decide to stop cache images. Subscribers can cache images, but publisher say "DON'T DO THAT!", subscribers never do it.

We can consider detailed rule.

  • e-mastodon-content-no-cache : The subscribers MUST NOT cache images.
  • e-mastodon-content-may-not-cache : The subscribers MAY cache images. The publisher has enough hardware resource to provide images to the subscribers.
  • e-mastodon-content-should-cache : The subscribers SHOULD cache images. The publisher does not have enough hardware resource to provide images to the subscribers, but the subscribers can ignore it.

How do you feel it?

furoshiki commented Apr 15, 2017

I think the new strategy of content-policy should focus on publisher.

Publisher may declare the content-policy. For example.

<body>
    <div class='h-entry'>
        <div class='status__content p-name emojify'>
            <div class='e-content' style='display: block; direction: ltr'>
                <data class="e-mastodon-content-no-cache"></data>
                <p>
                    <a href="https://pawoo.net/media/231Kh9YxVJaHJrb6xug" rel="nofollow noopener" target="_blank">
                        <span class="invisible">https://</span>
                        <span class="ellipsis">pawoo.net/media/231Kh9YxVJaHJr</span>
                        <span class="invisible">b6xug</span>
                    </a>
                </p>
            </div>
        </div>
    </div>
</body>

e-mastodon-content-no-cache is a new parameter. Subscribers check it, and decide to stop cache images. Subscribers can cache images, but publisher say "DON'T DO THAT!", subscribers never do it.

We can consider detailed rule.

  • e-mastodon-content-no-cache : The subscribers MUST NOT cache images.
  • e-mastodon-content-may-not-cache : The subscribers MAY cache images. The publisher has enough hardware resource to provide images to the subscribers.
  • e-mastodon-content-should-cache : The subscribers SHOULD cache images. The publisher does not have enough hardware resource to provide images to the subscribers, but the subscribers can ignore it.

How do you feel it?

@danmademe

This comment has been minimized.

Show comment
Hide comment
@danmademe

danmademe Apr 15, 2017

@furoshiki your idea requires trust..

Do you really trust everyone else on mastodon?

danmademe commented Apr 15, 2017

@furoshiki your idea requires trust..

Do you really trust everyone else on mastodon?

@yukihito23

This comment has been minimized.

Show comment
Hide comment
@yukihito23

yukihito23 Apr 15, 2017

same as @Kovensky mentioned, Japan does not allow uncensored adult images (which is legal in the US) to be existing within it's jurisdiction, so the same problem happens for both sides.

Then question comes around to what kind of strategies are the Japanese hosts currently applying as I think the amount of the "particular" content flowing inbound to Japanese jurisdiction would most likely be higher than the amount of "particular" content going outbound just by comparing the volume of users.

yukihito23 commented Apr 15, 2017

same as @Kovensky mentioned, Japan does not allow uncensored adult images (which is legal in the US) to be existing within it's jurisdiction, so the same problem happens for both sides.

Then question comes around to what kind of strategies are the Japanese hosts currently applying as I think the amount of the "particular" content flowing inbound to Japanese jurisdiction would most likely be higher than the amount of "particular" content going outbound just by comparing the volume of users.

@DanielGilbert

This comment has been minimized.

Show comment
Hide comment
@DanielGilbert

DanielGilbert Apr 15, 2017

Well, just not caching files won't solve the problem, at least here in Germany. It's about the information that is displayed to the user. If there is CP displayed under my url, police will knock at my door, and seize my stuff, because I'm responsible about what happens on my url. It's like that.

I must be able to control what is shown on my instance. There is a reason Twitter introduced national filters in 2012. They didn't do it because they had nothing to do, they did because they had to.

I will go as far as saying that Mastodon cannot exist in Germany in it's current form. At the end, it will be much safer to connect between instances in the same jurisdiction. So yes, I agree with @gellenburg - we need some kind of a two-way opt-in federation. But that's not all: I need to be able to control what's stored on my server, and which troots are shown on it. And I must be able to delete them from my instance upon request, if it's against the law in Germany.

Btw, our Federal Minister of Justice and Consumer Protection, Heiko Maas, is just about to issue a law against "hate speech", putting facebook & co in charge with hefty fines. You can read about it here: https://www.theguardian.com/media/2017/mar/14/social-media-hate-speech-fines-germany-heiko-maas-facebook
(sorry, couldn't find a more diverse english source. But all in all, it hits the nail on the head).

So this not a "what could happen if" kind of thing, this is actually a real thread.

DanielGilbert commented Apr 15, 2017

Well, just not caching files won't solve the problem, at least here in Germany. It's about the information that is displayed to the user. If there is CP displayed under my url, police will knock at my door, and seize my stuff, because I'm responsible about what happens on my url. It's like that.

I must be able to control what is shown on my instance. There is a reason Twitter introduced national filters in 2012. They didn't do it because they had nothing to do, they did because they had to.

I will go as far as saying that Mastodon cannot exist in Germany in it's current form. At the end, it will be much safer to connect between instances in the same jurisdiction. So yes, I agree with @gellenburg - we need some kind of a two-way opt-in federation. But that's not all: I need to be able to control what's stored on my server, and which troots are shown on it. And I must be able to delete them from my instance upon request, if it's against the law in Germany.

Btw, our Federal Minister of Justice and Consumer Protection, Heiko Maas, is just about to issue a law against "hate speech", putting facebook & co in charge with hefty fines. You can read about it here: https://www.theguardian.com/media/2017/mar/14/social-media-hate-speech-fines-germany-heiko-maas-facebook
(sorry, couldn't find a more diverse english source. But all in all, it hits the nail on the head).

So this not a "what could happen if" kind of thing, this is actually a real thread.

@gellenburg

This comment has been minimized.

Show comment
Hide comment
@gellenburg

gellenburg Apr 15, 2017

@DanielGilbert I wouldn't be surprised if Facebook and others just withdrew from Germany or implemented geoblocking and refused to serve any content to anybody coming from a German IP address if that passes and if there's not some sort of "safe harbor" provision.

Laws like that will balkanize the Internet faster than any Copyright laws.

Edit to add:

I also find it hard to believe that Tutanota will be held responsible if I, as an American, send some neo-Nazi, pro Hitler, Holocaust denying propaganda to a German friend who has a Tutanota email address. As much as I despise those thoughts and materials, doing so is perfect legal here in America. Freedom of Speech and all that.

gellenburg commented Apr 15, 2017

@DanielGilbert I wouldn't be surprised if Facebook and others just withdrew from Germany or implemented geoblocking and refused to serve any content to anybody coming from a German IP address if that passes and if there's not some sort of "safe harbor" provision.

Laws like that will balkanize the Internet faster than any Copyright laws.

Edit to add:

I also find it hard to believe that Tutanota will be held responsible if I, as an American, send some neo-Nazi, pro Hitler, Holocaust denying propaganda to a German friend who has a Tutanota email address. As much as I despise those thoughts and materials, doing so is perfect legal here in America. Freedom of Speech and all that.

@andy-twosticks

This comment has been minimized.

Show comment
Hide comment
@andy-twosticks

andy-twosticks Apr 15, 2017

late back to this thread. Here in the UK:

  • they really do prosecute based on cache data.
  • they will most likely prosecute re: links, too - the poster, not the instance owner.

andy-twosticks commented Apr 15, 2017

late back to this thread. Here in the UK:

  • they really do prosecute based on cache data.
  • they will most likely prosecute re: links, too - the poster, not the instance owner.
@DanielGilbert

This comment has been minimized.

Show comment
Hide comment
@DanielGilbert

DanielGilbert Apr 15, 2017

@gellenburg: Sending it to him via DM is perfectly fine. Posting it on his public facebook wall will put the service owner (facebook) into trouble if he cannot delete it.

Internet laws in Germany are a very special topic. And from my impression - no one cares about it. Older people normally dont want to have to deal with that. And I'm talking about people in their 40s, btw. Unfortunately, these are the people making the laws atm.

DanielGilbert commented Apr 15, 2017

@gellenburg: Sending it to him via DM is perfectly fine. Posting it on his public facebook wall will put the service owner (facebook) into trouble if he cannot delete it.

Internet laws in Germany are a very special topic. And from my impression - no one cares about it. Older people normally dont want to have to deal with that. And I'm talking about people in their 40s, btw. Unfortunately, these are the people making the laws atm.

@eeeple

This comment has been minimized.

Show comment
Hide comment
@eeeple

eeeple Apr 15, 2017

@DanielGilbert This is extremely worrying as it will probably create the need for whitelisting of other instances on the German ones, excluding all non-whitelisted instances as the default behavior.

We, as a community, should maybe create a "legal taskforce" (God i hate this term, but I can't find a better one) whose job is to collect and lay out all the liabilities and laws for every country in which instances exist (meaning probably every single country on earth at some point), so instances' admin can refer to this as a baseline for what they're allowed to do and what they're not allowed to (not being biding legal advice, but just as guidelines). It is a tremendous enterprise, but I can't help to think that it will be necessary at some point if we want more instances' admin to take the plunge (and not risk prison and whatnots).
In parallel, there will be a need for technical solutions (and maybe another working group) in order to help admins comply with said legal requirements, and help protect both the users' privacy, but also the admins liability.
Just throwing stuff at the wall here, and see what sticks. Thoughts ?

eeeple commented Apr 15, 2017

@DanielGilbert This is extremely worrying as it will probably create the need for whitelisting of other instances on the German ones, excluding all non-whitelisted instances as the default behavior.

We, as a community, should maybe create a "legal taskforce" (God i hate this term, but I can't find a better one) whose job is to collect and lay out all the liabilities and laws for every country in which instances exist (meaning probably every single country on earth at some point), so instances' admin can refer to this as a baseline for what they're allowed to do and what they're not allowed to (not being biding legal advice, but just as guidelines). It is a tremendous enterprise, but I can't help to think that it will be necessary at some point if we want more instances' admin to take the plunge (and not risk prison and whatnots).
In parallel, there will be a need for technical solutions (and maybe another working group) in order to help admins comply with said legal requirements, and help protect both the users' privacy, but also the admins liability.
Just throwing stuff at the wall here, and see what sticks. Thoughts ?

@nightpool

This comment has been minimized.

Show comment
Hide comment
@nightpool

nightpool Apr 15, 2017

Collaborator

@furoshiki I'm not sure I understand. Your local cache is only based on the size of the timelines of your instance users? If you have a small instance you will always have a small image cache. It's only dependent on your user's timelines, noone else

@danmademe You can delete remote posts on your server from the admin interface, so that just that instances can delete content they don't want. Also, if you suspend a remote user on your local instance, all of their posts will be deleted as well

also, look at @dannyob's post above. He's the International Director at the Electronic Frontier Foundation. Cops and prosecutors respond to rational incentives as much as anyone else. They don't want to prosecute cases that will make them look bad—CP cases are supposed to be their "slam dunks".

@DanielGilbert For statuses hosted on other instances, Mastodon works more like a cache system then like Facebook, where everything is centralized. this is (theoretically) treated differently, according to the EU rules. Not sure how that gets implemented into German law.

@andy-twosticks do you have any links to this? It goes against the EU directive in this area so I would love to see what their rationale was.

Collaborator

nightpool commented Apr 15, 2017

@furoshiki I'm not sure I understand. Your local cache is only based on the size of the timelines of your instance users? If you have a small instance you will always have a small image cache. It's only dependent on your user's timelines, noone else

@danmademe You can delete remote posts on your server from the admin interface, so that just that instances can delete content they don't want. Also, if you suspend a remote user on your local instance, all of their posts will be deleted as well

also, look at @dannyob's post above. He's the International Director at the Electronic Frontier Foundation. Cops and prosecutors respond to rational incentives as much as anyone else. They don't want to prosecute cases that will make them look bad—CP cases are supposed to be their "slam dunks".

@DanielGilbert For statuses hosted on other instances, Mastodon works more like a cache system then like Facebook, where everything is centralized. this is (theoretically) treated differently, according to the EU rules. Not sure how that gets implemented into German law.

@andy-twosticks do you have any links to this? It goes against the EU directive in this area so I would love to see what their rationale was.

@ZiiX

This comment has been minimized.

Show comment
Hide comment
@ZiiX

ZiiX Apr 15, 2017

Contributor

For commentary & discussion (moderating at all or say you block something bc X, Y or Z opens you up to more issues):

https://www.eff.org/deeplinks/2017/04/ninth-circuit-sends-message-platforms-use-moderator-go-trial

Feature to incorporate & consider please @Gargron:

I would like the one feature I've mentioned to gargron before that I want to change a variable to keep ## (say 2) days worth of cached images that were not uploaded by my instance's users that came through the federated timeline that were not boosted or replied to (interacted with). Toots from people my users follow could be a different, longer variable - say 5 days cached images from people my users follow. Also, a max data limit just in case 5 days goes crazy with 1000's of 8mb posts by one bot. As close to hosting a personal drupal, discourse, wordpress, website, etc as you can with respect to in-instance user uploads, interactions, and statuses to care for only.

Contributor

ZiiX commented Apr 15, 2017

For commentary & discussion (moderating at all or say you block something bc X, Y or Z opens you up to more issues):

https://www.eff.org/deeplinks/2017/04/ninth-circuit-sends-message-platforms-use-moderator-go-trial

Feature to incorporate & consider please @Gargron:

I would like the one feature I've mentioned to gargron before that I want to change a variable to keep ## (say 2) days worth of cached images that were not uploaded by my instance's users that came through the federated timeline that were not boosted or replied to (interacted with). Toots from people my users follow could be a different, longer variable - say 5 days cached images from people my users follow. Also, a max data limit just in case 5 days goes crazy with 1000's of 8mb posts by one bot. As close to hosting a personal drupal, discourse, wordpress, website, etc as you can with respect to in-instance user uploads, interactions, and statuses to care for only.

@gellenburg

This comment has been minimized.

Show comment
Hide comment
@gellenburg

gellenburg Apr 15, 2017

Something that I mentioned to Gargron on .social that I feel warrants consideration:

There are countless numbers of image hosting sites now. Imgur. TwitPics (!), Tumblr, Blogger, Giphy, etc.

One thing that can clear a lot (but not everything) up is just to forego hosting images and media on one's instance.

But that would mean implementing stellar support for oEmbed and maybe PubSubHub. But that way I could just link to a video hosted at YouTube or Vimeo or LiveLeak, or a GIF hosted at Giphy or ImgUr, or a photo hosted at Flickr or 500px or DeviantArt or Tumblr and save everyone's bandwidth and storage.

Plus, I would think that it would help Admins out legally (except you Germany) if the site was simply embedding content that was hosted on a third-party.

gellenburg commented Apr 15, 2017

Something that I mentioned to Gargron on .social that I feel warrants consideration:

There are countless numbers of image hosting sites now. Imgur. TwitPics (!), Tumblr, Blogger, Giphy, etc.

One thing that can clear a lot (but not everything) up is just to forego hosting images and media on one's instance.

But that would mean implementing stellar support for oEmbed and maybe PubSubHub. But that way I could just link to a video hosted at YouTube or Vimeo or LiveLeak, or a GIF hosted at Giphy or ImgUr, or a photo hosted at Flickr or 500px or DeviantArt or Tumblr and save everyone's bandwidth and storage.

Plus, I would think that it would help Admins out legally (except you Germany) if the site was simply embedding content that was hosted on a third-party.

@woganmay

This comment has been minimized.

Show comment
Hide comment
@woganmay

woganmay Apr 16, 2017

wogan@wogan.im here

A universal solution is impossible - too many regions, too many conflicting requirements. Google couldn't even solve this problem, much less the FOSS community. The way I see it, there are two high-level things that can be done:

  1. Making admins aware of their legal responsibilities
  2. Equipping admins to take action when necessary

(1): One tool that would be enormously helpful is an instance region flag on installation. If I specify that my instance is in the US, or ZA, or CN/JP/IR/ZW/whatever, a couple of bullet points about relevant content hosting legislation in my region will help me make better decisions as an admin.

Right now I think there's a real danger that people will spin up instances without understanding, at all, what laws apply to them. It doesn't need to be super-detailed, but just a few words on what's legal in your region would be helpful. Admins can, of course, go totally renegade and disregard them, but then at least Mastodon (the software) is doing its best to inform them of their responsibilities.

(2): I think admins should have several abilities:

  • Prevent media uploads completely - force users to upload elsewhere and post links.
  • An Admin-level view that shows all the media being posted to the system, so that they/someone can regularly scan over it to look for problematic things
  • A federation view that shows the regions (legal jurisdictions) that their instance is federating with, mainly as an FYI that shows where posted content might be distributed
  • The ability to automatically block instances based on their region/IP address (so a US admin might just block everything in the Japan IP range)

While I am personally in favor of open federation and individual responsibility, it's also pretty overwhelming if you're committed to managing a community, and have to take responsibility as an administrator (to some extent) for what the users do on your platform. Anything that can make it easier for an admin to navigate the legal minefields unique to their region will be helpful.

woganmay commented Apr 16, 2017

wogan@wogan.im here

A universal solution is impossible - too many regions, too many conflicting requirements. Google couldn't even solve this problem, much less the FOSS community. The way I see it, there are two high-level things that can be done:

  1. Making admins aware of their legal responsibilities
  2. Equipping admins to take action when necessary

(1): One tool that would be enormously helpful is an instance region flag on installation. If I specify that my instance is in the US, or ZA, or CN/JP/IR/ZW/whatever, a couple of bullet points about relevant content hosting legislation in my region will help me make better decisions as an admin.

Right now I think there's a real danger that people will spin up instances without understanding, at all, what laws apply to them. It doesn't need to be super-detailed, but just a few words on what's legal in your region would be helpful. Admins can, of course, go totally renegade and disregard them, but then at least Mastodon (the software) is doing its best to inform them of their responsibilities.

(2): I think admins should have several abilities:

  • Prevent media uploads completely - force users to upload elsewhere and post links.
  • An Admin-level view that shows all the media being posted to the system, so that they/someone can regularly scan over it to look for problematic things
  • A federation view that shows the regions (legal jurisdictions) that their instance is federating with, mainly as an FYI that shows where posted content might be distributed
  • The ability to automatically block instances based on their region/IP address (so a US admin might just block everything in the Japan IP range)

While I am personally in favor of open federation and individual responsibility, it's also pretty overwhelming if you're committed to managing a community, and have to take responsibility as an administrator (to some extent) for what the users do on your platform. Anything that can make it easier for an admin to navigate the legal minefields unique to their region will be helpful.

@jack1243star

This comment has been minimized.

Show comment
Hide comment
@jack1243star

jack1243star Apr 16, 2017

@woganmay I agree with most of your points, but for the last bullet point of (2), I would argue that this is a very bad idea. Auto blocking basing on the region (or any other criteria) does nothing to prevent illegal contents, and is a big obstacle to free speech. We cannot trust the admins of the blocking server to communicate on this issue, so the user may not notice the block at all (perhaps a dark pattern, since users will assume Twitter-like global broadcasting).

jack1243star commented Apr 16, 2017

@woganmay I agree with most of your points, but for the last bullet point of (2), I would argue that this is a very bad idea. Auto blocking basing on the region (or any other criteria) does nothing to prevent illegal contents, and is a big obstacle to free speech. We cannot trust the admins of the blocking server to communicate on this issue, so the user may not notice the block at all (perhaps a dark pattern, since users will assume Twitter-like global broadcasting).

@EzoeRyou

This comment has been minimized.

Show comment
Hide comment
@EzoeRyou

EzoeRyou Apr 16, 2017

As I am reading this discussion, Suddenly I feel like this is 15 years ago in P2P technology all over again. Why nobody learned anything from the history.

15 years ago, we are so hyped at building P2P distributed mesh network on the Internet.
We implemented File sharings, Chats, Forums, Blogs, Web pages and everything on top of that distributed P2P mesh network.

The result, we face the exactly same issues we face today.

  1. Copyright infrignement, Child porn, and other illegal datas(Nazi symbols in Germany for example) are spreading all over the place.

  2. Distributed cache burden us as the network grow so the cost of joining the network in terms of computational power, storage, and bandwidth become too expensive for newcomers.

If we seriously tried to solve those problem, we requires thousands of full-time employees, money, politics and hardware comparable to Twitter, Facebook, Google or Microsoft. We'll just become one of them.

At that time, there will be another people who think mastodon is too oppressive to the user so they start developing alternatives which promise "A decentralized alternative to existing platforms, it avoids the risks of a single mastodon community monopolizing your communication."

EzoeRyou commented Apr 16, 2017

As I am reading this discussion, Suddenly I feel like this is 15 years ago in P2P technology all over again. Why nobody learned anything from the history.

15 years ago, we are so hyped at building P2P distributed mesh network on the Internet.
We implemented File sharings, Chats, Forums, Blogs, Web pages and everything on top of that distributed P2P mesh network.

The result, we face the exactly same issues we face today.

  1. Copyright infrignement, Child porn, and other illegal datas(Nazi symbols in Germany for example) are spreading all over the place.

  2. Distributed cache burden us as the network grow so the cost of joining the network in terms of computational power, storage, and bandwidth become too expensive for newcomers.

If we seriously tried to solve those problem, we requires thousands of full-time employees, money, politics and hardware comparable to Twitter, Facebook, Google or Microsoft. We'll just become one of them.

At that time, there will be another people who think mastodon is too oppressive to the user so they start developing alternatives which promise "A decentralized alternative to existing platforms, it avoids the risks of a single mastodon community monopolizing your communication."

@dabura667

This comment has been minimized.

Show comment
Hide comment
@dabura667

dabura667 Apr 16, 2017

I agree with @Tryum

If hotlinking media from instance A is unfair when A is using a cheap server and instance B sends 5mil users fetching media from A... so the solution is simple encryption. For each piece of media, store a 4 byte nonce on instance A, then encrypt the media using SHA256(nonce || content-identifier) symmetrically on the client side.

So the user would fetch encrypted blob from their instance cache (or from instance A if it isn't cached). Then fetch the 4 byte nonce from the hotlinked instance. Decrypt the content locally with AES. A single SHA256 hash and AES decrypt would not be too slow, unless they were viewing the content on a potato.

It would at least be slightly better than having an unencrypted CP image on your computer.

dabura667 commented Apr 16, 2017

I agree with @Tryum

If hotlinking media from instance A is unfair when A is using a cheap server and instance B sends 5mil users fetching media from A... so the solution is simple encryption. For each piece of media, store a 4 byte nonce on instance A, then encrypt the media using SHA256(nonce || content-identifier) symmetrically on the client side.

So the user would fetch encrypted blob from their instance cache (or from instance A if it isn't cached). Then fetch the 4 byte nonce from the hotlinked instance. Decrypt the content locally with AES. A single SHA256 hash and AES decrypt would not be too slow, unless they were viewing the content on a potato.

It would at least be slightly better than having an unencrypted CP image on your computer.

@marcan

This comment has been minimized.

Show comment
Hide comment
@marcan

marcan Apr 16, 2017

@dabura667 You really want to talk to a lawyer before implementing something like that. The law does not work like technology does. Encrypting content could plausibly be taken as deliberate action to hinder the enforcement of the law in a situation like this. You might be better off with everything in the clear and cooperating with law enforcement if and when they ask (presuming they're interested in the source of the material, not servers it may have incidentally crossed). This isn't a legal opinion, I'm just saying that might be the case and you should really talk to a lawyer to figure that out.

marcan commented Apr 16, 2017

@dabura667 You really want to talk to a lawyer before implementing something like that. The law does not work like technology does. Encrypting content could plausibly be taken as deliberate action to hinder the enforcement of the law in a situation like this. You might be better off with everything in the clear and cooperating with law enforcement if and when they ask (presuming they're interested in the source of the material, not servers it may have incidentally crossed). This isn't a legal opinion, I'm just saying that might be the case and you should really talk to a lawyer to figure that out.

@DanielGilbert

This comment has been minimized.

Show comment
Hide comment
@DanielGilbert

DanielGilbert Apr 16, 2017

Totally agree with the encryption idea. Although judges could argue that I can easily access the keys and therefore unencrypt the data. Are there countries out there where encryption is illegal?

Unfortunately, it's Easter here, and family duty calls. I will have another look on the issue tomorrow.

DanielGilbert commented Apr 16, 2017

Totally agree with the encryption idea. Although judges could argue that I can easily access the keys and therefore unencrypt the data. Are there countries out there where encryption is illegal?

Unfortunately, it's Easter here, and family duty calls. I will have another look on the issue tomorrow.

@marcan

This comment has been minimized.

Show comment
Hide comment
@marcan

marcan Apr 16, 2017

Are there countries out there where encryption is illegal?

@DanielGilbert Yes (for various gradations of "illegal").

marcan commented Apr 16, 2017

Are there countries out there where encryption is illegal?

@DanielGilbert Yes (for various gradations of "illegal").

@DanielGilbert

This comment has been minimized.

Show comment
Hide comment
@DanielGilbert

DanielGilbert Apr 16, 2017

That's quite a lot. o.O

Did some small research:

From what I've found so far, I don't need to block pro-actively, but upon request. So a way to block troots from foreign instances might be sufficient for now - plus a solution for the cache, I guess.

DanielGilbert commented Apr 16, 2017

That's quite a lot. o.O

Did some small research:

From what I've found so far, I don't need to block pro-actively, but upon request. So a way to block troots from foreign instances might be sufficient for now - plus a solution for the cache, I guess.

@danmademe

This comment has been minimized.

Show comment
Hide comment
@danmademe

danmademe Apr 16, 2017

After some time to think I've realised that a good solution to this problem is more visibility of reporting functionality.

Reporting should work like this. If a user on my instance has something reported, then I as the admin can take responsibility for it. If that post is on a remote instance then it's reported to the instance admin.

I'm not sure if this is what's happening now. I'm sure it is.

But I think that if a user on my instance has reported another user on another instance then I also want to know so I can review it and potentially apply a domain block or a user block.

In this the functionality could be expanded so that other types of blocks are available.
Such as:

  • media blocks
  • URL blocks
  • Cache blocks

I really think this problem could be handled better if there was better tools available to admins, or even having another class of users, who act as moderators. Which would help large instances.

Why I think this is a good solution? Because then the users will have power to control what they find offensive and whole instances can become niche in what they allow.

danmademe commented Apr 16, 2017

After some time to think I've realised that a good solution to this problem is more visibility of reporting functionality.

Reporting should work like this. If a user on my instance has something reported, then I as the admin can take responsibility for it. If that post is on a remote instance then it's reported to the instance admin.

I'm not sure if this is what's happening now. I'm sure it is.

But I think that if a user on my instance has reported another user on another instance then I also want to know so I can review it and potentially apply a domain block or a user block.

In this the functionality could be expanded so that other types of blocks are available.
Such as:

  • media blocks
  • URL blocks
  • Cache blocks

I really think this problem could be handled better if there was better tools available to admins, or even having another class of users, who act as moderators. Which would help large instances.

Why I think this is a good solution? Because then the users will have power to control what they find offensive and whole instances can become niche in what they allow.

@andy-twosticks

This comment has been minimized.

Show comment
Hide comment
@andy-twosticks

andy-twosticks Apr 16, 2017

@nightpool I don't have any links re: prosecuting over cache data, but (a) the police of any country don't tend to pay much attention to any conventions that they do not have to pay attention to and (b) EU conventions will soon not apply in the UK anyway?

Re: encryption: Not sure where people are going with this. UK law allows the police to prosecute anyone who refuses to give up the password, AND allows them to prosecute you anyway if you genuinely don't have it. (Maybe even if they only think your 160k block of random numbers is hiding something illegal, at least in theory.)
Even if you enabled genuine e2e encryption of private toots between users, that would not help the two users -- and the admin would have to trust that no user had ever forgotten to set a toot private?

andy-twosticks commented Apr 16, 2017

@nightpool I don't have any links re: prosecuting over cache data, but (a) the police of any country don't tend to pay much attention to any conventions that they do not have to pay attention to and (b) EU conventions will soon not apply in the UK anyway?

Re: encryption: Not sure where people are going with this. UK law allows the police to prosecute anyone who refuses to give up the password, AND allows them to prosecute you anyway if you genuinely don't have it. (Maybe even if they only think your 160k block of random numbers is hiding something illegal, at least in theory.)
Even if you enabled genuine e2e encryption of private toots between users, that would not help the two users -- and the admin would have to trust that no user had ever forgotten to set a toot private?

@furoshiki

This comment has been minimized.

Show comment
Hide comment
@furoshiki

furoshiki Apr 17, 2017

#1865 seems the mastodon at latest version has resolved this problem. Is domain based media blocker the best solution? pawoo.net team think this is a good idea.

furoshiki commented Apr 17, 2017

#1865 seems the mastodon at latest version has resolved this problem. Is domain based media blocker the best solution? pawoo.net team think this is a good idea.

@Technowix

This comment has been minimized.

Show comment
Hide comment
@Technowix

Technowix Apr 17, 2017

Contributor

Well, it's a bit rough right now, but at least it permit peoples to communicates :3
Being able to "prevent caching" while "not muted" might be cool too :o

Contributor

Technowix commented Apr 17, 2017

Well, it's a bit rough right now, but at least it permit peoples to communicates :3
Being able to "prevent caching" while "not muted" might be cool too :o

@danmademe

This comment has been minimized.

Show comment
Hide comment
@danmademe

danmademe Apr 17, 2017

I think this topic has steered away from the original point of this issue. In terms of media caching strategies, I don't really have any solid advice other than some degree of control over where we cache from.

This topic is now closer towards enhancements to the domain block systems. Naturally I think that this is a worthy and important discussion to happen. Domain blocks need finer grained control, so as I suggested before a good place to start is to add a couple more types of blocks.

  • media block
  • URL block
  • cache block

Cache blocking being the focus of this issue.

danmademe commented Apr 17, 2017

I think this topic has steered away from the original point of this issue. In terms of media caching strategies, I don't really have any solid advice other than some degree of control over where we cache from.

This topic is now closer towards enhancements to the domain block systems. Naturally I think that this is a worthy and important discussion to happen. Domain blocks need finer grained control, so as I suggested before a good place to start is to add a couple more types of blocks.

  • media block
  • URL block
  • cache block

Cache blocking being the focus of this issue.

@westernotaku

This comment has been minimized.

Show comment
Hide comment
@westernotaku

westernotaku Apr 17, 2017

Could be this handeld in a way where you don't have to censor lolicon artists? We suffered enough for no reason at all, just by expressing ourself in form of drawings.

Maybe forbid certain ip ranges from viewing certain tags, as example #lolicon ? if I'm not mistaken as much illegal possesion of lolicon and distribution can be in places like australia, united kindom, canada, new zeland and france (this is not the mayority of western countryes btw) it's not illegal to be viewed online.

westernotaku commented Apr 17, 2017

Could be this handeld in a way where you don't have to censor lolicon artists? We suffered enough for no reason at all, just by expressing ourself in form of drawings.

Maybe forbid certain ip ranges from viewing certain tags, as example #lolicon ? if I'm not mistaken as much illegal possesion of lolicon and distribution can be in places like australia, united kindom, canada, new zeland and france (this is not the mayority of western countryes btw) it's not illegal to be viewed online.

@jack1243star

This comment has been minimized.

Show comment
Hide comment
@jack1243star

jack1243star Apr 17, 2017

@westernotaku Please don't do IP or region based restrictions. It won't work with VPN and we suffered enough for no reason at all, just by living in a country without choice.

jack1243star commented Apr 17, 2017

@westernotaku Please don't do IP or region based restrictions. It won't work with VPN and we suffered enough for no reason at all, just by living in a country without choice.

@Technowix

This comment has been minimized.

Show comment
Hide comment
@Technowix

Technowix Apr 18, 2017

Contributor

So, well, how do we do ? Right now there is a "bandage" with "block caching but must mute"
But the wound is still wide open, we still can't communicate with instances that haven't the same legal juridiction than our...

Contributor

Technowix commented Apr 18, 2017

So, well, how do we do ? Right now there is a "bandage" with "block caching but must mute"
But the wound is still wide open, we still can't communicate with instances that haven't the same legal juridiction than our...

@PeterCxy

This comment has been minimized.

Show comment
Hide comment
@PeterCxy

PeterCxy Apr 19, 2017

I do think the problem is more about storage / bandwidth rather than legal issues...

For me I just would like an archiving mechanism with which old content (e.g. older than a year) can be moved to some other storage, for example, a remote FTP server with abundant storage space, or an rclone encrypted Google Drive Unlimited remote, thus these less-viewed files can be safely removed from the main server to make space for more newly generated content (while they still can be viewed on demand). Before we get to an agreement on legal issues, implementing such mechanism, to me, will solve more urgent problems.

PeterCxy commented Apr 19, 2017

I do think the problem is more about storage / bandwidth rather than legal issues...

For me I just would like an archiving mechanism with which old content (e.g. older than a year) can be moved to some other storage, for example, a remote FTP server with abundant storage space, or an rclone encrypted Google Drive Unlimited remote, thus these less-viewed files can be safely removed from the main server to make space for more newly generated content (while they still can be viewed on demand). Before we get to an agreement on legal issues, implementing such mechanism, to me, will solve more urgent problems.

@Artoria2e5

This comment has been minimized.

Show comment
Hide comment
@Artoria2e5

Artoria2e5 Apr 20, 2017

Contributor

@DanielGilbert
It would be illegal to have CP or CP-like images in the non-volatile cache (aka RAM) here in my jurisdiction.

Disputing this "aka RAM" part here. "Non-volatile" memory refer to storage devices that retain the information after power loss, and is just the opposite of what people have in these SDRAM slots of motherboards. Yes, there are non-volatile types of RAM devices, but people would normally call them flash storage.

(This isn't to say that caching such images in some tmpfs ramdisk is always safe -- you may eventually bump into some swap space and accidentally write it onto the disk, for example.)

Contributor

Artoria2e5 commented Apr 20, 2017

@DanielGilbert
It would be illegal to have CP or CP-like images in the non-volatile cache (aka RAM) here in my jurisdiction.

Disputing this "aka RAM" part here. "Non-volatile" memory refer to storage devices that retain the information after power loss, and is just the opposite of what people have in these SDRAM slots of motherboards. Yes, there are non-volatile types of RAM devices, but people would normally call them flash storage.

(This isn't to say that caching such images in some tmpfs ramdisk is always safe -- you may eventually bump into some swap space and accidentally write it onto the disk, for example.)

@DanielGilbert

This comment has been minimized.

Show comment
Hide comment
@DanielGilbert

DanielGilbert Apr 20, 2017

@Artoria2e5

My fault. I meant "volatile".

https://www.heise.de/newsticker/meldung/Urteil-Kinderpornos-anklicken-ist-strafbar-931446.html

Translation of the relevant part:

"Already looking at child porn on the Internet is punishable. This follows from the existing legal situation and was now confirmed for the first time by an "Oberlandesgericht" (Higher Regional Court). Also the short-term download into the working memory, without a manual storage, brings users into the possession of the files, is stated in the reasoning of the OLG Hamburg from today's Monday."

Now, one might argue that a server cannot look at files - but I don't want to discuss that with any court here in Germany. ;)

DanielGilbert commented Apr 20, 2017

@Artoria2e5

My fault. I meant "volatile".

https://www.heise.de/newsticker/meldung/Urteil-Kinderpornos-anklicken-ist-strafbar-931446.html

Translation of the relevant part:

"Already looking at child porn on the Internet is punishable. This follows from the existing legal situation and was now confirmed for the first time by an "Oberlandesgericht" (Higher Regional Court). Also the short-term download into the working memory, without a manual storage, brings users into the possession of the files, is stated in the reasoning of the OLG Hamburg from today's Monday."

Now, one might argue that a server cannot look at files - but I don't want to discuss that with any court here in Germany. ;)

@kensoh

This comment has been minimized.

Show comment
Hide comment
@kensoh

kensoh Apr 23, 2017

Has the core team considered automatically deleting posts/images/videos after a certain time. Eg 30 days, 2 weeks, 1 week, etc. Or a setting for instance owner to decide. I know that this may be a digression from the discussion here, and contrary to current Mastodon functionality.

Am raising this because I can imagine the load that instance owners are bearing. Even if an instance owner decides to stop accepting new users, the existing users' new connections with more and more users outside of that instance may already create new exponential storage / bandwidth load on that instance.

As a large majority of instances are basically self-funded, the growth in storage / bandwidth might force some instance owners to pull the plug. That kinda will start a consolidation phase where only the instances with the deepest pockets survive, and reduce the diversity of people/ideas/content that Mastodon is so good at.

Also, the appearance of Snapchat, Instagram/Facebook stories etc, may suggest that somehow people are ok with the idea that their created digital content/data do not have to persist and exist permanently. And with the fast moving info-developments now, old posts might not be relevant to someone's followers anyway.

I'm a 2-week old user who believes in the mission of Mastodon.

PS: btw I will just use the chance to say thank you very much to Mastodon maintainers and contributors =) It is just amazing a project of this scale and rapid growth is supported through an open-source community of contributors.

kensoh commented Apr 23, 2017

Has the core team considered automatically deleting posts/images/videos after a certain time. Eg 30 days, 2 weeks, 1 week, etc. Or a setting for instance owner to decide. I know that this may be a digression from the discussion here, and contrary to current Mastodon functionality.

Am raising this because I can imagine the load that instance owners are bearing. Even if an instance owner decides to stop accepting new users, the existing users' new connections with more and more users outside of that instance may already create new exponential storage / bandwidth load on that instance.

As a large majority of instances are basically self-funded, the growth in storage / bandwidth might force some instance owners to pull the plug. That kinda will start a consolidation phase where only the instances with the deepest pockets survive, and reduce the diversity of people/ideas/content that Mastodon is so good at.

Also, the appearance of Snapchat, Instagram/Facebook stories etc, may suggest that somehow people are ok with the idea that their created digital content/data do not have to persist and exist permanently. And with the fast moving info-developments now, old posts might not be relevant to someone's followers anyway.

I'm a 2-week old user who believes in the mission of Mastodon.

PS: btw I will just use the chance to say thank you very much to Mastodon maintainers and contributors =) It is just amazing a project of this scale and rapid growth is supported through an open-source community of contributors.

@Gargron

This comment has been minimized.

Show comment
Hide comment
@Gargron

Gargron Jun 29, 2017

Member

We've implemented some new features since this issue was opened, to help deal with the problem. There are also a couple open issues for more technically specific approaches, so I believe this issue can be closed.

Member

Gargron commented Jun 29, 2017

We've implemented some new features since this issue was opened, to help deal with the problem. There are also a couple open issues for more technically specific approaches, so I believe this issue can be closed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment