New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Limit amount of simultaneously file downloading #19546
Comments
Adding such a feature would require to keep some state of connections / downloads. Possible with a distributed cache like redis or memcache but still a lot of additional complexity (e.g. failed downloads, unlock downloads, etc.). I think that problem is easier to solve with a middle proxy as you mentioned. |
Hi, @kesselb $redis = new Redis();
$downloadTime=floor($this->share->getNode()->getSize()/5000000);
$oldTime=$redis->get($fileId);
if($oldTime!=null) {
$downloadTimeRemain=$oldTime-time();
$userCount=ceil($downloadTimeRemain/$downloadTime);
if($userCount>5) throw new NotFoundException();
$redis->set($fileId,time()+$downloadTime+$downloadTimeRemain);
$redis->setTTL($fileId, $downloadTime);
} else {
$redis->set($fileId,$downloadTime+time());
$redis->setTTL($fileId, $downloadTime);
} It is not perfect, but I ohpe, it can give you an idea how to improve it. |
You can write a app that registers to that hook, check if download is possible (count < limit) and if not throw The only disadvantage I see is that the hook triggers very late. For example to download activity is already pushed. Probably acceptable to move that hook a bit higher or introduce another event. cc @nickvergessen @rullzer there is no event or hook in |
Unfortunately I'm not php-programmer, this is my second attemp in php. It would be greate, if someone who has exp in php, modify and upgrade it) |
There is no such hook. Also I'm not convinced of the use case. If you expect thousand of people to download the file. Then you should have the resources to handle this. So add more servers. Doing such locking is IMO not really ideal |
Is your feature request related to a problem? Please describe.
Hi! In case of very big NC installation (50k+ users), there can be a problem, if some user shares the file, public the link ob some forum/social network/messenger, and lots of people begin to download the file. It can harm the server and channel. It will be perfect to make a mechanism for admins to manage amount of simultaneously connections/downloads.
Describe the solution you'd like
I was configuring nginx to manage this. All share urls have same pattern domain.org/s/. First, i was thinking to make 'limit_conn' rule for location
location ~ ^/s/([0-9a-zA-Z]+)$
, but this will be global rule and make restricts to all shared files of all users. Also this variant will not work in kubernetes, where you can have lots of pods with nc and nginx containers inside each pod.Second variant - use some middle proxy, which will count number of accessing to the url and place this number in redis/memcached. Placing this value in DB (postgre, mysql) will be very dangerous, because lots of UPDATE operations can get down the db.
Describe alternatives you've considered
So, for now I don't have working solution for this
Additional context
This mechanism can also be applied to uploading to shared folder with write permissions.
The text was updated successfully, but these errors were encountered: