You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Using AWS with a large number of files is unusable. Clear cache breaks as well as delete, and whole experience of going into folders, upload, create folder is very slow.
I've taken a look on the code and the main problem seems to rely on the browser_files method that stores a hash of all files in the S3 bucket on a single meta row...
I'm happy to help on refactoring this part but surely appreciate some help as well.
The text was updated successfully, but these errors were encountered:
Hi @tostasqb I was thinking in the same problem.
My idea was to move it in a database table like "cama_media".
Unfortunately for now I am still busy with a project.
Regards!
Using AWS with a large number of files is unusable. Clear cache breaks as well as delete, and whole experience of going into folders, upload, create folder is very slow.
I've taken a look on the code and the main problem seems to rely on the
browser_files
method that stores a hash of all files in the S3 bucket on a single meta row...I'm happy to help on refactoring this part but surely appreciate some help as well.
The text was updated successfully, but these errors were encountered: