Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature Request: Data ingested per day/week/month/year #544

Closed
OmgImAlexis opened this issue Feb 14, 2016 · 9 comments
Closed

Feature Request: Data ingested per day/week/month/year #544

OmgImAlexis opened this issue Feb 14, 2016 · 9 comments

Comments

@OmgImAlexis
Copy link

I'd love to be able to see the amount of data being added to Plex since other tools can show the amount of shows and which shows were added and when I'm hoping Plexpy could add to the graphs the amount of data being added per x amount of time as a way to gauge when I need to buy more storage.

@drzoidberg33
Copy link
Contributor

This is not currently possible.

Please see: #155 and #270 and #430.

@Hellowlol
Copy link
Contributor

I beleave hes referring to storage as how many gb is added to the nas each week. We could every item with addedat after xxx and figure out the size of thouse items.

@drzoidberg33
Copy link
Contributor

My bad.

@OmgImAlexis
Copy link
Author

Yep, that's what I was talking about. Since I mainly use Plex on my file server the amount of drives I need to buy and the frequency I buy them all depends on how much data my server ingests so this would really help with that.

@Hellowlol
Copy link
Contributor

I have started on this request. So far it supports a time range, weekdays, hour, month and so on. The call is a little expensive (200 Mb memory on a 25k media files) but but atlest it's fast. (Can be slow if your using a huge time range because of the sorting) I still have some mako work and js (the worst part imo left)

@JonnyWong16
Copy link
Contributor

What data are you calling? Can you just load the cached json from the media info tables?

@Hellowlol
Copy link
Contributor

I just query the server for every item to get the file size.. I could use the cached files but I think that would use more memory since there is a lot of data in the cached files. I only cache about itemtype, and the time it was added. I'll test it.

@JonnyWong16
Copy link
Contributor

The cache should only be in the kilobytes. Did you find a way to speed up getting file sizes?

@Hellowlol
Copy link
Contributor

Not using your method I needed to merge the data I'm pulling with the one in the db and the overhead was just to great resulting in memory error :(. Getting all the files sizes take like 3 sec.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

No branches or pull requests

4 participants