Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Caching #86

Open
AuHau opened this issue Dec 9, 2018 · 8 comments
Open

Caching #86

AuHau opened this issue Dec 9, 2018 · 8 comments
Milestone

Comments

@AuHau
Copy link
Owner

AuHau commented Dec 9, 2018

With the introduction of Django ORM like framework, it allows easy usage of the API wrapper classes, but it also introduces quiet bit overhead for fetching resources, which resolve in expensive API calls. Currently, for CLI usage, this should not be a problem, but if somebody would like to use the API wrappers in more intensive ways he could reach the API's throttling limitations quiet soon.

To address this problem, I would like to add support for caching results. This should be most probably on the low level of toggl.utils.other.toggl calls.

Toggl's API supports bulk fetching of data using https://www.toggl.com/api/v8/me?with_related_data=true, which could be used for population of the cache.
Also, the same endpoint supports since=<timestamp>, that fetches changed data, which could be used for preventing the cached data to become stale.

@AuHau
Copy link
Owner Author

AuHau commented Jan 21, 2019

This feature is currently on-hold, until Toggl finalize their new API version (v9), which is currently in development (their web client is already using it), but is still not in stable form.

@sjvrijn
Copy link
Contributor

sjvrijn commented Mar 11, 2019

While wanting to do some analysis over all my entries (~2k by now), I've tried using Python's built-in @lru_cache decorator on the api.others.toggl function: it seems to work pretty well, will submit a PR soon.
Main issues for proper inclusion into the probject would be:

  • when to automatically reset the cache in more interactive use
  • how to enable clearing through the cli
  • whether to enable by default (as it can be confusing if people don't expect it...)

Any thoughts on these?

@AuHau
Copy link
Owner Author

AuHau commented Mar 11, 2019

That is cool! Somehow I did not think of using @lru_cache directly. I guess it can actually work!

I originally wanted to implement a custom cache in order to use the bulk fetching capabilities of the Toggl's API. I am reluctant to work on that before they will release the new API version, but until then it is definitely a good idea to use at least lru_cache!

Regarding your points. Clearing for CLI is not really necessary, because to my knowledge lru_cache is a memory only, hence when the CLI process will die, the cache will disappear too. Hence a bigger problem will be with long-running projects...

I would add a new property to utils.Config which will enable/disable usage of caching. Maybe we could use logic to enable caching by default for CLI, but for using the API wrappers it would be by default turned off? What do you think?

Also, I would document that with mentioning that you can clear the cache using api.others.toggl.cache_clear() call.

Thanks for the idea!

@sjvrijn
Copy link
Contributor

sjvrijn commented Mar 11, 2019

Clearing the cache would still be useful if I e.g. fetch a project, update it and then fetch it again (directly or indirectly). In that case, the old version will still be cached, even though I just updated it. Any update call could then clear the cache to force remote updates without having to check anything.

So I think I would enable it by default for both, but allow changing the cache size through the .togglrc file, where setting it to 0 should disable it I think.

@AuHau
Copy link
Owner Author

AuHau commented Mar 11, 2019

Hmm, by your example you are pointing to CLI usage? Or generally? It is definitely a valid case for the API Wrappers but not really for CLI...
Yup, updates should clear the cache. lru_cache unfortunately does not allow you to invalidate specific entries.

Yeah, a configuration of the size of the cache is definitely a good idea, but I would still keep it separate from turning on/off caching. Hence having a general boolean flag for having it turned on/off and then another option for the size of the cache.

@sjvrijn sjvrijn mentioned this issue Mar 11, 2019
@Exr0n
Copy link

Exr0n commented Oct 2, 2020

Hi,

Any updates on this? I would just like to be able to cache API calls so the list loads faster.

@AuHau
Copy link
Owner Author

AuHau commented Oct 3, 2020

I don't plan to tackle this in any close future. I guess @sjvrijn also does not really work on it anymore.

If you would be interested to work on the PR let me know! It would be most welcome!

@Exr0n
Copy link

Exr0n commented Oct 3, 2020

I could take a look, although I'm not too familiar with the frameworks used in this project. What needs to happen?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants