-
Notifications
You must be signed in to change notification settings - Fork 222
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Data caching issue #93
Comments
Hi @ASDWQad, You should be able to reuse the existing cache mechanism by extending DataSourceCacheMixin. Can you post your code here? |
Hi @edtechre I have another question about the cache: And when I query the 1000 symbols one by one, it actually loads everything from cache. # querying 1000 symbols at once fails the cache
d = ds.query(symbols, start, end, timeframe)
# querying 1000 symbols one by one works
for s in symobls:
d = ds.query(s, start, end, timeframe) |
Thanks @tsunamilx, I am not aware of any cache limit. I will need to test this myself and report back. |
Hi @tsunamilx, Looking at the code, there is no difference between calling query with all symbols or individually for each symbol. See data.py. Perhaps your cache was invalidated somehow? |
I have customized a Binance data source, but the API can only retrieve 1,000 data points each time. I have cached all the data from 2023/1/1 to 2024/1/1, but when calling the cache, it does not merge the kline return, only returning the cached 1,000 data points.
Can this problem be solved? Or I can only download the data to the local, and then customize the data source to import the local data.
The text was updated successfully, but these errors were encountered: