Skip to content
This repository has been archived by the owner on Jan 12, 2024. It is now read-only.

Storing different bar frequencies #40

Closed
nwillemse opened this issue Dec 6, 2016 · 3 comments
Closed

Storing different bar frequencies #40

nwillemse opened this issue Dec 6, 2016 · 3 comments

Comments

@nwillemse
Copy link

Hi @ranaroussi

Now that we got the blotter to work and store data in mysql, I'm looking at the data in bars table. It seems that by default blotter stores 1 min bar data. Questions:-

  1. what if i want to store 1M and 15M and Daily bars? There seem to be no way to distinguish what the bar frequency is. It's almost like you need another column to specify the bar frequency.
  2. how does the strategy get the historical bar data during backtesting? For instance I have csv files for daily data that I would like to load into the pyqt database in the bars table so that it can be used for backtesting.
  3. if you run a strategy on daily data, does the system use the tick data to create daily bars? or does the Algo read daily data from the bars table to do the backtest?

Regards,
Nick

@ranaroussi
Copy link
Owner

ranaroussi commented Dec 7, 2016

Short answer: All market data is constructed, based on IB's tick data.

Longer answer: The blotter stores IB "tick" data as the come in, and aggregates that data as 1-minute bars to be stored in the database.

When back-testing, the historical data is being pulled from the database. Currently, you cannot backtest using csv files for historical data - but I will be introducing a mechanism for automatic backfill of historical data, as suggested in issue #2, in the very near future.

For all time-based resolutions/frequencies larger than 1 minute are constructed from the 1-minute bars in the bars table, while all resolutions (second, tick, and volume) are constructed from previously stored tick data in the ticks table.

When live-trading, OHLC bars are being constructed in real-time from IB's tick data, based on the strategy's resolution.

I hope this makes sense :)
Ran

@ranaroussi
Copy link
Owner

Closing this issue for now

@lemieuxm
Copy link

lemieuxm commented Mar 7, 2019

@ranaroussi, while I understand the motivation behind using the lowest common denominator and basing all computations off of that, the breadth of changes in each instrument is not reported to end user tools (for any broker). The range of bars you get when subscribing to bar data will be larger than the range you will see in ticks. For a library like this, it really is preferrable to base larger time period computations off of bars computed at the source rather than the subset of data TWS has visibility into. See Issue #111 for an example.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants