Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Polygon API hits 500 upon handle_data call #82

Closed
LiamBui opened this issue Feb 19, 2019 · 3 comments
Closed

Polygon API hits 500 upon handle_data call #82

LiamBui opened this issue Feb 19, 2019 · 3 comments

Comments

@LiamBui
Copy link

LiamBui commented Feb 19, 2019

Hey there,

This seems to be a problem with the Polygon API as run through pylivetrader. In my handle_data function of my algorithm, I run the following command:

iwv_close = data.history(context.iwv, "price", context.lookback , "1d")

This causes the following 500 HTTP error from Polygon:

2019-02-19T18:28:00.858854+00:00 app[worker.1]: [2019-02-19 18:28:00.858001] ERROR: Executor: 500 Server Error: Internal Server Error for url: https://api.polygon.io/v1/historic/agg/day/IWV?limit=180&apiKey={API_KEY_OMITTED}
2019-02-19T18:28:00.858867+00:00 app[worker.1]: Traceback (most recent call last):
2019-02-19T18:28:00.858870+00:00 app[worker.1]:   File "/app/.heroku/python/lib/python3.6/site-packages/pylivetrader/executor/executor.py", line 67, in wrapper
2019-02-19T18:28:00.858872+00:00 app[worker.1]:     func(*args, **kwargs)
2019-02-19T18:28:00.858874+00:00 app[worker.1]:   File "/app/.heroku/python/lib/python3.6/site-packages/pylivetrader/executor/executor.py", line 88, in every_bar
2019-02-19T18:28:00.858875+00:00 app[worker.1]:     handle_data(algo, current_data, dt_to_use)
2019-02-19T18:28:00.858877+00:00 app[worker.1]:   File "/app/.heroku/python/lib/python3.6/site-packages/pylivetrader/misc/events.py", line 218, in handle_data
2019-02-19T18:28:00.858879+00:00 app[worker.1]:     dt,
2019-02-19T18:28:00.858880+00:00 app[worker.1]:   File "/app/.heroku/python/lib/python3.6/site-packages/pylivetrader/misc/events.py", line 237, in handle_data
2019-02-19T18:28:00.858882+00:00 app[worker.1]:     self.callback(context, data)
2019-02-19T18:28:00.858887+00:00 app[worker.1]:   File "/app/.heroku/python/lib/python3.6/site-packages/pylivetrader/algorithm.py", line 203, in handle_data
2019-02-19T18:28:00.858889+00:00 app[worker.1]:     self._handle_data(self, data)
2019-02-19T18:28:00.858890+00:00 app[worker.1]:   File "main.py", line 15, in handle_data
2019-02-19T18:28:00.858892+00:00 app[worker.1]:   File "/app/.heroku/python/lib/python3.6/site-packages/pylivetrader/data/bardata.py", line 180, in history
2019-02-19T18:28:00.858894+00:00 app[worker.1]:     self.data_frequency,
2019-02-19T18:28:00.858895+00:00 app[worker.1]:   File "/app/.heroku/python/lib/python3.6/site-packages/pylivetrader/data/data_portal.py", line 75, in get_history_window
2019-02-19T18:28:00.858897+00:00 app[worker.1]:     end_dt=end_dt).swaplevel(
2019-02-19T18:28:00.858899+00:00 app[worker.1]:   File "/app/.heroku/python/lib/python3.6/site-packages/pylivetrader/data/data_portal.py", line 51, in _get_realtime_bars
2019-02-19T18:28:00.858900+00:00 app[worker.1]:     assets, frequency, bar_count=bar_count)
2019-02-19T18:28:00.858902+00:00 app[worker.1]:   File "/app/.heroku/python/lib/python3.6/site-packages/pylivetrader/backend/alpaca.py", line 405, in get_bars
2019-02-19T18:28:00.858904+00:00 app[worker.1]:     symbols, 'day' if is_daily else 'minute', limit=bar_count)
2019-02-19T18:28:00.858905+00:00 app[worker.1]:   File "/app/.heroku/python/lib/python3.6/site-packages/pylivetrader/backend/alpaca.py", line 493, in _symbol_bars
2019-02-19T18:28:00.858907+00:00 app[worker.1]:     return parallelize(fetch, workers=25)(symbols)
2019-02-19T18:28:00.858909+00:00 app[worker.1]:   File "/app/.heroku/python/lib/python3.6/site-packages/pylivetrader/backend/alpaca.py", line 106, in wrapper
2019-02-19T18:28:00.858910+00:00 app[worker.1]:     task_result = task.result()
2019-02-19T18:28:00.858912+00:00 app[worker.1]:   File "/app/.heroku/python/lib/python3.6/concurrent/futures/_base.py", line 425, in result
2019-02-19T18:28:00.858914+00:00 app[worker.1]:     return self.__get_result()
2019-02-19T18:28:00.858916+00:00 app[worker.1]:   File "/app/.heroku/python/lib/python3.6/concurrent/futures/_base.py", line 384, in __get_result
2019-02-19T18:28:00.858918+00:00 app[worker.1]:     raise self._exception
2019-02-19T18:28:00.858919+00:00 app[worker.1]:   File "/app/.heroku/python/lib/python3.6/concurrent/futures/thread.py", line 56, in run
2019-02-19T18:28:00.858921+00:00 app[worker.1]:     result = self.fn(*self.args, **self.kwargs)
2019-02-19T18:28:00.858922+00:00 app[worker.1]:   File "/app/.heroku/python/lib/python3.6/site-packages/pylivetrader/backend/alpaca.py", line 76, in wrapper
2019-02-19T18:28:00.858924+00:00 app[worker.1]:     return func(*args, **kwargs)
2019-02-19T18:28:00.858926+00:00 app[worker.1]:   File "/app/.heroku/python/lib/python3.6/site-packages/pylivetrader/backend/alpaca.py", line 476, in fetch
2019-02-19T18:28:00.858927+00:00 app[worker.1]:     size, symbol, _from, to, query_limit).df
2019-02-19T18:28:00.858929+00:00 app[worker.1]:   File "/app/.heroku/python/lib/python3.6/site-packages/alpaca_trade_api/polygon/rest.py", line 75, in historic_agg
2019-02-19T18:28:00.858930+00:00 app[worker.1]:     raw = self.get(path, params)
2019-02-19T18:28:00.858932+00:00 app[worker.1]:   File "/app/.heroku/python/lib/python3.6/site-packages/alpaca_trade_api/polygon/rest.py", line 33, in get
2019-02-19T18:28:00.858933+00:00 app[worker.1]:     return self._request('GET', path, params=params)
2019-02-19T18:28:00.858935+00:00 app[worker.1]:   File "/app/.heroku/python/lib/python3.6/site-packages/alpaca_trade_api/polygon/rest.py", line 29, in _request
2019-02-19T18:28:00.858936+00:00 app[worker.1]:     resp.raise_for_status()
2019-02-19T18:28:00.858938+00:00 app[worker.1]:   File "/app/.heroku/python/lib/python3.6/site-packages/requests/models.py", line 940, in raise_for_status
2019-02-19T18:28:00.858940+00:00 app[worker.1]:     raise HTTPError(http_error_msg, response=self)
2019-02-19T18:28:00.858942+00:00 app[worker.1]: requests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: https://api.polygon.io/v1/historic/agg/day/IWV?limit=180&apiKey={API_KEY_OMITTED}
2019-02-19T18:28:00.859037+00:00 app[worker.1]: [2019-02-19 18:28:00.858903] WARNING: Executor: Continuing execution

Upon hitting the same URL from Polygon's API through a GET request, I get the following 500 error:

Server failure during read query at consistency LOCAL_ONE (1 responses were required but only 0 replicas responded, 2 failed)

But this error only happens on first request if manually requested.

This seems to be a timeout error thrown by Cassandra. As such, I believe a solution may be to increase Cassandra's tombstone failure threshold, or to handle via pylivetrader by catching this exception and trying a second time (seems like it may work after hitting twice in a row, at least this happens in manual requesting).

@LiamBui
Copy link
Author

LiamBui commented Feb 19, 2019

Also note that this only happens with the bar_count arg of the data.history call is too high.. In my case, I have set context.lookback = 90, though setting it to 45 does not cause this problem.

@ttt733
Copy link
Contributor

ttt733 commented Mar 8, 2019

Polygon definitely has a limit on that call - I think it's 50, from my own tests. Anything above that returns a server error. I will pass this along to Polygon and see if maybe they can get a more useful error, or at least document the limit more obviously. I'll also consider adding some validation there to Pylivetrader when trying to request too many, though I'd rather avoid doing so. I'll close the issue if I determine that's not worthwhile after talking to them.

@ttt733
Copy link
Contributor

ttt733 commented May 10, 2019

We're updating pylivetrader to use the v2 Polygon aggregate method, which should not run into these issues.

@ttt733 ttt733 closed this as completed May 10, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants