-
Notifications
You must be signed in to change notification settings - Fork 212
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ATR is calculated wrongly and is extremely #135
Comments
there's another issue where this is discussed in the strategy repo: freqtrade/freqtrade-strategies#30 Feel free to improve performance (but honestly, you'll have problems with that i'll think). Every indicator that requires a loop will automatically be a lot slower than any indicator that can be vectorized (calculated in one go across the whole timerange) - and there is very little possibility to optimize these calls. That said, i just noticed that we have 2 different calculations of atr: The second version is in indicators (which is vendored from qtpylib). At first glance, the calculation seems ... better aligned to investopedia ... however please investigate in the linked issue above, as i think also this calculation does not match with tradingview (however i don't see tradingview as gold standard, maybe their implementation is wrong). |
well i found this code on stackoverflow https://stackoverflow.com/questions/40256338/calculating-average-true-range-atr-on-ohlc-data-with-python and tested myself
It's extremely extremely extremely fast |
it might be fast - but it's most likely not aligned to the calculation of investopedia - which does not mention the use of Wilders MA at all. You can also use |
i could change to values.rolling(14).mean(). Haven't tested it yet |
as said, technical does contain an implementation of atr already, which is fast. |
Base on this article https://www.investopedia.com/terms/a/atr.asp. The way atr is calculated is wrong because it only takes into account 'close' but leave out 'high', 'low'. Beside, when tested, this method runs extremely slow with little data.
The text was updated successfully, but these errors were encountered: