-
-
Notifications
You must be signed in to change notification settings - Fork 5.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
kraken convert-trade-data --pairs option doesn't work #9811
Comments
support for Please update to the latest develop version - where it'll be available / working. The error does indeed look like a wrong / broken file. would you mind sharing the actual pair you suspect as the culprit? (Including the files - as i don't intend to download then full 10G that kraken is giving you again ... and i don't think i have the "last" ones available anymore). |
The issue is I am not getting many hints about the order of processing of the files. From the log you can see the last file that went through was ARPA/EUR. But they aren't seem to be in alphabetical order. Also they aren't processed in the same order that's printed in the beginning which says "Found csv files for ..." Or is the ARPA/EUR file the problem? |
processing should happen in the sequence of the output of I've just added a new debug statement which will show you the pair that's about to fail (see linked commit) - as you're on docker, please make sure to give ci time to complete before pulling the new version. |
Would this be freqtrade:latest? How long do I need to wait? A day or do you think the image is already ready? |
Ok, I think Ive found the problem files. With the latest image, the error was as follows: 2024-02-13 18:02:16,697 - freqtrade.data.converter.trade_converter_kraken - INFO - USDC/EUR: 5265057 trades, from 2020-01-08 15:17:49 to 2023-12-31 23:59:58 And in the above list we have ...,USDC/EUR, BTC/AED,... I am attaching the BTCAED files that showed up in Kraken_Trading_History_Q3_2023 and Kraken_Trading_History_Q4_2023. These files seem empty.. |
hm yeah, seems like that's an edge-case i didn't test so far should be fixed with the commit closing this issue. |
Describe your environment
Note: All issues other than enhancement requests will be closed without further comment if the above template is deleted or not filled out.
Describe the problem:
I've encountered two problems:
convert-trade-data is breaking down in the middle of execution and leaving many files unconverted.
Secondly, I figured that this is because maybe some of the very illiquid coins have bad files so I wanted to restrict the convert-trade-data to a specific number of pairs:
docker compose run --rm freqtrade convert-trade-data --exchange kraken --pairs BTC/USD ETH/USD XRP/USD DOGE/USD ADA/USD SOL/USD LTC/USD DOT/USD BCH/USD LINK/USD --format-from kraken_csv --format-to feather
This also didn't change anything as the function starts to convert all the files in the folder and not just the requested pairs.
Steps to reproduce:
Observed Results:
Command 2 converts many files but then breaks in the middle and stops.
Command 3 does the same exact thing even though I requested only a small number of coins
Relevant code exceptions or logs
The following command broke down in the middle of execution:
docker compose run --rm freqtrade convert-trade-data --exchange kraken --format-from kraken_csv --format-to feather
With the following error message:
Also for the following command, the convert-trade-data seems to ignore when I added --pairs option
docker compose run --rm freqtrade convert-trade-data --exchange kraken --pairs BTC/USD ETH/USD XRP/USD DOGE/USD ADA/USD SOL/USD LTC/USD DOT/USD BCH/USD LINK/USD --format-from kraken_csv --format-to feather
In the logs its says:
2024-02-13 13:57:32,686 - freqtrade.configuration.configuration - INFO - Using pairs ['BTC/USD', 'ETH/USD', 'XRP/USD', 'DOGE/USD', 'ADA/USD', 'SOL/USD', 'LTC/USD', 'DOT/USD', 'BCH/USD', 'LINK/USD']
However afterwards it goes ahead and starts to convert all of the csv trade files located in trades_csv
The text was updated successfully, but these errors were encountered: