Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use Unified CSV #223

Merged
merged 27 commits into from
Mar 21, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
27 commits
Select commit Hold shift + click to select a range
79ea4fa
Amend Readme.dev.md for Windows 10 Logging
macanudo527 Jan 31, 2024
228c46d
Update Fiat Pricing to Use Free Tier
macanudo527 Jan 31, 2024
6992592
Fix Chain Bug for Pionex
macanudo527 Jan 31, 2024
93f31f4
Update Rewards for Binance.com
macanudo527 Jan 31, 2024
5a554e1
Add UTF-8 encode to Bitbank Supplemental Plugin
macanudo527 Jan 31, 2024
42ea8b0
Fix zero fees labeled as -- bug in binance.com Supplemental Plugin
macanudo527 Jan 31, 2024
1cff8f9
Delete Unused Functions from Kraken CSV plugin
macanudo527 Jan 31, 2024
1b91636
Update Kraken CSV Plugin to Download Unified OHLCVT
macanudo527 Jan 31, 2024
bd9463a
Add Stubs for Progressbar
macanudo527 Jan 31, 2024
cabcf9e
Remove Google API key Requirement
macanudo527 Jan 31, 2024
c345b9d
Add Tests for Kraken CSV Plugin
macanudo527 Jan 31, 2024
10cf5b1
Fix Untradeable Assets Bug in CCXT Pair Converter
macanudo527 Jan 31, 2024
140bce4
Fix Directory Missing Error
Jan 31, 2024
0b668f9
Remove Deleted Endpoint for Binance.com :(
Jan 31, 2024
defe02d
Merge branch 'main' into use_unified_csv
macanudo527 Feb 1, 2024
0301da1
Clean up comments
macanudo527 Feb 22, 2024
a410bc5
Create Constant for 32kb chunk size
macanudo527 Feb 22, 2024
7e4d591
Refactor asset reversal if
macanudo527 Feb 22, 2024
b4c7d64
Add Note to Docs for Manually Downloading Unified CSV
Feb 26, 2024
a70cef3
Fix logging
Feb 26, 2024
f1aebc3
Merge branch 'main' into use_unified_csv
macanudo527 Feb 26, 2024
630fc13
Ignore *.pyi files
Mar 4, 2024
4aaaadc
Apply Commenting Suggestions
macanudo527 Mar 4, 2024
38d75c8
Add Logging of Unified CSV URL
Mar 4, 2024
4ee0fe3
Disable Link Check
Mar 4, 2024
ed01fb0
Remove Unified CSV id override
Mar 18, 2024
b74b6cb
Clarify Comments
Mar 11, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
1 change: 1 addition & 0 deletions .pylintrc
Original file line number Diff line number Diff line change
Expand Up @@ -25,4 +25,5 @@ disable=
R0913, # too-many-arguments
R0914, # too-many-local-variables
R0915, # too-many-statements
ignore-patterns=.*\.pyi$
max-line-length=160
7 changes: 6 additions & 1 deletion README.dev.md
Original file line number Diff line number Diff line change
Expand Up @@ -189,11 +189,16 @@ While every commit and push are automatically tested as described, sometimes it'
* sort imports: `isort .`
* run pre-commit tests without committing: `pre-commit run --all-files`

Logs are stored in the `log` directory. To generate debug logs, prepend the command line with `LOG_LEVEL=DEBUG`, e.g.:
Logs are stored in the `log` directory. To generate debug logs on Linux or Mac, prepend the command line with `LOG_LEVEL=DEBUG`, e.g.:
```
LOG_LEVEL=DEBUG dali_us -s -o output/ config/test_config.ini
```

In Windows Powershell, debug logs can be generated with the following command:
```
$env:LOG_LEVEL='DEBUG'; dali_us -s -o output/ config/test_config.ini
```

### Unit Tests
Unit tests are in the [tests](tests) directory. Please add unit tests for any new code.

Expand Down
4 changes: 2 additions & 2 deletions docs/configuration_file.md
Original file line number Diff line number Diff line change
Expand Up @@ -476,7 +476,6 @@ historical_price_type = <em>&lt;historical_price_type&gt;</em>
default_exchange = <em>&lt;default_exchange&gt;</em>
fiat_access_key = <em>&lt;fiat_access_key&gt;</em>
fiat_priority = <em>&lt;fiat_priority&gt;</em>
google_api_key = <em>&lt;google_api_key&gt;</em>
untradeable_assets = <em>&lt;untradeable_assets&gt;</em>
aliases = <em>&lt;untradeable_assets&gt;</em>
</pre>
Expand All @@ -486,7 +485,6 @@ Where:
* `default_exchange` is an optional string for the name of an exchange to use if the exchange listed in a transaction is not currently supported by the CCXT plugin. If no default is set, Kraken(US) is used. If you would like an exchange added please open an issue. The current available exchanges are "Binance.com", "Gate", "Huobi" and "Kraken".
* `fiat_access_key` is an optional access key that can be obtained from [Exchangerate.host](https://exchangerate.host/). It is required for any fiat conversions, which are typically required if the base fiat is other than USD.
* `fiat_priority` is an optional list of strings in JSON format (e.g. `["_1stpriority_", "_2ndpriority_"...]`) that ranks the priority of fiat in the routing system. If no `fiat_priority` is given, the default priority is USD, JPY, KRW, EUR, GBP, AUD, which is based on the volume of the fiat market paired with BTC (ie. BTC/USD has the highest worldwide volume, then BTC/JPY, etc.).
* `google_api_key` is an optional string for the Google API Key that is needed by some CSV readers, most notably the Kraken CSV reader. It is used to download the OHLCV files for a market. No data is ever sent to Google Drive. This is only used to retrieve data. To get a Google API Key, visit the [Google Console Page](https://console.developers.google.com/) and setup a new project. Be sure to enable the Google Drive API by clicking [+ ENABLE APIS AND SERVICES] and selecting the Google Drive API.
* `untradeable_assets` is a comma separated list of assets that have no market, yet. These are typically assets that are farmed or given away as a part of promotion before a market is available to price them and CCXT can not automatically assign a price. If you get the error "The asset XXX or XXX is missing from graph" and the asset is untradeable, adding the untradeable asset to this list will resolve it.
* `aliases` is a list of aliases separated by semicolons. Each alias has 4 properties: exchange, from asset, to asset, factor. `exchange` is the name of the exchange if the alias is specific or `UNIVERSAL` if you want it applied to all exchanges. The current exchanges recognized by the CCXT plugin are "Binance.com", "Binance US", "Bitfinex", "Coinbase Pro", "Gate", "Huobi", "Kraken", "Okex", "Pionex" and "Upbit". `from asset` and `to asset` are the ISO codes in all caps of the assets you want to make an alias for. Finally, `factor` is the price factor for the alias (e.g. "1" if it is one to one). Here are some examples:

Expand Down Expand Up @@ -525,6 +523,8 @@ transaction age | candle used

Accuracy will improve once new CSV data is released, which is typically 2 weeks after the end of a quarter. Also, the Kraken REST API is very slow. It may take 20-30 seconds per transaction to retrieve prices for the latest quarter.

##### Note on Unified CSV File
The unified CSV file is a CSV file that contains all the candles for all the assets on the Kraken exchange. It is used to retrieve the price for the transaction if the transaction is older than the latest quarter. The plugin will prompt you to download the unified CSV file if it is needed for the transaction. You can also manually download the file from <!-- markdown-link-check-disable -->[Kraken Exchange](https://support.kraken.com/hc/en-us/articles/360047124832-Downloadable-historical-OHLCVT-Open-High-Low-Close-Volume-Trades-data)<!-- markdown-link-check-enable --> and put it in `.dali_cache/kraken/csv/`.

### Binance Locked CCXT
This plugin makes use of the CCXT plugin, but locks all routes to Binance.com.
Expand Down
55 changes: 45 additions & 10 deletions src/dali/abstract_pair_converter_plugin.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@
_SUCCESS: str = "success"

# exchangerates.host urls
_EXCHANGE_BASE_URL: str = "http://api.exchangerate.host/"
_EXCHANGE_BASE_URL: str = "http://api.exchangerate.host/historical"
_EXCHANGE_SYMBOLS_URL: str = "http://api.exchangerate.host/list"

_DAYS_IN_SECONDS: int = 86400
Expand Down Expand Up @@ -251,11 +251,21 @@ def _is_fiat(self, asset: str) -> bool:
return asset in self.__fiat_list

def _get_fiat_exchange_rate(self, timestamp: datetime, from_asset: str, to_asset: str) -> Optional[HistoricalBar]:
key: AssetPairAndTimestamp = AssetPairAndTimestamp(timestamp, from_asset, to_asset, _FIAT_EXCHANGE)
historical_bar: Optional[HistoricalBar] = self._get_bar_from_cache(key)

if historical_bar is not None:
LOGGER.debug("Retrieved cache for %s/%s->%s for %s", timestamp, from_asset, to_asset, _FIAT_EXCHANGE)
return historical_bar

self._check_fiat_access_key()
if from_asset != "USD":
raise RP2ValueError("Fiat conversion is only available from USD at this time.")
# Currency has to be USD on free tier
if from_asset != "USD" and to_asset != "USD":
raise RP2ValueError("Fiat conversion is only available to/from USD at this time.")
currency: str = from_asset if from_asset != "USD" else to_asset
result: Optional[HistoricalBar] = None
params: Dict[str, Any] = {_ACCESS_KEY: self.__fiat_access_key, _DATE: timestamp.strftime("%Y-%m-%d"), _CURRENCIES: to_asset}

params: Dict[str, Any] = {_ACCESS_KEY: self.__fiat_access_key, _DATE: timestamp.strftime("%Y-%m-%d"), _CURRENCIES: currency}
request_count: int = 0
# exchangerate.host only gives us daily accuracy, which should be suitable for tax reporting
while request_count < 5:
Expand All @@ -280,17 +290,42 @@ def _get_fiat_exchange_rate(self, timestamp: datetime, from_asset: str, to_asset
# }
# }
data: Any = response.json()

# Exchangerate.host only returns one rate for the whole day and does not provide OHLCV, so
# all rates are the same.
if data[_SUCCESS]:
market: str = f"USD{to_asset}"
result = HistoricalBar(
market: str = f"USD{to_asset}" if to_asset != "USD" else f"USD{from_asset}"
usd_rate: RP2Decimal = RP2Decimal(str(data[_QUOTES][market]))
usd_result = HistoricalBar(
duration=timedelta(seconds=_DAYS_IN_SECONDS),
timestamp=timestamp,
open=RP2Decimal(str(data[_QUOTES][market])),
high=RP2Decimal(str(data[_QUOTES][market])),
low=RP2Decimal(str(data[_QUOTES][market])),
close=RP2Decimal(str(data[_QUOTES][market])),
open=usd_rate,
macanudo527 marked this conversation as resolved.
Show resolved Hide resolved
high=usd_rate,
low=usd_rate,
close=usd_rate,
volume=ZERO,
)
self._add_bar_to_cache(key, usd_result)

# Exchangerate.host only returns one rate for the whole day and does not provide OHLCV, so
# all rates are the same.
# Note: the from_asset and to_asset are purposely reversed
reverse_key: AssetPairAndTimestamp = AssetPairAndTimestamp(timestamp, to_asset, from_asset, _FIAT_EXCHANGE)
reverse_rate: RP2Decimal = RP2Decimal("1") / usd_rate
reverse_result = HistoricalBar(
duration=timedelta(seconds=_DAYS_IN_SECONDS),
timestamp=timestamp,
open=reverse_rate,
macanudo527 marked this conversation as resolved.
Show resolved Hide resolved
high=reverse_rate,
low=reverse_rate,
close=reverse_rate,
volume=ZERO,
)
self._add_bar_to_cache(reverse_key, reverse_result)

result = usd_result
if from_asset != "USD":
result = reverse_result
break

except (JSONDecodeError, ReadTimeout) as exc:
Expand Down
2 changes: 2 additions & 0 deletions src/dali/plugin/input/csv/binance_com_supplemental.py
Original file line number Diff line number Diff line change
Expand Up @@ -103,6 +103,8 @@ def parse_autoinvest_file(self, file_path: str) -> List[AbstractTransaction]:
quote_asset_symbol: str = line[self.__AUTO_QUOTE_AMOUNT_SYMBOL].split()[1]
quote_asset_amount: str = line[self.__AUTO_QUOTE_AMOUNT_SYMBOL].split()[0]
crypto_fee: str = line[self.__AUTO_TRADING_FEE_SYMBOL].split()[0]
if crypto_fee == "--": # Zero fees are reported as -- for some reason
crypto_fee = "0"
crypto_out_with_fee: RP2Decimal = RP2Decimal(quote_asset_amount) + RP2Decimal(crypto_fee)
result.append(
OutTransaction(
Expand Down
4 changes: 2 additions & 2 deletions src/dali/plugin/input/csv/bitbank_supplemental.py
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,7 @@ def parse_deposits_file(self, file_path: str) -> List[AbstractTransaction]:
with open(file_path, encoding="utf-8") as csv_file:
lines = reader(csv_file)

header = next(lines)
header = str(next(lines)).encode("utf-8")
self.__logger.debug("Header: %s", header)
for line in lines:
if line[self.__DEPOSIT_STATUS] == "DONE":
Expand Down Expand Up @@ -126,7 +126,7 @@ def parse_withdrawals_file(self, file_path: str) -> List[AbstractTransaction]:
with open(file_path, encoding="utf-8") as csv_file:
lines = reader(csv_file)

header = next(lines)
header = str(next(lines)).encode("utf-8")
self.__logger.debug("Header: %s", header)
for line in lines:
if line[self.__STATUS] == "DONE":
Expand Down
3 changes: 2 additions & 1 deletion src/dali/plugin/input/csv/pionex.py
Original file line number Diff line number Diff line change
Expand Up @@ -156,9 +156,10 @@ def parse_transfers_file(self, file_path: str) -> List[AbstractTransaction]:

asset: str = (
line[self.__ASSET_TRANSFERED][: -len(line[self.__CHAIN_USED])]
if (line[self.__ASSET_TRANSFERED].endswith(line[self.__CHAIN_USED]))
if (line[self.__CHAIN_USED] != "" and line[self.__ASSET_TRANSFERED].endswith(line[self.__CHAIN_USED]))
else (line[self.__ASSET_TRANSFERED])
)
self.__logger.debug("Asset: %s", asset)

if line[self.__TRANSACTION_TYPE] == self.__DEPOSIT:
result.append(
Expand Down
58 changes: 2 additions & 56 deletions src/dali/plugin/input/rest/binance_com.py
Original file line number Diff line number Diff line change
Expand Up @@ -133,6 +133,7 @@

# Types of Binance Dividends
_BNB_VAULT = "BNB Vault"
_EARN_REWARDS = "Earn Rewards"
_ETH_STAKING = "ETH 2.0 Staking"
_FLEXIBLE = "Flexible"
_FLEXIBLE_SAVINGS = "Flexible Savings"
Expand All @@ -148,7 +149,7 @@
_SAVINGS_TRAIL_FUND = "Savings Trail Fund"

_AIRDROP_LIST = [_SOLO_AIRDROP]
_INTEREST_LIST = [_FLEXIBLE, _FLEXIBLE_SAVINGS, _LOCKED, _LOCKED_SAVINGS, _SAVINGS_TRAIL_FUND]
_INTEREST_LIST = [_EARN_REWARDS, _FLEXIBLE, _FLEXIBLE_SAVINGS, _LOCKED, _LOCKED_SAVINGS, _SAVINGS_TRAIL_FUND]
_STAKING_LIST = [_ETH_STAKING, _LOCKED_STAKING, _BNB_VAULT, _LAUNCH_POOL, _GENERAL_STAKING, _LAUNCHPAD]
_INCOME_LIST = [_CASH_VOUCHER]

Expand Down Expand Up @@ -484,61 +485,6 @@ def _process_gains(
current_start = now_time - 1 # int(locked_redemptions[0][_TIME]) + 1
current_end = now_time # current_start + _THIRTY_DAYS_IN_MS

# Old system Flexible Savings

# Reset window
current_start = self._start_time_ms
current_end = current_start + _THIRTY_DAYS_IN_MS

# We will step backward in time from the switch over
while current_start < earliest_record_epoch:
self._logger.debug("Pulling flexible saving from older api system from %s to %s", current_start, current_end)

flexible_saving = self._client.sapi_get_lending_union_interesthistory(
params=({_START_TIME: current_start, _END_TIME: current_end, _LENDING_TYPE: _DAILY, _SIZE: _INTEREST_SIZE_LIMIT})
)
# [
# {
# "asset": "BUSD",
# "interest": "0.00006408",
# "lendingType": "DAILY",
# "productName": "BUSD",
# "time": 1577233578000
# },
# {
# "asset": "USDT",
# "interest": "0.00687654",
# "lendingType": "DAILY",
# "productName": "USDT",
# "time": 1577233562000
# }
# ]
processing_result_list = []
for saving in flexible_saving:
self._logger.debug("Flexible Saving: %s", json.dumps(saving))
saving[_EN_INFO] = "Flexible Savings (OLD)"
saving[_ID] = Keyword.UNKNOWN.value
saving[_DIV_TIME] = saving[_TIME]
saving[_AMOUNT] = saving[_INTEREST_FIELD]
processing_result_list.append(self._process_gain(saving, Keyword.INTEREST))
old_savings = True

for processing_result in processing_result_list:
if processing_result is None:
continue
if processing_result.in_transactions:
in_transactions.extend(processing_result.in_transactions)

# if we returned the limit, we need to roll the window forward to the last time
if len(flexible_saving) < _INTEREST_SIZE_LIMIT:
current_start = current_end + 1
current_end = current_start + _THIRTY_DAYS_IN_MS
else:
current_start = int(flexible_saving[0][_TIME]) + 1
current_end = current_start + _THIRTY_DAYS_IN_MS

current_end = min(current_end, earliest_record_epoch)

if old_savings:
# Since we are making a guess at the cut off, there might be errors.
self._logger.warning(
Expand Down