Description | Live Demo | Installation | How To | Documentation | Examples | Change Log | Wiki | Social | Notifications | Bugs | Contributing |Disclaimer | Commercial Support
A Python SDK by LUCIT for accessing and managing multiple local Binance order books with Python in a simple, fast, flexible, robust and fully functional way.
The organization of the DepthCache takes place in the same asyncio loop as the reception of the websocket data. The full stack of the UBS modules (REST, WebSocket and DepthCache) can be downloaded and installed by PyPi and Anaconda as a Python C extension for maximum performance.
Part of 'UNICORN Binance Suite'.
Get help with the integration of the UNICORN Binance Suite
modules!
To run modules of the UNICORN Binance Suite you need a valid license!
Create a local depth_cache for Binance with just 3 lines of code
from unicorn_binance_local_depth_cache import BinanceLocalDepthCacheManager, DepthCacheOutOfSync
ubldc = BinanceLocalDepthCacheManager(exchange="binance.com", depth_cache_update_interval=100)
ubldc.create_depth_cache("BTCUSDT")
asks = ubldc.get_asks("BTCUSDT")
bids = ubldc.get_bids("BTCUSDT")
asks = ubldc.get_asks("BTCUSDT", limit_count=10)
bids = ubldc.get_bids("BTCUSDT", limit_count=10)
asks = ubldc.get_asks("BTCUSDT", threshold_volume=300000)
bids = ubldc.get_bids("BTCUSDT", threshold_volume=300000)
Catch an exception, if the depth_cache is out of sync while accessing its data
try:
asks = ubldc.get_asks(market="BTCUSDT", limit_count=5, threshold_volume=300000)
bids = ubldc.get_bids(market="BTCUSDT", limit_count=5, threshold_volume=300000)
except DepthCacheOutOfSync:
asks = "Out of sync!"
bids = "Out of sync!"
ubldc.stop_depth_cache("BTCUSDT")
When you instantiate UBLDC with with
, ubldc.stop_manager()
is automatically executed upon exiting the with
-block.
with BinanceWebSocketApiManager() as ubldc:
ubldc.create_depth_cache("BTCUSDT")
Without with
, you must explicitly execute ubldc.stop_manager()
yourself.
ubldc.stop_manager()
The Python package UNICORN Binance Local Depth Cache provides local order books for the Binance Exchanges Binance (+Testnet), Binance Futures (+Testnet) and Binance US.
The algorithm of the depth_cache management was designed according to these instructions:
- Binance Spot: "How to manage a local order book correctly"
- Binance Futures: "How to manage a local order book correctly"
- Binance US: "Managing a Local Order Book"
With Create_depth_cache()` the depth_cache is started and initialized, i.e. for each depth_cache that is to be created, a separate asyncio coroutine is inserted into the event loop of the stream. As soon as at least one depth update is received via websocket is received, a REST snapshot is downloaded and the depth updates are applied to it so that it is synchronized in real time. As soon as Once this is done, the status of the cache is set to "synchronous".
Data in the depth_cache can be accessed with 'get_asks()' and 'get_bids()'. If the state of the depth_cache is not synchronous during access, the exception 'DepthCacheOutOfSync' is thrown.
The depth_cache will immediately start an automatic re-initialization if a gap in the UpdateID`s is detected (missing update event) or if the websocket connection is interrupted. As soon as this happens the state of the depth_cache is set to "out of sync" and when accessing the cache the exception 'DepthCacheOutOfSync' is thrown.
A local depth_cache is the fastest way to access the current order book depth at any time while transferring as little data as necessary. A REST snapshot takes a lot of time and the amount of data that is transferred is relatively large. Continuous full transmission of the order book via websocket is faster, but the amount of data is huge. A local depth_cache is initialized once with a REST snapshot and then handles Diff. Depth updates applied by the websocket connection. By transferring a small amount of data (only the changes), a local depth_cache is kept in sync in real time and also allows extremely fast (local) access to the data without exceeding the Binance request weight limits.
-
Always know if the cache is in sync! If the depth_cache is out of sync, the exception 'DepthCacheOutOfSync' is thrown or ask with
is_depth_cache_synchronized()
. -
If a depth cache is out of sync it gets refreshed automatically within a few seconds.
-
100% Websocket auto-reconnect!
-
Supported Exchanges
Exchange | Exchange string |
---|---|
Binance | binance.com |
Binance Testnet | binance.com-testnet |
Binance USD-M Futures | binance.com-futures |
Binance USD-M Futures Testnet | binance.com-futures-testnet |
Binance US | binance.us |
-
Create multiple depth caches within a single object instance.
-
Each depth_cache is managed in an asyncio coroutine.
-
Start or stop multiple caches with just one command
create_depth_cache()
orstop_depth_cache()
. -
Control websocket out of sync detection with
websocket_ping_interval
,websocket_ping_timeout
andwebsocket_close_timeout
-
Powered by UNICORN Binance REST API and UNICORN Binance WebSocket API.
-
Available as a package via
pip
andconda
as precompiled C extension with stub files for improved Intellisense functions and source code for easier debugging of the source code. To the installation.
If you like the project, please it on GitHub!
This live demo script runs DepthCaches from binance.com-futures and runs on a CCX13 virtual machine of HETZNER CLOUD.
(Refresh update once a minute!)
The module requires Python 3.7 and runs smoothly up to and including Python 3.12.
For the PyPy interpreter we offer packages only from Python version 3.9 and higher.
Anaconda packages are available from Python version 3.8 and higher, but only in the latest version!
The current dependencies are listed here.
If you run into errors during the installation take a look here.
When a new release is to be created, we start two GitHubActions:
Both start virtual Windows/Linux/Mac servers provided by GitHub in the cloud with preconfigured environments and create the respective compilations and stub files, pack them into wheels and conda packages and then publish them on GitHub, PYPI and Anaconda. This is a transparent method that makes it possible to trace the source code behind a compilation.
A Cython binary, PyPy or source code based CPython wheel of the latest version with pip
from PyPI
Our Cython and PyPy Wheels are available on PyPI, these wheels offer significant advantages for Python developers:
-
Performance Boost with Cython Wheels: Cython is a programming language that supplements Python with static typing and C-level performance. By compiling Python code into C, Cython Wheels can significantly enhance the execution speed of Python code, especially in computationally intensive tasks. This means faster runtimes and more efficient processing for users of our package.
-
PyPy Wheels for Enhanced Efficiency: PyPy is an alternative Python interpreter known for its speed and efficiency. It uses Just-In-Time (JIT) compilation, which can dramatically improve the performance of Python code. Our PyPy Wheels are tailored for compatibility with PyPy, allowing users to leverage this speed advantage seamlessly.
Both Cython and PyPy Wheels on PyPI make the installation process simpler and more straightforward. They ensure that you get the optimized version of our package with minimal setup, allowing you to focus on development rather than configuration.
pip install unicorn-binance-local-depth-cache
pip install unicorn-binance-local-depth-cache --upgrade
A Conda Package of the latest version with conda
from Anaconda
The unicorn-binance-local-depth-cache
package is also available as a Cython version for the linux-64
, osx-64
and win-64
architectures with Conda through the
lucit
channel.
For optimal compatibility and performance, it is recommended to source the necessary dependencies from the
conda-forge
channel.
conda config --add channels conda-forge
conda config --add channels lucit
conda install -c lucit unicorn-binance-local-depth-cache
conda update -c lucit unicorn-binance-local-depth-cache
From source of the latest release with PIP from GitHub
Run in bash:
pip install https://github.com/LUCIT-Systems-and-Development/unicorn-binance-local-depth-cache/archive/$(curl -s https://api.github.com/repos/LUCIT-Systems-and-Development/unicorn-binance-local-depth-cache/releases/latest | grep -oP '"tag_name": "\K(.*)(?=")').tar.gz --upgrade
Use the below command with the version (such as 2.1.1) you determined here:
pip install https://github.com/LUCIT-Systems-and-Development/unicorn-binance-local-depth-cache/archive/2.1.1.tar.gz --upgrade
From the latest source (dev-stage) with PIP from GitHub
This is not a release version and can not be considered to be stable!
pip install https://github.com/LUCIT-Systems-and-Development/unicorn-binance-local-depth-cache/tarball/master --upgrade
https://lucit-systems-and-development.github.io/unicorn-binance-local-depth-cache/changelog.html
https://www.lucit.tech/unicorn-binance-local-depth-cache.html
https://github.com/LUCIT-Systems-and-Development/unicorn-binance-local-depth-cache/wiki
- Discussions
- Gitter
- https://t.me/unicorndevs
- https://dev.binance.vision
- https://community.binance.org
To receive notifications on available updates you can
the repository on GitHub, write your
own script
with using
is_update_available()
.
Follow us on LinkedIn, X or Facebook!
To receive news (like inspection windows/maintenance) about the Binance API`s subscribe to their telegram groups:
- https://t.me/binance_api_announcements
- https://t.me/binance_api_english
- https://t.me/Binance_JEX_EN
- https://t.me/Binance_USA
- https://t.me/TRBinanceTR
- https://t.me/BinanceExchange
List of planned features - click if you need one of them or suggest a new feature!
Before you report a bug, try the latest release. If the issue still exists, provide the error trace, OS and Python version and explain how to reproduce the error. A demo script is appreciated.
If you dont find an issue related to your topic, please open a new issue!
UNICORN Binance Local Depth Cache is an open source project which welcomes contributions which can be anything from simple documentation fixes and reporting dead links to new features. To contribute follow this guide.
This project is for informational purposes only. You should not construe this information or any other material as legal, tax, investment, financial or other advice. Nothing contained herein constitutes a solicitation, recommendation, endorsement or offer by us or any third party provider to buy or sell any securities or other financial instruments in this or any other jurisdiction in which such solicitation or offer would be unlawful under the securities laws of such jurisdiction.
Under no circumstances will we be responsible or liable for any claims, damages, losses, expenses, costs or liabilities of any kind, including but not limited to direct or indirect damages for loss of profits.
Do you need a developer, operator or consultant? Contact us for a non-binding initial consultation!