Skip to content

I built a real-time streaming data pipeline using kafka, consuming deribit-api-v2 limit order book prices ๐Ÿ“ˆ and transforming them into an order flow imbalance indicator.

License

Notifications You must be signed in to change notification settings

kostyafarber/crypto-lob-data-pipeline

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

6 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

Order Flow Imbalance Data Pipeline

A real-time streaming data pipeline built using Kafka and Docker. Consumes Bitcoin data from Deribit's API v2.1.1 and transforms limit order book market data to net order flow imbalance and the mid-price.
Explore the docs ยป

View Demo ยท Report Bug ยท Request Feature

Table of Contents
  1. About The Project
  2. Getting Started
  3. Usage
  4. Roadmap
  5. Contributing
  6. License
  7. Contact
  8. Acknowledgments

About The Project

Product Name Screen Shot

Intially this project was intended as a starting point to build an algorithmic trading system. I decided to explore HFT (High Frequency Trading) and wanted to use Market Microstructure variables to inform my trading strategy. I wanted to use perptual cryptocurrency instruments and use order flow imbalance.

I however, instead tok the chance to change this into a fun project to practise and learn Docker and Kafka. What ultimately came of it was a simple real-time streaming data pipeline.

Please check out my website for more info!

(back to top)

Built With

(back to top)

Installation

This app was built using Docker. This solution assumes you have Docker installed on your machine

  1. Make sure you have API keys from Deribit and store them as CLIENT_ID_DERIBIT and CLIENT_SECRET_DERIBIT environment variables on your machine

  2. Clone the repo

    git clone git@github.com:kostyafarber/crypto-lob-data-pipeline.git
  3. Run the kafka and zookeeper container

    cd src/kafka
    docker-compose -f 'docker-compose.yml' up -d

Run the producer container

  1.  cd src/producer
     docker-compose -f 'docker-compose.yml' up 

Finally run the consumer container

cd src/consumer
docker-compose up -f `docker-compose.yml` up

(back to top)

What you should see is output that looks something like this:

demo gif

On the left is raw JSON being published to the kafka broker and on the right the JSON is being transformed with the order flow imbalance and mid-price being printed to the console.

Architecture

The pipeline was built with the microservices principles in mind. Kafka, the Producer and Consumer are all in their own seperate docker containers and have no knowledge of each other apart from being on the same bridge network I defined in the docker-compose.yml files.

architecture diagram

(back to top)

Roadmap

  • [] Add another script to make the consumer portion into a producer.

  • [] Add docker youtube video in acknowledgments.

See the open issues for a full list of proposed features (and known issues).

(back to top)

Contributing

Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.

If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

(back to top)

License

Distributed under the MIT License. See LICENSE.txt for more information.

(back to top)

Contact

Kostya Farber - kostya.farber@gmail.com

Project Link: https://kostyafarber.github.io/projects/crpyto-perpetual-futures-kafka-streaming

(back to top)

Acknowledgments

Use this space to list resources you find helpful and would like to give credit to. I've included a few of my favorites to kick things off!

(back to top)

About

I built a real-time streaming data pipeline using kafka, consuming deribit-api-v2 limit order book prices ๐Ÿ“ˆ and transforming them into an order flow imbalance indicator.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published