Files
crypto_futures/prompts/MAIN
2026-02-26 17:25:42 +05:00

21 lines
2.8 KiB
Plaintext
Executable File

Main points:
We're building a Python app for getting real-time orderbook and trade data from several cryptocurrency exchanges for archival, processing, prediction and trade execution - basically an algorithmic trading bot. We'll be using the following software stack:
1. TimescaleDB as archive database (full credentials for connection -> postgres://postgres:DMl0h3qMnkiWWKRJSIgR@localhost:5432/trading);
2. Redis as RAM cache and message broker (credentials for connection -> localhost:6379);
3. Celery for general multiprocessing, scheduling and background tasks.
Additional context:
The app will be connecting to 6 cryptoexchanges (the number may change later): Binance, Bybit, OKX, Bitget, KuCoin and MEXC. We'll be working with both spot and futures markets.
Our interest is in data of finest granularity, meaning WebSocket channels with orderbook depth and individual trades changing in real-time; OHLCV data is out of picture. Expect at least 2 client connections for every trading pair on every cryptoexchange, that makes >1000 active connections sending >10000 bits of valuable data every second. The sheer amount of data calls for event-driven architucture pattern for our app.
Usage of specific cryptoexchange libraries is discouraged for two reasons: 1. every cryptoexchange is different from another and this applies to their APIs also, and 2. our code blocks designed for handling WebSocket connections (e.g. connecting, receiving packages, sending subscription and heartbeat packages, reconnecting on network error) should be uniform, reusing same logic as much as possible.
We have our own workstation on site, so that we could deploy the app locally. The hardware infrastructure is as follows:
- Lenovo RD450X dual-CPU motherboard (10 Gigabit NIC version);
- 2x Intel Xeon 2686v4;
- 8x 64GB DDR4 ECC REG Samsung 2400Mhz 4DRx4 LRDIMM [M386A8K40BM1-CRC];
- 2x Samsung PM1735 1.6 TB NVMe SSD (in ZFS RAID).
We'll be utilizing our own spin on momentum trading strategies, analyzing data for prediction of direction and power of price movement based on the insights we can find in orderbook and trades, such as the volume of aggressive and passive trades; amount of large trades, iceberg trades, repeating trades and impulses; correlation with BTC and same coins between exchanges, and so on. Having latest orderbook data, we'll be dynamically predicting the amount of slippage, looking for closest order concentrations for our stop loss and the estimated movement power for our target. And then all 3 those factors will be optimal, we'll execute the deal.
The same strategy should be applicable both to real-time and archive data for backtesting. Every point of the code should be logged for ease of debugging.
Somewhere down the road, when the first block of data collection and second of data processing and prediction is ready, we'll be working on frontend and API.