Crypto Currency Forecasting App for ML System Design Course on ODS.ai
Also read the article "Yet another architecture of ML crypto trading system." on Medium
More visualizations and metrics are available in the presentation
App consists of 6 main parts
We could install and run different parts of the App independently
In this part data RAW DATA
is mined from LOB
, TRADE
and NEWS
.
Here FEATURES
are extracted from RAW DATA
and DATASET
is created from FEATURES
and/or PREDICTIONS
.
Here MODELS
are trained/tuned using DATASETS
and then stored in the MODELS REGISTRY
.
MODELS
loaded from the MODELS REGISTRY
make PREDICTIONS
using DATASETS
.
METRICS
are collected from pipeline.
Here the AGENT
trades using PREDICTIONS
.
cd docker
sudo openssl req -x509 -nodes -newkey rsa:2048 -keyout influxdb-selfsigned.key -out influxdb-selfsigned.crt -days 365
cp .env.secret.db.example .env.secret.db
docker compose -f docker-compose.kafka.yaml up -d
docker compose -f docker-compose.db.yaml up -d
docker compose -f docker-compose.get_data.yaml build
docker compose -f docker-compose.get_data.yaml up -d
docker compose -f docker-compose.extract_features.yaml up -d
docker compose -f docker-compose.collect_metrics.yaml up -d
cp .env.secret.mlflow.example .env.secret.mlflow
htpasswd -c .htpasswd ccf
docker compose -f docker-compose.mlflow.yaml up -d
Set sensitive environment variables for models (password from .htpasswd, influxdb token from .env.secret.db)
cp .env.secret.model.example .env.secret.model
docker compose -f docker-compose.train.mlflow.influxdb.yaml up -d
docker compose -f docker-compose.predict.mlflow.kafka.influxdb up -d
docker compose -f docker-compose.ui.yaml up -d
docker compose -f docker-compose.system.yaml up -d
cd docker
sudo openssl req -x509 -nodes -newkey rsa:2048 -keyout influxdb-selfsigned.key -out influxdb-selfsigned.crt -days 365
cp .env.secret.db.example .env.secret.db
docker compose -f docker-compose.kafka.yaml up -d
docker compose -f docker-compose.db.yaml up -d
docker compose -f docker-compose.data.feature.metric.yaml build
docker compose -f docker-compose.get_data.yaml up -d
docker compose -f docker-compose.extract_features.yaml up -d
docker compose -f docker-compose.collect_metrics.yaml up -d
docker compose -f docker-compose.train.local.influxdb.yaml up -d
docker compose -f docker-compose.predict.local.kafka.influxdb up -d
docker compose -f docker-compose.ui.yaml up -d
pip install -r requirements.txt
pip install -r src/ccf/requirements_data.txt
pip install -r src/ccf/requirements_features.txt
pip install -r src/ccf/requirements_ml.txt
pip install -r src/ccf/requirements_predictions.txt
pip install -r src/ccf/requirements_metrics.txt
pip install -r src/ccf/requirements_trade.txt
cd work
- Linux (by default)
PYTHONPATH=../src/ python ../src/ccf/get_data.py -cd conf -cn get_data-kafka-binance-btc-usdt
- Windows (as example)
cmd /C "set PYTHONPATH=../src && python ../src/ccf/get_data.py -cd conf -cn get_data-kafka-binance-btc-usdt"
PYTHONPATH=../src/ python ../src/ccf/extract_features.py -cd conf -cn extract_features-kafka-binance-btc-usdt
- Train once
PYTHONPATH=../src/ python ../src/ccf/train.py -cd conf -cn train-mid-lograt-tft-kafka-binance-btc-usdt
- Tune every ~1 hour
while true; do PYTHONPATH=../src/ python ../src/ccf/train.py -cd conf -cn train-mid-lograt-tft-kafka-binance-btc-usdt; sleep 3600; done
PYTHONPATH=../src/ python ../src/ccf/predict.py -cd conf -cn predict-mid-lograt-tft-kafka-binance-btc-usdt
PYTHONPATH=../src/ python ../src/ccf/collect_metrics.py -cd conf -cn collect_metrics-kafka-binance-btc-usdt
- Monitor metrics with InfluxDB (host: localhost:8086, user: ccf, password: see .env.secret.db)
- Monitor metrics with MLflow (host: localhost:5000, user: ccf, password: see .env.secret.model)
- Tensorboard (localhost:6007)
cd work
tensorboard --logdir tensorboard/ --host 0.0.0.0 --port 6007
PYTHONPATH=../src/ ../src/ccf/trade.py, -cd, conf, -cn, trade-kafka-binance-btc-tusd-fast-rl-1
PYTHONPATH=../src/ ../src/ccf/trade.py, -cd, conf, -cn, trade-kafka-binance-btc-tusd-fast-rl-2
...