Wire the trading bot to real Alpaca market data and persist pipeline
state to the database so the dashboard displays live information.
- Add market-data service fetching OHLCV bars from Alpaca, publishing
to market:bars Redis Stream; signal generator consumes bars and
injects current_price into signals for position sizing
- Sentiment analyzer now persists Article + ArticleSentiment rows to
DB after scoring, with duplicate and error handling
- API gateway runs a background portfolio sync task that snapshots
Alpaca account state into PortfolioSnapshot/Position DB tables
during market hours
- TradeSignal carries a signal_id UUID; signal generator and trade
executor both persist their records to DB with cross-references
- 303 unit tests pass (57 new tests added)
Add integration tests for the news pipeline (test_news_pipeline.py) and
trading flow (test_trading_flow.py) using real Redis with mocked FinBERT
and Alpaca. Add seed_strategies.py to insert default strategies (momentum,
mean_reversion, news_driven) with equal weights. Add smoke_test.sh for
end-to-end stack validation. Update pyproject.toml with integration marker
and scripts package discovery.
- pyproject.toml with core deps and optional dep groups per service
- shared/config.py: Pydantic BaseSettings with TRADING_ env prefix
- shared/redis_streams.py: StreamPublisher/StreamConsumer wrappers
- shared/telemetry.py: OpenTelemetry + Prometheus metric export
- tests for Redis Streams helpers (5 passing)