Wire the trading bot to real Alpaca market data and persist pipeline
state to the database so the dashboard displays live information.
- Add market-data service fetching OHLCV bars from Alpaca, publishing
to market:bars Redis Stream; signal generator consumes bars and
injects current_price into signals for position sizing
- Sentiment analyzer now persists Article + ArticleSentiment rows to
DB after scoring, with duplicate and error handling
- API gateway runs a background portfolio sync task that snapshots
Alpaca account state into PortfolioSnapshot/Position DB tables
during market hours
- TradeSignal carries a signal_id UUID; signal generator and trade
executor both persist their records to DB with cross-references
- 303 unit tests pass (57 new tests added)
I1: Add graceful shutdown (SIGTERM/SIGINT) to all 5 background services
I2: Fix Dockerfile healthcheck to use curl on /metrics endpoint
I3: Fix StreamConsumer.ensure_group() to only catch BUSYGROUP errors
I4: Fix SimulatedBroker to reject orders with insufficient cash/shares
I5: Move ORM attribute access inside DB session context in trades routes
I6: Add Redis-based rate limiting (10 req/min/IP) on all auth endpoints
I8: Prevent backtest background task garbage collection
I9: Use Numeric(16,6) instead of Float for financial columns in migration
I10: Add index on trades.created_at for time-range queries
I11: Bind infrastructure ports to 127.0.0.1 in docker-compose
I12: Add migrations init service; all app services depend on it
I13: Fix user enumeration in login_begin (return options for non-existent users)
- pyproject.toml with core deps and optional dep groups per service
- shared/config.py: Pydantic BaseSettings with TRADING_ env prefix
- shared/redis_streams.py: StreamPublisher/StreamConsumer wrappers
- shared/telemetry.py: OpenTelemetry + Prometheus metric export
- tests for Redis Streams helpers (5 passing)