feat: real data pipeline — market data, DB persistence, portfolio sync, signal-trade linkage
Wire the trading bot to real Alpaca market data and persist pipeline state to the database so the dashboard displays live information. - Add market-data service fetching OHLCV bars from Alpaca, publishing to market:bars Redis Stream; signal generator consumes bars and injects current_price into signals for position sizing - Sentiment analyzer now persists Article + ArticleSentiment rows to DB after scoring, with duplicate and error handling - API gateway runs a background portfolio sync task that snapshots Alpaca account state into PortfolioSnapshot/Position DB tables during market hours - TradeSignal carries a signal_id UUID; signal generator and trade executor both persist their records to DB with cross-references - 303 unit tests pass (57 new tests added)
This commit is contained in:
parent
5a6b20c8f1
commit
e2a3bd456d
19 changed files with 2238 additions and 72 deletions
|
|
@ -3,7 +3,7 @@
|
|||
from datetime import datetime
|
||||
from enum import Enum
|
||||
from typing import Any
|
||||
from uuid import UUID
|
||||
from uuid import UUID, uuid4
|
||||
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
|
|
@ -96,6 +96,7 @@ class AccountInfo(BaseModel):
|
|||
class TradeSignal(BaseModel):
|
||||
"""Published to ``signals:generated`` by the signal generator."""
|
||||
|
||||
signal_id: UUID = Field(default_factory=uuid4)
|
||||
ticker: str
|
||||
direction: SignalDirection
|
||||
strength: float = Field(ge=0.0, le=1.0)
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue