Pipeline view for Stage 2 — Ingestion & order-book reconstruction and Stage 3 — Canonical state & event log.
Data flow
How raw market and external data becomes the trusted internal objects every bot reads.
- Raw ingest — CLOB v2 WebSocket, CLOB v2 REST, on-chain subgraph, sports feeds, news feeds. Each source has its own adapter.
- Adapter normalisation — Each adapter (
intel.sportsfeed_adapter,intel.news_ingest, etc.) maps the raw payload to a typed schema with a freshness stamp. - Provenance hashing — Every normalised object carries the hash of its source response. Two bots seeing the same MarketSnapshot have identical provenance.
- Quality ranking —
intel.market_quality_rankerassigns a quality score per market. Bots can require a minimum quality threshold. - Freshness budgets — Every consumer declares the maximum age it will accept. Anything older fires
INTEL_FEED_STALEand the safe fallback runs. - Internal trust boundary — Once an object is in the internal store, no bot is allowed to consume the raw upstream payload. The boundary is enforced at code-review time.
Freshness budgets per object
| Object | Default budget | On stale |
|---|---|---|
MarketSnapshot | 1000 ms | REJECT |
OrderBookSnapshot | 500 ms | REJECT |
| Sports feed | 30 s | IGNORE_SIGNAL |
| News feed | 120 s | IGNORE_SIGNAL |
| On-chain reconcile | 60 s | PAUSE_MARKET |
Why this layer exists
Raw exchange feeds change shape. Network conditions vary. Sources disagree. Without a normalisation layer, every bot reinvents the same parsing and the same staleness handling, and they disagree in subtle ways. The data-flow layer makes "the world" a single typed object that every bot reads from.