Data Pipeline & Market Scanning
Operator & jurisdiction: BASIS is operated by BASIS DIGITAL INFRASTRUCTURE LTD, a Seychelles IBC (LEI: 254900IX2F2KCWNSSS64).
Research context: Base58 Labs acts as a Research Partner for market microstructure research, structural alpha capture, and execution systems design.
Accounting convention: Platform dashboards may display values in USDT as an internal USD-equivalent accounting unit only. USDT is not a deposit or withdrawal asset. Deposits and withdrawals use native assets only, including BTC, ETH, SOL, and PAXG.
In structural alpha capture, the first failure mode is accepting a price that is not truly executable.
The data pipeline converts raw venue signals into a validated signal stream for the execution precision layer. That stream feeds BHLE, the BASIS execution engine built for sub-50μs routing, 100K+ OPS throughput, and proprietary routing infrastructure.
A visible spread is not automatically tradable. A signal is only eligible if it passes price validation, venue health checks, cost modeling, and deterministic execution constraints.
1. Signal sources
BASIS ingests market-state inputs such as:
top-of-book quotes
order book snapshots and deltas
trade prints
mark prices and index prices
funding rates and open interest where relevant
Operational signals are treated as first-class inputs:
withdrawal status
deposit status
API latency and error rates
throttling and rate-limit conditions
maintenance notices
settlement or transfer interruptions
Where strategy modules require it, BASIS also evaluates:
gas conditions
block congestion
confirmation latency
bridge or settlement state
wallet and contract interaction health
A venue registry defines which feeds are eligible and how much confidence each source receives.
2. Normalization
Each venue exposes data differently. To make signals comparable, BASIS transforms all inputs into a canonical internal format.
Symbol naming
Canonical symbol mapping
Quote currency conventions
Unified quote handling
Precision and tick size
Scaled numeric normalization
Timestamp format
Clock alignment and drift monitoring
Depth representation
Standardized depth ladder format
API semantics
Common event schema
Example canonical event:
Normalization reduces semantic mismatch before any opportunity model is applied.
3. Cross-validation and outlier rejection
A single venue can publish stale, lagged, or erroneous prices. BASIS therefore applies multi-source validation before any signal reaches execution.
Collect comparable observations across eligible venues.
Estimate fair reference levels using robust statistics such as medians and trimmed means.
Reject observations outside dynamic deviation thresholds.
Require temporal consistency across successive updates.
Promote only validated signals to the execution queue.
This process reduces the probability of trading on a ghost gap or stale book.
4. Venue health scoring
A large spread can indicate opportunity, but it can also indicate operational stress. BASIS scores venues continuously and uses those scores as part of the eligibility gate.
Withdrawal availability
Determines settlement realism
Deposit availability
Affects inventory mobility
API latency
Impacts execution precision
Error rate
Indicates feed stability
Throttling conditions
Limits order placement reliability
Maintenance windows
Can invalidate live pricing
Sequence integrity
Detects missing or corrupted updates
Low health scores can down-rank or fully exclude a venue from signal generation.
5. Market scanning and opportunity detection
After normalization and validation, the signal engine scans for executable structural alpha, including:
cross-venue price dislocations
funding and basis differentials
spot and derivative mispricings
on-chain versus off-chain valuation gaps where relevant
Detection alone is not sufficient. Every candidate must also pass:
depth sufficiency checks
transfer and settlement feasibility checks
fee and slippage modeling
route construction checks
state-machine risk controls
Trust in the signal engine comes from deterministic execution rules, mathematical constraints, and explicit state transitions. A candidate either satisfies the full rule set or it does not enter execution.
6. Execution handoff
The validated signal stream is handed to the orchestration layer only when all required constraints are satisfied:
data freshness is within tolerance
venue health is above threshold
executable depth is sufficient
modeled edge remains positive after costs
routing path is stable
risk state permits action
This architecture is designed to prioritize determinism over headline spread size.
7. Why this matters
The data pipeline is the first control surface for execution quality. If inputs are inconsistent, stale, or operationally compromised, even fast infrastructure will route bad decisions quickly. BASIS therefore treats market scanning as a constrained systems problem, not a simple spread detector.
Next: read Execution Orchestration.
Last updated