Hayaiti Research
Internal R&D · perpetuals desk
Hayaiti Research · perpetual swap pricer with funding model
Reproducible pricer for a perpetual swap with a clean funding-rate model.
Internal R&D: a perpetual swap pricer that decomposes mark price, index price, and funding rate cleanly enough to be useful for both research and execution. The point of the project was to have something we trust before we wrote anything that depended on it.
Throughput
981/s
p95
89ms
Errors
0.02%
- Industry
- Crypto / Derivatives
- Timeline
- 3 weeks
- Team
- 1
- Service
- Software
- Project tier
- Internal R&D — anonymized
The Problem
What was broken.
We had three notebooks doing roughly the same calculation in three slightly different ways. None of them tested. Ad-hoc disagreements between them quietly turned into bad assumptions in research.
Our Approach
How we framed it.
Wrote one pricer, with explicit interfaces for: index source, mark calculation, funding rate model. Wrote unit tests against published exchange documentation examples. Replaced the three notebooks with imports.
Capability proof
What this case demonstrates.
This case makes the hidden work visible: strategy, architecture, delivery control, quality evidence, and handoff.
01 / Product judgment
Problem framed before UI
We had three notebooks doing roughly the same calculation in three slightly different ways. None of them tested. Ad-hoc disagreements between them quietly turned into bad assumptions in research.
02 / Technical depth
7 stack decisions
Python, NumPy, Pandas, Polars, asyncio, Pytest
03 / Delivery discipline
3 delivery checkpoints
Spec + tests-first / Pricer + replay client / Live client + writeup
04 / Handoff quality
4 shipped artifacts
Tested perpetual-swap pricer with per-venue plug-ins / Replay client for backtest use / Async live client for execution use
Production artifacts
Inspect the work behind the visible result.
Each case exposes the surfaces, systems, evidence, and handoff package that make the shipped product usable after launch.
Experience layer
Buyer or user surface
Single tested pricer with explicit interfaces for index source, mark, and funding rate. Async live client + replay client for backtest. Assumptions documented in the README.
Proof 01
Reproducible pricer for a perpetual swap with a clean funding-rate model.
Proof 02
Wrote the test suite from exchange documentation before writing the implementation.
Proof 03
Tested perpetual-swap pricer with per-venue plug-ins
Production signals
Risk-aware
Security and compliance boundaries named.
Handoff-ready
Owner can keep operating after delivery.
Before / after · product UI mockup
Industry · Crypto / Derivatives
import numpy as np, pandas as pd
spot, k, t, r, sigma = 62_410.20, 65_000, 0.0822, 0.0532, 0.6420
call_px = bs.price(spot, k, t, r, sigma, kind='call')
print(pd.Series(greeks))
gamma 0.000028
vega 82.41
theta -12.18
RuntimeWarning: divide by zero in log (rho)
Before:Pricing lived in a Jupyter notebook; quotes refreshed when someone manually re-ran a cell.
After:Single tested pricer with explicit interfaces for index source, mark, and funding rate. Async live client + replay client for backtest. Assumptions documented in the README.
How the engagement ran.
- 01Week 1
Spec + tests-first
Wrote the test suite from exchange documentation before writing the implementation.
- 02Week 2
Pricer + replay client
Implemented the pricer and a replay client that consumes recorded WebSocket dumps.
- 03Week 3
Live client + writeup
Async live WebSocket client + 6-page README that future-us will thank present-us for.
- 1
Week 1
Spec + tests-first
Wrote the test suite from exchange documentation before writing the implementation.
- 2
Week 2
Pricer + replay client
Implemented the pricer and a replay client that consumes recorded WebSocket dumps.
- 3
Week 3
Live client + writeup
Async live WebSocket client + 6-page README that future-us will thank present-us for.
Deliverables
What we shipped.
- ✓Tested perpetual-swap pricer with per-venue plug-ins
- ✓Replay client for backtest use
- ✓Async live client for execution use
- ✓README documenting assumptions and limitations
Outcomes.
delivered outcomesPlan: single pricer used by all internal research notebooks
Plan: tested against exchange-published worked examples
Plan: clean separation: index source vs mark vs funding model
Plan: async WebSocket clients for live use, replay clients for backtest
Plan: documented assumptions, in plain English, in the README
Honest challenges
What we got wrong (or almost wrong).
The pretty version of any case study skips this part. We don't.
- 01
Funding rate intervals differ across venues; we modeled funding as a per-venue plug-in instead of hard-coding 8 hours.
- 02
Index sources occasionally include outliers; we added a documented MAD-based filter rather than a magic constant.
In our own words
The README was the deliverable. Future-us forgets the details; present-us writes them down with the assumptions in plain English so the next research project doesn't relitigate the same funding-rate edge cases.
From the Hayaiti team
Engineering · design · security
Technical blueprint
How the work holds together.
Buyers should see that the visual layer is backed by architecture, quality gates, and operational ownership.
Experience
1Application
2Data
3Operations
4Security
5Stack used
7 technologiesRelated
Other cases like this.
Northwind Studios
Onboarding rebuild · Series A SaaS
Cut signup-to-aha-moment from 9 minutes to under 90 seconds.
FintechFoundry Capital
Broker API + nightly reconciliation
Stand up a reliable broker integration with nightly reconciliation in three weeks.
Quant TradingHayaiti Research
Cross-sectional momentum harness
Stand up a reproducible backtest harness for a momentum strategy on equities.
Want a case study like this?
Want this level of production quality on your project?
Send a short brief and we'll reply with scope, timeline, price direction, and the first technical recommendation.