← ...
project1
project 1: the “order book” aggregator
scenario: a crypto exchange (like the one you worked on at blazpay) is crashing because the frontend is polling the database for prices. the mission: build a real-time websocket ingestion engine that normalizes order book data from binance & coinbase into a unified kafka stream.
- tech: python (asyncio), kafka, docker.
- challenge: handle connection drops and ensure strict ordering of updates.
- dev to prod:
- create a python
aiohttpscript to connect to binance websocket. - push raw json events to a kafka topic
raw_orders. - prod requirement: dockerize the producer with a
restart: alwayspolicy. implement “dead letter queue” logic (if kafka is down, save to a local csv buffer).
- create a python