Role Overview
You will own and ship core platform systems that power how health data flows through Terra — from upstream data suppliers (wearables, health apps, sensors) into our unified schema, and out to downstream developers and AI consumers via our REST API, Streaming API, and webhooks.
What You Will Build
- Unified API & Data Normalisation: Design and extend systems that ingest raw health data from 100+ upstream providers (Garmin, Fitbit, Oura, Apple Health, Google Fit, Withings, etc.) and normalise it into Terra’s standardized schema.
- Upstream Data Supplier Connector: Architect and build Terra’s new upstream connector capability, allowing data suppliers to push data directly into the platform. Define ingestion contracts, enforce data quality SLAs, and ensure schema compliance.
- Streaming API: Scale our real-time streaming infrastructure for live sensor data (heart rate, steps, distance) from Apple Watch and BLE/ANT+ devices via websocket listeners. Sub-second latency matters.
- Authentication & Developer Experience: Own the authentication layer for both upstream suppliers and downstream developers. Build OAuth flows, API key management, and RBAC + least privilege controls.
- Webhooks & Event Delivery: Design and harden the webhook delivery system that pushes health data events to downstream consumers. Guaranteed delivery, retry logic, at scale.
- Teams API: Contribute to our Teams API — enabling coaches, researchers, and organisations to aggregate and analyse group data across athletes, patients, and study participants.
- Health Scores & Derived Data: Build backend systems that transform raw health metrics into validated scores (recovery, strain, stress, immunity) backed by research.
- Infrastructure & Reliability: Maintain 99.5%+ uptime on a platform that processes billions of events. Design for fault tolerance, observability, and horizontal scale.
Qualifications
- 5+ years of production backend engineering; shipped systems that handle real traffic at scale.
- Deep API design experience; built REST APIs for external developers; understands versioning, rate limiting, pagination, error handling.
- Strong with distributed systems: message queues, event-driven architectures, stream processing, data pipelines.
- Proficiency in Python and/or Go; deeply fluent in at least one.
- Database expertise: PostgreSQL, Redis, time-series stores; know when to use what and how to make it fast.
- Cloud-native mindset: AWS/GCP, containers, CI/CD, IaC; robust DevOps skills.
- Security-first thinking: encryption in transit/at rest (TLS 1.2+, AES‑256); data minimisation; GDPR, HIPAA, SOC 2 built-in.
- Autonomous and fast; high velocity, high standards; sees problem, owns it, ships fix.
Bonus
Being an athlete or having a strong passion for fitness data analytics is a plus.
#J-18808-Ljbffr…
