Role Overview
We are looking for an ambitious, curious and versatile Senior Data Engineer to drive business impact through a robust and flexible data platform. Our ideal future team mate cares deeply about delivering a world‑class, reliable data platform and developer experience that supercharges the work of analysts, analytics engineers and data scientists across the company.
Responsibilities
- Data platform ownership and vision: define and execute the technical roadmap for our core data platform, ensuring highest levels of reliability, observability, data quality and flexibility.
- Build for self‑service: design and implement tooling that empowers Analytics Engineers and Data Scientists to build their own pipelines and models safely and autonomously, removing yourself as a bottleneck.
- Leverage our data: drive business impact within your squad through sophisticated ways of operationalising data in our products.
- Raise the bar: lead through influence and collaborate with fellow Data team members to grow and improve our analytics and machine learning & AI platform and set technical standards across the organisation.
Your profile
- 5+ years of experience in a similar data engineering, analytics engineering or backend engineering role. You have a proven track record of building and scaling data systems in production environments.
- Highly proficient in Python and write maintainable, testable, and efficient code. You are also a SQL expert who understands how to optimise complex queries and design resilient data models.
- Experience or a deep interest in managing “Data Infrastructure as Code.” We lean heavily on open‑source solutions like Dagster and self‑host on GCP via Kubernetes.
- Comfortable within the Google Cloud (GCP) ecosystem (or similar) and have experience with data warehouses like BigQuery and cloud storage patterns.
- You advocate for CI/CD, automated testing, and observability. You believe that a data platform should be treated with the same discipline as a core product.
- A clear and bold communicator who knows how to “scale themselves” through precise documentation and written guides while also enjoying direct collaboration across squads.
Technologies we use
- Programming languages: Python, SQL.
- Core data platform tools and frameworks: Dagster, dlt, Airbyte, dbt, data‑diff, Elementary.
- Data lake and warehouse: BigQuery, Iceberg + BigLake.
- Infrastructure: Cloud Run, Cloud Storage, GKE (Google Kubernetes Engine), Terraform.
How we reward our team
- Dynamic hybrid working environment with a diverse and driven team.
- Huge opportunity for learning in a high growth environment, with progression opportunities based on success in the role.
- 25 days of holiday allowance plus public holidays.
- 1 Birthday Day Off + 2 Tenure‑Based Additional Days Off.
- Subsidised Private Medical Insurance including dental, vision & mental health therapy.
- Bi‑annual performance reviews and tailored development plans.
- Competitive salary + EMI options scheme.
- Annual compensation review.
- Team lunch provided once a week.
- Quarterly team socials and annual sports day.
- Enhanced maternity/paternity/adoption policy as day 1 right.
- Community volunteer days.
- Cycle to work scheme.
- Dog friendly office and depots.
- MacBook Air or Pro (depending on your preference).
#J-18808-Ljbffr…
