La Fosse is currently recruiting a contract Data Engineer (12 months) on behalf of a global financial services company.
They require an experienced engineer with extensive experience in building and maintaining Data Lakehouses in Azure. The ideal candidate will have a strong background in both Scala and Python programming, with specific experience building high performance and scalable pipelines in Databricks with Spark.
The new data platform will serve a range of end users in the Capital Markets division, so experience across areas such as derivatives, risk, PnL, trade lifecycle, market data are highly desired.
Key Responsibilities
- Engineer robust, scalable and efficient data pipelines using Spark/Databricks
- Demonstrate best practices in the design & development of data pipelines
- Data modelling on medallion architecture, ensuring schema is intuitive and serves business users analytical and reporting objectives
What you’ll need
- Expert in building data pipelines in Databricks with Spark (preferably on Azure)
- Financial Services sector knowledge (ideally covering trading, derivatives, risk and trade-lifecycle)
- Expert level programming in Python (plus Scala is highly desired)
- Expert data modelling experience on medallion architecture
Hybrid model: 1 day per week in City of London
Duration: 12 months
IR35 – Outside
Start date: April 206
#J-18808-Ljbffr…
