A leading global investment management firm is seeking Senior Java Engineers (Contract) to join its Market Data Platform team on a high-impact engagement. This is a critical hire, with the opportunity to step into a highly visible area of the business where performance, scale and reliability truly matter.
This team sits at the heart of a systematic trading environment, responsible for ingesting and processing vast volumes of real-time market data through direct exchange connectivity. The platform operates at extreme scale – handling 15–20 billion data points per day, with peak rates of over 1 million events per second – and accuracy is non-negotiable.
This opportunity would suit an engineer with deep Java expertise who has worked on complete data platforms, particularly those handling real-time or market data using Kafka.
The Role
The successful candidate will be responsible for designing, developing and optimising components of a large-scale data platform used across the investment business. Working in a highly technical environment, they will:
- Build and evolve end-to-end data platforms supporting real-time and historical data use cases
- Develop primarily in Java
- Work extensively with Kafka and streaming architectures to handle real-time data flows
- Contribute to systems processing billions of events per day, with peak throughput in the millions per second
- Improve and modernise existing components, with a strong focus on latency, scalability and resilience
- Partner closely with data management teams, quantitative researchers and other engineering groups
- Take ownership of production systems, from requirements gathering through to deployment and optimisation
Required Experience
- Extensive commercial Java development experience, particularly on backend or data-intensive systems
- Demonstrable experience building or contributing to complete data platforms
- Hands-on experience working with real-time data, ideally using Kafka
- Background in market data, trading systems or other time-sensitive data domains is highly desirable
- Solid experience working on Linux and strong understanding of Git
- Experience with databases or data storage technologies such as MongoDB, Postgres, Iceberg or similar
- Comfortable working in fast-paced, performance-critical environments
- Strong problem-solving skills, attention to detail and ability to work autonomously
Advantageous to Have
- Experience with distributed systems and large-scale data processing
- Performance tuning and optimisation of low-latency systems
- Exposure to orchestration frameworks and modern data infrastructure
- Contributions to open-source projects
- Experience working with or around Large Language Models (LLMs)
- Rate: £500-575 per day (Inside IR35 – some flexibility for the right candidate)
- Location: Hybrid – 3 days per week onsite in the City of London.
- Start: ASAP
Interview Process
- Screening call by Mike
- 30-minute introductory discussion with the hiring manager
- 90-minute final stage on-site interview (60 minutes Technical and 30 minutes with Leadership)
#J-18808-Ljbffr…
