Data Architect

Company: Bounteous
Apply for the Data Architect
Location: Greater London
Job Description:

Ideal Background

  • Financial services background (banking, payments, capital markets)
  • Mix of mainframe and modern technology experience
  • Track record of modernizing legacy systems
  • Production support and incident management experience

Must-Have

  • Change Data Capture: CDC design and operations (IBM, Precisely, or equivalent); subscription management, bookmarks, replay, backfill.
  • Db2 & z/OS knowledge: Db2 catalog, z/OS fundamentals, batch windows, performance considerations.
  • Integration patterns: Kafka/MSK hands-on, CDC-to-target pipelines, UPSERT/MERGE logic; Python/SQL; strong troubleshooting.
  • Data quality mindset: Write validation tests before migration; golden-source reconciliation.

Data Architecture Fundamentals (Must-Have)

  • Logical data modeling : Entity-relationship diagrams, normalization (1NF through Boyce-Codd/BCNF), denormalization trade-offs; identify functional dependencies and anomalies.
  • Physical data modeling: Table design, partitioning strategies, indexes; SCD types; dimensional vs. transactional schemas; storage patterns for OLTP vs. analytics.
  • Normalization & design: Normalize to 3NF/BCNF for transactional systems; understand when to denormalize for queries; trade-offs between 3NF, Data Vault, and star schemas.
  • Domain-Driven Design: Bounded contexts and subdomains; aggregates and aggregate roots; entities vs. value objects; repository patterns; ubiquitous language.
  • Event-driven architecture: Domain events and contracts; CDC as event streams; idempotency and replay patterns; mapping Db2 transactions to event-driven architectures; saga orchestration.
  • CQRS patterns: Command/query separation; event sourcing and state reconstruction; eventual consistency; when CQRS is justified for mainframe migration vs. overkill.
  • Database internals: Index structures (B-tree, bitmap, etc.), query planning, partitioning strategies; how Db2 vs. PostgreSQL differ in storage and execution.
  • Data quality & validation: Designing test suites for schema conformance; referential integrity checks; sampling and reconciliation strategies.

Nice-to-Have

None listed.

#J-18808-Ljbffr…

Posted: April 11th, 2026