Data Test Engineer - (Banking, Kafka, MongoDB, API, Migration)
Rate: Competitive Inside IR35
Contract Start - Immediate
Contract End - December 2026
Location: London (2/3 days Hybrid)
** Must have experience within the Banking industry **
We are seeking a skilled Data Test Engineer to join our team responsible for validating the quality, reliability, and performance of Direct-to-Consumer (D2C) Reporting APIs delivered on the Operational Data Store (ODS) platform. This role is critical in ensuring robust and accurate data processing through comprehensive testing of Java-based APIs, event-driven streaming pipelines, and data persistence layers.
Key Responsibilities
- Design, develop, and execute functional, integration, and non-functional tests for D2C Reporting APIs.
- Validate data accuracy and event flows within event-driven streaming pipelines using Kafka, Flink, and Kafka Streams.
- Test data persistence and integrity in MongoDB databases.
- Implement automated testing frameworks and quality controls to support production readiness.
- Collaborate with development and operations teams to ensure seamless deployment and performance on OpenShift platform.
- Identify, document, and track defects and issues, ensuring timely resolution.
- Contribute to continuous improvement of testing processes and tools.
Required Skills and Experience
- Must have proven experience within the Banking industry.
- Strong experience testing Java-based APIs.
- Strong knowledge of event-driven streaming technologies such as Kafka, Flink, and Kafka Streams.
- Experience with data persistence testing in MongoDB.
- Familiarity with automated testing tools and frameworks.
- Experience working with containerized environments, preferably OpenShift or Kubernetes.
- Ability to design and execute comprehensive test plans covering functional, integration, and non-functional aspects.
- Strong analytical skills with attention to detail in validating data accuracy and event flows.
- Excellent communication and collaboration skills.
#LI-Hybrid