- Job Reference hbuk/TP/158/2489
- Contract Type Day Rate Contractor
- Closing Date 20 May, 2026
- Job Category Tech, Data and Digital
- Business Unit Systems and Infrastructure (UKT)
- Location London, United Kingdom
- Posted on 20 April, 2026
We are looking for a Data Engineer to design, build and maintain a scalable on-premise data warehouse using modern data engineering practices. You will work with Python, Apache Airflow and DBT in a Unix/Linux environment to develop robust data pipelines, transform data into curated models and support analytics and reporting.
This role requires someone comfortable working in a non-cloud (on-prem) environment, with a strong ownership of infrastructure-aware data solutions.
Main Responsibilities
- Design and build end-to-end ETL/ELT pipelines using Python and SQL.
- Develop and Orchestrate workflows using Apache Airflow.
- Implement transformation logic and data models using dbt (medallionstar where applicable)
- Work in a Unix/Linux environment for scheduling, scripting and deployment.
- Maintain CI/CD pipelines and version control.
- Translate complex business requirements into clear technical specifications.
- Administer, maintain, and provide support for data analytics platforms.
- Build and maintain solutions across the Microsoft BI stack (SSRS, SSIS, SSAS, T-SQL).
- Perform unit testing and resolve issues through effective troubleshooting.
- Collaborate with cross-functional teams to ensure seamless system integration.
- Deliver insightful visualisations and reports to support business projects.
- Create, maintain, and update comprehensive technical documentation.
- Operates in line with the Bank's Risk Management framework (including sub-frameworks) and relevant risk and compliance policies and procedures, ensuring appropriate and timely escalation of any concerns to their line manager.
- Actively promotes and ensures adherence to the Bank's Risk Management framework (including sub-frameworks) and relevant risk and compliance policies and procedures, ensuring timely and appropriate escalation of concerns to relevant senior stakeholders.
Ideal Candidate
Research (by Harvard University) shows that women are particularly likely to second guess themselves and not apply - so if you are worried you don't meet all the criteria, get in touch anyhow and let us do the worrying…
- Strong Programming skills in Python (Data pipelines, API’s and scripting)
- Hands-on experience with Apache Airflow.
- Experience configuring, maintaining, and optimizing DAGs.
- Strong experience with dbt (models, macros, tests and documentation)
- Exposure to containerisation (Docker)
- Knowledge and Experience of ETL/ELT, data warehousing, BI and Cloud.
- Advanced SQL (T-SQL or PL/SQL).
- Good understanding of Linux/Unix systems.
- Design, develop, and maintain solutions using Microsoft BI stack (SSRS, SSIS, SSAS, T-SQL).
- Experience with agile delivery, test automation and CI/CD.
- Skilled at developing dashboards and data visualisations in tools such as PowerBI, Tableau, Qlik etc