AWS Engineer at Scrumconnect Consulting, Newcastle, £400-£450 per day

£400 - £450 per day

Contract Description

  • Date Opened
  • 28/04/2026
  • Job Type
  • Contract

Job Description

About Scrumconnect Consulting
Scrumconnect Consulting is a multi-award-winning digital consultancy, recognised for delivering impactful and innovative technology solutions across UK government departments. Our work has positively influenced the lives of over 40 million UK citizens.

We are passionate about user-centred design, agile delivery, and building digital services that make a real difference. Our teams work at the forefront of innovation, helping organisations transform and deliver high-quality, scalable solutions that truly matter.

 

Overview:

We are looking for a highly skilled AWS DevOps / Data Engineer to design, build, and maintain scalable, resilient, and high-performing cloud-based data solutions. The ideal candidate will have strong DevOps expertise combined with data engineering capabilities, working across modern cloud technologies and automation frameworks. 

 

Key Responsibilities:

  • Translate technical requirements into effective DevOps toolchains to enable product delivery

  • Design and implement resilient, scalable, and highly available services

  • Automate infrastructure, deployments, and testing using Infrastructure as Code (IaC)

  • Ensure deployment strategies are repeatable, scalable, and reliable

  • Provide technical leadership, mentoring, and coaching to junior team members

  • Support delivery teams by troubleshooting and resolving complex cloud-related issues

  • Work with data pipelines and processing frameworks to ensure data quality and reliability

  • Perform data analysis to identify and resolve root causes of issues

  • Collaborate with cross-functional teams to deliver end-to-end solutions

Key Skills & Experience:

  • Strong expertise in AWS cloud services, including:
    CloudWatch, IAM, S3, Glue, ECR, EC2, EMR, DynamoDB, Lake Formation

  • Experience with Infrastructure as Code (Terraform)

  • Proficiency in Python, SQL, and familiarity with PySpark

  • Experience with Apache Spark and data processing frameworks

  • Hands-on experience with Apache Airflow for workflow orchestration

  • Experience using Jupyter Notebooks and/or Amazon Athena for data querying and validation

  • Strong understanding of EMR and log analysis

  • Experience with GitLab for CI/CD pipelines, release tagging, and version control

  • Knowledge of Docker and containerisation

  • Understanding of encryption techniques (server-side and client-side)

  • Familiarity with Amazon Textract and Comprehend

  • Strong understanding of data modelling concepts, including dimensional models and slowly changing dimensions

Additional Skills:

  • Ability to translate customer requirements into functional technical solutions

  • Familiarity with engineering best practices and standards

  • Strong understanding of data structures and solution design principles

  • Experience working in agile delivery environments