Data Architect at Enterprise Blueprints, Manchester/Remote, 6 Months, £Contract Rate

6 Months or more Information Technology

Contract Description

Enterprise Blueprints is a boutique IT architecture consultancy formed by Architects focused on Architecture. Our passion for architecture has driven our growth over 15 years and led to an acquisition by Bain & Company at the beginning of 2023 retaining our culture and collaboration whilst opening even more doors to exciting projects and opportunities in the IT architecture space.

 

Our Culture is built on our values of contribution and trusted relationships this applies to our team and our clients. We have an inclusive culture which encourages ideas, input, and collaboration; we are keen for you to bring your skills, experience and personality to the team and engagements you are a part of.

 

We work with architects who want to make an impact. When you join EB, you join an organisation that values your entrepreneurial spirit and contribution to Enterprise Blueprints as well as our clients. You have a chance to do more than delivery. There are opportunities to carve a niche for yourself, add value, use your experience, develop your skills, and define your career trajectory.

 

About the role:

We are seeking an experienced Data Architect to support a large-scale transformation program within the financial services industry. This contract role offers the opportunity to work on complex data architecture, pipeline design, and real-time data processing within a cloud-based environment.

 

6 Month Engagement

Inside IR35

Hybrid Working - Mixture of remote/office presence required in Manchester

 

Required Skills & Experience:

  • Proven experience as a Data Architect within the financial services industry.
  • Strong expertise in data warehousing, schema design, and data modeling.
  • Hands-on experience with time series data ingestion and event-driven architectures.
  • Proficiency in data pipeline frameworks (ETL, ELT) and cloud-based data transformation.
  • Extensive knowledge of Google Cloud Platform (GCP), particularly BigQuery.
  • Experience with Apache Kafka for real-time data streaming and event processing.