Data Architect at Defaqto, Haddenham/Remote, 3 Months initial, £Day Rate Option

Contract Description

About Defaqto

We are one of the UK’s most trusted providers of independent financial information, ratings and market intelligence, helping consumers, advisers and financial institutions make smarter financial decisions. We maintain the UK’s largest financial product database and deliver expert, unbiased product ratings - including our industry‑recognised Star Ratings - which assess product quality across insurance, banking, investments, pensions and more.
  • This role can be filled via fixed term contract or via daily rate (Inside IR35) via umbrella company.
  • Initially 3 months with the potential for expansion.

Role Overview:

This is a focused, high-value engagement to lead the data architecture of Defaqto's redesigned market data model. We have an initial view of the logical structure we need, and we are looking for an experienced architect to test, challenge, and refine that thinking - then translate it into a signed-off logical model and implementation recommendations for the CTO and Head of Data to act on.

What you'll do

Context:

We have developed an initial view of the logical architecture - centred on four core entities: Package, Component, Attribute, and Market - but this is a starting point, not a fixed brief. We are looking for an architect who will interrogate that thinking, validate it against real business requirements, and produce a model that genuinely fits how our data needs to work. The initial scope is the market data product estate and the primary research data platform.

Technology choices for physical implementation have not been finalised. Google BigQuery is our existing lakehouse platform and the likely foundation, but decisions on the transformation layer, tooling, and overall implementation approach will be made during this engagement in collaboration with the CTO and Head of Data. The logical model itself is a technology-agnostic deliverable; implementation recommendations are expected alongside it.

The Engagement:

This contract covers the design and architecture phase of the data model project (Phase 1 of a phased delivery programme). The primary output is a signed-off logical data model and a technology implementation recommendation, produced in close collaboration with the CTO and Head of Data. The logical model is a technology-agnostic deliverable; the implementation recommendation should be grounded in Defaqto's existing technology landscape and make a clear case for the chosen approach.

A Senior Data Engineer. permanent hire, to be recruited, will own the build and ongoing implementation once the architecture is agreed. The architect's role is to design the model, recommend the implementation approach, and provide sufficient documentation that the engineering team can execute without ongoing dependency on the contractor.

Scope & Constraints:

In Scope
  • Logical model design for the market data product estate - core entities, relationships, and attributes
  • Technology implementation recommendation - physical implementation approach suited to the existing stack, presented to CTO and Head of Data
  • Compatibility view specification for research platform continuity during transition
  • Business rule formalisation for deduplication, ratings hierarchy, and attribute priority logic
  • Stakeholder workshops and sign-off facilitation
  • Data dictionary and handover documentation
Out of Scope
  • Transformation layer build and implementation (owned by Senior Data Engineer)
  • Research platform frontend adaptation (Phase 2 workstream)
  • Full estate migration beyond the initial market data scope (Phase 2+)
  • Ongoing data governance or catalogue ownership
  • Graph database design or implementation

What you'll need to succeed:

Essential requirements
  • Demonstrable experience leading logical and physical data model design for analytical data warehouses
  • Strong command of entity-relationship modelling, normalisation, and dimensional design
  • Experience with cloud data warehouse platforms - Google BigQuery is the existing lakehouse technology; hands-on experience with BigQuery or equivalent (Snowflake, Redshift, Databricks) is expected
  • Ability to produce technology-agnostic logical models and translate them into well-reasoned implementation recommendations
  • Proven ability to facilitate workshops with mixed technical and non-technical stakeholders and produce signed-off artefacts
  • Experience producing data dictionaries and technical documentation to a standard usable by engineering teams
  • Comfortable presenting architecture options and trade-offs to senior technology leadership
  • Ability to work independently and manage delivery to milestone-based timelines

Desirable requirements: 
  • Experience in financial services, insurance, or fintech data domains
  • Familiarity with dbt - sufficient to design a schema that a dbt-based transformation layer can implement effectively
  • Experience designing compatibility or migration layers for legacy platform transitions
  • Exposure to data cataloguing tools (Google Dataplex, DataHub, Atlan)
  • Understanding of product data or market intelligence data structures
  • Experience working within a phased, non-big-bang migration approach

Important to know:

Location:
This is a hybrid working role usally with 2 days each week in the office - office base can be London or Haddenham. From time-to-time if there are workshops, you may need to be in the office more frequently.