Job Description:
Location: Lisburn
Contract Length: 3 Months (2-3 days per week)
Rate: £450.00 per day
Role:
OKTO is seeking a Senior Data Engineer to lead the architecture and implementation of a mission-critical digital operations system for several major projects. This role is foundational to our mission – connecting diverse system endpoints such as power ,airflow, heating, medical gas and fire alarms into a centralised, resilient cloud-based infrastructure.
You will be responsible for designing the end-to-end data lifecycle from high frequency time series ingestion to the creation of a “Master Dashboard” that ensures critical systems remain vital, accurate and trustworthy.
Responsibilities:
- Data architecture & modelling: Design and implement a scalable architecture on Microsoft Azure to organise raw system data into curated, reporting-ready layers.
- Operational system integration: Build robust pipelines to ingest and process high-frequency time-series and IoT-like data from critical infrastructure.
- Cloud database production: Develop and manage the foundational cloud database using Azure Databricks, Synapse and Data Lake to serve as the single source of truth.
- Real-time processing: Implement and optimise streaming workloads using Kafka, Event Hub and Spark Streaming to ensure low-latency data availability for life-critical monitoring.
- Governance & compliance: Enforce strict data quality frameworks and maintain PII/PHI compliance, ensuring the integrity of sensitive healthcare and operational data.
- Visualisation & reporting: Collaborate with stakeholders to design and deploy the “Master Dashboard” using Power BI, utilising DAX and semantic models for actionable insights.
Requirements:
- Azure mastery: Deep expertise in the Azure Ecosystem, specifically Databricks, Azure Data Factory (ADF), Synapse and Unity Catalog.
- Data engineering: Proficiency in Python, PySpark and Spark SQL for ETL/ELT pipeline development.
- Time-series expertise: Proven experience handling real-time, high frequency data feeds (e.g. aviation, sensor or industrial data)
- Schema design: Advanced knowledge of Star and Snowflake Schemas and Lakehouse architectures.
- Quality assurance: Hands-on experience with data validation tools like Great Expectations and observability platforms like Grafana.
- DevOps: Familiarity with CI/CD processes using Azure DevOps or GitHub Actions.
Preferred Qualifications:
- 6+ years of experience in data engineering with a focus on resilient, high availability systems.
- Strong understanding of security protocols, including RBAC and Row-level Masking.
- Ability to work closely with AI/ML teams to support predictive maintenance or digital data products.
Job Type: Fixed term contract
Contract length: 3 months
Pay: £500.00 per day
Experience:
- Data Engineering: 6 years (required)
Work authorisation:
- United Kingdom (required)
Location:
Work Location: On the road