SC Cleared Senior Data Engineer - Microsoft Fabric/Azure

  • Layer7
  • 05/02/2026
Contractor Information Technology Telecommunications

Job Description

Location: London/Hybrid (2-3 days on-site per week, remainder remote)
Duration: 6 months + extension
Clearance Requirement: Active SC (or SC-clearable with BPSS as a minimum)
Rate: £500 per day (Inside IR35)

SFIA 4

Overview
We're seeking an SC cleared Senior Data Engineer to design, build, and operate data solutions that power mission-critical analytics within a complex UK public-sector environment.

You'll lead on scalable, production-grade data pipelines on Microsoft Fabric (OneLake/Delta Lake, Data Factory, Synapse Data Engineering), using PySpark, Spark SQL, Python and SQL. You'll help modernise Legacy estates, mentor other engineers, and turn raw data into reliable, secure, and actionable intelligence for high-stakes decision-makers.
This role is aligned to SFIA Level 4 (Enable) - ideal for an engineer who is hands-on, collaborative, and comfortable shaping standards and delivery across multiple squads.

What you'll do

  • Engineer production-grade data pipelines on Microsoft Fabric (OneLake/Delta Lake, Data Factory, Synapse Notebook Data Engineering) using PySpark/Spark SQL/Python and SQL, with a strong focus on:
    • Performance & scalability
    • Resilience & reliability
    • Testing & data quality
    • Monitoring & observability
  • Support reporting & MI use cases, delivering robust transformations and data models that feed downstream tools such as Power BI and other analytics platforms.
  • Own CI/CD and DevOps practices:
    • Build and maintain data pipeline CI/CD and Infrastructure as Code (eg Terraform)
    • Manage automated testing and release governance
    • Use Git/GitLab for version control, code review and branching strategies
  • Lead and mentor engineers:
    • Provide technical guidance, code reviews, and best-practice patterns
    • Contribute to architecture decisions and standards across squads
    • Help uplift engineering capability across the wider data function
  • Work in an Agile delivery environment:
    • Collaborate with product, data, and platform teams using Jira/Confluence
    • Refine and translate requirements into robust engineering tasks and user stories
    • Communicate clearly with both technical and non-technical stakeholders
  • Embed security and compliance by design:
    • Ensure solutions meet BPSS/SC constraints and departmental data-handling policies
    • Build secure, auditable, and compliant data flows suitable for UK government workloads

Essential skills & experience

  • Microsoft Fabric & Azure data engineering
    • Hands-on experience with Microsoft Fabric (OneLake/Delta Lake, Data Factory, Synapse Notebook Data Engineering)
    • Strong PySpark, Spark SQL, Python and SQL skills for large-scale batch processing and transformations
  • Data engineering at scale in complex domains
    • Proven track record delivering in government or similarly complex, regulated environments
    • Performance tuning, optimisation, and proactive data-quality management
  • CI/CD & DevOps
    • Building and maintaining CI/CD pipelines for data workloads
    • Exposure to Infrastructure as Code (eg Terraform)
    • Automated testing and controlled release processes
  • Version control & collaboration
    • Strong experience with Git/GitLab, including:
      • Branching strategies (eg trunk-based, feature branches)
      • Pull/merge request workflows
      • Peer code review and approval processes
  • APIs & integration
    • Designing, building, and consuming APIs/data services
    • Safely and reliably moving and exposing data across platforms and domains
  • Agile delivery & documentation
    • Working in Agile teams with Jira and Confluence
    • Writing clear technical documentation and communicating complex ideas concisely
  • Security clearance
    • BPSS (minimum) and SC-cleared or SC-clearable for UK government work

Desirable

  • Data warehousing & modelling
    • Dimensional data modelling and modern data-warehouse concepts
    • Experience with tools such as dbt for transformation and modelling
  • Power BI familiarity
    • Ability to work with BI developers and validate end-to-end data flows into Power BI
    • Understanding of how data models enable self-service MI and reporting
  • Certifications (nice to have)
    • Microsoft Fabric Associate Data Engineer (or higher)
    • Azure AI Fundamentals or similar, with awareness of ML/AI services and patterns