Senior Data Engineer

  • Gigaclear
  • Shippon, Oxfordshire
  • 24/04/2026
Full time Information Technology Telecommunications SQL Python Testing CRM

Job Description

As a Senior Data Engineer within the Data Engineering team, you will play a key role in building, enhancing, and maintaining our enterprise data platform on Snowflake. You will develop and optimise scalable data pipelines and models that bring data from core business systems into Snowflake, enabling analytics, reporting, and data-driven insights across the organisation.

You will translate the data platform strategy into high-quality technical solutions, ensuring our Snowflake environment is reliable, well-structured, and performant. You will champion engineering best practices and contribute to standards that improve the quality, consistency, and usability of data assets.

Your work will ensure the business has access to trusted, timely, and well-modelled data to support decision-making, operational reporting, and the foundations for advanced analytics and future AI/ML capabilities.

Key Accountabilities & Responsibilities

Snowflake Data Engineering Delivery
Design, build, and maintain high-quality data pipelines and models in Snowflake to support business analytics, BI, and operational reporting needs.

Data Architecture Implementation
Translate the defined data architecture and standards into implemented solutions including ingestion, transformation, storage, and performance optimisation.

Pipeline Development & Orchestration
Develop robust ELT/ETL pipelines using dbt and workflow/orchestration tools (e.g., Argo Workflows), ensuring reliability, maintainability, and adherence to engineering best practices.

Performance & Cost Optimisation
Implement Snowflake warehouse configurations and query optimisation techniques to ensure efficient usage and predictable cost.

Data Quality & Governance Execution
Apply data quality checks, lineage tracking, and security standards across the data estate. Ensure compliance with data policies, InfoSec controls, and regulatory requirements as required.

Tooling & Feature Adoption
Leverage Snowflake capabilities (Tasks, Streams, Snowpark, Time Travel, Secure Data Sharing) to improve automation, reduce manual effort, and enhance data accessibility across the business.

Collaboration & Support
Work closely with analysts, data consumers, and business stakeholders to support data product delivery, troubleshoot data issues, and enable effective usage of Snowflake datasets.

Enablement for Analytics & Data Science
Implement dimensional models that provide clean, well-structured, reusable datasets for reporting, scenario modelling, and emerging ML/AI use cases.

Monitoring, Reliability & Operations
Implement and maintain monitoring, alerting, logging, and cost-management processes for Snowflake and data pipelines to ensure a stable and well-maintained platform.

Continuous Improvement of Engineering Practices
Contribute to shared engineering standards to simplify development and accelerate delivery across the team.

Knowledge & Skills

  • Proven experience in delivering cloud-based data engineering solutions, ideally with Snowflake.
  • Strong hands-on proficiency with SQL, Python, and dbt for data transformations, modelling, and pipeline automation.
  • Practical experience with Snowflake and RBAC management.
  • Experience with data ingestion and replication tools such as Airbyte, Fivetran, Hevo, or similar.
  • Working knowledge of cloud services (AWS preferred).
  • Strong understanding of data modelling and data governance principles.
  • Experience supporting BI/reporting tools (Power BI) and enabling them through well-designed Snowflake data models.
  • Solid knowledge of CI/CD and version-controlled development practices in git.

Desirable

Enterprise System Familiarity
Exposure to CRM (Salesforce), BSS/OSS (Netadmin), Call Centre, Telephony, or similar enterprise data sources.

Data Migration Experience
Participation in migrating data platforms (e.g., PostgreSQL or other cloud RDBMS) into a data warehouse like Snowflake with minimal disruption and strong data validation controls.

Change & Adoption Support
Experience supporting business teams during platform transitions (e.g., training, documentation, user onboarding, issue resolution).

Best Practice Contribution
Experience contributing to naming conventions, schema standards, environment management, testing frameworks, and security patterns for data platforms.

Continuous Learning & Innovation
Interest in staying up to date with the latest technologies, modern data stack tooling, and best practices to contribute to ongoing platform evolution.

Infrastructure as Code

Exposure to Terraform would be advantageous.

Gigaclear is a growing Fibre Broadband (FTTP / FTTH) company, developing our fibre-to-the-premises broadband infrastructure to some of the most difficult to reach areas of the UK, empowering those communities with broadband to rival any city.