it job board logo
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
  • Recruiting? Post a job
  • Sign in
  • Sign up
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

70 jobs found

Email me jobs like this
Refine Search
Current Search
fabric data engineer
Connells Group HQ
Data Delivery Manager
Connells Group HQ Milton Keynes, Buckinghamshire
Job Description We are seeking an experienced Data Delivery Manager to join our Group Technology team in Milton Keynes. This is a middle management role responsible for the creation and evolution of data delivery roadmaps and the subsequent end to end delivery of agreed change initiatives across cross functional technology teams through all delivery phases to meet the business needs.We offer a hybrid working arrangement with 1 day per week in our Milton Keynes office. Key Responsibilities: Take ownership and drive forward the end to end delivery of technology data change initiatives across cross functional technology delivery teams, inclusive of 3rd party partners, through all delivery phases. In collaboration with key product stakeholders, develop and maintain a Data delivery roadmap, to deliver the vision and strategic priorities. Identify and secure resources needed to support our delivery teams, identify and remove blockers to delivery, and work with peers, engineers, ops and change people, to launch the products, platforms and features that we need. Ensure that a balanced portfolio of change is pursued by the team, and that they have time dedicated for maintenance, operational tasks, efforts to address technical risk, learning, as well as feature delivery. Manage 3rd party development deliveries ensuring key milestones are communicated and ensuring excellent working relationships are maintained. Manage and oversee externally developed data roadmaps to agreed schedules, priorities and estimates provided ensuring priorities are understood by all parties. Own and ensure the delivery change governance framework is followed for all changes and made visible to all stakeholders. Coordinate regular service reviews, updates on in-flight projects and ensure the roadmaps are aligned and understood. Champion a learning and continuous improvement culture, driving improvements to how the teams work, methods they use, tools they employ and principles and practices that they adopt. Experience and Skills Required: Data Architecture Understanding - Knowledge of Data Platforms, data modelling, ingestion processes, and preparation techniques in Microsoft Fabric. Data Governance & Compliance - Familiarity with policies around data quality, security and regulatory compliance. Analytics & Reporting - Ability to oversee data visualisations, dashboards and actionable insights for stakeholders. Preferably educated to graduate level in a Technology or software related engineering degree and 5 years' experience in delivering change in agile software engineering environments. Background in delivery technology in customer facing industries. Understanding of and experience with delivering in SAFe and agile frameworks such as Scrum or Kanban. Experience of delivering change using tools such as Jira. Connells Group is the leading UK estate agency and property services group, with over 80 different brands and 1,200 branches UK-wide. Alongside a significant high street estate agency presence, it has a strong financial services business operation, offering all services to support sales, purchases, lettings, mortgages, building surveys & valuations, conveyancing, auctions and more.Connells Group UK is an equal opportunities employer and positively encourages applications from suitably qualified and eligible candidates regardless of sex, race, disability, age, sexual orientation, transgender status, religion or belief, marital status, or pregnancy and maternity.CF00776
16/03/2026
Full time
Job Description We are seeking an experienced Data Delivery Manager to join our Group Technology team in Milton Keynes. This is a middle management role responsible for the creation and evolution of data delivery roadmaps and the subsequent end to end delivery of agreed change initiatives across cross functional technology teams through all delivery phases to meet the business needs.We offer a hybrid working arrangement with 1 day per week in our Milton Keynes office. Key Responsibilities: Take ownership and drive forward the end to end delivery of technology data change initiatives across cross functional technology delivery teams, inclusive of 3rd party partners, through all delivery phases. In collaboration with key product stakeholders, develop and maintain a Data delivery roadmap, to deliver the vision and strategic priorities. Identify and secure resources needed to support our delivery teams, identify and remove blockers to delivery, and work with peers, engineers, ops and change people, to launch the products, platforms and features that we need. Ensure that a balanced portfolio of change is pursued by the team, and that they have time dedicated for maintenance, operational tasks, efforts to address technical risk, learning, as well as feature delivery. Manage 3rd party development deliveries ensuring key milestones are communicated and ensuring excellent working relationships are maintained. Manage and oversee externally developed data roadmaps to agreed schedules, priorities and estimates provided ensuring priorities are understood by all parties. Own and ensure the delivery change governance framework is followed for all changes and made visible to all stakeholders. Coordinate regular service reviews, updates on in-flight projects and ensure the roadmaps are aligned and understood. Champion a learning and continuous improvement culture, driving improvements to how the teams work, methods they use, tools they employ and principles and practices that they adopt. Experience and Skills Required: Data Architecture Understanding - Knowledge of Data Platforms, data modelling, ingestion processes, and preparation techniques in Microsoft Fabric. Data Governance & Compliance - Familiarity with policies around data quality, security and regulatory compliance. Analytics & Reporting - Ability to oversee data visualisations, dashboards and actionable insights for stakeholders. Preferably educated to graduate level in a Technology or software related engineering degree and 5 years' experience in delivering change in agile software engineering environments. Background in delivery technology in customer facing industries. Understanding of and experience with delivering in SAFe and agile frameworks such as Scrum or Kanban. Experience of delivering change using tools such as Jira. Connells Group is the leading UK estate agency and property services group, with over 80 different brands and 1,200 branches UK-wide. Alongside a significant high street estate agency presence, it has a strong financial services business operation, offering all services to support sales, purchases, lettings, mortgages, building surveys & valuations, conveyancing, auctions and more.Connells Group UK is an equal opportunities employer and positively encourages applications from suitably qualified and eligible candidates regardless of sex, race, disability, age, sexual orientation, transgender status, religion or belief, marital status, or pregnancy and maternity.CF00776
Tenth Revolution Group
Fabric Data Engineer - Outside IR35 - Hybrid
Tenth Revolution Group
Fabric Data Engineer - Outside IR35 - Hybrid We are seeking a skilled Fabric Data Engineer to design, build, and optimize scalable data solutions using Microsoft Fabric. The ideal candidate will have strong expertise in modern data architecture, cloud-based analytics, and end-to-end data pipeline development. You will work closely with data analysts, data scientists, and business stakeholders to deliver high-quality, reliable, and secure data solutions that drive strategic decision-making. Key Responsibilities Design, develop, and maintain scalable data pipelines using Microsoft Fabric, including: Data Factory for orchestration and ingestion Lakehouse architecture for unified analytics Warehouse for structured data modeling Build and optimize data solutions leveraging Azure Data Factory, Azure Synapse Analytics, and Power BI. Develop ETL/ELT processes to ingest, transform, and load data from various sources (APIs, databases, flat files, streaming sources). Implement data modeling techniques (star schema, snowflake schema, medallion architecture). Ensure data quality, governance, and security standards are met. Monitor, troubleshoot, and optimize data workflows for performance and cost efficiency. Collaborate with cross-functional teams to translate business requirements into technical solutions. Support CI/CD deployment and version control best practices. Required Qualifications Bachelor's degree in Computer Science, Information Systems, Engineering, or related field. 3+ years of experience in data engineering or related roles. Hands-on experience with Microsoft Fabric. Strong SQL skills and experience with data warehousing concepts. Proficiency in Python or Spark for data transformation. Experience with cloud platforms, preferably Microsoft Azure. Understanding of data governance, security, and compliance frameworks. Key Skills Data Pipeline Development Cloud Data Architecture Data Modeling & Optimization ETL/ELT Frameworks Performance Tuning Stakeholder Communication To apply for this role please submit your CV or contact Dillon Blackburn on or at . Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
16/03/2026
Contractor
Fabric Data Engineer - Outside IR35 - Hybrid We are seeking a skilled Fabric Data Engineer to design, build, and optimize scalable data solutions using Microsoft Fabric. The ideal candidate will have strong expertise in modern data architecture, cloud-based analytics, and end-to-end data pipeline development. You will work closely with data analysts, data scientists, and business stakeholders to deliver high-quality, reliable, and secure data solutions that drive strategic decision-making. Key Responsibilities Design, develop, and maintain scalable data pipelines using Microsoft Fabric, including: Data Factory for orchestration and ingestion Lakehouse architecture for unified analytics Warehouse for structured data modeling Build and optimize data solutions leveraging Azure Data Factory, Azure Synapse Analytics, and Power BI. Develop ETL/ELT processes to ingest, transform, and load data from various sources (APIs, databases, flat files, streaming sources). Implement data modeling techniques (star schema, snowflake schema, medallion architecture). Ensure data quality, governance, and security standards are met. Monitor, troubleshoot, and optimize data workflows for performance and cost efficiency. Collaborate with cross-functional teams to translate business requirements into technical solutions. Support CI/CD deployment and version control best practices. Required Qualifications Bachelor's degree in Computer Science, Information Systems, Engineering, or related field. 3+ years of experience in data engineering or related roles. Hands-on experience with Microsoft Fabric. Strong SQL skills and experience with data warehousing concepts. Proficiency in Python or Spark for data transformation. Experience with cloud platforms, preferably Microsoft Azure. Understanding of data governance, security, and compliance frameworks. Key Skills Data Pipeline Development Cloud Data Architecture Data Modeling & Optimization ETL/ELT Frameworks Performance Tuning Stakeholder Communication To apply for this role please submit your CV or contact Dillon Blackburn on or at . Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Connells Group HQ
Data Platform Engineer
Connells Group HQ Milton Keynes, Buckinghamshire
Job Description We are seeking a Data Platform Engineer to join our Group Technology team in Milton Keynes. You will play a key role in delivering the Connells Group Data Platform, including design, capacity, management and configuration management responsibilities. As a Data Platform Engineer, you will be working in a team of technical specialists on a day-to-day basis, liaising with 3rd party providers developing your core skills and expertise whilst maturing the overall processes and procedures in relation to the service. The role also supports the business objectives and strategy through the delivery of secure, supportable and scalable cloud data platforms. Key Responsibilities: Have expertise around the Data Platform capability Maintain the Data Platform for all Connells users, managing critical downtime and risk of disruption. To undertake root cause analysis and resolution of incidents. To undertake applicable out-of-hours support as required for operational running. Team Roles and Responsibilities Use of Agile methodologies for platform development Deliver Platform as Code Work within the Cloud Platform design pattern to implement technical and financial observability Undertake platform capacity management Support projects where appropriate Undertake proactive monitoring and react to escalations from other IT teams Collaborating with other members of the team. Experience and Skills Required: Essential: Demonstrable experience in similar relevant technical roles Experience of incident resolution, requests, changes and problem-solving activities delivered to agreed SLAs Experience of implementing Cloud Technologies Experience of Microsoft Fabric Experience of managing SQL Server Good team communication, with experiences in sharing and presenting new ideas and approaches with the team Willingness to learn, adopt and advance best-practices, procedures and qualitative standards Can operate in a complex environment under pressure Able to operate both with legacy and current technology Analytically minded and a strong attention to detail Desirable: Experience of GitHub, Git Actions, Terraform, Platform as Code and Zero Trust architectures Experience of advanced tools for operational monitoring Ability to operate and influence at all levels within the organisation Experience of tools for Alerting and Monitoring Cloud Cost Monitoring and reporting STEM degree/ postgraduate qualification Connells Group UK is an equal opportunities employer and positively encourages applications from suitably qualified and eligible candidates regardless of sex, race, disability, age, sexual orientation, transgender status, religion or belief, marital status, or pregnancy and maternity.Don't meet every single requirement? Studies have shown that women and people of colour are less likely to apply to jobs unless they meet every single qualification. At Connells Group we are dedicated to building a diverse, inclusive and authentic workplace. So, if you're excited about this role but your experience doesn't fit perfectly with every aspect of the job description, we encourage you to apply anyway. You may be just the right candidate for this or other opportunities.CF00744
16/03/2026
Full time
Job Description We are seeking a Data Platform Engineer to join our Group Technology team in Milton Keynes. You will play a key role in delivering the Connells Group Data Platform, including design, capacity, management and configuration management responsibilities. As a Data Platform Engineer, you will be working in a team of technical specialists on a day-to-day basis, liaising with 3rd party providers developing your core skills and expertise whilst maturing the overall processes and procedures in relation to the service. The role also supports the business objectives and strategy through the delivery of secure, supportable and scalable cloud data platforms. Key Responsibilities: Have expertise around the Data Platform capability Maintain the Data Platform for all Connells users, managing critical downtime and risk of disruption. To undertake root cause analysis and resolution of incidents. To undertake applicable out-of-hours support as required for operational running. Team Roles and Responsibilities Use of Agile methodologies for platform development Deliver Platform as Code Work within the Cloud Platform design pattern to implement technical and financial observability Undertake platform capacity management Support projects where appropriate Undertake proactive monitoring and react to escalations from other IT teams Collaborating with other members of the team. Experience and Skills Required: Essential: Demonstrable experience in similar relevant technical roles Experience of incident resolution, requests, changes and problem-solving activities delivered to agreed SLAs Experience of implementing Cloud Technologies Experience of Microsoft Fabric Experience of managing SQL Server Good team communication, with experiences in sharing and presenting new ideas and approaches with the team Willingness to learn, adopt and advance best-practices, procedures and qualitative standards Can operate in a complex environment under pressure Able to operate both with legacy and current technology Analytically minded and a strong attention to detail Desirable: Experience of GitHub, Git Actions, Terraform, Platform as Code and Zero Trust architectures Experience of advanced tools for operational monitoring Ability to operate and influence at all levels within the organisation Experience of tools for Alerting and Monitoring Cloud Cost Monitoring and reporting STEM degree/ postgraduate qualification Connells Group UK is an equal opportunities employer and positively encourages applications from suitably qualified and eligible candidates regardless of sex, race, disability, age, sexual orientation, transgender status, religion or belief, marital status, or pregnancy and maternity.Don't meet every single requirement? Studies have shown that women and people of colour are less likely to apply to jobs unless they meet every single qualification. At Connells Group we are dedicated to building a diverse, inclusive and authentic workplace. So, if you're excited about this role but your experience doesn't fit perfectly with every aspect of the job description, we encourage you to apply anyway. You may be just the right candidate for this or other opportunities.CF00744
Tenth Revolution Group
Databricks Data Architect - Hybrid - Permanent
Tenth Revolution Group City, London
Databricks Data Architect - Hybrid - Permanent Job Summary We are looking for a Databricks Data Architect with Databricks certifications to design and implement scalable data platforms using the Databricks Lakehouse architecture. The role involves building modern data pipelines, optimizing data platforms, and enabling analytics and machine learning capabilities across the organization. Key Responsibilities Design and implement data architectures on the Databricks Lakehouse Platform Build and optimize ETL/ELT pipelines using Apache Spark and Databricks Implement Delta Lake and medallion architecture (Bronze, Silver, Gold) Ensure data governance, security, and performance optimization Integrate Databricks with cloud platforms and enterprise data sources Collaborate with data engineers, analysts, and stakeholders to deliver scalable data solutions Required Skills & Qualifications Databricks Certification (eg, Databricks Certified Data Engineer Associate/Professional) Strong experience with Databricks, Apache Spark, and Delta Lake Proficiency in Python, SQL, and data pipeline development Experience with cloud platforms (Azure, AWS, or GCP) Solid understanding of modern data architecture and big data solutions To apply for this role please submit your CV or contact Dillon Blackburn (see below) Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
16/03/2026
Full time
Databricks Data Architect - Hybrid - Permanent Job Summary We are looking for a Databricks Data Architect with Databricks certifications to design and implement scalable data platforms using the Databricks Lakehouse architecture. The role involves building modern data pipelines, optimizing data platforms, and enabling analytics and machine learning capabilities across the organization. Key Responsibilities Design and implement data architectures on the Databricks Lakehouse Platform Build and optimize ETL/ELT pipelines using Apache Spark and Databricks Implement Delta Lake and medallion architecture (Bronze, Silver, Gold) Ensure data governance, security, and performance optimization Integrate Databricks with cloud platforms and enterprise data sources Collaborate with data engineers, analysts, and stakeholders to deliver scalable data solutions Required Skills & Qualifications Databricks Certification (eg, Databricks Certified Data Engineer Associate/Professional) Strong experience with Databricks, Apache Spark, and Delta Lake Proficiency in Python, SQL, and data pipeline development Experience with cloud platforms (Azure, AWS, or GCP) Solid understanding of modern data architecture and big data solutions To apply for this role please submit your CV or contact Dillon Blackburn (see below) Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Harvey Nash IT Recruitment UK
Technical Architect
Harvey Nash IT Recruitment UK Chester, Cheshire
Technical Architect - Microsoft Fabric Chester - Hybrid working 2 x per week Salary: Up to £90,000 per annum A leading client in Chester seeks a Technical Architect to design and deliver data and AI solutions on the Microsoft Fabric platform. As Technical Lead for a small team, you'll oversee end-to-end architecture, develop scalable analytics solutions, and stay hands-on. Responsibilities include delivering Fabric solutions (OneLake, Lakehouse, Warehouses, Power BI), leading architecture and performance optimisation, and enabling advanced analytics and machine learning with Fabric and Azure ML. Key skills and responsibilities: Design and deliver scalable, secure end-to-end Microsoft Fabric solutions aligned with the businesses objectives (OneLake, Lakehouse, Warehouses and Power BI) Designing and managing enterprise-grade workloads and semantic models to enable robust analytics and reporting, utilising Fabric and Azure ML Planning and sizing Fabric initiatives, including capacity needs, effort estimates, infrastructure requirements, team resourcing, and delivery timelines. Extensive knowledge in data modelling, DAX, Python/PySpark, SQL/KQL. Proven record of technical leadership and effective stakeholder engagement. Lead architectural design, capacity planning, and performance optimisation efforts. Implement governance, security, and DevOps best practices. Lead a small team of data engineers and contribute to the development of the platform strategy. Interested? Please submit your updated CV to Emma Siwicki at Harvey Nash for immediate consideration.
16/03/2026
Full time
Technical Architect - Microsoft Fabric Chester - Hybrid working 2 x per week Salary: Up to £90,000 per annum A leading client in Chester seeks a Technical Architect to design and deliver data and AI solutions on the Microsoft Fabric platform. As Technical Lead for a small team, you'll oversee end-to-end architecture, develop scalable analytics solutions, and stay hands-on. Responsibilities include delivering Fabric solutions (OneLake, Lakehouse, Warehouses, Power BI), leading architecture and performance optimisation, and enabling advanced analytics and machine learning with Fabric and Azure ML. Key skills and responsibilities: Design and deliver scalable, secure end-to-end Microsoft Fabric solutions aligned with the businesses objectives (OneLake, Lakehouse, Warehouses and Power BI) Designing and managing enterprise-grade workloads and semantic models to enable robust analytics and reporting, utilising Fabric and Azure ML Planning and sizing Fabric initiatives, including capacity needs, effort estimates, infrastructure requirements, team resourcing, and delivery timelines. Extensive knowledge in data modelling, DAX, Python/PySpark, SQL/KQL. Proven record of technical leadership and effective stakeholder engagement. Lead architectural design, capacity planning, and performance optimisation efforts. Implement governance, security, and DevOps best practices. Lead a small team of data engineers and contribute to the development of the platform strategy. Interested? Please submit your updated CV to Emma Siwicki at Harvey Nash for immediate consideration.
La Fosse Associates Limited
HPC SRE - Quant Research, HFT
La Fosse Associates Limited
Network Site Reliability Engineer - Python/GO, Observability, Monitoring, HPC Within the Network Engineering Team, this role is critical in ensuring our clients High-Performance Computing (HPC) environments are supported by a resilient, data-driven, and software-defined network foundation. We are seeking a Networks focused Site Reliability Engineer (SRE) with a focus on Observability, Telemetry, and Monitoring. In this role, you will apply a software engineering mindset to network operations, bridging the gap between traditional networking and modern Site Reliability Engineering (SRE). You will be responsible for ensuring our high-performance network infrastructure is not just functional, but deeply visible. You will build the tooling and automation that allow the team to move from reactive troubleshooting to proactive, automated remediation and "self-healing" infrastructure. Key Responsibilities: Reliability Engineering: Apply SRE principles to the network; define and maintain SLIs, SLOs, and Error Budgets for network latency, packet loss, and availability. HPC Connectivity & Performance: Support low-latency, high-throughput network architectures (eg, RDMA, RoCE) designed for intensive HPC and financial data workloads. Advanced Telemetry: Design and manage high-cardinality telemetry pipelines to collect and analyze flow logs, metrics, and traces at scale. Network Automation (Python/Go): Build and maintain internal software tools, APIs, and "self-healing" scripts to automate routine operations and complex failure recoveries. Infrastructure-as-Code (IaC): Use Terraform to manage complex network configurations and observability stacks (Prometheus, Grafana, OpenSearch) as code. Observability & Monitoring: Implement automated alerting and dashboarding that provide Real Time insights into network health and traffic patterns. Incident Management & Post-Mortems: Lead technical troubleshooting for complex outages and conduct "blameless post-mortems" to drive systemic improvements. Your Present Skillset 3+ years of experience in a Network Reliability (NRE), SRE, or Network Operations role within a high-performance environment. Software Engineering Mindset: Strong proficiency in Python and Go for building automation, custom exporters, or network management tools. Observability Stack Expertise: Hands-on experience with Prometheus, Grafana, OpenSearch/Elasticsearch, and distributed tracing. Networking Fundamentals: Deep knowledge of TCP/IP, BGP, EVPN, and routing/switching concepts in a high-bandwidth environment. Infrastructure as Code: Proven experience using Terraform to ensure scalable, repeatable, and version-controlled network deployments. HPC Awareness: Familiarity with the networking requirements of high-performance computing, such as non-blocking fabrics and low-latency interconnects. Desirable Experience Streaming Telemetry: Experience with gNMI, gRPC, or Kafka for Real Time network data streaming. CI/CD for Networking: Familiarity with "NetDevOps" workflows, including automated testing (Pytest/Go test) and pipeline validation for network changes. Container Networking: Knowledge of Kubernetes networking, CNI plugins, and Service Mesh (eg, Istio or Cilium). Traffic Engineering: Experience with segment routing or advanced load-balancing strategies for high-performance workloads.
16/03/2026
Full time
Network Site Reliability Engineer - Python/GO, Observability, Monitoring, HPC Within the Network Engineering Team, this role is critical in ensuring our clients High-Performance Computing (HPC) environments are supported by a resilient, data-driven, and software-defined network foundation. We are seeking a Networks focused Site Reliability Engineer (SRE) with a focus on Observability, Telemetry, and Monitoring. In this role, you will apply a software engineering mindset to network operations, bridging the gap between traditional networking and modern Site Reliability Engineering (SRE). You will be responsible for ensuring our high-performance network infrastructure is not just functional, but deeply visible. You will build the tooling and automation that allow the team to move from reactive troubleshooting to proactive, automated remediation and "self-healing" infrastructure. Key Responsibilities: Reliability Engineering: Apply SRE principles to the network; define and maintain SLIs, SLOs, and Error Budgets for network latency, packet loss, and availability. HPC Connectivity & Performance: Support low-latency, high-throughput network architectures (eg, RDMA, RoCE) designed for intensive HPC and financial data workloads. Advanced Telemetry: Design and manage high-cardinality telemetry pipelines to collect and analyze flow logs, metrics, and traces at scale. Network Automation (Python/Go): Build and maintain internal software tools, APIs, and "self-healing" scripts to automate routine operations and complex failure recoveries. Infrastructure-as-Code (IaC): Use Terraform to manage complex network configurations and observability stacks (Prometheus, Grafana, OpenSearch) as code. Observability & Monitoring: Implement automated alerting and dashboarding that provide Real Time insights into network health and traffic patterns. Incident Management & Post-Mortems: Lead technical troubleshooting for complex outages and conduct "blameless post-mortems" to drive systemic improvements. Your Present Skillset 3+ years of experience in a Network Reliability (NRE), SRE, or Network Operations role within a high-performance environment. Software Engineering Mindset: Strong proficiency in Python and Go for building automation, custom exporters, or network management tools. Observability Stack Expertise: Hands-on experience with Prometheus, Grafana, OpenSearch/Elasticsearch, and distributed tracing. Networking Fundamentals: Deep knowledge of TCP/IP, BGP, EVPN, and routing/switching concepts in a high-bandwidth environment. Infrastructure as Code: Proven experience using Terraform to ensure scalable, repeatable, and version-controlled network deployments. HPC Awareness: Familiarity with the networking requirements of high-performance computing, such as non-blocking fabrics and low-latency interconnects. Desirable Experience Streaming Telemetry: Experience with gNMI, gRPC, or Kafka for Real Time network data streaming. CI/CD for Networking: Familiarity with "NetDevOps" workflows, including automated testing (Pytest/Go test) and pipeline validation for network changes. Container Networking: Knowledge of Kubernetes networking, CNI plugins, and Service Mesh (eg, Istio or Cilium). Traffic Engineering: Experience with segment routing or advanced load-balancing strategies for high-performance workloads.
La Fosse Associates Limited
Network Automation Lead - NetDevOps, Arista, Python, Ansible
La Fosse Associates Limited
Lead Network Automation Engineer - Python/Golang, DC Fabric Design, HPC This role sits within our clients Global Infrastructure team working to automate a cutting-edge, high-performance datacentre and research platform. You will build the tooling, pipelines, and guardrails that turn network intent into safe, repeatable code. This role focuses on NetDevOps practices, ensuring scalability and reliability across a global on-premise footprint. Key Responsibilities Architect: Design and evolve the network automation framework (Source of Truth, templating, validation). Build: Develop idempotent automation for datacentre fabrics and integrate with vendor APIs (Arista, NetQ). Standardise: Capture network intent as code, ensuring rigorous peer reviews and quality gates. Monitor: Instrument the network with telemetry, alerting, and SLOs to drive automated remediation. Collaborate: Work with platform engineers to translate complex designs into automated workflows. Your Skills Development: Advanced Python; Go is highly preferred for performant tooling. Automation: Strong Ansible (roles/collections) and workflow orchestration (eg, Temporal). Tools: NetBox/Nautobot, Git, and network validation (pyATS, NAPALM). Systems: Linux networking, Bash, Docker/Kubernetes, and HashiCorp Vault. Observability: Experience with Prometheus, Grafana, and logging/tracing. Knowledge: Familiarity with datacentre fabrics, QoS, and high-performance networking is a plus.
16/03/2026
Full time
Lead Network Automation Engineer - Python/Golang, DC Fabric Design, HPC This role sits within our clients Global Infrastructure team working to automate a cutting-edge, high-performance datacentre and research platform. You will build the tooling, pipelines, and guardrails that turn network intent into safe, repeatable code. This role focuses on NetDevOps practices, ensuring scalability and reliability across a global on-premise footprint. Key Responsibilities Architect: Design and evolve the network automation framework (Source of Truth, templating, validation). Build: Develop idempotent automation for datacentre fabrics and integrate with vendor APIs (Arista, NetQ). Standardise: Capture network intent as code, ensuring rigorous peer reviews and quality gates. Monitor: Instrument the network with telemetry, alerting, and SLOs to drive automated remediation. Collaborate: Work with platform engineers to translate complex designs into automated workflows. Your Skills Development: Advanced Python; Go is highly preferred for performant tooling. Automation: Strong Ansible (roles/collections) and workflow orchestration (eg, Temporal). Tools: NetBox/Nautobot, Git, and network validation (pyATS, NAPALM). Systems: Linux networking, Bash, Docker/Kubernetes, and HashiCorp Vault. Observability: Experience with Prometheus, Grafana, and logging/tracing. Knowledge: Familiarity with datacentre fabrics, QoS, and high-performance networking is a plus.
Nexere Consulting Limited
Lead Data Platform Engineer - Databricks - IAC - Terraform - Azure Data Factory - Data Lakehouse
Nexere Consulting Limited
Lead Data Platform Engineer - Databricks - IAC - Terraform - Azure Data Factory - Data Lakehouse The Data Platform Engineer designs, develops, automates, and maintains secure, scalable, and compliant data platforms that enable the firm to efficiently manage, analyse, and utilise data. The role ensures that data solutions are robust and reliable while meeting regulatory obligations and safeguarding client confidentiality. Key Responsibilities Design and architect scalable, secure, and compliant data platforms and solutions, producing technical documentation and securing approvals through governance bodies such as Architecture Review Boards. Build and deliver robust data solutions using Databricks, PySpark, Spark SQL, Azure Data Factory, and Azure services. Develop APIs and write efficient Python, PySpark, and SQL code to support data integration, processing, and automation. Implement and manage CI/CD pipelines and automated deployments using Azure DevOps to enable reliable releases across environments. Develop and maintain infrastructure-as-code (eg, Terraform, ARM) to provision and manage cloud resources, including ADF pipelines, Databricks assets, and Unity Catalog components. Monitor, troubleshoot, and optimise data platform performance, reliability, and costs, identifying bottlenecks and recommending improvements. Create dashboards and observability tools to report on platform performance, usage, incidents, and operational KPIs. Knowledge, Skills & Experience Degree in Computer Science, Data Engineering, or a related field. Proven experience designing and building cloud-based data platforms, ideally within Azure. Strong hands-on expertise with Databricks, PySpark, Spark SQL, and Azure Data Factory. Solid understanding of Data Lakehouse architecture and modern data platform design. Proficiency in Python for data engineering, automation, and data processing. Experience developing and integrating REST APIs for data services. Strong DevOps experience, including CI/CD, automated testing, and release management for data platforms. Experience with Infrastructure as Code tools such as Terraform or ARM templates. Knowledge of data modelling, ETL/ELT pipelines, and data warehousing concepts. Familiarity with monitoring, logging, and alerting tools (eg, Azure Monitor). Desirable Experience with additional Azure services (eg, Fabric, Azure Functions, Logic Apps). Knowledge of cloud cost optimisation for data platforms. Understanding of data governance and regulatory compliance (eg, GDPR). Experience working in regulated or professional services environments.
16/03/2026
Full time
Lead Data Platform Engineer - Databricks - IAC - Terraform - Azure Data Factory - Data Lakehouse The Data Platform Engineer designs, develops, automates, and maintains secure, scalable, and compliant data platforms that enable the firm to efficiently manage, analyse, and utilise data. The role ensures that data solutions are robust and reliable while meeting regulatory obligations and safeguarding client confidentiality. Key Responsibilities Design and architect scalable, secure, and compliant data platforms and solutions, producing technical documentation and securing approvals through governance bodies such as Architecture Review Boards. Build and deliver robust data solutions using Databricks, PySpark, Spark SQL, Azure Data Factory, and Azure services. Develop APIs and write efficient Python, PySpark, and SQL code to support data integration, processing, and automation. Implement and manage CI/CD pipelines and automated deployments using Azure DevOps to enable reliable releases across environments. Develop and maintain infrastructure-as-code (eg, Terraform, ARM) to provision and manage cloud resources, including ADF pipelines, Databricks assets, and Unity Catalog components. Monitor, troubleshoot, and optimise data platform performance, reliability, and costs, identifying bottlenecks and recommending improvements. Create dashboards and observability tools to report on platform performance, usage, incidents, and operational KPIs. Knowledge, Skills & Experience Degree in Computer Science, Data Engineering, or a related field. Proven experience designing and building cloud-based data platforms, ideally within Azure. Strong hands-on expertise with Databricks, PySpark, Spark SQL, and Azure Data Factory. Solid understanding of Data Lakehouse architecture and modern data platform design. Proficiency in Python for data engineering, automation, and data processing. Experience developing and integrating REST APIs for data services. Strong DevOps experience, including CI/CD, automated testing, and release management for data platforms. Experience with Infrastructure as Code tools such as Terraform or ARM templates. Knowledge of data modelling, ETL/ELT pipelines, and data warehousing concepts. Familiarity with monitoring, logging, and alerting tools (eg, Azure Monitor). Desirable Experience with additional Azure services (eg, Fabric, Azure Functions, Logic Apps). Knowledge of cloud cost optimisation for data platforms. Understanding of data governance and regulatory compliance (eg, GDPR). Experience working in regulated or professional services environments.
Joseph Harry Ltd
Data Engineering Manager Azure AI Finance Croydon London
Joseph Harry Ltd Croydon, Surrey
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Croydon, London. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Croydon, London. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £100k - 125k + Bonus + Pension
13/03/2026
Full time
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Croydon, London. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Croydon, London. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £100k - 125k + Bonus + Pension
Joseph Harry Ltd
Data Engineering Manager Azure AI Finance Croydon London
Joseph Harry Ltd Croydon, Surrey
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Croydon, London. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Croydon, London. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £125k - 150k + Bonus + Pension
13/03/2026
Full time
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Croydon, London. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Croydon, London. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £125k - 150k + Bonus + Pension
Joseph Harry Ltd
Data Engineering Manager Azure AI Finance Tunbridge Wells Kent
Joseph Harry Ltd Tunbridge Wells, Kent
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Tunbridge Wells, Kent. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Tunbridge Wells, Kent. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £125k - 150k + Bonus + Pension
13/03/2026
Full time
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Tunbridge Wells, Kent. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Tunbridge Wells, Kent. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £125k - 150k + Bonus + Pension
Joseph Harry Ltd
Data Engineering Manager Azure AI Finance Tunbridge Wells Kent
Joseph Harry Ltd Tunbridge Wells, Kent
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Tunbridge Wells, Kent. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Tunbridge Wells, Kent. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £100k - 125k + Bonus + Pension
13/03/2026
Full time
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Tunbridge Wells, Kent. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Tunbridge Wells, Kent. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £100k - 125k + Bonus + Pension
Joseph Harry Ltd
Data Engineering Manager Azure AI Finance Tunbridge Wells Kent
Joseph Harry Ltd Tunbridge Wells, Kent
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Tunbridge Wells, Kent. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Tunbridge Wells, Kent. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £80k - 100k + Bonus + Pension
13/03/2026
Full time
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Tunbridge Wells, Kent. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Tunbridge Wells, Kent. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £80k - 100k + Bonus + Pension
Joseph Harry Ltd
Data Engineering Manager Azure AI Finance Brighton
Joseph Harry Ltd Brighton, Sussex
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Brighton. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Brighton. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £80k - 100k + Bonus + Pension
13/03/2026
Full time
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Brighton. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Brighton. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £80k - 100k + Bonus + Pension
Tenth Revolution Group
Lead Databricks Engineer - Hybrid - Permanent
Tenth Revolution Group
Lead Databricks Engineer - Hybrid - Permanent Overview We are seeking an experienced Lead Azure Databricks Engineer to design, build and optimise our enterprise data platforms. The role requires deep technical expertise, strong delivery capability and experience working in highly regulated environments within the Lloyd's of London market. You will play a key role in shaping our cloud data platform, enabling advanced analytics and ensuring secure, scalable and reliable solutions across the organisation. This is a fully hands-on engineering role. Key responsibilities Design and build enterprise data platforms using Microsoft Azure services including Azure Data Factory, Azure Data Lake Storage, Azure Key Vault, Azure Functions and Azure Databricks Develop scalable pipelines using Delta Lake, PySpark and Unity Catalog Work with underwriting, actuarial, delegated authority, bordereaux, exposure management, reinsurance, finance and risk teams to deliver solutions aligned to Lloyd's market requirements and regulations including Solvency II Translate business requirements into scalable cloud data solutions with architects, SMEs and product owners Lead the resolution of complex technical challenges and drive delivery Promote best practices in data engineering, cloud architecture, CI/CD and data life cycle management Provide technical leadership and mentoring to engineers Improve platform performance, resilience, cost efficiency and observability Deliver reliable, high quality data pipelines across the platform Required experience and skills Strong engineering experience with Microsoft Azure data services and Azure Databricks in large scale environments Deep knowledge of Delta Lake, medallion architecture, distributed compute and lakehouse data platforms Strong development skills in Python, PySpark and Spark SQL Experience implementing CI/CD for data platforms using Azure DevOps Knowledge of data governance, lineage, access control and secure cloud engineering Excellent communication skills and ability to work with senior stakeholders Proven delivery mindset with strong ownership and accountability in fast paced environments To apply for this role please submit your CV or contact Dillon Blackburn (see below) Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
13/03/2026
Full time
Lead Databricks Engineer - Hybrid - Permanent Overview We are seeking an experienced Lead Azure Databricks Engineer to design, build and optimise our enterprise data platforms. The role requires deep technical expertise, strong delivery capability and experience working in highly regulated environments within the Lloyd's of London market. You will play a key role in shaping our cloud data platform, enabling advanced analytics and ensuring secure, scalable and reliable solutions across the organisation. This is a fully hands-on engineering role. Key responsibilities Design and build enterprise data platforms using Microsoft Azure services including Azure Data Factory, Azure Data Lake Storage, Azure Key Vault, Azure Functions and Azure Databricks Develop scalable pipelines using Delta Lake, PySpark and Unity Catalog Work with underwriting, actuarial, delegated authority, bordereaux, exposure management, reinsurance, finance and risk teams to deliver solutions aligned to Lloyd's market requirements and regulations including Solvency II Translate business requirements into scalable cloud data solutions with architects, SMEs and product owners Lead the resolution of complex technical challenges and drive delivery Promote best practices in data engineering, cloud architecture, CI/CD and data life cycle management Provide technical leadership and mentoring to engineers Improve platform performance, resilience, cost efficiency and observability Deliver reliable, high quality data pipelines across the platform Required experience and skills Strong engineering experience with Microsoft Azure data services and Azure Databricks in large scale environments Deep knowledge of Delta Lake, medallion architecture, distributed compute and lakehouse data platforms Strong development skills in Python, PySpark and Spark SQL Experience implementing CI/CD for data platforms using Azure DevOps Knowledge of data governance, lineage, access control and secure cloud engineering Excellent communication skills and ability to work with senior stakeholders Proven delivery mindset with strong ownership and accountability in fast paced environments To apply for this role please submit your CV or contact Dillon Blackburn (see below) Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Joseph Harry Ltd
Data Engineering Manager Azure AI Finance Croydon London
Joseph Harry Ltd Croydon, Surrey
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Croydon, London. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Croydon, London. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £80k - 100k + Bonus + Pension
13/03/2026
Full time
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Croydon, London. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Croydon, London. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £80k - 100k + Bonus + Pension
Joseph Harry Ltd
Data Engineering Manager Azure AI Finance Brighton
Joseph Harry Ltd Brighton, Sussex
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Brighton. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Brighton. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £100k - 125k + Bonus + Pension
12/03/2026
Full time
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Brighton. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Brighton. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £100k - 125k + Bonus + Pension
Joseph Harry Ltd
Data Engineering Manager Azure AI Finance Brighton
Joseph Harry Ltd Brighton, Sussex
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Brighton. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Brighton. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £125k - 150k + Bonus + Pension
12/03/2026
Full time
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Brighton. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Brighton. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £125k - 150k + Bonus + Pension
Gails
Data Scientist & Engineer
Gails
ABOUT THE ROLE Develop advanced analytics / data science solutions to solve problems focused on forecasting, new site selection, ordering, production, rota scheduling, logistics and online services optimisation. Extend functionality of our Bread GPT service (Large Language Model insight synthesis engine). Data engineering: build and develop ETL processes in Microsoft Fabric to support reporting, insight and applied AI models A hands-on role working with other staff and partners. Utilize data science and analytics to enhance application functionality and performance. Work with the data team to create and deploy machine learning models and AI-driven solutions for real-world applications. Ensure the continuous development and delivery of solutions. Monitor and evolve solutions. Mentor and guide junior team members, fostering a culture of continuous learning and improvement. Develop effective working relationships with colleagues within and beyond the Technology team to ensure that a consistent, high-quality service is delivered. ARE YOU THE MISSING INGREDIENT Ideally a bachelor's degree in Computer Science, Analytics, Engineering, or a related field. Minimum of 3+ years of experience within excellent knowledge of Python and preferably R. Knowledge of ETL processes - ideally basic understanding of Microsoft ETL (Data Factory / Synapse / Fabric) Knowledge of databases (SQL & NoSQL) and API development/integration. Understanding of software development and application design. Proven experience in building data science solutions and developing customised LLM applications. Strong interest in technology. Excellent problem-solving skills and attention to detail. Knowledge of effective business analysis - ability to gather, document, and analyze business requirements effectively and the experience creating user stories, process flows, and wireframes. Ability to work effectively in a fast-paced, dynamic environment. Strong communication and collaboration skills. "Can do" outlook and approach to work. Demonstrate the ability to think around issues and look at the bigger picture to provide solutions through a variety of problem-solving techniques. Ability to prioritise issues according to business needs, and to escalate when necessary/appropriate, and problem solve Preferred Qualifications: Experience in manufacturing, retail or hospitality industries. Knowledge of programming languages and frameworks. BENEFITS BAKED IN Free food and drink when working 50% off food and drink when not working 33 days holiday Pension Scheme Discounts and Savings from high-street retailers and restaurants 24 hour GP service Cycle to work scheme Twice yearly pay review Development programmes for you to RISE with GAIL's
12/03/2026
Full time
ABOUT THE ROLE Develop advanced analytics / data science solutions to solve problems focused on forecasting, new site selection, ordering, production, rota scheduling, logistics and online services optimisation. Extend functionality of our Bread GPT service (Large Language Model insight synthesis engine). Data engineering: build and develop ETL processes in Microsoft Fabric to support reporting, insight and applied AI models A hands-on role working with other staff and partners. Utilize data science and analytics to enhance application functionality and performance. Work with the data team to create and deploy machine learning models and AI-driven solutions for real-world applications. Ensure the continuous development and delivery of solutions. Monitor and evolve solutions. Mentor and guide junior team members, fostering a culture of continuous learning and improvement. Develop effective working relationships with colleagues within and beyond the Technology team to ensure that a consistent, high-quality service is delivered. ARE YOU THE MISSING INGREDIENT Ideally a bachelor's degree in Computer Science, Analytics, Engineering, or a related field. Minimum of 3+ years of experience within excellent knowledge of Python and preferably R. Knowledge of ETL processes - ideally basic understanding of Microsoft ETL (Data Factory / Synapse / Fabric) Knowledge of databases (SQL & NoSQL) and API development/integration. Understanding of software development and application design. Proven experience in building data science solutions and developing customised LLM applications. Strong interest in technology. Excellent problem-solving skills and attention to detail. Knowledge of effective business analysis - ability to gather, document, and analyze business requirements effectively and the experience creating user stories, process flows, and wireframes. Ability to work effectively in a fast-paced, dynamic environment. Strong communication and collaboration skills. "Can do" outlook and approach to work. Demonstrate the ability to think around issues and look at the bigger picture to provide solutions through a variety of problem-solving techniques. Ability to prioritise issues according to business needs, and to escalate when necessary/appropriate, and problem solve Preferred Qualifications: Experience in manufacturing, retail or hospitality industries. Knowledge of programming languages and frameworks. BENEFITS BAKED IN Free food and drink when working 50% off food and drink when not working 33 days holiday Pension Scheme Discounts and Savings from high-street retailers and restaurants 24 hour GP service Cycle to work scheme Twice yearly pay review Development programmes for you to RISE with GAIL's
Scope AT Limited
Network Specialist - SMPTE 2110/PTP/Arista/EVPN/Multicast/QoS - Broadcast Networks
Scope AT Limited
Network Specialist - SMPTE 2110/PTP/Arista/EVPN/Multicast/QoS - Broadcast Networks Location: London (Hybrid) Type: Permanent We are working with a leading sports broadcast organisation that is building a brand-new IP-based production and broadcast facility in London . They are looking for a Network Specialist to join a small founding engineering team responsible for designing and operating the network infrastructure supporting live production workflows. This is a unique opportunity to help build a modern broadcast network from the ground up , supporting high-bandwidth, low-latency media transport used in live sports broadcasting. Key Responsibilities Design, deploy, and support high-performance IP media networks for live broadcast environments Work with SMPTE 2110 media transport supporting video, audio, and metadata streams Design and troubleshoot PTP (IEEE 1588) timing infrastructure Build and operate Arista-based spine-leaf fabrics using EVPN/VXLAN Configure and optimise multicast networking (PIM/IGMP) Implement QoS policies to support latency-sensitive media traffic Troubleshoot complex issues within a mission-critical, high-availability environment Required Experience Strong experience with data centre or high-performance networks Hands-on experience with Arista switching platforms Deep understanding of multicast networking Experience designing resilient, low-latency network infrastructure Understanding of PTP timing architectures Experience working in environments where network uptime is critical Experience in broadcast, media, or live production environments is highly desirable. Location: London (Hybrid)
12/03/2026
Network Specialist - SMPTE 2110/PTP/Arista/EVPN/Multicast/QoS - Broadcast Networks Location: London (Hybrid) Type: Permanent We are working with a leading sports broadcast organisation that is building a brand-new IP-based production and broadcast facility in London . They are looking for a Network Specialist to join a small founding engineering team responsible for designing and operating the network infrastructure supporting live production workflows. This is a unique opportunity to help build a modern broadcast network from the ground up , supporting high-bandwidth, low-latency media transport used in live sports broadcasting. Key Responsibilities Design, deploy, and support high-performance IP media networks for live broadcast environments Work with SMPTE 2110 media transport supporting video, audio, and metadata streams Design and troubleshoot PTP (IEEE 1588) timing infrastructure Build and operate Arista-based spine-leaf fabrics using EVPN/VXLAN Configure and optimise multicast networking (PIM/IGMP) Implement QoS policies to support latency-sensitive media traffic Troubleshoot complex issues within a mission-critical, high-availability environment Required Experience Strong experience with data centre or high-performance networks Hands-on experience with Arista switching platforms Deep understanding of multicast networking Experience designing resilient, low-latency network infrastructure Understanding of PTP timing architectures Experience working in environments where network uptime is critical Experience in broadcast, media, or live production environments is highly desirable. Location: London (Hybrid)

Modal Window

  • Home
  • Contact
  • About Us
  • FAQs
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • IT blog
  • Facebook
  • Twitter
  • LinkedIn
  • Youtube
© 2008-2026 IT Job Board