Senior BI Architect Birmingham Reporting into the CTI as a key member of our newly formed Data Engineering team, the Senior BI Architect will lead the design, development, and ongoing enhancement of Compass Community's data and reporting infrastructure. You will be the strategic owner of our Azure Data Platform, overseeing services such as Azure Data Lake, Data Warehouse, Data Factory, Databricks, and Power BI. This role combines technical leadership, agile delivery, and stakeholder engagement to drive data-driven decision-making across departments including safeguarding, education, residential care, HR, finance, and compliance. Your work will directly contribute to improving outcomes for children and ensuring operational excellence. Core Responsibilities Strategic & Technical Leadership Shape and evolve the BI and data platform strategy to meet business, compliance, and care objectives Lead a multi-disciplinary data engineering team using Agile (Scrum) methodologies Oversee the development and scalability of the strategic data platform Manage the BI function and ensure alignment with business goals Platform Ownership & Governance Administer and govern the Azure Fabric Platform, including: Azure Data Lake & Data Warehouse (storage) Azure Data Factory (data integration) Databricks (data transformation and modelling) Power BI (analytics and dashboards) Ensure data integrity, access control, platform monitoring, and incident resolution Explore AI capabilities to enhance data presentation and consumption Agile Delivery & Operational Oversight Act as Scrum Master for data engineering sprints Translate business needs into actionable user stories and deliverables Balance sprint workload between new development and support tasks Oversee testing, small changes, and continuous integration Stakeholder Engagement & Insight Development Collaborate with Data Insight Leads and business teams to create impactful dashboards Support internal and external reporting (e.g., Ofsted, Local Authorities) Promote data literacy and self-service analytics across the organisation Quality Assurance & Compliance Maintain high standards of data quality, consistency, and availability Develop and manage a comprehensive data dictionary Ensure GDPR compliance and responsible data usage, especially around child protection Support statutory and operational reporting with accurate, timely data Key Skills & Competencies Deep expertise in Azure BI architecture and cloud services Strong grasp of data modelling, integration, and warehousing best practices Proven experience in Agile/Scrum delivery and full development lifecycle Ability to translate complex data into actionable insights for non-technical audiences Familiarity with safeguarding and compliance in social care or education settings Passion for using data to improve outcomes for children and families Education & Experience 3-5 years in BI architecture and team leadership Degree in Data Science, Computer Science, Information Systems, or related field Extensive hands-on experience with Azure Fabric, SQL warehousing, and Databricks Track record in MI/BI product development using Agile and Waterfall methods Experience managing cross-functional teams and sprint activities Background in social care, childcare, or education is desirable Success Measures Delivery and evolution of a secure, scalable Strategic Data Platform High-performing team with clear objectives and development plans Reliable data models supporting business and regulatory reporting Measurable business impact via Power BI dashboards Positive feedback from stakeholders and customer satisfaction surveys Strong adoption of BI tools and self-service analytics Full compliance with GDPR and safeguarding protocols Ownership of up-to-date technical documentation General Expectations Operate with professionalism, purpose, and pace Embody Compass Community's REACH values in all activities Maintain confidentiality and adhere to data protection standards Comply with company policies including Equal Opportunities, IT usage, and Health & Safety Be flexible and responsive to evolving business needs Travel to Compass Community locations as required Salary/Package Basic salary of £75k-£80k Discretionary company bonus Share Option Scheme 4% Pension Life Insurance 3 x salary 25 days annual leave plus stautory - 1 x extra day every year for the first 3 years Blue Light Card Medicash - includes discounted gym memberships etc. Click apply now or speak with Chris Holliday for further information.
17/10/2025
Full time
Senior BI Architect Birmingham Reporting into the CTI as a key member of our newly formed Data Engineering team, the Senior BI Architect will lead the design, development, and ongoing enhancement of Compass Community's data and reporting infrastructure. You will be the strategic owner of our Azure Data Platform, overseeing services such as Azure Data Lake, Data Warehouse, Data Factory, Databricks, and Power BI. This role combines technical leadership, agile delivery, and stakeholder engagement to drive data-driven decision-making across departments including safeguarding, education, residential care, HR, finance, and compliance. Your work will directly contribute to improving outcomes for children and ensuring operational excellence. Core Responsibilities Strategic & Technical Leadership Shape and evolve the BI and data platform strategy to meet business, compliance, and care objectives Lead a multi-disciplinary data engineering team using Agile (Scrum) methodologies Oversee the development and scalability of the strategic data platform Manage the BI function and ensure alignment with business goals Platform Ownership & Governance Administer and govern the Azure Fabric Platform, including: Azure Data Lake & Data Warehouse (storage) Azure Data Factory (data integration) Databricks (data transformation and modelling) Power BI (analytics and dashboards) Ensure data integrity, access control, platform monitoring, and incident resolution Explore AI capabilities to enhance data presentation and consumption Agile Delivery & Operational Oversight Act as Scrum Master for data engineering sprints Translate business needs into actionable user stories and deliverables Balance sprint workload between new development and support tasks Oversee testing, small changes, and continuous integration Stakeholder Engagement & Insight Development Collaborate with Data Insight Leads and business teams to create impactful dashboards Support internal and external reporting (e.g., Ofsted, Local Authorities) Promote data literacy and self-service analytics across the organisation Quality Assurance & Compliance Maintain high standards of data quality, consistency, and availability Develop and manage a comprehensive data dictionary Ensure GDPR compliance and responsible data usage, especially around child protection Support statutory and operational reporting with accurate, timely data Key Skills & Competencies Deep expertise in Azure BI architecture and cloud services Strong grasp of data modelling, integration, and warehousing best practices Proven experience in Agile/Scrum delivery and full development lifecycle Ability to translate complex data into actionable insights for non-technical audiences Familiarity with safeguarding and compliance in social care or education settings Passion for using data to improve outcomes for children and families Education & Experience 3-5 years in BI architecture and team leadership Degree in Data Science, Computer Science, Information Systems, or related field Extensive hands-on experience with Azure Fabric, SQL warehousing, and Databricks Track record in MI/BI product development using Agile and Waterfall methods Experience managing cross-functional teams and sprint activities Background in social care, childcare, or education is desirable Success Measures Delivery and evolution of a secure, scalable Strategic Data Platform High-performing team with clear objectives and development plans Reliable data models supporting business and regulatory reporting Measurable business impact via Power BI dashboards Positive feedback from stakeholders and customer satisfaction surveys Strong adoption of BI tools and self-service analytics Full compliance with GDPR and safeguarding protocols Ownership of up-to-date technical documentation General Expectations Operate with professionalism, purpose, and pace Embody Compass Community's REACH values in all activities Maintain confidentiality and adhere to data protection standards Comply with company policies including Equal Opportunities, IT usage, and Health & Safety Be flexible and responsive to evolving business needs Travel to Compass Community locations as required Salary/Package Basic salary of £75k-£80k Discretionary company bonus Share Option Scheme 4% Pension Life Insurance 3 x salary 25 days annual leave plus stautory - 1 x extra day every year for the first 3 years Blue Light Card Medicash - includes discounted gym memberships etc. Click apply now or speak with Chris Holliday for further information.
Senior Data Engineer - Winchester/London (Hybrid) - £84,000 + 10% bonus 1-2dpw in office Greenfield Enterprise Data Platform Build Flexible Working Private Medical Ada Meher are working with the market leader in Digital Telecommunication & Broadcasting technology solutions as they recruit for a Senior Data Engineer to join them on their migration project from on-prem to an AWS based Enterprise Data Platform. The successful candidate will be working hands on with batch and streaming pipelines, designing systems from scratch and taking a more strategic view of the future roadmap.The business are flexible in nature, with a commitment to work-life balance that allows employees to work the hours that suit them around life's other commitments. They are results focused but do ask for a presence in either the Winchester or Central London offices 1-2 days a week - based business need. To Be Considered: Demonstrable expertise and experience working on large-scale Data Engineering projects Strong experience in Python/PySpark, Databricks & Apache Spark Hands on experience with both batch & streaming pipelines Strong experience in AWS and associated tooling (Eg, S3, Glue, Redshift, Lambda, Terraform etc) Experience designing Data Engineering platforms from scratch Alongside their commitment to work-life balance the business also has an extremely competitive benefits package including, but not limited to, Private Medical, enhanced pension contributions & wellness/gymflex programmes.We are expecting a strong response to this Senior Data Engineer role, so please apply send a CV in to today to avoid missing out!
17/10/2025
Full time
Senior Data Engineer - Winchester/London (Hybrid) - £84,000 + 10% bonus 1-2dpw in office Greenfield Enterprise Data Platform Build Flexible Working Private Medical Ada Meher are working with the market leader in Digital Telecommunication & Broadcasting technology solutions as they recruit for a Senior Data Engineer to join them on their migration project from on-prem to an AWS based Enterprise Data Platform. The successful candidate will be working hands on with batch and streaming pipelines, designing systems from scratch and taking a more strategic view of the future roadmap.The business are flexible in nature, with a commitment to work-life balance that allows employees to work the hours that suit them around life's other commitments. They are results focused but do ask for a presence in either the Winchester or Central London offices 1-2 days a week - based business need. To Be Considered: Demonstrable expertise and experience working on large-scale Data Engineering projects Strong experience in Python/PySpark, Databricks & Apache Spark Hands on experience with both batch & streaming pipelines Strong experience in AWS and associated tooling (Eg, S3, Glue, Redshift, Lambda, Terraform etc) Experience designing Data Engineering platforms from scratch Alongside their commitment to work-life balance the business also has an extremely competitive benefits package including, but not limited to, Private Medical, enhanced pension contributions & wellness/gymflex programmes.We are expecting a strong response to this Senior Data Engineer role, so please apply send a CV in to today to avoid missing out!
Senior Data Engineer - Winchester/London (Hybrid) - £84,000 + 10% bonus 1-2dpw in office Greenfield Enterprise Data Platform Build Flexible Working Private Medical Ada Meher are working with the market leader in Digital Telecommunication & Broadcasting technology solutions as they recruit for a Senior Data Engineer to join them on their migration project from on-prem to an AWS based Enterprise Data Platform. The successful candidate will be working hands on with batch and streaming pipelines, designing systems from scratch and taking a more strategic view of the future roadmap.The business are flexible in nature, with a commitment to work-life balance that allows employees to work the hours that suit them around life's other commitments. They are results focused but do ask for a presence in either the Winchester or Central London offices 1-2 days a week - based business need. To Be Considered: Demonstrable expertise and experience working on large-scale Data Engineering projects Strong experience in Python/PySpark, Databricks & Apache Spark Hands on experience with both batch & streaming pipelines Strong experience in AWS and associated tooling (Eg, S3, Glue, Redshift, Lambda, Terraform etc) Experience designing Data Engineering platforms from scratch Alongside their commitment to work-life balance the business also has an extremely competitive benefits package including, but not limited to, Private Medical, enhanced pension contributions & wellness/gymflex programmes.We are expecting a strong response to this Senior Data Engineer role, so please apply send a CV in to today to avoid missing out!
17/10/2025
Full time
Senior Data Engineer - Winchester/London (Hybrid) - £84,000 + 10% bonus 1-2dpw in office Greenfield Enterprise Data Platform Build Flexible Working Private Medical Ada Meher are working with the market leader in Digital Telecommunication & Broadcasting technology solutions as they recruit for a Senior Data Engineer to join them on their migration project from on-prem to an AWS based Enterprise Data Platform. The successful candidate will be working hands on with batch and streaming pipelines, designing systems from scratch and taking a more strategic view of the future roadmap.The business are flexible in nature, with a commitment to work-life balance that allows employees to work the hours that suit them around life's other commitments. They are results focused but do ask for a presence in either the Winchester or Central London offices 1-2 days a week - based business need. To Be Considered: Demonstrable expertise and experience working on large-scale Data Engineering projects Strong experience in Python/PySpark, Databricks & Apache Spark Hands on experience with both batch & streaming pipelines Strong experience in AWS and associated tooling (Eg, S3, Glue, Redshift, Lambda, Terraform etc) Experience designing Data Engineering platforms from scratch Alongside their commitment to work-life balance the business also has an extremely competitive benefits package including, but not limited to, Private Medical, enhanced pension contributions & wellness/gymflex programmes.We are expecting a strong response to this Senior Data Engineer role, so please apply send a CV in to today to avoid missing out!
Position: Senior Data Engineer Hybrid - Birmingham 6 months - Outside IR35 Overview: Join a leading UK company as a Senior Data Engineer and play a key role in a major data transformation project. You will have the opportunity to design and deliver a new Azure-based data platform, modernising the organisation's data management and reporting processes. This hands-on role offers architectural influence and is ideal for an experienced engineer with a strong background in setting up new environments, creating data pipelines, and enabling self-service analytics through Power BI. Key Responsibilities: Design, build, and maintain Azure data pipelines using Azure Data Factory, Synapse, or Fabric. Implement a data lakehouse architecture (Bronze/Silver/Gold) and establish best-practise ETL/ELT frameworks. Ingest and integrate data from multiple core systems, including ERP, finance, supply chain, and CRM platforms. Develop and optimise SQL data models and support the creation of Power BI-ready datasets. Apply and document data governance, quality, and validation rules within the platform. Collaborate with Finance and IT stakeholders to translate reporting needs into technical solutions. Monitor, troubleshoot, and optimise data pipelines for performance and cost efficiency. Define reusable components, standards, and documentation to support long-term scalability. Essential Skills & Experience: Proven experience building Azure data platforms end-to-end (Data Factory, Synapse, Fabric, or Databricks). Strong SQL development and data modelling capability. Experience integrating ERP or legacy systems into cloud data platforms. Proficiency in Python or PySpark for transformation and automation. Understanding of data governance, access control, and security within Azure. Hands-on experience preparing data for Power BI or other analytics tools. Excellent communication skills - able to bridge technical and non-technical stakeholders. Strong documentation habits and attention to detail. Desirable Skills & Experience: Experience with AS400, Tagetik, or similar finance systems. Familiarity with Power BI Premium, RLS, and workspace governance. Knowledge of Azure DevOps and CI/CD for data pipelines. Exposure to data quality tools or frameworks.
17/10/2025
Full time
Position: Senior Data Engineer Hybrid - Birmingham 6 months - Outside IR35 Overview: Join a leading UK company as a Senior Data Engineer and play a key role in a major data transformation project. You will have the opportunity to design and deliver a new Azure-based data platform, modernising the organisation's data management and reporting processes. This hands-on role offers architectural influence and is ideal for an experienced engineer with a strong background in setting up new environments, creating data pipelines, and enabling self-service analytics through Power BI. Key Responsibilities: Design, build, and maintain Azure data pipelines using Azure Data Factory, Synapse, or Fabric. Implement a data lakehouse architecture (Bronze/Silver/Gold) and establish best-practise ETL/ELT frameworks. Ingest and integrate data from multiple core systems, including ERP, finance, supply chain, and CRM platforms. Develop and optimise SQL data models and support the creation of Power BI-ready datasets. Apply and document data governance, quality, and validation rules within the platform. Collaborate with Finance and IT stakeholders to translate reporting needs into technical solutions. Monitor, troubleshoot, and optimise data pipelines for performance and cost efficiency. Define reusable components, standards, and documentation to support long-term scalability. Essential Skills & Experience: Proven experience building Azure data platforms end-to-end (Data Factory, Synapse, Fabric, or Databricks). Strong SQL development and data modelling capability. Experience integrating ERP or legacy systems into cloud data platforms. Proficiency in Python or PySpark for transformation and automation. Understanding of data governance, access control, and security within Azure. Hands-on experience preparing data for Power BI or other analytics tools. Excellent communication skills - able to bridge technical and non-technical stakeholders. Strong documentation habits and attention to detail. Desirable Skills & Experience: Experience with AS400, Tagetik, or similar finance systems. Familiarity with Power BI Premium, RLS, and workspace governance. Knowledge of Azure DevOps and CI/CD for data pipelines. Exposure to data quality tools or frameworks.
A leading global bank with strong presence in Corporate and Investment Banking is seeking a Director to drive strategic and technical leadership for engineering teams, to strengthen their engineering capability across multiple applications and platforms. This exciting newly recreated role will drive high-quality, scalable, and secure software delivery while fostering a collaborative, innovative, and results-oriented team environment. Key Responsibilities: Lead and mentor onshore/offshore engineering, DevOps, DevSecOps, and application development teams. Collaborate with Cross Product Domain Leads and Product Owners to deliver solutions that meet business, regulatory, and growth requirements. Oversee domain architecture, solution design, and technology stack optimization for cloud and on-premises systems. Ensure high-quality, adaptable software solutions while reducing total cost of ownership. Manage vendor and internal resources, driving career development, engagement, and performance excellence. Partner with infrastructure, security, and application management teams to enable smooth delivery and continuous improvement. Experience & Skills: 15+ years in leading engineering teams in Corporate/Investment Banking, ideally across Risk, Finance, and Regulatory Reporting systems. Expertise in microservices architecture, system integration, DevOps, DevSecOps, cloud (Azure), and on-premises platforms. Proven experience in Agile and Waterfall methodologies, IT controls, vendor management, and strategic value delivery. Strong leadership, strategic thinking, commercial acumen, and global stakeholder management skills. Passionate about diversity, inclusion, sustainability, and fostering a high-performing team culture. Technical Competencies: Languages: .Net, C#, Java, SQL Databases: Oracle, SQL, PostgreSQL, MongoDB, Redis Messaging & Middleware: MQ, IIB/ACE, DataStage Cloud & Orchestration: Azure Databricks, Camunda 8, Kubernetes (AKS/EKS/GKE), Docker, Helm Backend/Frontend: Java 21, Spring Boot 3.x, Angular 15+, React 18+, REST APIs CI/CD & DevSecOps: Jenkins, GitHub/GitLab/BitBucket, SonarQube, Prometheus, Grafana, ELK Stack Security & Secrets Management: OAuth2/OpenID Connect, HashiCorp Vault, CyberArk This is a high-impact leadership role offering the opportunity to shape the bank's engineering capability and deliver transformational technology solutions at an enterprise scale. Robert Walters Operations Limited is an employment business and employment agency and welcomes applications from all candidates
17/10/2025
Full time
A leading global bank with strong presence in Corporate and Investment Banking is seeking a Director to drive strategic and technical leadership for engineering teams, to strengthen their engineering capability across multiple applications and platforms. This exciting newly recreated role will drive high-quality, scalable, and secure software delivery while fostering a collaborative, innovative, and results-oriented team environment. Key Responsibilities: Lead and mentor onshore/offshore engineering, DevOps, DevSecOps, and application development teams. Collaborate with Cross Product Domain Leads and Product Owners to deliver solutions that meet business, regulatory, and growth requirements. Oversee domain architecture, solution design, and technology stack optimization for cloud and on-premises systems. Ensure high-quality, adaptable software solutions while reducing total cost of ownership. Manage vendor and internal resources, driving career development, engagement, and performance excellence. Partner with infrastructure, security, and application management teams to enable smooth delivery and continuous improvement. Experience & Skills: 15+ years in leading engineering teams in Corporate/Investment Banking, ideally across Risk, Finance, and Regulatory Reporting systems. Expertise in microservices architecture, system integration, DevOps, DevSecOps, cloud (Azure), and on-premises platforms. Proven experience in Agile and Waterfall methodologies, IT controls, vendor management, and strategic value delivery. Strong leadership, strategic thinking, commercial acumen, and global stakeholder management skills. Passionate about diversity, inclusion, sustainability, and fostering a high-performing team culture. Technical Competencies: Languages: .Net, C#, Java, SQL Databases: Oracle, SQL, PostgreSQL, MongoDB, Redis Messaging & Middleware: MQ, IIB/ACE, DataStage Cloud & Orchestration: Azure Databricks, Camunda 8, Kubernetes (AKS/EKS/GKE), Docker, Helm Backend/Frontend: Java 21, Spring Boot 3.x, Angular 15+, React 18+, REST APIs CI/CD & DevSecOps: Jenkins, GitHub/GitLab/BitBucket, SonarQube, Prometheus, Grafana, ELK Stack Security & Secrets Management: OAuth2/OpenID Connect, HashiCorp Vault, CyberArk This is a high-impact leadership role offering the opportunity to shape the bank's engineering capability and deliver transformational technology solutions at an enterprise scale. Robert Walters Operations Limited is an employment business and employment agency and welcomes applications from all candidates
Senior Data Engineer (Databricks) Location: London (Hybrid) Rate: Negotiable, depending on experience Duration: 6 months (initial)We're looking for a Senior Data Engineer (Databricks) to join a world-leading energy organisation on a key transformation programme within their trading and supply division. This is an exciting opportunity to play a pivotal role in building modern, scalable data solutions using Azure cloud technologies. The Role As a Senior Data Engineer, you'll be responsible for designing and developing robust data foundations and end-to-end solutions that drive value across the business. You'll help shape and embed data-driven thinking across both technical and business teams, ensuring the organisation continues to lead with insight and innovation.You'll act as a subject matter expert, guiding technical decisions, mentoring junior engineers, and ensuring data engineering best practices are consistently applied. Key Responsibilities Design and build data solutions aligned with business and IT strategy. Lead development of scalable data pipelines and models using Azure and Databricks. Support data foundation initiatives and ensure effective rollout across business units. Act as a bridge between technical and non-technical stakeholders, presenting insights clearly. Oversee change management, incident management, and data quality improvement. Contribute to best practice sharing and community-building initiatives within the data engineering space. Required Skills & Experience Cloud Platforms: Strong expertise in AWS / Azure / SAP ETL/ELT Pipelines: Advanced proficiency Data Modelling: Expert level Data Integration & Ingestion: Skilled Databricks, SQL, Synapse, Data Factory and related Azure services Version Control / DevOps tools: GITHUB, Azure DevOps, Actions Testing & Automation tools: PyTest, SonarQube Desirable Experience Experience leading or running scrum teams Exposure to planning tools such as BPC Familiarity with external data ecosystems and documentation tools (e.g., MKDocs) The Project You'll be joining a large-scale programme focused on modernising a global data warehouse platform using Azure technologies. The project aims to deliver a unified and standardised view of data across international operations - a key enabler for smarter, data-driven trading decisions.If you're a data engineer with deep Azure and Databricks experience, and you enjoy solving complex challenges within a global business, this contract offers a chance to make a real impact on a high-profile initiative. Interested? Please apply now with your updated CV and reach out to Tom Johnson at Certain Advantage - Ref: 79413
17/10/2025
Full time
Senior Data Engineer (Databricks) Location: London (Hybrid) Rate: Negotiable, depending on experience Duration: 6 months (initial)We're looking for a Senior Data Engineer (Databricks) to join a world-leading energy organisation on a key transformation programme within their trading and supply division. This is an exciting opportunity to play a pivotal role in building modern, scalable data solutions using Azure cloud technologies. The Role As a Senior Data Engineer, you'll be responsible for designing and developing robust data foundations and end-to-end solutions that drive value across the business. You'll help shape and embed data-driven thinking across both technical and business teams, ensuring the organisation continues to lead with insight and innovation.You'll act as a subject matter expert, guiding technical decisions, mentoring junior engineers, and ensuring data engineering best practices are consistently applied. Key Responsibilities Design and build data solutions aligned with business and IT strategy. Lead development of scalable data pipelines and models using Azure and Databricks. Support data foundation initiatives and ensure effective rollout across business units. Act as a bridge between technical and non-technical stakeholders, presenting insights clearly. Oversee change management, incident management, and data quality improvement. Contribute to best practice sharing and community-building initiatives within the data engineering space. Required Skills & Experience Cloud Platforms: Strong expertise in AWS / Azure / SAP ETL/ELT Pipelines: Advanced proficiency Data Modelling: Expert level Data Integration & Ingestion: Skilled Databricks, SQL, Synapse, Data Factory and related Azure services Version Control / DevOps tools: GITHUB, Azure DevOps, Actions Testing & Automation tools: PyTest, SonarQube Desirable Experience Experience leading or running scrum teams Exposure to planning tools such as BPC Familiarity with external data ecosystems and documentation tools (e.g., MKDocs) The Project You'll be joining a large-scale programme focused on modernising a global data warehouse platform using Azure technologies. The project aims to deliver a unified and standardised view of data across international operations - a key enabler for smarter, data-driven trading decisions.If you're a data engineer with deep Azure and Databricks experience, and you enjoy solving complex challenges within a global business, this contract offers a chance to make a real impact on a high-profile initiative. Interested? Please apply now with your updated CV and reach out to Tom Johnson at Certain Advantage - Ref: 79413
Data Services Manager Remote (UK-based) Quarterly travel to London £85,000 + benefits Permanent Full-time About the Role My client are seeking an experienced Data Services Manager to lead a high-performing data engineering function within a fast-paced financial services organisation. Reporting to the Senior Data Services Manager, you'll drive the design, delivery, and continuous improvement of data solutions that underpin strategic decision-making and regulatory compliance. You'll manage and mentor Data Engineers, ensuring scalable, high-quality data products and operational excellence across both on-premise and cloud platforms. Key Responsibilities Lead, coach, and develop a team of Data Engineers to deliver robust, scalable data solutions. Provide technical leadership across the Microsoft and Azure Data Platform (SQL Server, Power BI, Azure Data Factory, Databricks, Data Lake). Champion Data Governance , Data Quality , and Data Management best practices. Collaborate with business and technology teams to align data solutions with strategic goals. Oversee Agile delivery, CI/CD, and DataOps processes. Support prioritisation, planning, and resource management within the Data Services function. Contribute to architectural decisions as part of the organisation's Technical Design Authority . About You 10-15 years' experience in Data Engineering , Business Intelligence , or Analytics roles. 2+ years in a lead or management position. Financial services experience is essential , ideally covering lending, servicing, or securitisation data. Deep technical expertise in Microsoft and Azure data technologies. Strong knowledge of data modelling (Kimball) , ETL/ELT , and hybrid cloud architectures . Proven ability to drive quality, governance, and best practices within engineering teams. Excellent communication, stakeholder management, and leadership skills. If you're interested, get in touch ASAP with a copy of your most up-to-date CV and email me at or call me on . Please Note: This is a permanent role for UK residents only. This role does not offer Sponsorship. You must have the right to work in the UK with no restrictions. Some of our roles may be subject to successful background checks including a DBS and Credit Check. TRG are the go-to recruiter for Power BI and Azure Data Platform roles in the UK, offering more opportunities across the country than any other. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, the London Power BI User Group, Newcastle Power BI User Group and Newcastle Data Platform and Cloud User Group. To find out more and speak confidentially about your job search or hiring needs, please contact me directly at
17/10/2025
Full time
Data Services Manager Remote (UK-based) Quarterly travel to London £85,000 + benefits Permanent Full-time About the Role My client are seeking an experienced Data Services Manager to lead a high-performing data engineering function within a fast-paced financial services organisation. Reporting to the Senior Data Services Manager, you'll drive the design, delivery, and continuous improvement of data solutions that underpin strategic decision-making and regulatory compliance. You'll manage and mentor Data Engineers, ensuring scalable, high-quality data products and operational excellence across both on-premise and cloud platforms. Key Responsibilities Lead, coach, and develop a team of Data Engineers to deliver robust, scalable data solutions. Provide technical leadership across the Microsoft and Azure Data Platform (SQL Server, Power BI, Azure Data Factory, Databricks, Data Lake). Champion Data Governance , Data Quality , and Data Management best practices. Collaborate with business and technology teams to align data solutions with strategic goals. Oversee Agile delivery, CI/CD, and DataOps processes. Support prioritisation, planning, and resource management within the Data Services function. Contribute to architectural decisions as part of the organisation's Technical Design Authority . About You 10-15 years' experience in Data Engineering , Business Intelligence , or Analytics roles. 2+ years in a lead or management position. Financial services experience is essential , ideally covering lending, servicing, or securitisation data. Deep technical expertise in Microsoft and Azure data technologies. Strong knowledge of data modelling (Kimball) , ETL/ELT , and hybrid cloud architectures . Proven ability to drive quality, governance, and best practices within engineering teams. Excellent communication, stakeholder management, and leadership skills. If you're interested, get in touch ASAP with a copy of your most up-to-date CV and email me at or call me on . Please Note: This is a permanent role for UK residents only. This role does not offer Sponsorship. You must have the right to work in the UK with no restrictions. Some of our roles may be subject to successful background checks including a DBS and Credit Check. TRG are the go-to recruiter for Power BI and Azure Data Platform roles in the UK, offering more opportunities across the country than any other. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, the London Power BI User Group, Newcastle Power BI User Group and Newcastle Data Platform and Cloud User Group. To find out more and speak confidentially about your job search or hiring needs, please contact me directly at
Data Manager - Remote - Up to £85,000 Employment Type: Permanent Are you a seasoned data professional ready to take the next step into strategic leadership? We're working with a forward-thinking financial services organisation seeking a Data Manager to lead and mentor a high-performing data engineering team. This is a pivotal role within a growing Data Services function, offering the opportunity to shape data strategy, drive innovation, and influence enterprise-wide decision-making. You'll be at the forefront of delivering scalable, high-quality data solutions that support business-critical domains such as Servicing, Securitisations, and Finance. What You'll Be Doing: Leading and mentoring Data Engineers to deliver robust data products and solutions. Championing data governance, quality, and compliance across the organisation. Collaborating with cross-functional teams to align data initiatives with strategic goals. Driving Agile delivery, automation, and continuous improvement. Influencing architectural decisions as part of the Technical Design Authority. What We're Looking For: Proven leadership in data engineering or BI teams. Hands-on experience across the Microsoft Data Platform (SQL Server, Azure, Power BI, Databricks). Strong understanding of data governance, privacy, and compliance frameworks. Expertise in hybrid cloud/on-premise data architectures and DevOps/DataOps practices. Excellent stakeholder engagement and mentoring capabilities. Why Apply? Be part of a data-driven transformation journey. Work with cutting-edge technologies and a passionate team. Influence enterprise architecture and strategic data decisions. Enjoy a collaborative, growth-oriented culture.
17/10/2025
Full time
Data Manager - Remote - Up to £85,000 Employment Type: Permanent Are you a seasoned data professional ready to take the next step into strategic leadership? We're working with a forward-thinking financial services organisation seeking a Data Manager to lead and mentor a high-performing data engineering team. This is a pivotal role within a growing Data Services function, offering the opportunity to shape data strategy, drive innovation, and influence enterprise-wide decision-making. You'll be at the forefront of delivering scalable, high-quality data solutions that support business-critical domains such as Servicing, Securitisations, and Finance. What You'll Be Doing: Leading and mentoring Data Engineers to deliver robust data products and solutions. Championing data governance, quality, and compliance across the organisation. Collaborating with cross-functional teams to align data initiatives with strategic goals. Driving Agile delivery, automation, and continuous improvement. Influencing architectural decisions as part of the Technical Design Authority. What We're Looking For: Proven leadership in data engineering or BI teams. Hands-on experience across the Microsoft Data Platform (SQL Server, Azure, Power BI, Databricks). Strong understanding of data governance, privacy, and compliance frameworks. Expertise in hybrid cloud/on-premise data architectures and DevOps/DataOps practices. Excellent stakeholder engagement and mentoring capabilities. Why Apply? Be part of a data-driven transformation journey. Work with cutting-edge technologies and a passionate team. Influence enterprise architecture and strategic data decisions. Enjoy a collaborative, growth-oriented culture.
Business Product Owner, Data Analytics, Azure, Data Engineering, AI, ML, Azure, Mainly Remote Product Owner (Technical and Business) required to join a global Professional Services business based in Central London. However, this is practically a remote role, but when travel is required (to London, Europe and the States on occasions. We need someone who has worked on LARGE and SIGNIFICANT products for large enterprise businesses. If you have come from the Big 4 on the Professional Services circuit, even better! The platform primarily serves two key personas: Data and Intelligence Delivery specialists, who manage data ingestion, transformation, and orchestration processes, and Assurance professionals, who use the analysers to enhance audit quality and client service. This being said, we need DATA HEAVY Product Owners who have managed complex, Global products. Read on for more details Experience required: Technical proficiency: Familiarity with Azure services (e.g., Data Lake, Synapse, Fabric) and Databricks for data engineering, analytics, performance optimisation, and governance. Experience with implementing and optimising scalable cloud infrastructure is highly valued. Backlog management: Demonstrated expertise in maintaining and prioritizing product backlogs, writing detailed user stories, and collaborating with development teams to deliver sprint goals. Agile product ownership: Experience in SAFe or similar agile frameworks, including daily scrum leadership and sprint planning. Cross-team collaboration: Effective working across engineering, analytics, and business teams to ensure seamless execution. KPI management: Ability to track, analyze, and interpret KPIs to guide product improvements and communicate results to stakeholders. Technical acumen: Solid understanding of modern data platforms, including experience with medallion architecture, AI/ML applications, and cloud-native infrastructures. Communication skills: Excellent communication skills for conveying technical concepts to various audiences, including engineers, business partners, and senior leadership. Collaboration and flexibility: Experience working with distributed teams in dynamic, fast-paced environments. Innovation mindset: Passion for leveraging advanced analytics, AI, and cloud technologies to deliver cutting-edge solutions. This is a great opportunity and salary is dependent upon experience. Apply now for more details
17/10/2025
Full time
Business Product Owner, Data Analytics, Azure, Data Engineering, AI, ML, Azure, Mainly Remote Product Owner (Technical and Business) required to join a global Professional Services business based in Central London. However, this is practically a remote role, but when travel is required (to London, Europe and the States on occasions. We need someone who has worked on LARGE and SIGNIFICANT products for large enterprise businesses. If you have come from the Big 4 on the Professional Services circuit, even better! The platform primarily serves two key personas: Data and Intelligence Delivery specialists, who manage data ingestion, transformation, and orchestration processes, and Assurance professionals, who use the analysers to enhance audit quality and client service. This being said, we need DATA HEAVY Product Owners who have managed complex, Global products. Read on for more details Experience required: Technical proficiency: Familiarity with Azure services (e.g., Data Lake, Synapse, Fabric) and Databricks for data engineering, analytics, performance optimisation, and governance. Experience with implementing and optimising scalable cloud infrastructure is highly valued. Backlog management: Demonstrated expertise in maintaining and prioritizing product backlogs, writing detailed user stories, and collaborating with development teams to deliver sprint goals. Agile product ownership: Experience in SAFe or similar agile frameworks, including daily scrum leadership and sprint planning. Cross-team collaboration: Effective working across engineering, analytics, and business teams to ensure seamless execution. KPI management: Ability to track, analyze, and interpret KPIs to guide product improvements and communicate results to stakeholders. Technical acumen: Solid understanding of modern data platforms, including experience with medallion architecture, AI/ML applications, and cloud-native infrastructures. Communication skills: Excellent communication skills for conveying technical concepts to various audiences, including engineers, business partners, and senior leadership. Collaboration and flexibility: Experience working with distributed teams in dynamic, fast-paced environments. Innovation mindset: Passion for leveraging advanced analytics, AI, and cloud technologies to deliver cutting-edge solutions. This is a great opportunity and salary is dependent upon experience. Apply now for more details
About the Role We're working with a market-leading organisation that's undergoing a major transformation, moving from manual, Excel-based reporting to a fully automated, intelligence-driven data ecosystem. As Data Architect, you'll be responsible for designing and implementing the Azure-based data platform that becomes the single source of truth across the business. This is a hands-on, strategic role where you'll build scalable, governed data architecture and shape how data is used across Finance, Operations, and Commercial functions. What You'll Be Doing Data Architecture & Platform Design Design and implement an enterprise data lake on Azure Data Lake Gen2, using Bronze/Silver/Gold architecture. Build and maintain scalable ETL/ELT pipelines in Azure Data Factory to integrate data from core systems (AS400, Tagetik, CRM, Esker, Slimstock). Develop the overall data model, data dictionaries, and lineage documentation. Deliver a stable "batch-first" integration strategy with AS400 during its .NET migration, with a roadmap toward API integration. Data Governance & Quality Implement the technical foundation for data governance - quality checks, metadata management, and master data validation. Embed business rules and validation logic directly within data pipelines. Define and manage data security and access controls (Azure and Power BI row-level security). Implementation & Optimisation Lead the hands-on build, testing, and deployment of the Azure data platform. Monitor platform performance and optimise pipelines for cost, scalability, and speed. Define and document technical standards and best practices. Oversee the migration from legacy tools (Domo, Vecta) to the new Power BI ecosystem. What You'll Bring Technical Skills Strong hands-on experience with Azure Data Lake Gen2, Azure Data Factory, and Azure Active Directory. Advanced skills in data modelling (conceptual, logical, physical) and SQL for complex transformations. Proven ability to design and build high-performance ETL/ELT pipelines. Understanding of data governance, security, and access control frameworks. Knowledge of batch and real-time data integration and experience with ODBC connectors or REST APIs. Familiarity with Databricks and/or Microsoft Fabric is a bonus. Experience 3+ years in a Data Architect or senior data engineering role. Proven record of designing and delivering cloud-based data platforms, ideally in Azure. Background working with complex ERP or transactional systems. Experience supporting or leading data transformation initiatives within a business setting.
17/10/2025
Full time
About the Role We're working with a market-leading organisation that's undergoing a major transformation, moving from manual, Excel-based reporting to a fully automated, intelligence-driven data ecosystem. As Data Architect, you'll be responsible for designing and implementing the Azure-based data platform that becomes the single source of truth across the business. This is a hands-on, strategic role where you'll build scalable, governed data architecture and shape how data is used across Finance, Operations, and Commercial functions. What You'll Be Doing Data Architecture & Platform Design Design and implement an enterprise data lake on Azure Data Lake Gen2, using Bronze/Silver/Gold architecture. Build and maintain scalable ETL/ELT pipelines in Azure Data Factory to integrate data from core systems (AS400, Tagetik, CRM, Esker, Slimstock). Develop the overall data model, data dictionaries, and lineage documentation. Deliver a stable "batch-first" integration strategy with AS400 during its .NET migration, with a roadmap toward API integration. Data Governance & Quality Implement the technical foundation for data governance - quality checks, metadata management, and master data validation. Embed business rules and validation logic directly within data pipelines. Define and manage data security and access controls (Azure and Power BI row-level security). Implementation & Optimisation Lead the hands-on build, testing, and deployment of the Azure data platform. Monitor platform performance and optimise pipelines for cost, scalability, and speed. Define and document technical standards and best practices. Oversee the migration from legacy tools (Domo, Vecta) to the new Power BI ecosystem. What You'll Bring Technical Skills Strong hands-on experience with Azure Data Lake Gen2, Azure Data Factory, and Azure Active Directory. Advanced skills in data modelling (conceptual, logical, physical) and SQL for complex transformations. Proven ability to design and build high-performance ETL/ELT pipelines. Understanding of data governance, security, and access control frameworks. Knowledge of batch and real-time data integration and experience with ODBC connectors or REST APIs. Familiarity with Databricks and/or Microsoft Fabric is a bonus. Experience 3+ years in a Data Architect or senior data engineering role. Proven record of designing and delivering cloud-based data platforms, ideally in Azure. Background working with complex ERP or transactional systems. Experience supporting or leading data transformation initiatives within a business setting.
Senior Data Engineer (AWS Kafka Python) London / WFH to £85k Are you a tech savvy Data Engineer with AWS expertise combined with client facing skills? You could be joining a global technology consultancy with a range of banking, financial services and insurance clients in a senior, hands-on Data Engineer role. As a Senior Data Engineer you will design and build end-to-end real-time data pipelines using AWS native tools, Kafka and modern data architectures, applying AWS Well-Architected Principles to ensure scalability, security and resilience. You'll collaborate directly with clients to analyse requirements, define solutions and deliver production grade systems, leading the development of robust, well tested and fault tolerant data engineering solutions. Location / WFH: There's a hybrid work from home model with two days a week in the London, City office (or at client site in London). About you: You are an experienced Data Engineer within financial services environments You have expertise with AWS including Lake formation and transformation layers You have strong Python coding skills You have experience with real-time data streaming using Kafka You're collaborative and pragmatic with excellent communication and stakeholder management skills You're comfortable taking ownership of projects and working end-to-end You have a good knowledge of Distributed Systems and DevOps tooling Ideally you will also have Databricks experience What's in it for you: As a Senior Data Engineer you will earn a highly competitive package: Salary to £85k Bonus c15% Pension (up to 7% employer contribution), Life Assurance, Income Protection Private medical care for you and your family, including mental health Travel Insurance Charitable giving Gym membership for you and your family Flexible holiday scheme Apply now to find out more about this Senior Data Engineer (AWS Kafka Python) opportunity. At Client Server we believe in a diverse workplace that allows people to play to their strengths and continually learn. We're an equal opportunities employer whose people come from all walks of life and will never discriminate based on race, colour, religion, sex, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status. The clients we work with share our values.
17/10/2025
Full time
Senior Data Engineer (AWS Kafka Python) London / WFH to £85k Are you a tech savvy Data Engineer with AWS expertise combined with client facing skills? You could be joining a global technology consultancy with a range of banking, financial services and insurance clients in a senior, hands-on Data Engineer role. As a Senior Data Engineer you will design and build end-to-end real-time data pipelines using AWS native tools, Kafka and modern data architectures, applying AWS Well-Architected Principles to ensure scalability, security and resilience. You'll collaborate directly with clients to analyse requirements, define solutions and deliver production grade systems, leading the development of robust, well tested and fault tolerant data engineering solutions. Location / WFH: There's a hybrid work from home model with two days a week in the London, City office (or at client site in London). About you: You are an experienced Data Engineer within financial services environments You have expertise with AWS including Lake formation and transformation layers You have strong Python coding skills You have experience with real-time data streaming using Kafka You're collaborative and pragmatic with excellent communication and stakeholder management skills You're comfortable taking ownership of projects and working end-to-end You have a good knowledge of Distributed Systems and DevOps tooling Ideally you will also have Databricks experience What's in it for you: As a Senior Data Engineer you will earn a highly competitive package: Salary to £85k Bonus c15% Pension (up to 7% employer contribution), Life Assurance, Income Protection Private medical care for you and your family, including mental health Travel Insurance Charitable giving Gym membership for you and your family Flexible holiday scheme Apply now to find out more about this Senior Data Engineer (AWS Kafka Python) opportunity. At Client Server we believe in a diverse workplace that allows people to play to their strengths and continually learn. We're an equal opportunities employer whose people come from all walks of life and will never discriminate based on race, colour, religion, sex, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status. The clients we work with share our values.
Data Engineer Join the Team Powering the Future of Data at Vivedia Location: Sheffield - Hybrid (3 days per week) Salary : £50,000 to £60,000 depending on experience Department: Data Team At Vivedia , data isn't just numbers - it's the engine driving every decision, every innovation, and every customer experience. We're growing, and we're looking for a Data Engineer who's ready to build the foundations of a smarter, more connected future. Why You'll Love This Role You'll play a pivotal role in designing, building, and optimizing the data systems that power our business. From robust pipelines to scalable platforms, your work will help every team at Vivedia unlock insights, drive performance, and make data-driven decisions that matter. This is your chance to shape the future of our data ecosystem - and see your impact ripple across an entire organisation. What You'll Do Build the backbone of our data universe Design, develop, and deploy scalable, secure, and reliable data pipelines. Bring together data from internal systems, APIs, and third-party platforms into a unified warehouse or data lake. Model data to ensure consistency, accessibility, and reusability across the company. Empower smarter decisions Partner with Data Analysts, Data Scientists, and Business Leaders to translate data needs into technical solutions. Optimise systems to deliver fast, accurate, and actionable insights. Support the creation of dashboards, KPIs, and reports that guide strategic decisions. Champion data quality and governance Implement validation, monitoring, and quality checks to ensure data accuracy and trust. Support compliance with privacy regulations like GDPR. Maintain rigorous security standards for data access, encryption, and storage. Drive innovation and continuous improvement Explore and integrate emerging tools in cloud, automation, and data architecture. Lead or support migrations to modern platforms such as AWS, Azure, GCP, Snowflake, or Databricks. Proactively identify opportunities to streamline and optimize performance. What You'll Bring Experience: Proven hands-on experience as a Data Engineer or in a similar data-focused role. Technical Skills: Proficiency in SQL and Python . Strong grasp of ETL/ELT pipelines , data modeling , and data warehousing . Experience with cloud platforms (AWS, Azure, GCP) and tools like Snowflake, Databricks, or BigQuery . Familiarity with streaming technologies (Kafka, Spark Streaming, Flink) is a plus. Tools & Frameworks: Airflow, dbt, Prefect, CI/CD pipelines, Terraform. Mindset: Curious, data-obsessed, and driven to create meaningful business impact. Soft Skills: Excellent communication and collaboration - translating complex technical ideas into business insight is your superpower. Why Vivedia We're a company where data meets purpose - where your ideas are valued, your work makes a visible difference, and innovation thrives. Join us and be part of a team that's building the future of data, together. Ready to make an impact? Apply now and help us turn data into something extraordinary.
17/10/2025
Full time
Data Engineer Join the Team Powering the Future of Data at Vivedia Location: Sheffield - Hybrid (3 days per week) Salary : £50,000 to £60,000 depending on experience Department: Data Team At Vivedia , data isn't just numbers - it's the engine driving every decision, every innovation, and every customer experience. We're growing, and we're looking for a Data Engineer who's ready to build the foundations of a smarter, more connected future. Why You'll Love This Role You'll play a pivotal role in designing, building, and optimizing the data systems that power our business. From robust pipelines to scalable platforms, your work will help every team at Vivedia unlock insights, drive performance, and make data-driven decisions that matter. This is your chance to shape the future of our data ecosystem - and see your impact ripple across an entire organisation. What You'll Do Build the backbone of our data universe Design, develop, and deploy scalable, secure, and reliable data pipelines. Bring together data from internal systems, APIs, and third-party platforms into a unified warehouse or data lake. Model data to ensure consistency, accessibility, and reusability across the company. Empower smarter decisions Partner with Data Analysts, Data Scientists, and Business Leaders to translate data needs into technical solutions. Optimise systems to deliver fast, accurate, and actionable insights. Support the creation of dashboards, KPIs, and reports that guide strategic decisions. Champion data quality and governance Implement validation, monitoring, and quality checks to ensure data accuracy and trust. Support compliance with privacy regulations like GDPR. Maintain rigorous security standards for data access, encryption, and storage. Drive innovation and continuous improvement Explore and integrate emerging tools in cloud, automation, and data architecture. Lead or support migrations to modern platforms such as AWS, Azure, GCP, Snowflake, or Databricks. Proactively identify opportunities to streamline and optimize performance. What You'll Bring Experience: Proven hands-on experience as a Data Engineer or in a similar data-focused role. Technical Skills: Proficiency in SQL and Python . Strong grasp of ETL/ELT pipelines , data modeling , and data warehousing . Experience with cloud platforms (AWS, Azure, GCP) and tools like Snowflake, Databricks, or BigQuery . Familiarity with streaming technologies (Kafka, Spark Streaming, Flink) is a plus. Tools & Frameworks: Airflow, dbt, Prefect, CI/CD pipelines, Terraform. Mindset: Curious, data-obsessed, and driven to create meaningful business impact. Soft Skills: Excellent communication and collaboration - translating complex technical ideas into business insight is your superpower. Why Vivedia We're a company where data meets purpose - where your ideas are valued, your work makes a visible difference, and innovation thrives. Join us and be part of a team that's building the future of data, together. Ready to make an impact? Apply now and help us turn data into something extraordinary.
Job Title: Innovation Delivery Manager Location: Flexible (Peterborough, Sunderland, Kent, Bristol) ad hoc travel Role Overview The Innovation Engineer plays a key role within the Group's Innovation function, helping to shape and deliver the Technology Innovation and GenAI roadmap. This role blends hands-on engineering with strategic exploration, focusing on the practical application of emerging technologies - particularly Generative AI - to solve real business challenges. As part of a multidisciplinary team, the Innovation Engineer will lead the design, prototyping, and early-stage development of innovative solutions, helping to inform and influence the Group's future technology direction. Key Responsibilities: Explore and evaluate emerging technologies to identify opportunities for innovation and business impact. Design and deliver a pipeline of proofs of concept (PoCs) and early-stage projects aligned with the Innovation strategy. Act as a technical SME for our Reference Architecture, defining the evolution and optimisation of this platform Collaborate with business stakeholders, architects, data SMEs, and delivery teams to shape priorities and approaches. Build and test prototypes, ensuring they are scalable and production-ready where appropriate. Lead infrastructure and environment setup for innovation projects, including CI/CD pipelines and cloud environments. Support the development of data pipelines, processing, and storage for AI/ML workloads. Monitor and evaluate model performance, accuracy, and resource utilisation. Champion delivery best practices across the team including AI Governance, EDA alignment and SDLC compliance Contribute to the rollout of successful innovations across wider business functions and applications. Support the development of AI literacy across the Group, coaching stakeholders and co-developing new ways of working. Stay current with technology trends and GenAI developments, bringing fresh thinking and insight to the team. Drive knowledge sharing across the team including lessons learnt in PoCs, platform specialisms, and providing guidance, mentorship, and coaching to junior team members Key Skills and Experience: Proven experience in an engineering role, with a track record of delivering innovative technology solutions. Strong programming skills in Python, C#, or similar languages. Experience with CI/CD pipelines, containerisation (Docker/Kubernetes) Familiarity with cloud platforms (Azure, AWS, or GCP) Understanding of data engineering principles, including ETL, data pipelines, and database management. Working knowledge of ML/AI concepts and technologies, including model deployment and monitoring. Strong problem-solving skills and the ability to work across disciplines in a fast-paced, exploratory environment. Excellent communication and collaboration skills, with the ability to engage both technical and non-technical stakeholders. Desirable Experience working in innovation, R&D, or emerging technology teams. Passion for GenAI and a strong understanding of its potential impact on business and technology. Experience in regulated industries such as insurance or financial services. Experience with tools such as LangChain, Databricks, LLMOps, and Kubernetes. Behaviours: Team player Self-motivated with a drive to learn and develop Logical thinker with a professional and positive attitude Passion to innovate and improve processes Personality and a sense of humour
17/10/2025
Full time
Job Title: Innovation Delivery Manager Location: Flexible (Peterborough, Sunderland, Kent, Bristol) ad hoc travel Role Overview The Innovation Engineer plays a key role within the Group's Innovation function, helping to shape and deliver the Technology Innovation and GenAI roadmap. This role blends hands-on engineering with strategic exploration, focusing on the practical application of emerging technologies - particularly Generative AI - to solve real business challenges. As part of a multidisciplinary team, the Innovation Engineer will lead the design, prototyping, and early-stage development of innovative solutions, helping to inform and influence the Group's future technology direction. Key Responsibilities: Explore and evaluate emerging technologies to identify opportunities for innovation and business impact. Design and deliver a pipeline of proofs of concept (PoCs) and early-stage projects aligned with the Innovation strategy. Act as a technical SME for our Reference Architecture, defining the evolution and optimisation of this platform Collaborate with business stakeholders, architects, data SMEs, and delivery teams to shape priorities and approaches. Build and test prototypes, ensuring they are scalable and production-ready where appropriate. Lead infrastructure and environment setup for innovation projects, including CI/CD pipelines and cloud environments. Support the development of data pipelines, processing, and storage for AI/ML workloads. Monitor and evaluate model performance, accuracy, and resource utilisation. Champion delivery best practices across the team including AI Governance, EDA alignment and SDLC compliance Contribute to the rollout of successful innovations across wider business functions and applications. Support the development of AI literacy across the Group, coaching stakeholders and co-developing new ways of working. Stay current with technology trends and GenAI developments, bringing fresh thinking and insight to the team. Drive knowledge sharing across the team including lessons learnt in PoCs, platform specialisms, and providing guidance, mentorship, and coaching to junior team members Key Skills and Experience: Proven experience in an engineering role, with a track record of delivering innovative technology solutions. Strong programming skills in Python, C#, or similar languages. Experience with CI/CD pipelines, containerisation (Docker/Kubernetes) Familiarity with cloud platforms (Azure, AWS, or GCP) Understanding of data engineering principles, including ETL, data pipelines, and database management. Working knowledge of ML/AI concepts and technologies, including model deployment and monitoring. Strong problem-solving skills and the ability to work across disciplines in a fast-paced, exploratory environment. Excellent communication and collaboration skills, with the ability to engage both technical and non-technical stakeholders. Desirable Experience working in innovation, R&D, or emerging technology teams. Passion for GenAI and a strong understanding of its potential impact on business and technology. Experience in regulated industries such as insurance or financial services. Experience with tools such as LangChain, Databricks, LLMOps, and Kubernetes. Behaviours: Team player Self-motivated with a drive to learn and develop Logical thinker with a professional and positive attitude Passion to innovate and improve processes Personality and a sense of humour
A growing UK-based professional services organisation are looking for a Data & Development Lead in this newly created role, to help shape and deliver their Data & Innovation Strategy, which will be critical to their ongoing success. This is a hybrid role, with 3 days per week in one of their multiple office locations across the UK - there is one in Richmond, North Yorkshire. This role blends hands-on technical delivery with strategic oversight, where you'll manage a small team of 3 to deliver excellent data engineering, automation and systems development solutions. You will define and maintain a fit-for-purpose data and systems architecture that is aligned to business needs, and lead your team on the design and delivery of modern data and technology solutions. This will involve providing technical direction, encouraging best-practice, and cultivating a collaborative and supportive team environment. Their tech stack currently spans things like Databricks, Microsoft Azure, Power Platform, Power BI, M365, Co-pilot, and various applications such as Workday. Requirements: Experience guiding data strategy and designing and delivering data and system architectures Experience leading small high-performing teams in an agile environment Hands-on experience with Azure data technologies and Databricks Strong understanding of data integration, automation, and system design An interest in emerging technologies such as AI Benefits: Salary up to £95,000 depending on experience Annual performance based bonus 25 days annual leave plus bank holidays Private healthcare Life insurance Electric vehicle purchase scheme Please Note: This is a role for UK residents only. This role does not offer Sponsorship. You must have the right to work in the UK with no restrictions. Some of our roles may be subject to successful background checks including a DBS and Credit Check. Tenth Revolution Group / Nigel Frank are the go-to recruiter for Data and AI roles in the UK, offering more opportunities across the country than any other. We're the proud sponsor and supporter of SQLBits, and the London Power BI User Group. To find out more and speak confidentially about your job search or hiring needs, please contact me directly at
17/10/2025
Full time
A growing UK-based professional services organisation are looking for a Data & Development Lead in this newly created role, to help shape and deliver their Data & Innovation Strategy, which will be critical to their ongoing success. This is a hybrid role, with 3 days per week in one of their multiple office locations across the UK - there is one in Richmond, North Yorkshire. This role blends hands-on technical delivery with strategic oversight, where you'll manage a small team of 3 to deliver excellent data engineering, automation and systems development solutions. You will define and maintain a fit-for-purpose data and systems architecture that is aligned to business needs, and lead your team on the design and delivery of modern data and technology solutions. This will involve providing technical direction, encouraging best-practice, and cultivating a collaborative and supportive team environment. Their tech stack currently spans things like Databricks, Microsoft Azure, Power Platform, Power BI, M365, Co-pilot, and various applications such as Workday. Requirements: Experience guiding data strategy and designing and delivering data and system architectures Experience leading small high-performing teams in an agile environment Hands-on experience with Azure data technologies and Databricks Strong understanding of data integration, automation, and system design An interest in emerging technologies such as AI Benefits: Salary up to £95,000 depending on experience Annual performance based bonus 25 days annual leave plus bank holidays Private healthcare Life insurance Electric vehicle purchase scheme Please Note: This is a role for UK residents only. This role does not offer Sponsorship. You must have the right to work in the UK with no restrictions. Some of our roles may be subject to successful background checks including a DBS and Credit Check. Tenth Revolution Group / Nigel Frank are the go-to recruiter for Data and AI roles in the UK, offering more opportunities across the country than any other. We're the proud sponsor and supporter of SQLBits, and the London Power BI User Group. To find out more and speak confidentially about your job search or hiring needs, please contact me directly at
Data Engineering Lead (Azure) London (3 days in office) Up to £105,000 + 5% Bonus + Benefits Are you ready to take the lead on building a cutting-edge data platform? We're seeking a Data Engineering Lead to shape and grow data engineering capabilities at a global, forward-thinking organisation. This is a greenfield opportunity to architect a modern Azure-based platform, lead a talented team, and make a direct impact on business-wide decision making. Why this role? Spearhead the creation of a new data platform from the ground up. Balance hands-on engineering with team leadership and mentoring. Collaborate with analysts, data scientists, and business leaders to deliver impactful solutions. Hybrid working: London office 3 days per week , remote flexibility the rest. Join a supportive and inclusive culture, with a global footprint and a focus on innovation. ? What you'll be doing: Designing and building scalable data pipelines to support analytics, reporting, and machine learning. Leading the implementation of Azure data solutions and ensuring robust ETL/ELT processes. Defining architecture strategies that allow for scalable, flexible development. Embedding best practices in DevOps, CI/CD, and engineering standards . Mentoring and growing a high-performing team of data engineers. Driving the integration of MS Fabric and low-code platforms (e.g. TimeXtender) for self-service BI. Staying on top of industry trends and introducing new technologies to advance the data platform. ? What we're looking for: Strong track record in data engineering leadership roles. Deep expertise in the Microsoft Azure data stack (data lakes, warehousing, modelling, etc.). Solid background in data architecture, data governance, and data management . Experience in modern data engineering practices (pipelines, orchestration, devops). Excellent leadership and stakeholder engagement skills. Bonus: experience with Databricks or TimeXtender . Package & Benefits: Salary up to £100,000 + 5% bonus . Hybrid working: London office 3 times per week . Comprehensive benefits package. Global organisation with opportunities to make a real impact. Supportive, diverse, and inclusive working culture.
17/10/2025
Full time
Data Engineering Lead (Azure) London (3 days in office) Up to £105,000 + 5% Bonus + Benefits Are you ready to take the lead on building a cutting-edge data platform? We're seeking a Data Engineering Lead to shape and grow data engineering capabilities at a global, forward-thinking organisation. This is a greenfield opportunity to architect a modern Azure-based platform, lead a talented team, and make a direct impact on business-wide decision making. Why this role? Spearhead the creation of a new data platform from the ground up. Balance hands-on engineering with team leadership and mentoring. Collaborate with analysts, data scientists, and business leaders to deliver impactful solutions. Hybrid working: London office 3 days per week , remote flexibility the rest. Join a supportive and inclusive culture, with a global footprint and a focus on innovation. ? What you'll be doing: Designing and building scalable data pipelines to support analytics, reporting, and machine learning. Leading the implementation of Azure data solutions and ensuring robust ETL/ELT processes. Defining architecture strategies that allow for scalable, flexible development. Embedding best practices in DevOps, CI/CD, and engineering standards . Mentoring and growing a high-performing team of data engineers. Driving the integration of MS Fabric and low-code platforms (e.g. TimeXtender) for self-service BI. Staying on top of industry trends and introducing new technologies to advance the data platform. ? What we're looking for: Strong track record in data engineering leadership roles. Deep expertise in the Microsoft Azure data stack (data lakes, warehousing, modelling, etc.). Solid background in data architecture, data governance, and data management . Experience in modern data engineering practices (pipelines, orchestration, devops). Excellent leadership and stakeholder engagement skills. Bonus: experience with Databricks or TimeXtender . Package & Benefits: Salary up to £100,000 + 5% bonus . Hybrid working: London office 3 times per week . Comprehensive benefits package. Global organisation with opportunities to make a real impact. Supportive, diverse, and inclusive working culture.
Role: Lead Tech ArchitectIR35: OutsideSC: Yes Start: 21/10 3 Months contract Rate: £600 Requirements SC Clearance Technical Architect from a engineering backgroundExperience in being in a 2nd line support role and help trouble shoot with the team Skillset:Databricks Alterna AWS Cloud Terraform If interested please send over your CV
17/10/2025
Full time
Role: Lead Tech ArchitectIR35: OutsideSC: Yes Start: 21/10 3 Months contract Rate: £600 Requirements SC Clearance Technical Architect from a engineering backgroundExperience in being in a 2nd line support role and help trouble shoot with the team Skillset:Databricks Alterna AWS Cloud Terraform If interested please send over your CV
BI Developer Rebuild critical dashboards that drive commercial decisions. 6-month fixed-term contract. Play a key role in a major enterprise-wide transformation programme. Work with modern data platforms and Tableau Cloud. We're on the lookout for a BI Developer to join a leading organisations Bristol team on an FTC basis. In this role you'll be instrumental in rebuilding business-critical Tableau dashboards as part of a wider transformation programme. You'll be working closely with commercial teams, data engineers, and analytics specialists to deliver high-impact reporting that supports confident decision-making. What you'll be doing Your focus will be recreating and enhancing key sales dashboards in Tableau, using data from newly implemented platforms. You'll work directly with stakeholders to understand their needs and translate them into clear, actionable dashboards.You'll collaborate with data engineers to ensure data pipelines and models are aligned with reporting requirements. You'll be part of a multi-capability project team, contributing to the successful delivery of a wider transformation programme. You'll also support the management of the Tableau Cloud platform and get involved in other analytics initiatives across the business. What experience you'll need to apply Experience working as a BI Developer Proven experience designing and building dashboards in Tableau Strong SQL skills End-to-end experience on analytics projects, from requirements gathering to build and support Experience with cloud data platforms like Databricks or Snowflake What you'll get in return for your experience This is a fixed term contract role paying the equating of a £65,000 per annum salary, on a contract basis, plus a discretionary bonus and great benefits. What's next? If this role sounds like a good fit, click the apply button to send in your CV and we'll arrange a call to chat through your experience!
17/10/2025
Full time
BI Developer Rebuild critical dashboards that drive commercial decisions. 6-month fixed-term contract. Play a key role in a major enterprise-wide transformation programme. Work with modern data platforms and Tableau Cloud. We're on the lookout for a BI Developer to join a leading organisations Bristol team on an FTC basis. In this role you'll be instrumental in rebuilding business-critical Tableau dashboards as part of a wider transformation programme. You'll be working closely with commercial teams, data engineers, and analytics specialists to deliver high-impact reporting that supports confident decision-making. What you'll be doing Your focus will be recreating and enhancing key sales dashboards in Tableau, using data from newly implemented platforms. You'll work directly with stakeholders to understand their needs and translate them into clear, actionable dashboards.You'll collaborate with data engineers to ensure data pipelines and models are aligned with reporting requirements. You'll be part of a multi-capability project team, contributing to the successful delivery of a wider transformation programme. You'll also support the management of the Tableau Cloud platform and get involved in other analytics initiatives across the business. What experience you'll need to apply Experience working as a BI Developer Proven experience designing and building dashboards in Tableau Strong SQL skills End-to-end experience on analytics projects, from requirements gathering to build and support Experience with cloud data platforms like Databricks or Snowflake What you'll get in return for your experience This is a fixed term contract role paying the equating of a £65,000 per annum salary, on a contract basis, plus a discretionary bonus and great benefits. What's next? If this role sounds like a good fit, click the apply button to send in your CV and we'll arrange a call to chat through your experience!
Proven experience working as a principal or lead data engineer Strong background working with large datasets, with proficiency in SQL, Python, and PySpark Experience managing and mentoring engineers with varying levels of experience I'm currently working with a leading insurance broker who is looking to hire a Lead Azure Data Engineer on an initial 12-month fixed-term contract, with strong potential for extension. This is a great opportunity for an experienced Data Engineer to take on a leadership role, mentoring a small team of junior engineers while playing a key part in driving the development of an Azure-based data lakehouse. Key requirements: Proven experience working as a principal or lead data engineer Strong background working with large datasets, with proficiency in SQL, Python, and PySpark Experience managing and mentoring engineers with varying levels of experience Hands-on experience deploying pipelines within Azure Databricks, ideally following the Medallion Architecture framework Hybrid working: Minimum two days per week on-site in London. If this sounds like something you'd be interested in, please don't hesitate to get in touch for more details. Thanks,Oliver
17/10/2025
Full time
Proven experience working as a principal or lead data engineer Strong background working with large datasets, with proficiency in SQL, Python, and PySpark Experience managing and mentoring engineers with varying levels of experience I'm currently working with a leading insurance broker who is looking to hire a Lead Azure Data Engineer on an initial 12-month fixed-term contract, with strong potential for extension. This is a great opportunity for an experienced Data Engineer to take on a leadership role, mentoring a small team of junior engineers while playing a key part in driving the development of an Azure-based data lakehouse. Key requirements: Proven experience working as a principal or lead data engineer Strong background working with large datasets, with proficiency in SQL, Python, and PySpark Experience managing and mentoring engineers with varying levels of experience Hands-on experience deploying pipelines within Azure Databricks, ideally following the Medallion Architecture framework Hybrid working: Minimum two days per week on-site in London. If this sounds like something you'd be interested in, please don't hesitate to get in touch for more details. Thanks,Oliver
A rapidly growing company in the B2B Software as a Service (SaaS) space are looking for a Deployed Engineer to join their expanding team in London (hybrid working - 2-3 days a week in their modern office space). Their product is a platform that acts as a digital twin of a business - integrating internal and external data from a variety of sources to act as a single source of truth, which powers actionable insights at scale. When combined with AI algorithms, the platform drives strategic decision-making, and enables planning and effective execution, allowing businesses to achieve their targeted state. They are a true pioneer in their field! They believe the future of B2B SaaS is about delivering tailored, dynamic solutions for their clients, rather than implementing static tools. This is where you come in - you'll be working within a team who believe value is created not just in the codebase, but in the implementation layer - making this role ideal for someone who thrives in dynamic, customer-facing environments. The role: Adapt and deploy a powerful data platform to solve complex business problems Design scalable generative AI workflows using modern platforms like Palantir AIP Execute advanced data integration using PySpark and distributed technologies Collaborate directly with clients to understand priorities and deliver outcomes What We're Looking For: Strong skills in PySpark, Python, and SQL Ability to translate ambiguous requirements into clean, maintainable pipelines Quick learner with a passion for new technologies Experience in startups or top-tier consultancies is a plus Nice to Have (not essential): Familiarity with dashboarding tools, Typescript, and API development Exposure to Airflow, DBT, Databricks Experience with ERP (e.g. SAP, Oracle) and CRM systems What's On Offer: Salary: 50,000- 75,000 + share options Hybrid working: 2-3 days per week in a vibrant Soho office A highly social culture with regular team events and activities Work alongside seasoned tech and business leaders Be part of a mission-driven company with a strong social impact ethos If you're excited by the idea of working at the intersection of AI, data, and enterprise transformation - and want to be part of a fast-scaling, values-led team - we'd love to hear from you. Please Note: This is a role for UK residents only. This role does not offer Sponsorship. You must have the right to work in the UK with no restrictions. Some of our roles may be subject to successful background checks including a DBS and Credit Check. Tenth Revolution Group / Nigel Frank are the go-to recruiter for Data and AI roles in the UK, offering more opportunities across the country than any other. We're the proud sponsor and supporter of SQLBits, and the London Power BI User Group. To find out more and speak confidentially about your job search or hiring needs, please contact me directly.
17/10/2025
Full time
A rapidly growing company in the B2B Software as a Service (SaaS) space are looking for a Deployed Engineer to join their expanding team in London (hybrid working - 2-3 days a week in their modern office space). Their product is a platform that acts as a digital twin of a business - integrating internal and external data from a variety of sources to act as a single source of truth, which powers actionable insights at scale. When combined with AI algorithms, the platform drives strategic decision-making, and enables planning and effective execution, allowing businesses to achieve their targeted state. They are a true pioneer in their field! They believe the future of B2B SaaS is about delivering tailored, dynamic solutions for their clients, rather than implementing static tools. This is where you come in - you'll be working within a team who believe value is created not just in the codebase, but in the implementation layer - making this role ideal for someone who thrives in dynamic, customer-facing environments. The role: Adapt and deploy a powerful data platform to solve complex business problems Design scalable generative AI workflows using modern platforms like Palantir AIP Execute advanced data integration using PySpark and distributed technologies Collaborate directly with clients to understand priorities and deliver outcomes What We're Looking For: Strong skills in PySpark, Python, and SQL Ability to translate ambiguous requirements into clean, maintainable pipelines Quick learner with a passion for new technologies Experience in startups or top-tier consultancies is a plus Nice to Have (not essential): Familiarity with dashboarding tools, Typescript, and API development Exposure to Airflow, DBT, Databricks Experience with ERP (e.g. SAP, Oracle) and CRM systems What's On Offer: Salary: 50,000- 75,000 + share options Hybrid working: 2-3 days per week in a vibrant Soho office A highly social culture with regular team events and activities Work alongside seasoned tech and business leaders Be part of a mission-driven company with a strong social impact ethos If you're excited by the idea of working at the intersection of AI, data, and enterprise transformation - and want to be part of a fast-scaling, values-led team - we'd love to hear from you. Please Note: This is a role for UK residents only. This role does not offer Sponsorship. You must have the right to work in the UK with no restrictions. Some of our roles may be subject to successful background checks including a DBS and Credit Check. Tenth Revolution Group / Nigel Frank are the go-to recruiter for Data and AI roles in the UK, offering more opportunities across the country than any other. We're the proud sponsor and supporter of SQLBits, and the London Power BI User Group. To find out more and speak confidentially about your job search or hiring needs, please contact me directly.
Position: Senior Data Engineer Hybrid - Birmingham 6 months - Outside IR35 Overview: Join a leading UK company as a Senior Data Engineer and play a key role in a major data transformation project. You will have the opportunity to design and deliver a new Azure-based data platform, modernising the organisation's data management and reporting processes. This hands-on role offers architectural influence and is ideal for an experienced engineer with a strong background in setting up new environments, creating data pipelines, and enabling self-service analytics through Power BI. Key Responsibilities: Design, build, and maintain Azure data pipelines using Azure Data Factory, Synapse, or Fabric. Implement a data lakehouse architecture (Bronze/Silver/Gold) and establish best-practise ETL/ELT frameworks. Ingest and integrate data from multiple core systems, including ERP, finance, supply chain, and CRM platforms. Develop and optimise SQL data models and support the creation of Power BI-ready datasets. Apply and document data governance, quality, and validation rules within the platform. Collaborate with Finance and IT stakeholders to translate reporting needs into technical solutions. Monitor, troubleshoot, and optimise data pipelines for performance and cost efficiency. Define reusable components, standards, and documentation to support long-term scalability. Essential Skills & Experience: Proven experience building Azure data platforms end-to-end (Data Factory, Synapse, Fabric, or Databricks). Strong SQL development and data modelling capability. Experience integrating ERP or legacy systems into cloud data platforms. Proficiency in Python or PySpark for transformation and automation. Understanding of data governance, access control, and security within Azure. Hands-on experience preparing data for Power BI or other analytics tools. Excellent communication skills - able to bridge technical and non-technical stakeholders. Strong documentation habits and attention to detail. Desirable Skills & Experience: Experience with AS400, Tagetik, or similar finance systems. Familiarity with Power BI Premium, RLS, and workspace governance. Knowledge of Azure DevOps and CI/CD for data pipelines. Exposure to data quality tools or frameworks.
17/10/2025
Seasonal
Position: Senior Data Engineer Hybrid - Birmingham 6 months - Outside IR35 Overview: Join a leading UK company as a Senior Data Engineer and play a key role in a major data transformation project. You will have the opportunity to design and deliver a new Azure-based data platform, modernising the organisation's data management and reporting processes. This hands-on role offers architectural influence and is ideal for an experienced engineer with a strong background in setting up new environments, creating data pipelines, and enabling self-service analytics through Power BI. Key Responsibilities: Design, build, and maintain Azure data pipelines using Azure Data Factory, Synapse, or Fabric. Implement a data lakehouse architecture (Bronze/Silver/Gold) and establish best-practise ETL/ELT frameworks. Ingest and integrate data from multiple core systems, including ERP, finance, supply chain, and CRM platforms. Develop and optimise SQL data models and support the creation of Power BI-ready datasets. Apply and document data governance, quality, and validation rules within the platform. Collaborate with Finance and IT stakeholders to translate reporting needs into technical solutions. Monitor, troubleshoot, and optimise data pipelines for performance and cost efficiency. Define reusable components, standards, and documentation to support long-term scalability. Essential Skills & Experience: Proven experience building Azure data platforms end-to-end (Data Factory, Synapse, Fabric, or Databricks). Strong SQL development and data modelling capability. Experience integrating ERP or legacy systems into cloud data platforms. Proficiency in Python or PySpark for transformation and automation. Understanding of data governance, access control, and security within Azure. Hands-on experience preparing data for Power BI or other analytics tools. Excellent communication skills - able to bridge technical and non-technical stakeholders. Strong documentation habits and attention to detail. Desirable Skills & Experience: Experience with AS400, Tagetik, or similar finance systems. Familiarity with Power BI Premium, RLS, and workspace governance. Knowledge of Azure DevOps and CI/CD for data pipelines. Exposure to data quality tools or frameworks.
Jobs - Frequently Asked Questions
Use the location filter to find IT jobs in cities like London, Manchester, Birmingham, and across the UK.
Entry-level roles include IT support technician, junior developer, QA tester, and helpdesk analyst.
New jobs are posted daily. Set up alerts to be notified as soon as new roles match your preferences.
Key skills include problem-solving, coding, cloud computing, networking, and familiarity with tools like AWS or SQL.
Yes, many employers offer training or junior roles. Focus on building a strong CV with relevant coursework or personal projects.