it job board logo
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
  • Recruiting? Post a job
  • Sign in
  • Sign up
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

48 jobs found

Email me jobs like this
Refine Search
Current Search
senior data engineer databricks
Michael Page
Senior Azure Data Engineer - London
Michael Page
Senior Azure data engineer We are seeking an Azure Data Engineer to join the analytics department within the financial services industry. This role focuses on designing, implementing, and maintaining data solutions using Azure technologies to support business decision-making and insights. Client Details Senior Azure data engineer Our client is a large organisation within the financial services industry, dedicated to providing innovative solutions and leveraging technology to drive business success. They are known for their commitment to excellence and their focus on delivering impactful data-driven strategies. Description Senior Azure data engineer Design and develop data pipelines and workflows using Azure technologies. Implement and manage data storage solutions, ensuring optimal performance and security. Collaborate with analytics teams to understand data requirements and deliver solutions accordingly. Monitor and maintain the performance of Azure based data systems. Ensure data integrity and accuracy across all platforms. Provide technical expertise on Azure data engineering best practices. Optimise data processes for efficiency and scalability. Troubleshoot and resolve data-related issues promptly. Profile Senior Azure data engineer A successful Azure Data Engineer should have: Proven experience in data engineering within the financial services industry. Strong expertise in Azure data technologies, including Data Factory, Databricks, and Synapse Analytics. Proficiency in SQL and other data query languages. Knowledge of data modelling, ETL processes, and data warehousing concepts. Experience in working with large datasets and ensuring data quality. Excellent problem-solving and analytical skills. A degree in Computer Science, Data Science, or a related field. Job Offer Senior Azure data engineer Competitive salary ranging from 80,000 to 95,000 per annum. Comprehensive benefits package. Opportunities for professional growth within the financial services industry. A supportive and innovative work environment. If you are an experienced Senior Azure Data Engineer looking for your next permanent opportunity, we encourage you to apply and take the next step in your career!
11/03/2026
Full time
Senior Azure data engineer We are seeking an Azure Data Engineer to join the analytics department within the financial services industry. This role focuses on designing, implementing, and maintaining data solutions using Azure technologies to support business decision-making and insights. Client Details Senior Azure data engineer Our client is a large organisation within the financial services industry, dedicated to providing innovative solutions and leveraging technology to drive business success. They are known for their commitment to excellence and their focus on delivering impactful data-driven strategies. Description Senior Azure data engineer Design and develop data pipelines and workflows using Azure technologies. Implement and manage data storage solutions, ensuring optimal performance and security. Collaborate with analytics teams to understand data requirements and deliver solutions accordingly. Monitor and maintain the performance of Azure based data systems. Ensure data integrity and accuracy across all platforms. Provide technical expertise on Azure data engineering best practices. Optimise data processes for efficiency and scalability. Troubleshoot and resolve data-related issues promptly. Profile Senior Azure data engineer A successful Azure Data Engineer should have: Proven experience in data engineering within the financial services industry. Strong expertise in Azure data technologies, including Data Factory, Databricks, and Synapse Analytics. Proficiency in SQL and other data query languages. Knowledge of data modelling, ETL processes, and data warehousing concepts. Experience in working with large datasets and ensuring data quality. Excellent problem-solving and analytical skills. A degree in Computer Science, Data Science, or a related field. Job Offer Senior Azure data engineer Competitive salary ranging from 80,000 to 95,000 per annum. Comprehensive benefits package. Opportunities for professional growth within the financial services industry. A supportive and innovative work environment. If you are an experienced Senior Azure Data Engineer looking for your next permanent opportunity, we encourage you to apply and take the next step in your career!
VC Talent
Senior Data Engineer
VC Talent
Senior Data Engineer A successful FTSE-listed organisation with a friendly, close-knit culture is looking for a Senior Data Engineer to help shape and deliver its evolving data platform. This is a hands-on, high-impact role in a smaller organisation where you will act as both a senior engineer and strategic data partner, designing solutions that support both operational and long-term business goals. You will play an important role in the company s migration to Microsoft Fabric, while continuing to build and optimise its existing SQL Server-based data warehouse environment. While primarily a backend engineering role, you will also support reporting via SSRS/Microsoft Power BI, with an increasing focus on AI-driven capabilities over time. As the Senior Data Engineer, you will work closely with the Solutions Architect and collaborate with teams across the business, with occasional travel to France and Germany. This opportunity suits someone with in-depth SQL experience who is looking to work with modern tools like Microsoft Fabric. Essential Skills Strong SQL Server experience (T-SQL, SSIS, SSRS, stored procedures, functions, triggers) Data warehouse architecture, build and maintenance Excellent communication skills and ability to gather requirements from non-technical stakeholders at all levels Degree in Computer Science or STEM subject Comfortable working 4 days per week onsite in Vauxhall Desirable Microsoft Fabric, Azure Synapse, Azure Data Factory, Snowflake, Databricks Python or any other relevant language Experience with tools such as Power BI, Qlik or Tableau The role offers a package of £64k+, plus a bonus of up to 15%, along with a generous pension, 28 days' holiday, private medical insurance, permanent health insurance, study support, and a modern office with an onsite gym. If your experience aligns, apply with an up-to-date CV as soon as possible, as this is expected to be a popular opportunity.
11/03/2026
Full time
Senior Data Engineer A successful FTSE-listed organisation with a friendly, close-knit culture is looking for a Senior Data Engineer to help shape and deliver its evolving data platform. This is a hands-on, high-impact role in a smaller organisation where you will act as both a senior engineer and strategic data partner, designing solutions that support both operational and long-term business goals. You will play an important role in the company s migration to Microsoft Fabric, while continuing to build and optimise its existing SQL Server-based data warehouse environment. While primarily a backend engineering role, you will also support reporting via SSRS/Microsoft Power BI, with an increasing focus on AI-driven capabilities over time. As the Senior Data Engineer, you will work closely with the Solutions Architect and collaborate with teams across the business, with occasional travel to France and Germany. This opportunity suits someone with in-depth SQL experience who is looking to work with modern tools like Microsoft Fabric. Essential Skills Strong SQL Server experience (T-SQL, SSIS, SSRS, stored procedures, functions, triggers) Data warehouse architecture, build and maintenance Excellent communication skills and ability to gather requirements from non-technical stakeholders at all levels Degree in Computer Science or STEM subject Comfortable working 4 days per week onsite in Vauxhall Desirable Microsoft Fabric, Azure Synapse, Azure Data Factory, Snowflake, Databricks Python or any other relevant language Experience with tools such as Power BI, Qlik or Tableau The role offers a package of £64k+, plus a bonus of up to 15%, along with a generous pension, 28 days' holiday, private medical insurance, permanent health insurance, study support, and a modern office with an onsite gym. If your experience aligns, apply with an up-to-date CV as soon as possible, as this is expected to be a popular opportunity.
Peregrine
Data Engineer
Peregrine
We are Data Services, our mission is to unlock the value of data by delivering high-quality, reliable, and secure data services that are accessible, understandable, and actionable. We continuously evolve our offerings, leveraging modern cloud-based technologies, and fostering strong partnerships to help our colleagues in the Bank navigate the complexities of a data-driven world and achieve their strategic objectives. Active SC Clearance Job Description: The world of data in Central Banking is evolving rapidly. With the rise of detailed data collection in financial regulation and the swift advancements in cloud-native data technologies, the demand for visionary data engineers is growing. We re seeking a senior Data Engineer to join our Data Engineering team and play a pivotal role in shaping the Bank s strategic cloud-first data platform. As a senior member of the team, you will play a key role in designing and delivering robust, scalable data solutions that support the Bank s core responsibilities around monetary policy, financial stability, and regulatory supervision. You ll contribute to technical design decisions, mentor engineers, and collaborate across teams to ensure our data infrastructure continues to evolve and meet future demands. Role Responsibilities Lead the design, development, and deployment of scalable, secure, and cost-effective distributed data solutions using Azure services (e.g., Azure Databricks, Azure Data Lake Storage, Azure Data Factory). Architect and implement advanced data pipelines using Databricks, Delta Lake, Python and Spark, ensuring performance, reliability, and maintainability across cloud and on-prem environments. Champion data quality, governance, and observability, ensuring data is accurate, timely, and fit-for-purpose for analytics, BI, and operational use cases. Drive the modernization of legacy systems, leading the migration of data infrastructure to Azure with minimal disruption and long-term scalability. Act as a technical authority on Azure-native data engineering, guiding best practices and setting standards across the team. Mentor and coach junior and mid-level engineers, fostering a culture of continuous learning, innovation, and technical excellence. Collaborate with architects, analysts, and stake holders to align data engineering efforts with strategic business goals and enterprise data strategy. Evaluate and introduce emerging technologies, tools, and methodologies to enhance the Bank s data capabilities. Own the end-to-end delivery of complex data solutions, from requirements gathering to production deployment and support. Contribute to the development of reusable frameworks, templates, and patterns to accelerate delivery and ensure consistency across projects. Minimum Criteria Extensive experience with Azure services including Azure Databricks, Azure Data Lake Storage, and Azure Data Factory. Advanced proficiency in SQL, Python, and Spark (PySpark), with a strong focus on performance optimization and distributed processing. Proven experience in CI/CD practices using industry-standard tools (e.g., GitHub Actions, Azure DevOps). Strong understanding of data architecture principles and cloud-native design patterns. Essential Criteria Demonstrated ability to lead technical delivery, mentor engineering teams and collaborate with stakeholders to ensure alignment between data solutions and business strategy. Proficiency in Linux/Unix environments and shell scripting. Deep understanding of source control, testing strategies, and agile development practices. Self-motivated with a strategic mindset and a passion for driving innovation in data engineering. Desirable Criteria Experience delivering data pipelines on Hortonworks/Cloudera on-prem and leading cloud migration initiatives. Familiarity with: Apache Airflow Data modelling and metadata management Experience influencing enterprise data strategy and contributing to architectural governance. Changed this now. I was confusing this with PDE role as I am working on that in parallel. Hope this makes sense now. data solutions rather than architectures? Should add Python here as a key tech we use Have mentioned Python in 'Minimum Criteria' section below, but will add here too this could be added to Essential Criteria ? stakeholder and project management ? Have updated in essential criteria below. But I have now used the previous version to create requisition in OBS. Will see if it can be changed. What is the difference between "minimum" and "essential" criteria. Both imply that they are mandatory and so could be one list? This is a bit confusing. I used to have just one, but this is the standard format of JD that the Bank wants us to follow. Here is the difference: Min Criteria: This must list the minimum technical skills/experience/qualifications required to do the job and should be measurable/scoreable. The screening questions you select must link to these, in order to allow candidates to best demonstrate their suitability for the role. Essential: This lists other important technical skills/experience/qualifications, and also more behavioural competencies. These are ones that are better assessed at interview rather than on screening questions on the application form Ok, I think we could go back and ask HR about this as it does seem confusing and to me doesn't give a good impression of the Bank to applicants at it looks like 2 lists for the same thing. I had checked this earlier, but seems they want us to follow this format. When I advertised last time, I just mentioned Minimum Criteria, but they said it has to be split into Minimum and Essential. Don't think we need to mention Atlas or Cloudera Manager as we hardly ever use those. Airflow could be useful so would leave that in.
10/03/2026
Full time
We are Data Services, our mission is to unlock the value of data by delivering high-quality, reliable, and secure data services that are accessible, understandable, and actionable. We continuously evolve our offerings, leveraging modern cloud-based technologies, and fostering strong partnerships to help our colleagues in the Bank navigate the complexities of a data-driven world and achieve their strategic objectives. Active SC Clearance Job Description: The world of data in Central Banking is evolving rapidly. With the rise of detailed data collection in financial regulation and the swift advancements in cloud-native data technologies, the demand for visionary data engineers is growing. We re seeking a senior Data Engineer to join our Data Engineering team and play a pivotal role in shaping the Bank s strategic cloud-first data platform. As a senior member of the team, you will play a key role in designing and delivering robust, scalable data solutions that support the Bank s core responsibilities around monetary policy, financial stability, and regulatory supervision. You ll contribute to technical design decisions, mentor engineers, and collaborate across teams to ensure our data infrastructure continues to evolve and meet future demands. Role Responsibilities Lead the design, development, and deployment of scalable, secure, and cost-effective distributed data solutions using Azure services (e.g., Azure Databricks, Azure Data Lake Storage, Azure Data Factory). Architect and implement advanced data pipelines using Databricks, Delta Lake, Python and Spark, ensuring performance, reliability, and maintainability across cloud and on-prem environments. Champion data quality, governance, and observability, ensuring data is accurate, timely, and fit-for-purpose for analytics, BI, and operational use cases. Drive the modernization of legacy systems, leading the migration of data infrastructure to Azure with minimal disruption and long-term scalability. Act as a technical authority on Azure-native data engineering, guiding best practices and setting standards across the team. Mentor and coach junior and mid-level engineers, fostering a culture of continuous learning, innovation, and technical excellence. Collaborate with architects, analysts, and stake holders to align data engineering efforts with strategic business goals and enterprise data strategy. Evaluate and introduce emerging technologies, tools, and methodologies to enhance the Bank s data capabilities. Own the end-to-end delivery of complex data solutions, from requirements gathering to production deployment and support. Contribute to the development of reusable frameworks, templates, and patterns to accelerate delivery and ensure consistency across projects. Minimum Criteria Extensive experience with Azure services including Azure Databricks, Azure Data Lake Storage, and Azure Data Factory. Advanced proficiency in SQL, Python, and Spark (PySpark), with a strong focus on performance optimization and distributed processing. Proven experience in CI/CD practices using industry-standard tools (e.g., GitHub Actions, Azure DevOps). Strong understanding of data architecture principles and cloud-native design patterns. Essential Criteria Demonstrated ability to lead technical delivery, mentor engineering teams and collaborate with stakeholders to ensure alignment between data solutions and business strategy. Proficiency in Linux/Unix environments and shell scripting. Deep understanding of source control, testing strategies, and agile development practices. Self-motivated with a strategic mindset and a passion for driving innovation in data engineering. Desirable Criteria Experience delivering data pipelines on Hortonworks/Cloudera on-prem and leading cloud migration initiatives. Familiarity with: Apache Airflow Data modelling and metadata management Experience influencing enterprise data strategy and contributing to architectural governance. Changed this now. I was confusing this with PDE role as I am working on that in parallel. Hope this makes sense now. data solutions rather than architectures? Should add Python here as a key tech we use Have mentioned Python in 'Minimum Criteria' section below, but will add here too this could be added to Essential Criteria ? stakeholder and project management ? Have updated in essential criteria below. But I have now used the previous version to create requisition in OBS. Will see if it can be changed. What is the difference between "minimum" and "essential" criteria. Both imply that they are mandatory and so could be one list? This is a bit confusing. I used to have just one, but this is the standard format of JD that the Bank wants us to follow. Here is the difference: Min Criteria: This must list the minimum technical skills/experience/qualifications required to do the job and should be measurable/scoreable. The screening questions you select must link to these, in order to allow candidates to best demonstrate their suitability for the role. Essential: This lists other important technical skills/experience/qualifications, and also more behavioural competencies. These are ones that are better assessed at interview rather than on screening questions on the application form Ok, I think we could go back and ask HR about this as it does seem confusing and to me doesn't give a good impression of the Bank to applicants at it looks like 2 lists for the same thing. I had checked this earlier, but seems they want us to follow this format. When I advertised last time, I just mentioned Minimum Criteria, but they said it has to be split into Minimum and Essential. Don't think we need to mention Atlas or Cloudera Manager as we hardly ever use those. Airflow could be useful so would leave that in.
Searchability NS&D
Data Scientist / AI Engineer
Searchability NS&D Cheltenham, Gloucestershire
Must have active enhanced DV (West) Clearance Junior to Lead levels available £45k to £95k DoE plus 15% clearance bonus Must be willing to be full-time on-site in Cheltenham Skills required in machine learning, GenAI, NLP, Customer Engagement/Consultancy Who are we? We are recruiting Junior, Senior and Lead Data Scientists with AI specialism and enhanced DV Clearance for a prestigious client to work on a portfolio of public and private sector projects. Our client is a global leader in technology, consulting, and engineering services at the forefront of innovation to evolve the world of digital, cloud, and platforms. You'll experience excellent career progression opportunities to develop your skillset and personal profile in an inclusive culture. What will the Data Scientist be doing? Our client is seeking individuals with strong technical expertise in machine learning, GenAI, computer vision, and data science, alongside solid skills in solution architecture and software engineering to design and scale impactful solutions. This role involves working closely with clients to identify challenges, define solutions, communicate their value clearly, and lead teams to successful delivery. There are also opportunities to publish whitepapers and represent the organisation at conferences, all within an inclusive and diverse working environment. Key Skills and Requirements: Proficient in AI techniques including machine learning, GenAI, NLP, deep learning, graph analytics, and time series analysis. Strong communicator with the ability to simplify complex concepts, manage stakeholders, and motivate/lead Agile teams to deliver robust outcomes. Experienced in securing work through RFI/RFPs, bids, and presentations across public and private sectors. Skilled in data science platforms (e.g. Databricks, AzureML) and cloud services (AWS, Azure, GCP), with knowledge of tools like Terraform. Experienced in deploying solutions using Docker, Kubernetes, CI/CD tools. To be Considered: Please either apply by clicking online or emailing me directly at . For further information please call me on or - I can make myself available outside of normal working hours to suit from 7 am until 10 pm. If unavailable, please leave a message and either myself or one of my colleagues will respond. By applying for this role, you give express consent for us to process & submit (subject to required skills) your application to our client in conjunction with this vacancy only. Also feel free to connect with me on LinkedIn, just search for Henry Clay-Davies. I look forward to hearing from you. KEY SKILLS: Data Science / Data Scientist / AI Engineer / AI / Machine Learning / ML / NLP / GenAI / Stakeholder Engagement / Customer Engagement / AWS / Azure / Cloud / Docker / Kubernetes / CI/CD / Deep Learning
10/03/2026
Full time
Must have active enhanced DV (West) Clearance Junior to Lead levels available £45k to £95k DoE plus 15% clearance bonus Must be willing to be full-time on-site in Cheltenham Skills required in machine learning, GenAI, NLP, Customer Engagement/Consultancy Who are we? We are recruiting Junior, Senior and Lead Data Scientists with AI specialism and enhanced DV Clearance for a prestigious client to work on a portfolio of public and private sector projects. Our client is a global leader in technology, consulting, and engineering services at the forefront of innovation to evolve the world of digital, cloud, and platforms. You'll experience excellent career progression opportunities to develop your skillset and personal profile in an inclusive culture. What will the Data Scientist be doing? Our client is seeking individuals with strong technical expertise in machine learning, GenAI, computer vision, and data science, alongside solid skills in solution architecture and software engineering to design and scale impactful solutions. This role involves working closely with clients to identify challenges, define solutions, communicate their value clearly, and lead teams to successful delivery. There are also opportunities to publish whitepapers and represent the organisation at conferences, all within an inclusive and diverse working environment. Key Skills and Requirements: Proficient in AI techniques including machine learning, GenAI, NLP, deep learning, graph analytics, and time series analysis. Strong communicator with the ability to simplify complex concepts, manage stakeholders, and motivate/lead Agile teams to deliver robust outcomes. Experienced in securing work through RFI/RFPs, bids, and presentations across public and private sectors. Skilled in data science platforms (e.g. Databricks, AzureML) and cloud services (AWS, Azure, GCP), with knowledge of tools like Terraform. Experienced in deploying solutions using Docker, Kubernetes, CI/CD tools. To be Considered: Please either apply by clicking online or emailing me directly at . For further information please call me on or - I can make myself available outside of normal working hours to suit from 7 am until 10 pm. If unavailable, please leave a message and either myself or one of my colleagues will respond. By applying for this role, you give express consent for us to process & submit (subject to required skills) your application to our client in conjunction with this vacancy only. Also feel free to connect with me on LinkedIn, just search for Henry Clay-Davies. I look forward to hearing from you. KEY SKILLS: Data Science / Data Scientist / AI Engineer / AI / Machine Learning / ML / NLP / GenAI / Stakeholder Engagement / Customer Engagement / AWS / Azure / Cloud / Docker / Kubernetes / CI/CD / Deep Learning
Datatech
Senior Data Engineer - (Python & SQL)
Datatech
Senior Data Engineer (Python & SQL) Location London with hybrid working Monday to Wednesday in the office Salary 70,000 to 85,000 depending on experience Reference J13026 An AI first SaaS business that transforms high quality first party data into trusted, decision ready insight at scale is looking for a Senior Data Engineer to join its growing data and engineering team. This role sits at the core of data engineering. You will work with data that is often imperfect and transform it into well structured, reliable datasets that other teams can depend on. The focus is on engineering high quality data foundations rather than analytics or cloud infrastructure alone. You will design and build clear, maintainable data pipelines using Python and SQL within a modern data and AI platform, with a strong focus on data quality, robustness, and long term reliability. You will also play an important mentoring role within the team, supporting and guiding other data engineers and helping to raise engineering standards through thoughtful, hands on leadership. Why join A supportive and inclusive environment where different perspectives are welcomed and people are encouraged to contribute and be heard Clear progression with space to deepen your technical expertise and grow your confidence at a sustainable pace A team that values collaboration, good communication, and shared ownership over hero culture The opportunity to work on meaningful data engineering problems where quality genuinely matters What you will be doing Designing and building cloud based data and machine learning pipelines that prepare data for analytics, AI, and product use Writing clear, well-structured Python, PySpark, and SQL to transform and validate data from multiple upstream sources Taking ownership of data quality, consistency, and reliability across the pipeline lifecycle Shaping scalable data models that support a wide range of downstream use cases Working closely with Product, Engineering, and Data Science teams to understand data needs and constraints Mentoring and supporting other data engineers, sharing knowledge and encouraging good engineering practices Contributing to the long term health of the data platform through thoughtful design and continuous improvement What we are looking for Strong experience using Python and SQL to transform large, real world datasets in production environments A deep understanding of data structures, data quality challenges, and how to design reliable transformation logic Experience working with modern data platforms such as Azure, GCP, AWS, Databricks, Snowflake, or similar Confidence working with imperfect data and making it fit for consumption downstream Experience supporting or mentoring other engineers through code reviews, pairing, or informal guidance Clear, thoughtful communication and a collaborative mindset You do not need to meet every requirement listed. What matters most is strong, hands on experience using Python and SQL to work confidently with complex, real world data, apply sound engineering judgement, and help others grow through your experience. Right to work in the UK is required. Sponsorship is not available now or in the future. Apply to find out more about the role. If you have a friend or colleague who may be interested, referrals are welcome. For each successful placement, you will be eligible for our general gift or voucher scheme. Datatech is one of the UK's leading recruitment agencies specialising in analytics and is the host of the critically acclaimed Women in Data event. For more information, visit (url removed)
09/03/2026
Full time
Senior Data Engineer (Python & SQL) Location London with hybrid working Monday to Wednesday in the office Salary 70,000 to 85,000 depending on experience Reference J13026 An AI first SaaS business that transforms high quality first party data into trusted, decision ready insight at scale is looking for a Senior Data Engineer to join its growing data and engineering team. This role sits at the core of data engineering. You will work with data that is often imperfect and transform it into well structured, reliable datasets that other teams can depend on. The focus is on engineering high quality data foundations rather than analytics or cloud infrastructure alone. You will design and build clear, maintainable data pipelines using Python and SQL within a modern data and AI platform, with a strong focus on data quality, robustness, and long term reliability. You will also play an important mentoring role within the team, supporting and guiding other data engineers and helping to raise engineering standards through thoughtful, hands on leadership. Why join A supportive and inclusive environment where different perspectives are welcomed and people are encouraged to contribute and be heard Clear progression with space to deepen your technical expertise and grow your confidence at a sustainable pace A team that values collaboration, good communication, and shared ownership over hero culture The opportunity to work on meaningful data engineering problems where quality genuinely matters What you will be doing Designing and building cloud based data and machine learning pipelines that prepare data for analytics, AI, and product use Writing clear, well-structured Python, PySpark, and SQL to transform and validate data from multiple upstream sources Taking ownership of data quality, consistency, and reliability across the pipeline lifecycle Shaping scalable data models that support a wide range of downstream use cases Working closely with Product, Engineering, and Data Science teams to understand data needs and constraints Mentoring and supporting other data engineers, sharing knowledge and encouraging good engineering practices Contributing to the long term health of the data platform through thoughtful design and continuous improvement What we are looking for Strong experience using Python and SQL to transform large, real world datasets in production environments A deep understanding of data structures, data quality challenges, and how to design reliable transformation logic Experience working with modern data platforms such as Azure, GCP, AWS, Databricks, Snowflake, or similar Confidence working with imperfect data and making it fit for consumption downstream Experience supporting or mentoring other engineers through code reviews, pairing, or informal guidance Clear, thoughtful communication and a collaborative mindset You do not need to meet every requirement listed. What matters most is strong, hands on experience using Python and SQL to work confidently with complex, real world data, apply sound engineering judgement, and help others grow through your experience. Right to work in the UK is required. Sponsorship is not available now or in the future. Apply to find out more about the role. If you have a friend or colleague who may be interested, referrals are welcome. For each successful placement, you will be eligible for our general gift or voucher scheme. Datatech is one of the UK's leading recruitment agencies specialising in analytics and is the host of the critically acclaimed Women in Data event. For more information, visit (url removed)
TRIA
Senior Data Engineer
TRIA
Senior Data Engineer - Sponsorship Offered 65,000 - 72,000 + 20% Bonus + Excellent Benefits London 2-3 days on-site Our client is a leading global hospitality brand undergoing an exciting period of rapid growth and transformation. With significant investment in data and technology, they are building a world-class data platform to power decision-making across every area of the business - from supply chain and logistics to marketing, customer sales and in-store operations. We are seeking an experienced Senior Data Engineer with deep expertise in Databricks to design, build, and optimize the clients data platform. This role will be pivotal in developing scalable data pipelines, enabling advanced analytics, and driving data quality and governance across the organisation. You'll work closely with data scientists, analysts, and business stakeholders to transform raw data into trusted, actionable insights that power critical business decisions. Required Qualifications 6+ years of experience in data engineering 3+ years of hands-on experience with Azure Databricks Strong working knowledge Azure Strong knowledge of data modeling, ETL/ELT design, and data lakehouse concepts. To apply for this role please email across your CV ASAP.
09/03/2026
Full time
Senior Data Engineer - Sponsorship Offered 65,000 - 72,000 + 20% Bonus + Excellent Benefits London 2-3 days on-site Our client is a leading global hospitality brand undergoing an exciting period of rapid growth and transformation. With significant investment in data and technology, they are building a world-class data platform to power decision-making across every area of the business - from supply chain and logistics to marketing, customer sales and in-store operations. We are seeking an experienced Senior Data Engineer with deep expertise in Databricks to design, build, and optimize the clients data platform. This role will be pivotal in developing scalable data pipelines, enabling advanced analytics, and driving data quality and governance across the organisation. You'll work closely with data scientists, analysts, and business stakeholders to transform raw data into trusted, actionable insights that power critical business decisions. Required Qualifications 6+ years of experience in data engineering 3+ years of hands-on experience with Azure Databricks Strong working knowledge Azure Strong knowledge of data modeling, ETL/ELT design, and data lakehouse concepts. To apply for this role please email across your CV ASAP.
IntaPeople
Data Architect
IntaPeople Nantgarw, Cardiff
Data Architect Hybrid RCT (South Wales) IntaPeople are proud and excited to be appointed to recruit an experienced Data Architect for a Welsh-based not-for-profit sector client on an exclusive growth project. This is a very exciting opportunity to join their fast-growing Data function in this newly created position. You will be joining the data team as one of the first handful of team members in this area of the business which will work with external partners to build out the organisations data capability offering. As a Data Architect, you will be responsible for designing, building, and maintaining robust, scalable, and secure data pipelines and platform that enable them to make data -driven decisions at a enterprise level. Working closely with the Head of Data Engineering you will help grow out this data function with the recruitment of further data engineering resources whilst working closely with solutions architects and Software Engineers. You will also get the opportunity to progress into a leadership role if this suited the individuals desires and capabilities. You will shape, govern and assure the organisation s data architecture, defining, designing and maintaining strategic data models, standards, flows and governance structures that support organisational goals, ensure compliance, foster collaboration across business areas, and enable the organisation to make data-driven decisions Essential Skills Proven experience as a Senior Data Engineer or Data Architect (or similar/related role). Experience with Enterprise level Data sets. Expertise and practical experience in designing and aligning data models across multiple subject areas, applying recognised patterns and industry standards. Familiarity with structured architectural approaches found in TOGAF (data architecture) or equivalent. Proven experience defining and evolving data governance, including data quality, metadata, lineage, and policy assurance across services. Strong capability in data profiling, source system analysis and identifying links across problem domains to define common, reusable solutions. Experience of communicating technical information and data to a non technical audience and working collaboratively with analysts, architects, and product owners to deliver data solutions that meet user and organisational needs. Ability to lead and mentor other team members. Demonstrable knowledge of data modelling and data warehousing within platforms such as Azure or AWS. Practical experience with Microsoft Azure services, including Azure Data Lake (Gen2), Synapse, Event Hubs, and Cosmos DB, within scalable cloud -based architectures. Robust understanding of data governance, data quality, and metadata management. Desirable skills Experience with Azure Data Factory, Databricks, or Apache Spark, following modern ETL/ELT principles. Experience in using Git, Azure DevOps, or GitHub Actions for version control, CI/CD, and collaborative data delivery. Experience with Big Data. Certification in data architecture or governance frameworks (e.g., TOGAF, DAMA, DCAM, EDMC). Experience of using programming languages such as Python, Scala and SQL Welsh language skills. Key Responsibilities (at a glance): Establish Data strategies and data modelling internally within the data estate Lead the design and oversight of enterprise aligned data models and supporting data architecture, ensuring that all modelling approaches follow organisational standards, recognised patterns, and enable scalable, high quality data flows across services. Provide expert architectural guidance to technical teams delivering cloud based data platforms, ensuring that data integration, modelling, metadata and design decisions align with organisational and enterprise-wide standards Work closely with other business leaders to maintain governance and compliance within their data estate. Work closely with data analysts,data engineering, Enterprise and solution architects, DevOps, and business stakeholders through regular communication and collaborative planning to ensure data solutions are closely aligned with business objectives and effectively meet user needs. Contribute to the development and execution of the Data Strategy by maintaining thorough documentation of data processes, architectures, and workflows to ensure all technical and process information is systematically recorded, updated and data initiatives deliver business value and are aligned with broader technology and organisational goals Research into emerging technologies and upcoming trends Provide oversight to teams building data processing pipelines and integration patterns, ensuring their artefacts are consistent with data architecture principles and metadata strategies. Lead on the introduction of foundational data management capabilities to improve trust, accessibility, and efficiency in an organisation that has limited data management capability, lacks data management practices, including governance, metadata standards, and quality controls. Design, implement, and optimise physical data models that align with pipeline architecture, by using the approach that ensures efficient query performance, scalable storage, and robust integration and delivers adaptable and resource -efficient data processing, meeting the organisation s evolving analytical and operational demands. Managing the aspirations of a variety of stakeholders to enable successful project delivery can be challenging, especially when their priorities may differ or even conflict and require reconciliation to meet business and project needs. What you ll get in return (at a glance) A salary of circa £62,500 - £67,500 (depending on experience) 28 days annual leave + public bank holidays Hybrid working - To be based in their brand new, modern offices 1-2 days per week A flexible working environment Competitive Legal and General pension Scheme (8% employer contribution) 4 x Death in service The opportunity to work on modern and industry changing projects Progression and development opportunities Free Rail travel throughout Wales and discounted throughout the UK Salary sacrifice scheme such as cycle to work, electric vehicle A chance to truly contribute to large scale digitalisation projects within Wales For more information click APPLY now or for a confidential chat call Nathan Handley on (phone number removed). This role is commutable from Swansea, Bridgend, Pontypridd, Cardiff and Newport or surrounding areas.
06/03/2026
Full time
Data Architect Hybrid RCT (South Wales) IntaPeople are proud and excited to be appointed to recruit an experienced Data Architect for a Welsh-based not-for-profit sector client on an exclusive growth project. This is a very exciting opportunity to join their fast-growing Data function in this newly created position. You will be joining the data team as one of the first handful of team members in this area of the business which will work with external partners to build out the organisations data capability offering. As a Data Architect, you will be responsible for designing, building, and maintaining robust, scalable, and secure data pipelines and platform that enable them to make data -driven decisions at a enterprise level. Working closely with the Head of Data Engineering you will help grow out this data function with the recruitment of further data engineering resources whilst working closely with solutions architects and Software Engineers. You will also get the opportunity to progress into a leadership role if this suited the individuals desires and capabilities. You will shape, govern and assure the organisation s data architecture, defining, designing and maintaining strategic data models, standards, flows and governance structures that support organisational goals, ensure compliance, foster collaboration across business areas, and enable the organisation to make data-driven decisions Essential Skills Proven experience as a Senior Data Engineer or Data Architect (or similar/related role). Experience with Enterprise level Data sets. Expertise and practical experience in designing and aligning data models across multiple subject areas, applying recognised patterns and industry standards. Familiarity with structured architectural approaches found in TOGAF (data architecture) or equivalent. Proven experience defining and evolving data governance, including data quality, metadata, lineage, and policy assurance across services. Strong capability in data profiling, source system analysis and identifying links across problem domains to define common, reusable solutions. Experience of communicating technical information and data to a non technical audience and working collaboratively with analysts, architects, and product owners to deliver data solutions that meet user and organisational needs. Ability to lead and mentor other team members. Demonstrable knowledge of data modelling and data warehousing within platforms such as Azure or AWS. Practical experience with Microsoft Azure services, including Azure Data Lake (Gen2), Synapse, Event Hubs, and Cosmos DB, within scalable cloud -based architectures. Robust understanding of data governance, data quality, and metadata management. Desirable skills Experience with Azure Data Factory, Databricks, or Apache Spark, following modern ETL/ELT principles. Experience in using Git, Azure DevOps, or GitHub Actions for version control, CI/CD, and collaborative data delivery. Experience with Big Data. Certification in data architecture or governance frameworks (e.g., TOGAF, DAMA, DCAM, EDMC). Experience of using programming languages such as Python, Scala and SQL Welsh language skills. Key Responsibilities (at a glance): Establish Data strategies and data modelling internally within the data estate Lead the design and oversight of enterprise aligned data models and supporting data architecture, ensuring that all modelling approaches follow organisational standards, recognised patterns, and enable scalable, high quality data flows across services. Provide expert architectural guidance to technical teams delivering cloud based data platforms, ensuring that data integration, modelling, metadata and design decisions align with organisational and enterprise-wide standards Work closely with other business leaders to maintain governance and compliance within their data estate. Work closely with data analysts,data engineering, Enterprise and solution architects, DevOps, and business stakeholders through regular communication and collaborative planning to ensure data solutions are closely aligned with business objectives and effectively meet user needs. Contribute to the development and execution of the Data Strategy by maintaining thorough documentation of data processes, architectures, and workflows to ensure all technical and process information is systematically recorded, updated and data initiatives deliver business value and are aligned with broader technology and organisational goals Research into emerging technologies and upcoming trends Provide oversight to teams building data processing pipelines and integration patterns, ensuring their artefacts are consistent with data architecture principles and metadata strategies. Lead on the introduction of foundational data management capabilities to improve trust, accessibility, and efficiency in an organisation that has limited data management capability, lacks data management practices, including governance, metadata standards, and quality controls. Design, implement, and optimise physical data models that align with pipeline architecture, by using the approach that ensures efficient query performance, scalable storage, and robust integration and delivers adaptable and resource -efficient data processing, meeting the organisation s evolving analytical and operational demands. Managing the aspirations of a variety of stakeholders to enable successful project delivery can be challenging, especially when their priorities may differ or even conflict and require reconciliation to meet business and project needs. What you ll get in return (at a glance) A salary of circa £62,500 - £67,500 (depending on experience) 28 days annual leave + public bank holidays Hybrid working - To be based in their brand new, modern offices 1-2 days per week A flexible working environment Competitive Legal and General pension Scheme (8% employer contribution) 4 x Death in service The opportunity to work on modern and industry changing projects Progression and development opportunities Free Rail travel throughout Wales and discounted throughout the UK Salary sacrifice scheme such as cycle to work, electric vehicle A chance to truly contribute to large scale digitalisation projects within Wales For more information click APPLY now or for a confidential chat call Nathan Handley on (phone number removed). This role is commutable from Swansea, Bridgend, Pontypridd, Cardiff and Newport or surrounding areas.
Experis
Enterprise Data Manager (Head of)
Experis
Enterprise Data Manager/Head of Data Hybrid: 2-3 days per week in the office ( London ) Permanent Paying up to 115k + Bonus Experis are delighted to be partnering with a well-established organisation as they continue to evolve and mature their enterprise data capability during a significant phase of data platform transformation. We are supporting them in the search for an Enterprise Data Manager to take ownership of the organisation's enterprise-wide data agenda. This is a senior, strategic role focused on data leadership, governance, stakeholder alignment and platform direction, sitting above the existing Data Engineering Manager and working across multiple business functions. This role provides enterprise-level oversight and cohesion across a complex, federated data landscape. The organisation is midway through modernising its data platform and needs an experienced data leader who can bring clarity, alignment and strategic direction across teams. This is a highly visible role operating across technology and business leadership, treating data as a strategic asset and ensuring the organisation is positioned to scale its data and AI capabilities responsibly. What You'll Be Doing Leading and evolving the enterprise data strategy, turning high-level intent into a clear, actionable roadmap. Providing enterprise leadership across the data ecosystem, bringing alignment between data engineering, architecture, governance and reporting functions. Acting as the senior authority on data across the organisation, influencing senior stakeholders and shaping data-driven decision making. Providing strategic oversight of the organisation's modern data platform including Azure, Databricks, Lakehouse architecture and early exploration of Microsoft Fabric. Guiding key decisions around sources of truth, data product lifecycle management and platform operating models. Managing the strategic delivery partnership with Telef nica Tech, ensuring strong collaboration, knowledge transfer and service delivery. Overseeing vendor-delivered workstreams and shaping the future sourcing strategy as the internal capability evolves. Supporting the organisation's data governance and AI governance foundations, ensuring strong controls before AI capability is scaled further. Operating across a federated data landscape, aligning multiple teams and reducing duplication while strengthening the organisation's overall data ecosystem. Experience Required Proven experience operating at Head of Data / Enterprise Data Manager / senior data leadership level. Strong track record working within large, complex or regulated organisations. Deep understanding of modern data platforms, data governance and enterprise data operating models. Experience managing external vendors, managed services and strategic delivery partners. Comfortable leading through organisational change, transformation and ambiguity. Experience working with Databricks, Lakehouse architectures or Microsoft Fabric. Background in regulated or data-rich industries such as finance, insurance, legal or media. If you'd like to learn more, please contact Jacob Ferdinand at
06/03/2026
Full time
Enterprise Data Manager/Head of Data Hybrid: 2-3 days per week in the office ( London ) Permanent Paying up to 115k + Bonus Experis are delighted to be partnering with a well-established organisation as they continue to evolve and mature their enterprise data capability during a significant phase of data platform transformation. We are supporting them in the search for an Enterprise Data Manager to take ownership of the organisation's enterprise-wide data agenda. This is a senior, strategic role focused on data leadership, governance, stakeholder alignment and platform direction, sitting above the existing Data Engineering Manager and working across multiple business functions. This role provides enterprise-level oversight and cohesion across a complex, federated data landscape. The organisation is midway through modernising its data platform and needs an experienced data leader who can bring clarity, alignment and strategic direction across teams. This is a highly visible role operating across technology and business leadership, treating data as a strategic asset and ensuring the organisation is positioned to scale its data and AI capabilities responsibly. What You'll Be Doing Leading and evolving the enterprise data strategy, turning high-level intent into a clear, actionable roadmap. Providing enterprise leadership across the data ecosystem, bringing alignment between data engineering, architecture, governance and reporting functions. Acting as the senior authority on data across the organisation, influencing senior stakeholders and shaping data-driven decision making. Providing strategic oversight of the organisation's modern data platform including Azure, Databricks, Lakehouse architecture and early exploration of Microsoft Fabric. Guiding key decisions around sources of truth, data product lifecycle management and platform operating models. Managing the strategic delivery partnership with Telef nica Tech, ensuring strong collaboration, knowledge transfer and service delivery. Overseeing vendor-delivered workstreams and shaping the future sourcing strategy as the internal capability evolves. Supporting the organisation's data governance and AI governance foundations, ensuring strong controls before AI capability is scaled further. Operating across a federated data landscape, aligning multiple teams and reducing duplication while strengthening the organisation's overall data ecosystem. Experience Required Proven experience operating at Head of Data / Enterprise Data Manager / senior data leadership level. Strong track record working within large, complex or regulated organisations. Deep understanding of modern data platforms, data governance and enterprise data operating models. Experience managing external vendors, managed services and strategic delivery partners. Comfortable leading through organisational change, transformation and ambiguity. Experience working with Databricks, Lakehouse architectures or Microsoft Fabric. Background in regulated or data-rich industries such as finance, insurance, legal or media. If you'd like to learn more, please contact Jacob Ferdinand at
Cathcart Technology
Head of Data
Cathcart Technology Glasgow, Lanarkshire
Head of Data required to lead and evolve enterprise-wide data platforms within a global organisation in Glasgow. This is a senior role responsible for building scalable platforms, maintaining high standards of data governance and quality, leading a high-performing team, and shaping the organisation's long-term data strategy. The Organisation This is a large, global organisation where data underpins critical business functions. Over the past five years, the organisation has been developing its data capabilities and recently launched its first enterprise data platform. The next phase is to replicate these platforms across multiple business domains while enhancing governance, reliability, and value from the data estate. The firm continues to invest in cloud-based platforms, analytics, and AI, with a focus on secure, scalable, and high-quality data solutions. Senior technology leaders are trusted to guide strategy as well as deliver results operationally. The Role You will take end-to-end ownership of the enterprise data platforms, ensuring they are robust, reliable, and scalable, while driving improvements in data quality and governance. You'll manage a team of data engineers and act as the central coordinator across data architects, governance, and reporting teams, ensuring alignment and successful delivery across the business. In addition, you will contribute to developing and executing the firm's data strategy, supporting innovation and longer-term ambitions including AI and advanced analytics initiatives. What You'll Be Doing . * Leading the design, implementation, and optimisation of enterprise data platforms .* Ensuring data governance, master data management, and quality standards are Embedded across all platforms .* Managing, mentoring, and developing a team of data engineers, while coordinating cross-functional teams of architects, governance, and reporting specialists .* Building and maintaining scalable, reliable, and reusable data pipelines across multiple sources .* Collaborating with senior stakeholders to translate business priorities into actionable data initiatives .* Driving the adoption of cloud services, analytics, and AI to enhance the data estate .* Managing vendors and third-party partners to ensure delivery, performance, and value What They're Looking For * Proven experience leading enterprise data platform initiatives .* Strong technical expertise in data engineering, data management, and cloud platforms (eg, Azure, MS Fabric, Databricks) .* Track record of delivering complex, high-value data solutions with strong governance and quality controls .* Experienced people leader capable of managing teams and coordinating cross-functional stakeholders .* Skilled at shaping and delivering data strategy, with a vision for AI and advanced analytics .* Excellent stakeholder management and communication skills at senior and executive levels .* Understanding of business functions such as finance, HR, compliance, and operational processes The Offer A competitive salary and benefits package is on offer, alongside hybrid working (typically 2-3 days per week in their city centre office). This is a senior, high-profile leadership role with the opportunity to shape the enterprise data landscape, build a high-performing team, and drive strategic innovation across the organisation. If this sounds of interest, please apply or reach out to Murray Simpson. Cathcart Technology is acting as an Employment Agency in relation to this vacancy.
06/03/2026
Full time
Head of Data required to lead and evolve enterprise-wide data platforms within a global organisation in Glasgow. This is a senior role responsible for building scalable platforms, maintaining high standards of data governance and quality, leading a high-performing team, and shaping the organisation's long-term data strategy. The Organisation This is a large, global organisation where data underpins critical business functions. Over the past five years, the organisation has been developing its data capabilities and recently launched its first enterprise data platform. The next phase is to replicate these platforms across multiple business domains while enhancing governance, reliability, and value from the data estate. The firm continues to invest in cloud-based platforms, analytics, and AI, with a focus on secure, scalable, and high-quality data solutions. Senior technology leaders are trusted to guide strategy as well as deliver results operationally. The Role You will take end-to-end ownership of the enterprise data platforms, ensuring they are robust, reliable, and scalable, while driving improvements in data quality and governance. You'll manage a team of data engineers and act as the central coordinator across data architects, governance, and reporting teams, ensuring alignment and successful delivery across the business. In addition, you will contribute to developing and executing the firm's data strategy, supporting innovation and longer-term ambitions including AI and advanced analytics initiatives. What You'll Be Doing . * Leading the design, implementation, and optimisation of enterprise data platforms .* Ensuring data governance, master data management, and quality standards are Embedded across all platforms .* Managing, mentoring, and developing a team of data engineers, while coordinating cross-functional teams of architects, governance, and reporting specialists .* Building and maintaining scalable, reliable, and reusable data pipelines across multiple sources .* Collaborating with senior stakeholders to translate business priorities into actionable data initiatives .* Driving the adoption of cloud services, analytics, and AI to enhance the data estate .* Managing vendors and third-party partners to ensure delivery, performance, and value What They're Looking For * Proven experience leading enterprise data platform initiatives .* Strong technical expertise in data engineering, data management, and cloud platforms (eg, Azure, MS Fabric, Databricks) .* Track record of delivering complex, high-value data solutions with strong governance and quality controls .* Experienced people leader capable of managing teams and coordinating cross-functional stakeholders .* Skilled at shaping and delivering data strategy, with a vision for AI and advanced analytics .* Excellent stakeholder management and communication skills at senior and executive levels .* Understanding of business functions such as finance, HR, compliance, and operational processes The Offer A competitive salary and benefits package is on offer, alongside hybrid working (typically 2-3 days per week in their city centre office). This is a senior, high-profile leadership role with the opportunity to shape the enterprise data landscape, build a high-performing team, and drive strategic innovation across the organisation. If this sounds of interest, please apply or reach out to Murray Simpson. Cathcart Technology is acting as an Employment Agency in relation to this vacancy.
Vermelo RPO
Innovation Senior Business Analyst
Vermelo RPO
Innovation Senior Business Analyst This is a flexible, hybrid role and can be based from either of our offices in Peterborough, Manchester, Chesterfield, Stoke or Sunderland. We also have largely remote options available. Must be able to travel on an ad-hoc basis. Role Purpose The Innovation Senior Business Analyst plays a key role in the Group's Innovation function, helping to shape and deliver the Technology Innovation and GenAI roadmap. This role acts as the critical bridge between business needs and technical solutions, identifying opportunities where emerging technologies - particularly Generative AI - can drive meaningful change. Working within a multidisciplinary team, the Innovation Senior Business Analyst will lead the discovery, analysis, and validation of innovation initiatives, ensuring they are aligned with business goals and deliver measurable value. Key Accountabilities & Responsibilities Facilitate workshops, interviews, and discovery sessions to understand the business value stream, pain points and opportunities Identify and shape a pipeline of innovation initiatives that potentially reduce waste and improve processes, productivity and quality Collaborate with engineers, architects, data teams, 3rd party providers, and business SMEs to detail individual PoC requirements and success criteria, that will allow us to evaluate the feasibility of prototypes Work in a tight team to design, build, test, and validate prototypes, ensuring they are feasible and clearly demonstrate business value Support the rollout and scaling of successful PoCs across business functions. Help build AI literacy across the organisation, supporting adoption of new technologies and co-developing new ways of working Operate within an Agile framework, contributing to backlog management, sprint planning, and iterative delivery Co-develop, and enforce, AI governance policies and protocols Skills, Experience & Knowledge Experience with Lean thinking and value stream mapping. Significant experience at a senior level Business Analyst including skills in process mapping, requirements gathering and performance analysis Strong analytical and problem-solving skills, with a keen eye for detail Appreciation of value creation, commercial priorities and business case analysis Experience working in fast-paced, digital environments and Agile delivery teams. Skilled in stakeholder engagement, facilitation. Comfortable operating at all levels of the business, influencing and gaining trust Understanding of data and technology, ideally AI/ML concepts, and their business applications. Comfortable working with ambiguity and shaping early-stage ideas into tangible outcomes. Preferred Familiarity with innovation accelerators and PoC frameworks. Experience with GenAI and an understanding of its potential impact on business. Exposure to tools and platforms such as Azure, GCP, LangChain, MLflow, Databricks, Kubernetes, and CI/CD pipelines. Experience in regulated industries such as insurance or financial services. Background in digital transformation, R&D, or emerging technology teams. What we offer in return? A collaborative and fast paced work environment Health care cash plan Yearly bonus scheme 24 days annual leave plus Bank Holidays and the ability to buy additional leave (annual leave also increases with service) Life Assurance 4x annual salary Vibrant, modern offices About the business: Markerstudy is a leading provider of private insurance in the UK, insuring around 5% of the private cars on the UK roads, 20% of commercial vehicles and over 30% of motorcycles in total premium levels of circa £1.2b. Markerstudy also has a large and growing direct presence in the market as well. Having acquired and successfully integrated Co-op Insurance Services in 2021, BGLi in 2022 & Atlanta in 2024.
05/03/2026
Full time
Innovation Senior Business Analyst This is a flexible, hybrid role and can be based from either of our offices in Peterborough, Manchester, Chesterfield, Stoke or Sunderland. We also have largely remote options available. Must be able to travel on an ad-hoc basis. Role Purpose The Innovation Senior Business Analyst plays a key role in the Group's Innovation function, helping to shape and deliver the Technology Innovation and GenAI roadmap. This role acts as the critical bridge between business needs and technical solutions, identifying opportunities where emerging technologies - particularly Generative AI - can drive meaningful change. Working within a multidisciplinary team, the Innovation Senior Business Analyst will lead the discovery, analysis, and validation of innovation initiatives, ensuring they are aligned with business goals and deliver measurable value. Key Accountabilities & Responsibilities Facilitate workshops, interviews, and discovery sessions to understand the business value stream, pain points and opportunities Identify and shape a pipeline of innovation initiatives that potentially reduce waste and improve processes, productivity and quality Collaborate with engineers, architects, data teams, 3rd party providers, and business SMEs to detail individual PoC requirements and success criteria, that will allow us to evaluate the feasibility of prototypes Work in a tight team to design, build, test, and validate prototypes, ensuring they are feasible and clearly demonstrate business value Support the rollout and scaling of successful PoCs across business functions. Help build AI literacy across the organisation, supporting adoption of new technologies and co-developing new ways of working Operate within an Agile framework, contributing to backlog management, sprint planning, and iterative delivery Co-develop, and enforce, AI governance policies and protocols Skills, Experience & Knowledge Experience with Lean thinking and value stream mapping. Significant experience at a senior level Business Analyst including skills in process mapping, requirements gathering and performance analysis Strong analytical and problem-solving skills, with a keen eye for detail Appreciation of value creation, commercial priorities and business case analysis Experience working in fast-paced, digital environments and Agile delivery teams. Skilled in stakeholder engagement, facilitation. Comfortable operating at all levels of the business, influencing and gaining trust Understanding of data and technology, ideally AI/ML concepts, and their business applications. Comfortable working with ambiguity and shaping early-stage ideas into tangible outcomes. Preferred Familiarity with innovation accelerators and PoC frameworks. Experience with GenAI and an understanding of its potential impact on business. Exposure to tools and platforms such as Azure, GCP, LangChain, MLflow, Databricks, Kubernetes, and CI/CD pipelines. Experience in regulated industries such as insurance or financial services. Background in digital transformation, R&D, or emerging technology teams. What we offer in return? A collaborative and fast paced work environment Health care cash plan Yearly bonus scheme 24 days annual leave plus Bank Holidays and the ability to buy additional leave (annual leave also increases with service) Life Assurance 4x annual salary Vibrant, modern offices About the business: Markerstudy is a leading provider of private insurance in the UK, insuring around 5% of the private cars on the UK roads, 20% of commercial vehicles and over 30% of motorcycles in total premium levels of circa £1.2b. Markerstudy also has a large and growing direct presence in the market as well. Having acquired and successfully integrated Co-op Insurance Services in 2021, BGLi in 2022 & Atlanta in 2024.
Michael Page Technology
Senior Data engineer - Databricks
Michael Page Technology Basingstoke, Hampshire
This is an exciting opportunity for a Senior Data Engineer to own the Azure/Databricks data & AI platform end-to-end from architecture and pipelines to governance, data quality, observability, ML enablement so the business gets trusted, timely, and cost-efficient data and models. Client Details Senior Data Engineer The organisation is a well established entity within the financial services industry. It operates as a medium-sized firm and focuses on providing reliable and innovative solutions to its clients. Description Senior Data Engineer Develop and oversee Azure-based solutions to support analytics functions. Design and evolve the Azure/Databricks modern data platform architecture. Build, optimise and maintain scalable ETL/ELT pipelines and data models. Implement data governance, lineage and cataloguing using Purview/Unity Catalog. Establish data quality frameworks, monitoring and observability across pipelines. Manage orchestration, platform operations and ITIL-aligned incident/change processes. Ensure strong data security, access controls and regulatory compliance. Support and enable machine learning and AI solutions within Databricks. Monitor and optimise cloud and compute costs using Azure FinOps tooling. Provide guidance and mentorship to the analytics team on Azure best practices. Profile Senior Data Engineer A successful senior data engineer will have deep, hands-on experience with Databricks and Azure, with the ability to own platform architecture, delivery and operations end-to-end (including DevOps, monitoring, reliability and cost control). They will be fluent across data engineering, governance, ML enablement and ITIL-aligned change and incident management. Proven expertise in delivering Azure and Databricks data platform solutions. Strong background in designing and optimising complex ETL/ELT pipelines. Hands-on experience with data governance, lineage and cataloguing tools. Proven ability to implement data quality, monitoring and observability practices. Experience leading platform operations, including incident and change management. Demonstrated capability in mentoring others and collaborating with diverse stakeholders. Strong leadership skills with the ability to guide and develop teams. Excellent analytical and problem-solving abilities. Job Offer Senior Data Engineer Competitive salary up to £70,000 + Bonus & Benefits. Standard benefits package provided. Permanent position within a reputable organisation. Chance to develop and lead innovative Azure-driven projects. Take the next step in your career by applying for this Senior Data Engineer position in Basingstoke today. Join a trusted organisation and make a significant impact.
04/03/2026
Full time
This is an exciting opportunity for a Senior Data Engineer to own the Azure/Databricks data & AI platform end-to-end from architecture and pipelines to governance, data quality, observability, ML enablement so the business gets trusted, timely, and cost-efficient data and models. Client Details Senior Data Engineer The organisation is a well established entity within the financial services industry. It operates as a medium-sized firm and focuses on providing reliable and innovative solutions to its clients. Description Senior Data Engineer Develop and oversee Azure-based solutions to support analytics functions. Design and evolve the Azure/Databricks modern data platform architecture. Build, optimise and maintain scalable ETL/ELT pipelines and data models. Implement data governance, lineage and cataloguing using Purview/Unity Catalog. Establish data quality frameworks, monitoring and observability across pipelines. Manage orchestration, platform operations and ITIL-aligned incident/change processes. Ensure strong data security, access controls and regulatory compliance. Support and enable machine learning and AI solutions within Databricks. Monitor and optimise cloud and compute costs using Azure FinOps tooling. Provide guidance and mentorship to the analytics team on Azure best practices. Profile Senior Data Engineer A successful senior data engineer will have deep, hands-on experience with Databricks and Azure, with the ability to own platform architecture, delivery and operations end-to-end (including DevOps, monitoring, reliability and cost control). They will be fluent across data engineering, governance, ML enablement and ITIL-aligned change and incident management. Proven expertise in delivering Azure and Databricks data platform solutions. Strong background in designing and optimising complex ETL/ELT pipelines. Hands-on experience with data governance, lineage and cataloguing tools. Proven ability to implement data quality, monitoring and observability practices. Experience leading platform operations, including incident and change management. Demonstrated capability in mentoring others and collaborating with diverse stakeholders. Strong leadership skills with the ability to guide and develop teams. Excellent analytical and problem-solving abilities. Job Offer Senior Data Engineer Competitive salary up to £70,000 + Bonus & Benefits. Standard benefits package provided. Permanent position within a reputable organisation. Chance to develop and lead innovative Azure-driven projects. Take the next step in your career by applying for this Senior Data Engineer position in Basingstoke today. Join a trusted organisation and make a significant impact.
The Portfolio Group
AI Platform Engineer
The Portfolio Group City, London
AI Platform Engineer London Excellent Salary +Benefits Join an award-winning, internationally recognised B2B consultancy as an AI Platform Engineer, owning the cloud-native platform that underpins conversational AI and generative AI products at scale. Sitting at the core of AI delivery, you will design, build, and operate the runtime, infrastructure, and operational layers supporting RAG pipelines, LLM orchestration, vector search, and evaluation workflows across AWS and Databricks. Working closely with senior AI engineers and product teams, you'll ensure AI systems are scalable, observable, secure, and cost-efficient, turning experimental AI into reliable, production-grade capabilities. With further scope of responsibilities detailed below: Own and evolve the AI platform powering conversational assistants and generative AI products. Build, operate, and optimise RAG and LLM-backed services, improving latency, reliability, and cost. Design and run cloud-native AI services across AWS and Databricks, including ingestion and embedding pipelines. Scale and operate vector search infrastructure (Weaviate, OpenSearch, Algolia, AWS Bedrock Knowledge Bases). Implement strong observability, CI/CD, security, and governance across AI workloads. Enable future architectures such as multi-model orchestration and agentic workflows. Required Skills & Experience Strong experience designing and operating cloud-native platforms on AWS (Lambda, API Gateway, DynamoDB, S3, CloudWatch). Hands-on experience with Databricks and large-scale data or embedding pipelines. Proven experience building and operating production AI systems , including RAG pipelines, LLM-backed services, and vector search (Weaviate, OpenSearch, Algolia). Proficiency in Python , with experience deploying containerised services on Kubernetes using Terraform . Solid understanding of distributed systems, cloud architecture, and API design , with a focus on scalability and reliability. Demonstrable ownership of observability, performance, cost efficiency, and operational robustness in production environments. Why Join? You'll own the foundational AI platform behind a growing suite of generative AI products, working with senior AI leaders on systems used by real customers at scale. This role offers deep technical ownership, long-term impact, and an excellent compensation package within a market-leading organisation. INDAMS Portfolio Payroll Ltd is acting as an Employment Agency in relation to this vacancy.
02/03/2026
Full time
AI Platform Engineer London Excellent Salary +Benefits Join an award-winning, internationally recognised B2B consultancy as an AI Platform Engineer, owning the cloud-native platform that underpins conversational AI and generative AI products at scale. Sitting at the core of AI delivery, you will design, build, and operate the runtime, infrastructure, and operational layers supporting RAG pipelines, LLM orchestration, vector search, and evaluation workflows across AWS and Databricks. Working closely with senior AI engineers and product teams, you'll ensure AI systems are scalable, observable, secure, and cost-efficient, turning experimental AI into reliable, production-grade capabilities. With further scope of responsibilities detailed below: Own and evolve the AI platform powering conversational assistants and generative AI products. Build, operate, and optimise RAG and LLM-backed services, improving latency, reliability, and cost. Design and run cloud-native AI services across AWS and Databricks, including ingestion and embedding pipelines. Scale and operate vector search infrastructure (Weaviate, OpenSearch, Algolia, AWS Bedrock Knowledge Bases). Implement strong observability, CI/CD, security, and governance across AI workloads. Enable future architectures such as multi-model orchestration and agentic workflows. Required Skills & Experience Strong experience designing and operating cloud-native platforms on AWS (Lambda, API Gateway, DynamoDB, S3, CloudWatch). Hands-on experience with Databricks and large-scale data or embedding pipelines. Proven experience building and operating production AI systems , including RAG pipelines, LLM-backed services, and vector search (Weaviate, OpenSearch, Algolia). Proficiency in Python , with experience deploying containerised services on Kubernetes using Terraform . Solid understanding of distributed systems, cloud architecture, and API design , with a focus on scalability and reliability. Demonstrable ownership of observability, performance, cost efficiency, and operational robustness in production environments. Why Join? You'll own the foundational AI platform behind a growing suite of generative AI products, working with senior AI leaders on systems used by real customers at scale. This role offers deep technical ownership, long-term impact, and an excellent compensation package within a market-leading organisation. INDAMS Portfolio Payroll Ltd is acting as an Employment Agency in relation to this vacancy.
VIQU IT
Data and AI Product Lead
VIQU IT City, London
Data & AI Product Lead London Hybrid Permanent Up to £110,000 VIQU have partnered with a leading insurance organisation seeking a Data & AI Product Lead to shape and drive their UK&I data and AI product strategy. As a Data & AI Product Lead, you will define the vision, roadmap, and operating model for a growing data product ecosystem, working closely with underwriting, pricing, claims, risk, data science, and engineering teams to deliver scalable, high value data and AI products aligned to commercial outcomes. This Data & AI Product Lead role offers genuine enterprise level influence and strategic ownership across the organisation. Key Responsibilities: • Define and drive the data and AI product strategy as a Data & AI Product Lead, aligned to business and data objectives. • Develop and maintain comprehensive product roadmaps and a target data product catalogue. • Partner with underwriting, pricing, claims, risk, data science, and technology teams to identify opportunities and prioritise initiatives. • Own the end to end lifecycle of data and AI products from ideation through to deployment and continuous optimisation. • Establish and embed a clear data product operating model and governance framework. • Define KPIs and ROI measures to track performance, adoption, and commercial impact. • Engage senior stakeholders, translating complex data and AI concepts into clear business value. • Ensure regulatory compliance and high standards of data quality and integrity. Key Requirements: • Proven experience as a Data & AI Product Lead, Senior Data Product Manager, or similar end to end product ownership role. • Strong insurance domain knowledge across pricing, underwriting, claims, and risk. • Demonstrable experience defining product strategy, roadmaps, prioritisation frameworks, and value measurement. • Practical knowledge of cloud data platforms such as Azure and Databricks, with the ability to engage engineering teams credibly. • Experience establishing governance frameworks for data products, ensuring regulatory compliance and data integrity. • Strong executive level stakeholder management and communication skills. • Strategic, structured thinker able to operate in ambiguity and drive clarity at pace. Data & AI Product Lead London Hybrid Permanent Up to £110,000 Apply today to speak with VIQU in confidence or contact Belle Hegarty at (url removed). Know someone exceptional for this Data & AI Product Lead position? Refer them and receive up to £1,000 if successful terms apply. Follow us on LinkedIn IT Recruitment for more exciting opportunities.
25/02/2026
Full time
Data & AI Product Lead London Hybrid Permanent Up to £110,000 VIQU have partnered with a leading insurance organisation seeking a Data & AI Product Lead to shape and drive their UK&I data and AI product strategy. As a Data & AI Product Lead, you will define the vision, roadmap, and operating model for a growing data product ecosystem, working closely with underwriting, pricing, claims, risk, data science, and engineering teams to deliver scalable, high value data and AI products aligned to commercial outcomes. This Data & AI Product Lead role offers genuine enterprise level influence and strategic ownership across the organisation. Key Responsibilities: • Define and drive the data and AI product strategy as a Data & AI Product Lead, aligned to business and data objectives. • Develop and maintain comprehensive product roadmaps and a target data product catalogue. • Partner with underwriting, pricing, claims, risk, data science, and technology teams to identify opportunities and prioritise initiatives. • Own the end to end lifecycle of data and AI products from ideation through to deployment and continuous optimisation. • Establish and embed a clear data product operating model and governance framework. • Define KPIs and ROI measures to track performance, adoption, and commercial impact. • Engage senior stakeholders, translating complex data and AI concepts into clear business value. • Ensure regulatory compliance and high standards of data quality and integrity. Key Requirements: • Proven experience as a Data & AI Product Lead, Senior Data Product Manager, or similar end to end product ownership role. • Strong insurance domain knowledge across pricing, underwriting, claims, and risk. • Demonstrable experience defining product strategy, roadmaps, prioritisation frameworks, and value measurement. • Practical knowledge of cloud data platforms such as Azure and Databricks, with the ability to engage engineering teams credibly. • Experience establishing governance frameworks for data products, ensuring regulatory compliance and data integrity. • Strong executive level stakeholder management and communication skills. • Strategic, structured thinker able to operate in ambiguity and drive clarity at pace. Data & AI Product Lead London Hybrid Permanent Up to £110,000 Apply today to speak with VIQU in confidence or contact Belle Hegarty at (url removed). Know someone exceptional for this Data & AI Product Lead position? Refer them and receive up to £1,000 if successful terms apply. Follow us on LinkedIn IT Recruitment for more exciting opportunities.
Datatech
Principal Data Engineer (MS Azure)
Datatech
Principal Data Engineer (MS Azure) Location UK Remote Salary Up to 68,000 home based nationally or up to 74,00 home based for those living within the M25 dependent on experience plus a 600 per annum home working allowance Job Ref J13058 We are looking for a Principal Data Engineer to shape the technical direction of data engineering across a cloud based Enterprise Data Platform built on Microsoft Azure. This role suits someone with deep hands on data engineering experience who can also set standards, guide teams, and influence how data solutions are designed and delivered at scale. You will play a key role in ensuring the platform is engineered to a consistently high standard and can evolve to meet future organisational needs. We value diverse perspectives and are committed to creating an environment where people with different backgrounds and experiences can do their best work. The environment The Enterprise Data Platform is built on Microsoft Azure, using Databricks, Microsoft Fabric, and Power BI to deliver trusted, governed data and analytics. What you will be doing Setting data engineering standards, patterns, and best practices Acting as a trusted senior technical authority across data engineering and analytics Shaping solution design and architectural decisions on Azure Ensuring data pipelines are scalable, reliable, and production ready Championing modern engineering practices including CI/CD and automation Working in a forward deployed way with delivery teams to support progress and remove blockers Managing and developing data engineers, supporting growth and high quality delivery What we are looking for Strong experience designing and building data platforms on Microsoft Azure Hands on experience with Databricks and Microsoft Fabric Experience working with analytics and reporting tools such as Power BI Experience managing and mentoring data engineers Excellent communication skills with the ability to explain complex technical ideas clearly to both technical and non-technical audiences A collaborative approach and an interest in raising engineering standards across teams If you have strong Azure based data engineering experience and want to shape how data engineering is delivered at scale, make an application today to find out more. Alternatively, you can refer a friend or colleague by taking part in our fantastic referral schemes! If you have a friend or colleague who would be interested in this role, please refer them to us. For each relevant candidate that you introduce to us (there is no limit) and we place, you will be entitled to our general gift/voucher scheme. Datatech is one of the UK's leading recruitment agencies in the field of analytics and host of the critically acclaimed event, Women in Data. For more information, visit our website: (url removed)
25/02/2026
Full time
Principal Data Engineer (MS Azure) Location UK Remote Salary Up to 68,000 home based nationally or up to 74,00 home based for those living within the M25 dependent on experience plus a 600 per annum home working allowance Job Ref J13058 We are looking for a Principal Data Engineer to shape the technical direction of data engineering across a cloud based Enterprise Data Platform built on Microsoft Azure. This role suits someone with deep hands on data engineering experience who can also set standards, guide teams, and influence how data solutions are designed and delivered at scale. You will play a key role in ensuring the platform is engineered to a consistently high standard and can evolve to meet future organisational needs. We value diverse perspectives and are committed to creating an environment where people with different backgrounds and experiences can do their best work. The environment The Enterprise Data Platform is built on Microsoft Azure, using Databricks, Microsoft Fabric, and Power BI to deliver trusted, governed data and analytics. What you will be doing Setting data engineering standards, patterns, and best practices Acting as a trusted senior technical authority across data engineering and analytics Shaping solution design and architectural decisions on Azure Ensuring data pipelines are scalable, reliable, and production ready Championing modern engineering practices including CI/CD and automation Working in a forward deployed way with delivery teams to support progress and remove blockers Managing and developing data engineers, supporting growth and high quality delivery What we are looking for Strong experience designing and building data platforms on Microsoft Azure Hands on experience with Databricks and Microsoft Fabric Experience working with analytics and reporting tools such as Power BI Experience managing and mentoring data engineers Excellent communication skills with the ability to explain complex technical ideas clearly to both technical and non-technical audiences A collaborative approach and an interest in raising engineering standards across teams If you have strong Azure based data engineering experience and want to shape how data engineering is delivered at scale, make an application today to find out more. Alternatively, you can refer a friend or colleague by taking part in our fantastic referral schemes! If you have a friend or colleague who would be interested in this role, please refer them to us. For each relevant candidate that you introduce to us (there is no limit) and we place, you will be entitled to our general gift/voucher scheme. Datatech is one of the UK's leading recruitment agencies in the field of analytics and host of the critically acclaimed event, Women in Data. For more information, visit our website: (url removed)
Datatech
Principal Data Architect DV Cleared
Datatech
Principal Data Architect, Secure Government and Defence Programmes Location: UK, hybrid, client site as required Security Requirement: Developed Vetting (DV) clearance is mandatory. This role is only suitable for candidates who currently hold active DV clearance. Overview A leading UK consulting organisation delivering mission-critical digital and data transformation across defence, national security, and sensitive government environments is seeking a Principal Data Architect. This is a senior, client-facing role focused on defining and delivering secure, scalable data platforms within highly classified programmes. Due to the nature of the work and access to classified systems, only candidates with active DV clearance can be considered. Role Responsibilities Define and lead enterprise-level data architecture for complex, secure transformation programmes Architect end-to-end data platforms covering ingestion, integration, storage, governance, and analytics Design secure, resilient, and scalable architectures aligned with defence and national security requirements Translate mission and operational needs into technical data solutions Provide technical leadership across engineering teams and client stakeholders Contribute to capability growth, technical strategy, and thought leadership within secure environments Technical Environment Architectural responsibility across modern and legacy secure platforms, including: Multi-cloud platforms and secure data environments Data lakes, warehouses, and distributed data systems Data ingestion, orchestration, and integration tooling Cloud ecosystems such as AWS, Azure, and GCP Databricks, Snowflake, and similar modern data platforms Infrastructure automation, DevOps, and secure deployment patterns Data governance, metadata, and secure access frameworks Analytics, semantic layers, and enterprise reporting platforms Seniority and Leadership Expectations Operate at Principal Consultant or equivalent leadership level Own architecture strategy across large, complex client programmes Lead multidisciplinary teams across data, engineering, and delivery functions Provide strategic technical direction and advisory support to senior stakeholders Support business development, technical assurance, and capability development Essential Requirements Active DV clearance (mandatory, non-negotiable) Strong experience designing and delivering enterprise data architectures Experience working in defence, national security, or highly regulated government environments Deep understanding of secure data platform design and cloud architectures Strong stakeholder engagement and consulting capability Experience leading teams and delivering complex programmes
24/02/2026
Full time
Principal Data Architect, Secure Government and Defence Programmes Location: UK, hybrid, client site as required Security Requirement: Developed Vetting (DV) clearance is mandatory. This role is only suitable for candidates who currently hold active DV clearance. Overview A leading UK consulting organisation delivering mission-critical digital and data transformation across defence, national security, and sensitive government environments is seeking a Principal Data Architect. This is a senior, client-facing role focused on defining and delivering secure, scalable data platforms within highly classified programmes. Due to the nature of the work and access to classified systems, only candidates with active DV clearance can be considered. Role Responsibilities Define and lead enterprise-level data architecture for complex, secure transformation programmes Architect end-to-end data platforms covering ingestion, integration, storage, governance, and analytics Design secure, resilient, and scalable architectures aligned with defence and national security requirements Translate mission and operational needs into technical data solutions Provide technical leadership across engineering teams and client stakeholders Contribute to capability growth, technical strategy, and thought leadership within secure environments Technical Environment Architectural responsibility across modern and legacy secure platforms, including: Multi-cloud platforms and secure data environments Data lakes, warehouses, and distributed data systems Data ingestion, orchestration, and integration tooling Cloud ecosystems such as AWS, Azure, and GCP Databricks, Snowflake, and similar modern data platforms Infrastructure automation, DevOps, and secure deployment patterns Data governance, metadata, and secure access frameworks Analytics, semantic layers, and enterprise reporting platforms Seniority and Leadership Expectations Operate at Principal Consultant or equivalent leadership level Own architecture strategy across large, complex client programmes Lead multidisciplinary teams across data, engineering, and delivery functions Provide strategic technical direction and advisory support to senior stakeholders Support business development, technical assurance, and capability development Essential Requirements Active DV clearance (mandatory, non-negotiable) Strong experience designing and delivering enterprise data architectures Experience working in defence, national security, or highly regulated government environments Deep understanding of secure data platform design and cloud architectures Strong stakeholder engagement and consulting capability Experience leading teams and delivering complex programmes
EMBS Engineering
Senior Data Engineer - Azure & Snowflake
EMBS Engineering
Senior Data Engineer - Azure & Snowflake Location: Central London - 3 4 days onsite each week Salary: £ Negotiable + benefits We are supporting an enterprise-level client who is investing heavily in a modern cloud data platform that will sit at the centre of its data strategy. This programme will enable more advanced analytics, reporting and insight across multiple business functions. We are looking to appoint three experienced Senior Data Engineers with strong Azure and Snowflake expertise. The Role This is a senior, hands-on engineering position within a high-performing data team. You will play a key role in shaping, developing and enhancing a large-scale Azure-based data platform, ensuring it is scalable, reliable and built to enterprise standards. The position requires regular collaboration with stakeholders and an onsite presence in Central London 3 4 days per week, so this is not a fully remote role. What You Will Be Doing Building and enhancing scalable data pipelines using Azure and Snowflake Developing and improving ETL / ELT processes across batch and micro-batch workloads Working extensively with Azure Data Factory, Azure SQL, Azure Storage and Azure Functions Designing and maintaining data warehouse structures including star and snowflake schemas Applying recognised data warehousing approaches such as Kimball and Inmon Writing and optimising complex SQL queries to support analytics and reporting Ensuring strong data governance, quality, validation and reconciliation processes Partnering with BI teams to enable effective reporting solutions Contributing to architectural decisions around performance, scalability and infrastructure Identifying and resolving issues to improve platform reliability and efficiency What We Are Looking For 7+ years in software engineering or development 5+ years working within data-focused environments At least 2 years hands-on experience with Azure cloud data platforms Strong expertise across the Azure Data Platform including Data Factory, SQL, Storage and Functions Proven experience in SQL development and data modelling Experience building both periodic batch and micro-batch data pipelines Solid understanding of enterprise data warehouse design and loading strategies A minimum of 1 year hands-on experience with Snowflake Experience working with large-scale enterprise datasets Strong analytical mindset with a clear focus on data integrity and performance Desirable Experience Advanced Snowflake performance tuning and optimisation Python and or Databricks exposure Experience designing full end-to-end data platform architectures Background supporting enterprise BI ecosystems Familiarity with CI/CD pipelines and infrastructure-as-code practices Additional Details Visa candidates will be considered Salary is open and negotiable depending on experience Immediate requirement If you are an experienced Senior Data Engineer with strong Azure and Snowflake expertise and are comfortable with a London-based hybrid working model, we d love to hear from you.
23/02/2026
Full time
Senior Data Engineer - Azure & Snowflake Location: Central London - 3 4 days onsite each week Salary: £ Negotiable + benefits We are supporting an enterprise-level client who is investing heavily in a modern cloud data platform that will sit at the centre of its data strategy. This programme will enable more advanced analytics, reporting and insight across multiple business functions. We are looking to appoint three experienced Senior Data Engineers with strong Azure and Snowflake expertise. The Role This is a senior, hands-on engineering position within a high-performing data team. You will play a key role in shaping, developing and enhancing a large-scale Azure-based data platform, ensuring it is scalable, reliable and built to enterprise standards. The position requires regular collaboration with stakeholders and an onsite presence in Central London 3 4 days per week, so this is not a fully remote role. What You Will Be Doing Building and enhancing scalable data pipelines using Azure and Snowflake Developing and improving ETL / ELT processes across batch and micro-batch workloads Working extensively with Azure Data Factory, Azure SQL, Azure Storage and Azure Functions Designing and maintaining data warehouse structures including star and snowflake schemas Applying recognised data warehousing approaches such as Kimball and Inmon Writing and optimising complex SQL queries to support analytics and reporting Ensuring strong data governance, quality, validation and reconciliation processes Partnering with BI teams to enable effective reporting solutions Contributing to architectural decisions around performance, scalability and infrastructure Identifying and resolving issues to improve platform reliability and efficiency What We Are Looking For 7+ years in software engineering or development 5+ years working within data-focused environments At least 2 years hands-on experience with Azure cloud data platforms Strong expertise across the Azure Data Platform including Data Factory, SQL, Storage and Functions Proven experience in SQL development and data modelling Experience building both periodic batch and micro-batch data pipelines Solid understanding of enterprise data warehouse design and loading strategies A minimum of 1 year hands-on experience with Snowflake Experience working with large-scale enterprise datasets Strong analytical mindset with a clear focus on data integrity and performance Desirable Experience Advanced Snowflake performance tuning and optimisation Python and or Databricks exposure Experience designing full end-to-end data platform architectures Background supporting enterprise BI ecosystems Familiarity with CI/CD pipelines and infrastructure-as-code practices Additional Details Visa candidates will be considered Salary is open and negotiable depending on experience Immediate requirement If you are an experienced Senior Data Engineer with strong Azure and Snowflake expertise and are comfortable with a London-based hybrid working model, we d love to hear from you.
Pontoon
Senior Data Engineer
Pontoon Warwick, Warwickshire
Pontoon is an employment consultancy. We put expertise, energy, and enthusiasm into improving everyone's chance of being part of the workplace. We respect and appreciate people of all ethnicities, generations, religious beliefs, sexual orientations, gender identities, and more. We do this by showcasing their talents, skills, and unique experience in an inclusive environment that helps them thrive. Join Our Team as a Senior Data Engineer! Are you a passionate Data Engineer with a flair for innovation? Do you thrive in a dynamic environment where your skills can shape the future of data architecture? If so, we have the perfect opportunity for you! Our client, a leader in the Utilities sector, is seeking a Senior Data Engineer for a temporary role of 3 months. Role: Senior Data Engineer Duration: 3 Months (extension options) Location: Warwick (Hybrid - 1 day on site) Rate: 500- 550 per day (umbrella) Role Overview: As a Senior Data Engineer, you will play a pivotal role in enhancing the Interconnectors Data Platform (ICDP), a cloud-based data warehouse essential for commercial, financial modeling, and operational decision-making. With the platform evolving towards a modernized Medallion Architecture and Azure-native ingestion patterns, your expertise will drive architectural direction and technical leadership. Key Responsibilities: Data Architecture & Platform Engineering: Lead the design and implementation of scalable data architectures using Bronze/Silver/Gold layered models. Shape the platform's architectural roadmap, ensuring alignment with cutting-edge engineering practices. Develop secure and observable ingestion and transformation pipelines. Pipeline Development & Operations: Spearhead the migration from legacy ETL tools to modern Azure-based pipelines, using Azure Functions, Azure Data Factory (ADF), and event-driven frameworks. Build and maintain high-performance SQL transformations, curated layers, and reusable data models. Embed CI/CD, testing, version control, and observability into workflows. Data Quality & Governance: Ensure robust data validation, reconciliation, profiling, and auditability across platform layers. Collaborate with business stakeholders to guarantee analytical and operational needs are met. Leadership: Mentor fellow data engineers, fostering technical growth within the ICDP team. Collaborate with Product teams, IT&D, and external partners to achieve high-quality outcomes. Serve as a technical authority on engineering approaches, patterns, and standards. Required Skills & Experience: Essential Technical Skills: Python: Strong hands-on experience in building production-grade data pipelines and orchestration. Advanced SQL: Expert-level skills in analytical SQL, query optimization, and data modeling. Azure Cloud: Familiarity with Azure Functions, Azure Data Factory, Azure Storage, and cloud security fundamentals. Data Warehousing: In-depth understanding of data architecture principles and scalable enterprise data design. Version Control: Proficient in Git, CI/CD, automated testing, and modern engineering practices. Pipeline Design: Experience with API ingestion, SFTP ingestion, and resilient pipeline design. Soft Skills: Exceptional problem-solving and architectural thinking abilities. Strong communication and stakeholder collaboration skills. Capability to lead and provide clarity in complex technical environments. Desirable Experience: Involvement in data-platform re-architecture programs. Exposure to Medallion/Lakehouse patterns or Databricks-style ecosystems. Experience in regulated or high-assurance data environments. Why Join Us? This is your chance to be part of a transformative journey in the Utilities industry! Not only will you be enhancing your skills, but you will also contribute to a vital platform that impacts decision-making at every level. If you're ready to take on this exciting challenge and make a significant impact, we want to hear from you! Apply now and become a key player in our client's innovative team! Candidates will ideally show evidence of the above in their CV to be considered. Please be advised if you haven't heard from us within 48 hours then unfortunately your application has not been successful on this occasion, we may however keep your details on file for any suitable future vacancies and contact you accordingly. We use generative AI tools to support our candidate screening process. This helps us ensure a fair, consistent, and efficient experience for all applicants. Rest assured, all final decisions are made by our hiring team, and your application will be reviewed with care and attention.
18/02/2026
Contractor
Pontoon is an employment consultancy. We put expertise, energy, and enthusiasm into improving everyone's chance of being part of the workplace. We respect and appreciate people of all ethnicities, generations, religious beliefs, sexual orientations, gender identities, and more. We do this by showcasing their talents, skills, and unique experience in an inclusive environment that helps them thrive. Join Our Team as a Senior Data Engineer! Are you a passionate Data Engineer with a flair for innovation? Do you thrive in a dynamic environment where your skills can shape the future of data architecture? If so, we have the perfect opportunity for you! Our client, a leader in the Utilities sector, is seeking a Senior Data Engineer for a temporary role of 3 months. Role: Senior Data Engineer Duration: 3 Months (extension options) Location: Warwick (Hybrid - 1 day on site) Rate: 500- 550 per day (umbrella) Role Overview: As a Senior Data Engineer, you will play a pivotal role in enhancing the Interconnectors Data Platform (ICDP), a cloud-based data warehouse essential for commercial, financial modeling, and operational decision-making. With the platform evolving towards a modernized Medallion Architecture and Azure-native ingestion patterns, your expertise will drive architectural direction and technical leadership. Key Responsibilities: Data Architecture & Platform Engineering: Lead the design and implementation of scalable data architectures using Bronze/Silver/Gold layered models. Shape the platform's architectural roadmap, ensuring alignment with cutting-edge engineering practices. Develop secure and observable ingestion and transformation pipelines. Pipeline Development & Operations: Spearhead the migration from legacy ETL tools to modern Azure-based pipelines, using Azure Functions, Azure Data Factory (ADF), and event-driven frameworks. Build and maintain high-performance SQL transformations, curated layers, and reusable data models. Embed CI/CD, testing, version control, and observability into workflows. Data Quality & Governance: Ensure robust data validation, reconciliation, profiling, and auditability across platform layers. Collaborate with business stakeholders to guarantee analytical and operational needs are met. Leadership: Mentor fellow data engineers, fostering technical growth within the ICDP team. Collaborate with Product teams, IT&D, and external partners to achieve high-quality outcomes. Serve as a technical authority on engineering approaches, patterns, and standards. Required Skills & Experience: Essential Technical Skills: Python: Strong hands-on experience in building production-grade data pipelines and orchestration. Advanced SQL: Expert-level skills in analytical SQL, query optimization, and data modeling. Azure Cloud: Familiarity with Azure Functions, Azure Data Factory, Azure Storage, and cloud security fundamentals. Data Warehousing: In-depth understanding of data architecture principles and scalable enterprise data design. Version Control: Proficient in Git, CI/CD, automated testing, and modern engineering practices. Pipeline Design: Experience with API ingestion, SFTP ingestion, and resilient pipeline design. Soft Skills: Exceptional problem-solving and architectural thinking abilities. Strong communication and stakeholder collaboration skills. Capability to lead and provide clarity in complex technical environments. Desirable Experience: Involvement in data-platform re-architecture programs. Exposure to Medallion/Lakehouse patterns or Databricks-style ecosystems. Experience in regulated or high-assurance data environments. Why Join Us? This is your chance to be part of a transformative journey in the Utilities industry! Not only will you be enhancing your skills, but you will also contribute to a vital platform that impacts decision-making at every level. If you're ready to take on this exciting challenge and make a significant impact, we want to hear from you! Apply now and become a key player in our client's innovative team! Candidates will ideally show evidence of the above in their CV to be considered. Please be advised if you haven't heard from us within 48 hours then unfortunately your application has not been successful on this occasion, we may however keep your details on file for any suitable future vacancies and contact you accordingly. We use generative AI tools to support our candidate screening process. This helps us ensure a fair, consistent, and efficient experience for all applicants. Rest assured, all final decisions are made by our hiring team, and your application will be reviewed with care and attention.
Triad
Senior Data Engineer (AWS, Airflow, Python)
Triad
Senior Data Engineer (AWS, Airflow, Python) Based at client locations, working remotely, or based in our Godalming or Milton Keynes offices. Salary up to 65k plus company benefits. About Us Triad Group Plc is an award-winning digital, data, and solutions consultancy with over 35 years' experience primarily serving the UK public sector and central government. We deliver high-quality solutions that make a real difference to users, citizens and consumers. At Triad, collaboration thrives, knowledge is shared, and every voice matters. Our close-knit, supportive culture ensures you're valued from day one. Whether working with cutting-edge technology or shaping strategy for national-scale projects, you'll be trusted, challenged, and empowered to grow. We nurture learning through communities of practice and encourage creativity, autonomy, and innovation. If you're passionate about solving meaningful problems with smart and passionate people, Triad could be the place for you. Glassdoor score of 4.7 96% of our staff would recommend Triad to a friend 100% CEO approval See for yourself some of the work that makes us all so proud: Helping law enforcement with secure intelligence systems that keep the UK safe Supporting the UK's national meteorological service in leveraging supercomputers for next-level weather forecasting Assisting a UK government department responsible for consumer product safety with systems to track unsafe products Powering systems that help the government monitor and reduce greenhouse gas emissions from commercial transport Role Summary Triad is seeking a Senior Data Engineer to play a key role in delivering high-quality data solutions across a range of client assignments, primarily within the UK public sector. You will design, build, and optimise cloud-based data platforms, working closely with multidisciplinary teams to understand data requirements and deliver scalable, reliable, and secure data pipelines. This role offers the opportunity to shape data architecture, influence technical decisions, and contribute to meaningful, data-driven outcomes. Key Responsibilities Design, develop, and maintain scalable data pipelines to extract, transform, and load (ETL) data into cloud-based data platforms, primarily AWS. Create and manage data models that support efficient storage, retrieval, and analysis of data. Utilise AWS services such as S3, EC2, Glue, Aurora, Redshift, DynamoDB and Lambda to architect and maintain cloud data solutions. Maintain modular Terraform based IaC for reliable provisioning of AWS infrastructure. Develop, optimise and maintain robust data pipelines using Apache Airflow. Implement data transformation processes using Python to clean, preprocess, and enrich data for analytical use. Collaborate with data analysts, data scientists, developers, and other stakeholders to understand and integrate data requirements. Monitor, optimise, and tune data pipelines to ensure performance, reliability, and scalability. Identify data quality issues and implement data validation and cleansing processes. Maintain clear and comprehensive documentation covering data pipelines, models, and best practices. Work within a continuous integration environment with automated builds, deployments, and testing. Skills and Experience Strong experience designing and building data pipelines on cloud platforms, particularly AWS. Excellent proficiency in developing ETL processes and data transformation workflows. Strong SQL skills (postgresql) and advanced Python coding capability (essential). Experience working with AWS services such as S3, EC2, Glue, Aurora, Redshift, DynamoDB and Lambda (essential). Understanding of Terraform codebases to create and manage AWS infrastructure. Experience developing, optimising, and maintaining data pipelines using Apache Airflow. Familiarity with distributed data processing systems such as Spark or Databricks. Experience working with high-performing, low-latency, or large-volume data systems. Ability to collaborate effectively within cross-functional, agile, delivery-focused teams. Experience defining data models, metadata, and data dictionaries to ensure consistency and accuracy. Qualifications & Certifications A degree or equivalent qualification in Computer Science, Data Science, or a related discipline (desirable). Due to the nature of this position, you must be willing and eligible to achieve a minimum of SC clearance . To be eligible, you must have been a resident in the UK for a minimum of 5 years and have the right to work in the UK. Triad's Commitment to You As a growing and ambitious company, Triad prioritises your development and well-being: Continuous Training & Development: Access to top-rated Udemy Business courses. Work Environment: Collaborative, creative, and free from discrimination. Benefits: 25 days of annual leave, plus bank holidays. Matched pension contributions (5%). Private healthcare with Bupa. Gym membership support or Lakeshore Fitness access. Perkbox membership. Cycle-to-work scheme. What Our Colleagues Have to Say Please see for yourself on Glassdoor and our "Day in the Life" videos at the bottom of our Careers Page. Our Selection Process After applying for the role, our in-house talent team will contact you to discuss Triad and the position. If shortlisted, you will be invited for: A technical test including numerical, logical and verbal reasoning A technical interview with our consultants A management interview to assess cultural fit We aim to complete interviews and progress candidates to offer stage within 2-3 weeks of the initial conversation. Other Information If this role is of interest to you or you would like further information, please contact X and submit your application now. Triad is an equal opportunities employer and welcomes applications from all suitably qualified people regardless of sex, race, disability, age, sexual orientation, gender reassignment, religion, or belief. We are proud that our recruitment process is inclusive and accessible to disabled people who meet the minimum criteria for any role. Triad is a signatory to the Tech Talent Charter and a Disability Confident Leader.
18/02/2026
Full time
Senior Data Engineer (AWS, Airflow, Python) Based at client locations, working remotely, or based in our Godalming or Milton Keynes offices. Salary up to 65k plus company benefits. About Us Triad Group Plc is an award-winning digital, data, and solutions consultancy with over 35 years' experience primarily serving the UK public sector and central government. We deliver high-quality solutions that make a real difference to users, citizens and consumers. At Triad, collaboration thrives, knowledge is shared, and every voice matters. Our close-knit, supportive culture ensures you're valued from day one. Whether working with cutting-edge technology or shaping strategy for national-scale projects, you'll be trusted, challenged, and empowered to grow. We nurture learning through communities of practice and encourage creativity, autonomy, and innovation. If you're passionate about solving meaningful problems with smart and passionate people, Triad could be the place for you. Glassdoor score of 4.7 96% of our staff would recommend Triad to a friend 100% CEO approval See for yourself some of the work that makes us all so proud: Helping law enforcement with secure intelligence systems that keep the UK safe Supporting the UK's national meteorological service in leveraging supercomputers for next-level weather forecasting Assisting a UK government department responsible for consumer product safety with systems to track unsafe products Powering systems that help the government monitor and reduce greenhouse gas emissions from commercial transport Role Summary Triad is seeking a Senior Data Engineer to play a key role in delivering high-quality data solutions across a range of client assignments, primarily within the UK public sector. You will design, build, and optimise cloud-based data platforms, working closely with multidisciplinary teams to understand data requirements and deliver scalable, reliable, and secure data pipelines. This role offers the opportunity to shape data architecture, influence technical decisions, and contribute to meaningful, data-driven outcomes. Key Responsibilities Design, develop, and maintain scalable data pipelines to extract, transform, and load (ETL) data into cloud-based data platforms, primarily AWS. Create and manage data models that support efficient storage, retrieval, and analysis of data. Utilise AWS services such as S3, EC2, Glue, Aurora, Redshift, DynamoDB and Lambda to architect and maintain cloud data solutions. Maintain modular Terraform based IaC for reliable provisioning of AWS infrastructure. Develop, optimise and maintain robust data pipelines using Apache Airflow. Implement data transformation processes using Python to clean, preprocess, and enrich data for analytical use. Collaborate with data analysts, data scientists, developers, and other stakeholders to understand and integrate data requirements. Monitor, optimise, and tune data pipelines to ensure performance, reliability, and scalability. Identify data quality issues and implement data validation and cleansing processes. Maintain clear and comprehensive documentation covering data pipelines, models, and best practices. Work within a continuous integration environment with automated builds, deployments, and testing. Skills and Experience Strong experience designing and building data pipelines on cloud platforms, particularly AWS. Excellent proficiency in developing ETL processes and data transformation workflows. Strong SQL skills (postgresql) and advanced Python coding capability (essential). Experience working with AWS services such as S3, EC2, Glue, Aurora, Redshift, DynamoDB and Lambda (essential). Understanding of Terraform codebases to create and manage AWS infrastructure. Experience developing, optimising, and maintaining data pipelines using Apache Airflow. Familiarity with distributed data processing systems such as Spark or Databricks. Experience working with high-performing, low-latency, or large-volume data systems. Ability to collaborate effectively within cross-functional, agile, delivery-focused teams. Experience defining data models, metadata, and data dictionaries to ensure consistency and accuracy. Qualifications & Certifications A degree or equivalent qualification in Computer Science, Data Science, or a related discipline (desirable). Due to the nature of this position, you must be willing and eligible to achieve a minimum of SC clearance . To be eligible, you must have been a resident in the UK for a minimum of 5 years and have the right to work in the UK. Triad's Commitment to You As a growing and ambitious company, Triad prioritises your development and well-being: Continuous Training & Development: Access to top-rated Udemy Business courses. Work Environment: Collaborative, creative, and free from discrimination. Benefits: 25 days of annual leave, plus bank holidays. Matched pension contributions (5%). Private healthcare with Bupa. Gym membership support or Lakeshore Fitness access. Perkbox membership. Cycle-to-work scheme. What Our Colleagues Have to Say Please see for yourself on Glassdoor and our "Day in the Life" videos at the bottom of our Careers Page. Our Selection Process After applying for the role, our in-house talent team will contact you to discuss Triad and the position. If shortlisted, you will be invited for: A technical test including numerical, logical and verbal reasoning A technical interview with our consultants A management interview to assess cultural fit We aim to complete interviews and progress candidates to offer stage within 2-3 weeks of the initial conversation. Other Information If this role is of interest to you or you would like further information, please contact X and submit your application now. Triad is an equal opportunities employer and welcomes applications from all suitably qualified people regardless of sex, race, disability, age, sexual orientation, gender reassignment, religion, or belief. We are proud that our recruitment process is inclusive and accessible to disabled people who meet the minimum criteria for any role. Triad is a signatory to the Tech Talent Charter and a Disability Confident Leader.
The Portfolio Group
Senior Software Delivery & Quality Manager - Generative AI
The Portfolio Group City, London
Join an award-winning, internationally recognised B2B consultancy as a Senior Delivery & Quality Manager - Generative AI, leading the structured delivery and quality governance of AI capabilities deployed in regulated Legal and HR environments. Working closely with the Director of Search & Generative AI, you will translate roadmap priorities into sequenced delivery plans, ensuring Generative AI initiatives are executed with pace, clarity, and production readiness. You'll sit at the intersection of AI engineering, platform, and product-providing disciplined execution oversight, structured QA visibility, and evidence-based release governance. Key duties in this dynamic role will include: Lead end-to-end delivery execution of Generative AI initiatives from planning through release. Translate AI roadmap priorities into coordinated backlogs, milestones, and release plans. Establish structured quality dashboards covering AI output, retrieval performance, and system behaviour. Consolidate delivery and QA signals to support informed go/no-go decisions. Ensure releases meet robustness, compliance, and regulatory standards. Act as the connective layer between engineering execution, quality oversight, and senior stakeholders. Required Experience Strong background in technical delivery leadership within engineering-led environments. Proven experience delivering complex digital, data, or AI-enabled systems into production. Experience overseeing QA, release governance, and structured performance reporting. Ability to interpret technical quality signals and translate them into clear delivery insight. Experience of BI reporting tools and dashboards with experience of Databricks BI or OpenSearch Dashboard highly desirable. Comfortable operating in ambiguity, managing risk, and maintaining delivery momentum. Exposure to regulated or high-trust domains (Legal, HR, Finance, Healthcare) strongly preferred. Familiarity with Generative AI, RAG, or ML systems advantageous. Why Join? You'll play a pivotal role in ensuring Generative AI systems are delivered coherently, responsibly, and at scale-working alongside senior AI leadership in a market-leading consultancy. This is a high-impact role combining disciplined delivery management with structured AI quality oversight, offering strong ownership, visibility, and long-term influence. SDQM(phone number removed)AM INDAMS Portfolio Payroll Ltd is acting as an Employment Agency in relation to this vacancy.
13/02/2026
Full time
Join an award-winning, internationally recognised B2B consultancy as a Senior Delivery & Quality Manager - Generative AI, leading the structured delivery and quality governance of AI capabilities deployed in regulated Legal and HR environments. Working closely with the Director of Search & Generative AI, you will translate roadmap priorities into sequenced delivery plans, ensuring Generative AI initiatives are executed with pace, clarity, and production readiness. You'll sit at the intersection of AI engineering, platform, and product-providing disciplined execution oversight, structured QA visibility, and evidence-based release governance. Key duties in this dynamic role will include: Lead end-to-end delivery execution of Generative AI initiatives from planning through release. Translate AI roadmap priorities into coordinated backlogs, milestones, and release plans. Establish structured quality dashboards covering AI output, retrieval performance, and system behaviour. Consolidate delivery and QA signals to support informed go/no-go decisions. Ensure releases meet robustness, compliance, and regulatory standards. Act as the connective layer between engineering execution, quality oversight, and senior stakeholders. Required Experience Strong background in technical delivery leadership within engineering-led environments. Proven experience delivering complex digital, data, or AI-enabled systems into production. Experience overseeing QA, release governance, and structured performance reporting. Ability to interpret technical quality signals and translate them into clear delivery insight. Experience of BI reporting tools and dashboards with experience of Databricks BI or OpenSearch Dashboard highly desirable. Comfortable operating in ambiguity, managing risk, and maintaining delivery momentum. Exposure to regulated or high-trust domains (Legal, HR, Finance, Healthcare) strongly preferred. Familiarity with Generative AI, RAG, or ML systems advantageous. Why Join? You'll play a pivotal role in ensuring Generative AI systems are delivered coherently, responsibly, and at scale-working alongside senior AI leadership in a market-leading consultancy. This is a high-impact role combining disciplined delivery management with structured AI quality oversight, offering strong ownership, visibility, and long-term influence. SDQM(phone number removed)AM INDAMS Portfolio Payroll Ltd is acting as an Employment Agency in relation to this vacancy.
London Borough of Barnet
Insight & Intelligence Manager (18 Months FTC)
London Borough of Barnet Barnet, London
Directorate: Strategy & Innovation Contract Type: 18 Months Fixed Term Contract Hours: 36 Salary: 62,766 - 69,984 Location: Colindale Closing Date: Midnight March 9th 2026 About Barnet Council Barnet is a borough with much to be proud of. Our excellent schools, vibrant town centers, vast green spaces and diverse communities all help make it a great place to live and work. As a council we want to build on these strengths as we move into the future. We are growing and developing as an organisation to meet the challenges facing our borough and we are committed to working with partner organisations and residents to make Barnet even better. As an organisation, our staff are committed to Our Values: Learning to Improve, Caring, Inclusive, Collaborative - which drive everything we do. About the role This is an exciting time to join Barnet as we grow our Digital, Data and Technology (DDaT) capabilities and accelerate our digital transformation journey. We're investing in smarter services, better use of data, modern technology, and you'll play a key part in shaping this future. We're looking for an exceptional Insight & Intelligence Manager to play a leading role in shaping how we use data, analytics and emerging technologies to improve outcomes for our residents. This is a key leadership role within our Insight & Intelligence Hub, responsible for driving data informed decision-making across the organisation, developing advanced analytical solutions, and guiding a talented team of analysts and data scientists. You'll blend technical excellence with strategic insight, helping Barnet unlock the full value of its data through innovative approaches, strong stakeholder relationships, and a commitment to building organisational data literacy. This is a hybrid role. You will be expected to attend monthly in-person team days in our Colindale office. We also come into the office to meet service stakeholders, work together on collaboration, discovery and user testing sessions and department days. Please click here to download the Job description for this role. You're an experienced analytics or data science professional with a track record of delivering high impact insight. You combine technical depth with the ability to translate complex concepts into actionable recommendations for senior leaders. You'll bring: - Expertise in advanced analytics, data science, machine learning and data engineering. - Experience designing secure, scalable data solutions using platforms such as Azure, Data Factory, Synapse, Databricks or Fabric. - Strong proficiency in Python, SQL and/or R, and confidence in developing data pipelines and operationalising models. - Excellent communication and stakeholder engagement skills, with the ability to influence decisions and tell compelling data stories. - Experience leading, coaching or developing analysts or data scientists. - A passion for driving data culture, improving data literacy, and championing ethical, secure use of data. In this role, you will: - Lead the design and delivery of sophisticated analytical, data science and AI projects. - Build, maintain and optimise data pipelines, analytical models and data architecture. - Develop and operationalise machine learning solutions, applying MLOps best practice. - Produce high-quality, accessible dashboards and tools using platforms such as Power BI. - Work closely with senior leaders to shape strategic insight, policy development and service transformation. - Champion best practice in data quality, governance, privacy and information security. - Develop and mentor a team of analysts and data scientists, supporting high performance and continuous learning. - Strengthen cross council collaboration, working with ICT, service leads, data engineers and external partners. - Support delivery of the Council's DDaT Strategies If you thrive in a collaborative environment, enjoy solving complex problems, and want to make a tangible difference to public services, we'd love to hear from you. What we offer - 31 days annual leave, plus public and bank holidays - Access to the Local Government Pension Scheme, which provides a valuable guaranteed income in your retirement together with security for your dependents - Work-life balance options may include hybrid working, flexitime, job share, home working, part-time - A vast range of lifestyle discounts from major retailers, supermarkets, energy suppliers and more - Broad range of payroll benefits including cycle to work, eye care vouchers, travel and gym membership - Excellent training and development opportunities - Employee well- being training programs including confidential employee assistance How to apply Read the job description and person specification before clicking 'Apply' to commence the online application form. If you would like any further information about the role before applying, please contact James Rapkin, Head of Organisational Insight & Intelligence, Barnet Council is committed to safeguarding and promoting the welfare of children, young people, and vulnerable adults and expects all staff and volunteers to share this commitment. Barnet operates stringent safer recruitment procedures, this may include AI Detection Screening, Biometric ID/Right to Work Checks, Qualification and Registration Checks, Up to 6 years of Employment Data and Insights to Accelerate Screening (Konfir), Up to 5 years of Employment History References, DBS (Disclosure & Barring Service) Checks, Credit Checks and Social Media, Sanctions and Occupational Health Screening. To deliver Barnet Council's commitment to equality of opportunity in the provision of services, all staff are expected to promote equality in the workplace and in the services the Council delivers. As such we value diversity and welcome applications from all backgrounds. Barnet Council embraces all forms of flexible working (including part-time, compressed hours, and hybrid working) and is committed to offering employees a healthy work-life balance. Candidates are encouraged to talk about relevant requirements and preferences at interview. We can't promise to give you exactly what you want, but we do promise not to judge you for asking. Barnet Council is a Disability Confident Committed Employer. We welcome and encourage job applications of all abilities. If you require any reasonable adjustments in the application or interview, please contact the lead contact on this advert. We will make reasonable adjustments to make sure our disabled applicants and those with health conditions are supported throughout our recruitment process. We support the access to work scheme, further details are available at (url removed) All posts with the council are subject to a probationary period of six months, during which time you will be required to demonstrate to the council satisfaction your suitability for the position in which you will be employed. Due to the high number of applications that are received for some posts we may close vacancies before the stated closing date if sufficient number of applications are received. Therefore, please apply as soon as possible. Please ensure you regularly check the email account (including JUNK MAIL folders) that you use to submit your application, as any further communication regarding your application will be sent electronically. Should you not hear from us within four working weeks of the closing date for this post, then regretfully in this instance, you have not been shortlisted.
13/02/2026
Contractor
Directorate: Strategy & Innovation Contract Type: 18 Months Fixed Term Contract Hours: 36 Salary: 62,766 - 69,984 Location: Colindale Closing Date: Midnight March 9th 2026 About Barnet Council Barnet is a borough with much to be proud of. Our excellent schools, vibrant town centers, vast green spaces and diverse communities all help make it a great place to live and work. As a council we want to build on these strengths as we move into the future. We are growing and developing as an organisation to meet the challenges facing our borough and we are committed to working with partner organisations and residents to make Barnet even better. As an organisation, our staff are committed to Our Values: Learning to Improve, Caring, Inclusive, Collaborative - which drive everything we do. About the role This is an exciting time to join Barnet as we grow our Digital, Data and Technology (DDaT) capabilities and accelerate our digital transformation journey. We're investing in smarter services, better use of data, modern technology, and you'll play a key part in shaping this future. We're looking for an exceptional Insight & Intelligence Manager to play a leading role in shaping how we use data, analytics and emerging technologies to improve outcomes for our residents. This is a key leadership role within our Insight & Intelligence Hub, responsible for driving data informed decision-making across the organisation, developing advanced analytical solutions, and guiding a talented team of analysts and data scientists. You'll blend technical excellence with strategic insight, helping Barnet unlock the full value of its data through innovative approaches, strong stakeholder relationships, and a commitment to building organisational data literacy. This is a hybrid role. You will be expected to attend monthly in-person team days in our Colindale office. We also come into the office to meet service stakeholders, work together on collaboration, discovery and user testing sessions and department days. Please click here to download the Job description for this role. You're an experienced analytics or data science professional with a track record of delivering high impact insight. You combine technical depth with the ability to translate complex concepts into actionable recommendations for senior leaders. You'll bring: - Expertise in advanced analytics, data science, machine learning and data engineering. - Experience designing secure, scalable data solutions using platforms such as Azure, Data Factory, Synapse, Databricks or Fabric. - Strong proficiency in Python, SQL and/or R, and confidence in developing data pipelines and operationalising models. - Excellent communication and stakeholder engagement skills, with the ability to influence decisions and tell compelling data stories. - Experience leading, coaching or developing analysts or data scientists. - A passion for driving data culture, improving data literacy, and championing ethical, secure use of data. In this role, you will: - Lead the design and delivery of sophisticated analytical, data science and AI projects. - Build, maintain and optimise data pipelines, analytical models and data architecture. - Develop and operationalise machine learning solutions, applying MLOps best practice. - Produce high-quality, accessible dashboards and tools using platforms such as Power BI. - Work closely with senior leaders to shape strategic insight, policy development and service transformation. - Champion best practice in data quality, governance, privacy and information security. - Develop and mentor a team of analysts and data scientists, supporting high performance and continuous learning. - Strengthen cross council collaboration, working with ICT, service leads, data engineers and external partners. - Support delivery of the Council's DDaT Strategies If you thrive in a collaborative environment, enjoy solving complex problems, and want to make a tangible difference to public services, we'd love to hear from you. What we offer - 31 days annual leave, plus public and bank holidays - Access to the Local Government Pension Scheme, which provides a valuable guaranteed income in your retirement together with security for your dependents - Work-life balance options may include hybrid working, flexitime, job share, home working, part-time - A vast range of lifestyle discounts from major retailers, supermarkets, energy suppliers and more - Broad range of payroll benefits including cycle to work, eye care vouchers, travel and gym membership - Excellent training and development opportunities - Employee well- being training programs including confidential employee assistance How to apply Read the job description and person specification before clicking 'Apply' to commence the online application form. If you would like any further information about the role before applying, please contact James Rapkin, Head of Organisational Insight & Intelligence, Barnet Council is committed to safeguarding and promoting the welfare of children, young people, and vulnerable adults and expects all staff and volunteers to share this commitment. Barnet operates stringent safer recruitment procedures, this may include AI Detection Screening, Biometric ID/Right to Work Checks, Qualification and Registration Checks, Up to 6 years of Employment Data and Insights to Accelerate Screening (Konfir), Up to 5 years of Employment History References, DBS (Disclosure & Barring Service) Checks, Credit Checks and Social Media, Sanctions and Occupational Health Screening. To deliver Barnet Council's commitment to equality of opportunity in the provision of services, all staff are expected to promote equality in the workplace and in the services the Council delivers. As such we value diversity and welcome applications from all backgrounds. Barnet Council embraces all forms of flexible working (including part-time, compressed hours, and hybrid working) and is committed to offering employees a healthy work-life balance. Candidates are encouraged to talk about relevant requirements and preferences at interview. We can't promise to give you exactly what you want, but we do promise not to judge you for asking. Barnet Council is a Disability Confident Committed Employer. We welcome and encourage job applications of all abilities. If you require any reasonable adjustments in the application or interview, please contact the lead contact on this advert. We will make reasonable adjustments to make sure our disabled applicants and those with health conditions are supported throughout our recruitment process. We support the access to work scheme, further details are available at (url removed) All posts with the council are subject to a probationary period of six months, during which time you will be required to demonstrate to the council satisfaction your suitability for the position in which you will be employed. Due to the high number of applications that are received for some posts we may close vacancies before the stated closing date if sufficient number of applications are received. Therefore, please apply as soon as possible. Please ensure you regularly check the email account (including JUNK MAIL folders) that you use to submit your application, as any further communication regarding your application will be sent electronically. Should you not hear from us within four working weeks of the closing date for this post, then regretfully in this instance, you have not been shortlisted.

Modal Window

  • Home
  • Contact
  • About Us
  • FAQs
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • IT blog
  • Facebook
  • Twitter
  • LinkedIn
  • Youtube
© 2008-2026 IT Job Board