it job board logo
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
  • Recruiting? Post a job
  • Sign in
  • Sign up
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

115 jobs found

Email me jobs like this
Refine Search
Current Search
databricks data engineer
Michael Page
Senior Azure Data Engineer - London
Michael Page
Senior Azure data engineer We are seeking an Azure Data Engineer to join the analytics department within the financial services industry. This role focuses on designing, implementing, and maintaining data solutions using Azure technologies to support business decision-making and insights. Client Details Senior Azure data engineer Our client is a large organisation within the financial services industry, dedicated to providing innovative solutions and leveraging technology to drive business success. They are known for their commitment to excellence and their focus on delivering impactful data-driven strategies. Description Senior Azure data engineer Design and develop data pipelines and workflows using Azure technologies. Implement and manage data storage solutions, ensuring optimal performance and security. Collaborate with analytics teams to understand data requirements and deliver solutions accordingly. Monitor and maintain the performance of Azure based data systems. Ensure data integrity and accuracy across all platforms. Provide technical expertise on Azure data engineering best practices. Optimise data processes for efficiency and scalability. Troubleshoot and resolve data-related issues promptly. Profile Senior Azure data engineer A successful Azure Data Engineer should have: Proven experience in data engineering within the financial services industry. Strong expertise in Azure data technologies, including Data Factory, Databricks, and Synapse Analytics. Proficiency in SQL and other data query languages. Knowledge of data modelling, ETL processes, and data warehousing concepts. Experience in working with large datasets and ensuring data quality. Excellent problem-solving and analytical skills. A degree in Computer Science, Data Science, or a related field. Job Offer Senior Azure data engineer Competitive salary ranging from 80,000 to 95,000 per annum. Comprehensive benefits package. Opportunities for professional growth within the financial services industry. A supportive and innovative work environment. If you are an experienced Senior Azure Data Engineer looking for your next permanent opportunity, we encourage you to apply and take the next step in your career!
11/03/2026
Full time
Senior Azure data engineer We are seeking an Azure Data Engineer to join the analytics department within the financial services industry. This role focuses on designing, implementing, and maintaining data solutions using Azure technologies to support business decision-making and insights. Client Details Senior Azure data engineer Our client is a large organisation within the financial services industry, dedicated to providing innovative solutions and leveraging technology to drive business success. They are known for their commitment to excellence and their focus on delivering impactful data-driven strategies. Description Senior Azure data engineer Design and develop data pipelines and workflows using Azure technologies. Implement and manage data storage solutions, ensuring optimal performance and security. Collaborate with analytics teams to understand data requirements and deliver solutions accordingly. Monitor and maintain the performance of Azure based data systems. Ensure data integrity and accuracy across all platforms. Provide technical expertise on Azure data engineering best practices. Optimise data processes for efficiency and scalability. Troubleshoot and resolve data-related issues promptly. Profile Senior Azure data engineer A successful Azure Data Engineer should have: Proven experience in data engineering within the financial services industry. Strong expertise in Azure data technologies, including Data Factory, Databricks, and Synapse Analytics. Proficiency in SQL and other data query languages. Knowledge of data modelling, ETL processes, and data warehousing concepts. Experience in working with large datasets and ensuring data quality. Excellent problem-solving and analytical skills. A degree in Computer Science, Data Science, or a related field. Job Offer Senior Azure data engineer Competitive salary ranging from 80,000 to 95,000 per annum. Comprehensive benefits package. Opportunities for professional growth within the financial services industry. A supportive and innovative work environment. If you are an experienced Senior Azure Data Engineer looking for your next permanent opportunity, we encourage you to apply and take the next step in your career!
Synapri
Lead Platform Engineer
Synapri
Lead Data Platform Engineer - Databricks - IAC - Terraform - Azure Data Factory - Data Lakehouse The Data Platform Engineer designs, develops, automates, and maintains secure, scalable, and compliant data platforms that enable the firm to efficiently manage, analyse, and utilise data. The role ensures that data solutions are robust and reliable while meeting regulatory obligations and safeguarding client confidentiality. Key Responsibilities Design and architect scalable, secure, and compliant data platforms and solutions, producing technical documentation and securing approvals through governance bodies such as Architecture Review Boards. Build and deliver robust data solutions using Databricks, PySpark, Spark SQL, Azure Data Factory, and Azure services. Develop APIs and write efficient Python, PySpark, and SQL code to support data integration, processing, and automation. Implement and manage CI/CD pipelines and automated deployments using Azure DevOps to enable reliable releases across environments. Develop and maintain infrastructure-as-code (eg, Terraform, ARM) to provision and manage cloud resources, including ADF pipelines, Databricks assets, and Unity Catalog components. Monitor, troubleshoot, and optimise data platform performance, reliability, and costs, identifying bottlenecks and recommending improvements. Create dashboards and observability tools to report on platform performance, usage, incidents, and operational KPIs. Knowledge, Skills & Experience Degree in Computer Science, Data Engineering, or a related field. Proven experience designing and building cloud-based data platforms, ideally within Azure. Strong hands-on expertise with Databricks, PySpark, Spark SQL, and Azure Data Factory. Solid understanding of Data Lakehouse architecture and modern data platform design. Proficiency in Python for data engineering, automation, and data processing. Experience developing and integrating REST APIs for data services. Strong DevOps experience, including CI/CD, automated testing, and release management for data platforms. Experience with Infrastructure as Code tools such as Terraform or ARM templates. Knowledge of data modelling, ETL/ELT pipelines, and data warehousing concepts. Familiarity with monitoring, logging, and alerting tools (eg, Azure Monitor). Desirable Experience with additional Azure services (eg, Fabric, Azure Functions, Logic Apps). Knowledge of cloud cost optimisation for data platforms. Understanding of data governance and regulatory compliance (eg, GDPR). Experience working in regulated or professional services environments.
11/03/2026
Full time
Lead Data Platform Engineer - Databricks - IAC - Terraform - Azure Data Factory - Data Lakehouse The Data Platform Engineer designs, develops, automates, and maintains secure, scalable, and compliant data platforms that enable the firm to efficiently manage, analyse, and utilise data. The role ensures that data solutions are robust and reliable while meeting regulatory obligations and safeguarding client confidentiality. Key Responsibilities Design and architect scalable, secure, and compliant data platforms and solutions, producing technical documentation and securing approvals through governance bodies such as Architecture Review Boards. Build and deliver robust data solutions using Databricks, PySpark, Spark SQL, Azure Data Factory, and Azure services. Develop APIs and write efficient Python, PySpark, and SQL code to support data integration, processing, and automation. Implement and manage CI/CD pipelines and automated deployments using Azure DevOps to enable reliable releases across environments. Develop and maintain infrastructure-as-code (eg, Terraform, ARM) to provision and manage cloud resources, including ADF pipelines, Databricks assets, and Unity Catalog components. Monitor, troubleshoot, and optimise data platform performance, reliability, and costs, identifying bottlenecks and recommending improvements. Create dashboards and observability tools to report on platform performance, usage, incidents, and operational KPIs. Knowledge, Skills & Experience Degree in Computer Science, Data Engineering, or a related field. Proven experience designing and building cloud-based data platforms, ideally within Azure. Strong hands-on expertise with Databricks, PySpark, Spark SQL, and Azure Data Factory. Solid understanding of Data Lakehouse architecture and modern data platform design. Proficiency in Python for data engineering, automation, and data processing. Experience developing and integrating REST APIs for data services. Strong DevOps experience, including CI/CD, automated testing, and release management for data platforms. Experience with Infrastructure as Code tools such as Terraform or ARM templates. Knowledge of data modelling, ETL/ELT pipelines, and data warehousing concepts. Familiarity with monitoring, logging, and alerting tools (eg, Azure Monitor). Desirable Experience with additional Azure services (eg, Fabric, Azure Functions, Logic Apps). Knowledge of cloud cost optimisation for data platforms. Understanding of data governance and regulatory compliance (eg, GDPR). Experience working in regulated or professional services environments.
VC Talent
Senior Data Engineer
VC Talent
Senior Data Engineer A successful FTSE-listed organisation with a friendly, close-knit culture is looking for a Senior Data Engineer to help shape and deliver its evolving data platform. This is a hands-on, high-impact role in a smaller organisation where you will act as both a senior engineer and strategic data partner, designing solutions that support both operational and long-term business goals. You will play an important role in the company s migration to Microsoft Fabric, while continuing to build and optimise its existing SQL Server-based data warehouse environment. While primarily a backend engineering role, you will also support reporting via SSRS/Microsoft Power BI, with an increasing focus on AI-driven capabilities over time. As the Senior Data Engineer, you will work closely with the Solutions Architect and collaborate with teams across the business, with occasional travel to France and Germany. This opportunity suits someone with in-depth SQL experience who is looking to work with modern tools like Microsoft Fabric. Essential Skills Strong SQL Server experience (T-SQL, SSIS, SSRS, stored procedures, functions, triggers) Data warehouse architecture, build and maintenance Excellent communication skills and ability to gather requirements from non-technical stakeholders at all levels Degree in Computer Science or STEM subject Comfortable working 4 days per week onsite in Vauxhall Desirable Microsoft Fabric, Azure Synapse, Azure Data Factory, Snowflake, Databricks Python or any other relevant language Experience with tools such as Power BI, Qlik or Tableau The role offers a package of £64k+, plus a bonus of up to 15%, along with a generous pension, 28 days' holiday, private medical insurance, permanent health insurance, study support, and a modern office with an onsite gym. If your experience aligns, apply with an up-to-date CV as soon as possible, as this is expected to be a popular opportunity.
11/03/2026
Full time
Senior Data Engineer A successful FTSE-listed organisation with a friendly, close-knit culture is looking for a Senior Data Engineer to help shape and deliver its evolving data platform. This is a hands-on, high-impact role in a smaller organisation where you will act as both a senior engineer and strategic data partner, designing solutions that support both operational and long-term business goals. You will play an important role in the company s migration to Microsoft Fabric, while continuing to build and optimise its existing SQL Server-based data warehouse environment. While primarily a backend engineering role, you will also support reporting via SSRS/Microsoft Power BI, with an increasing focus on AI-driven capabilities over time. As the Senior Data Engineer, you will work closely with the Solutions Architect and collaborate with teams across the business, with occasional travel to France and Germany. This opportunity suits someone with in-depth SQL experience who is looking to work with modern tools like Microsoft Fabric. Essential Skills Strong SQL Server experience (T-SQL, SSIS, SSRS, stored procedures, functions, triggers) Data warehouse architecture, build and maintenance Excellent communication skills and ability to gather requirements from non-technical stakeholders at all levels Degree in Computer Science or STEM subject Comfortable working 4 days per week onsite in Vauxhall Desirable Microsoft Fabric, Azure Synapse, Azure Data Factory, Snowflake, Databricks Python or any other relevant language Experience with tools such as Power BI, Qlik or Tableau The role offers a package of £64k+, plus a bonus of up to 15%, along with a generous pension, 28 days' holiday, private medical insurance, permanent health insurance, study support, and a modern office with an onsite gym. If your experience aligns, apply with an up-to-date CV as soon as possible, as this is expected to be a popular opportunity.
Peregrine
Data Engineer
Peregrine
We are Data Services, our mission is to unlock the value of data by delivering high-quality, reliable, and secure data services that are accessible, understandable, and actionable. We continuously evolve our offerings, leveraging modern cloud-based technologies, and fostering strong partnerships to help our colleagues in the Bank navigate the complexities of a data-driven world and achieve their strategic objectives. Active SC Clearance Job Description: The world of data in Central Banking is evolving rapidly. With the rise of detailed data collection in financial regulation and the swift advancements in cloud-native data technologies, the demand for visionary data engineers is growing. We re seeking a senior Data Engineer to join our Data Engineering team and play a pivotal role in shaping the Bank s strategic cloud-first data platform. As a senior member of the team, you will play a key role in designing and delivering robust, scalable data solutions that support the Bank s core responsibilities around monetary policy, financial stability, and regulatory supervision. You ll contribute to technical design decisions, mentor engineers, and collaborate across teams to ensure our data infrastructure continues to evolve and meet future demands. Role Responsibilities Lead the design, development, and deployment of scalable, secure, and cost-effective distributed data solutions using Azure services (e.g., Azure Databricks, Azure Data Lake Storage, Azure Data Factory). Architect and implement advanced data pipelines using Databricks, Delta Lake, Python and Spark, ensuring performance, reliability, and maintainability across cloud and on-prem environments. Champion data quality, governance, and observability, ensuring data is accurate, timely, and fit-for-purpose for analytics, BI, and operational use cases. Drive the modernization of legacy systems, leading the migration of data infrastructure to Azure with minimal disruption and long-term scalability. Act as a technical authority on Azure-native data engineering, guiding best practices and setting standards across the team. Mentor and coach junior and mid-level engineers, fostering a culture of continuous learning, innovation, and technical excellence. Collaborate with architects, analysts, and stake holders to align data engineering efforts with strategic business goals and enterprise data strategy. Evaluate and introduce emerging technologies, tools, and methodologies to enhance the Bank s data capabilities. Own the end-to-end delivery of complex data solutions, from requirements gathering to production deployment and support. Contribute to the development of reusable frameworks, templates, and patterns to accelerate delivery and ensure consistency across projects. Minimum Criteria Extensive experience with Azure services including Azure Databricks, Azure Data Lake Storage, and Azure Data Factory. Advanced proficiency in SQL, Python, and Spark (PySpark), with a strong focus on performance optimization and distributed processing. Proven experience in CI/CD practices using industry-standard tools (e.g., GitHub Actions, Azure DevOps). Strong understanding of data architecture principles and cloud-native design patterns. Essential Criteria Demonstrated ability to lead technical delivery, mentor engineering teams and collaborate with stakeholders to ensure alignment between data solutions and business strategy. Proficiency in Linux/Unix environments and shell scripting. Deep understanding of source control, testing strategies, and agile development practices. Self-motivated with a strategic mindset and a passion for driving innovation in data engineering. Desirable Criteria Experience delivering data pipelines on Hortonworks/Cloudera on-prem and leading cloud migration initiatives. Familiarity with: Apache Airflow Data modelling and metadata management Experience influencing enterprise data strategy and contributing to architectural governance. Changed this now. I was confusing this with PDE role as I am working on that in parallel. Hope this makes sense now. data solutions rather than architectures? Should add Python here as a key tech we use Have mentioned Python in 'Minimum Criteria' section below, but will add here too this could be added to Essential Criteria ? stakeholder and project management ? Have updated in essential criteria below. But I have now used the previous version to create requisition in OBS. Will see if it can be changed. What is the difference between "minimum" and "essential" criteria. Both imply that they are mandatory and so could be one list? This is a bit confusing. I used to have just one, but this is the standard format of JD that the Bank wants us to follow. Here is the difference: Min Criteria: This must list the minimum technical skills/experience/qualifications required to do the job and should be measurable/scoreable. The screening questions you select must link to these, in order to allow candidates to best demonstrate their suitability for the role. Essential: This lists other important technical skills/experience/qualifications, and also more behavioural competencies. These are ones that are better assessed at interview rather than on screening questions on the application form Ok, I think we could go back and ask HR about this as it does seem confusing and to me doesn't give a good impression of the Bank to applicants at it looks like 2 lists for the same thing. I had checked this earlier, but seems they want us to follow this format. When I advertised last time, I just mentioned Minimum Criteria, but they said it has to be split into Minimum and Essential. Don't think we need to mention Atlas or Cloudera Manager as we hardly ever use those. Airflow could be useful so would leave that in.
10/03/2026
Full time
We are Data Services, our mission is to unlock the value of data by delivering high-quality, reliable, and secure data services that are accessible, understandable, and actionable. We continuously evolve our offerings, leveraging modern cloud-based technologies, and fostering strong partnerships to help our colleagues in the Bank navigate the complexities of a data-driven world and achieve their strategic objectives. Active SC Clearance Job Description: The world of data in Central Banking is evolving rapidly. With the rise of detailed data collection in financial regulation and the swift advancements in cloud-native data technologies, the demand for visionary data engineers is growing. We re seeking a senior Data Engineer to join our Data Engineering team and play a pivotal role in shaping the Bank s strategic cloud-first data platform. As a senior member of the team, you will play a key role in designing and delivering robust, scalable data solutions that support the Bank s core responsibilities around monetary policy, financial stability, and regulatory supervision. You ll contribute to technical design decisions, mentor engineers, and collaborate across teams to ensure our data infrastructure continues to evolve and meet future demands. Role Responsibilities Lead the design, development, and deployment of scalable, secure, and cost-effective distributed data solutions using Azure services (e.g., Azure Databricks, Azure Data Lake Storage, Azure Data Factory). Architect and implement advanced data pipelines using Databricks, Delta Lake, Python and Spark, ensuring performance, reliability, and maintainability across cloud and on-prem environments. Champion data quality, governance, and observability, ensuring data is accurate, timely, and fit-for-purpose for analytics, BI, and operational use cases. Drive the modernization of legacy systems, leading the migration of data infrastructure to Azure with minimal disruption and long-term scalability. Act as a technical authority on Azure-native data engineering, guiding best practices and setting standards across the team. Mentor and coach junior and mid-level engineers, fostering a culture of continuous learning, innovation, and technical excellence. Collaborate with architects, analysts, and stake holders to align data engineering efforts with strategic business goals and enterprise data strategy. Evaluate and introduce emerging technologies, tools, and methodologies to enhance the Bank s data capabilities. Own the end-to-end delivery of complex data solutions, from requirements gathering to production deployment and support. Contribute to the development of reusable frameworks, templates, and patterns to accelerate delivery and ensure consistency across projects. Minimum Criteria Extensive experience with Azure services including Azure Databricks, Azure Data Lake Storage, and Azure Data Factory. Advanced proficiency in SQL, Python, and Spark (PySpark), with a strong focus on performance optimization and distributed processing. Proven experience in CI/CD practices using industry-standard tools (e.g., GitHub Actions, Azure DevOps). Strong understanding of data architecture principles and cloud-native design patterns. Essential Criteria Demonstrated ability to lead technical delivery, mentor engineering teams and collaborate with stakeholders to ensure alignment between data solutions and business strategy. Proficiency in Linux/Unix environments and shell scripting. Deep understanding of source control, testing strategies, and agile development practices. Self-motivated with a strategic mindset and a passion for driving innovation in data engineering. Desirable Criteria Experience delivering data pipelines on Hortonworks/Cloudera on-prem and leading cloud migration initiatives. Familiarity with: Apache Airflow Data modelling and metadata management Experience influencing enterprise data strategy and contributing to architectural governance. Changed this now. I was confusing this with PDE role as I am working on that in parallel. Hope this makes sense now. data solutions rather than architectures? Should add Python here as a key tech we use Have mentioned Python in 'Minimum Criteria' section below, but will add here too this could be added to Essential Criteria ? stakeholder and project management ? Have updated in essential criteria below. But I have now used the previous version to create requisition in OBS. Will see if it can be changed. What is the difference between "minimum" and "essential" criteria. Both imply that they are mandatory and so could be one list? This is a bit confusing. I used to have just one, but this is the standard format of JD that the Bank wants us to follow. Here is the difference: Min Criteria: This must list the minimum technical skills/experience/qualifications required to do the job and should be measurable/scoreable. The screening questions you select must link to these, in order to allow candidates to best demonstrate their suitability for the role. Essential: This lists other important technical skills/experience/qualifications, and also more behavioural competencies. These are ones that are better assessed at interview rather than on screening questions on the application form Ok, I think we could go back and ask HR about this as it does seem confusing and to me doesn't give a good impression of the Bank to applicants at it looks like 2 lists for the same thing. I had checked this earlier, but seems they want us to follow this format. When I advertised last time, I just mentioned Minimum Criteria, but they said it has to be split into Minimum and Essential. Don't think we need to mention Atlas or Cloudera Manager as we hardly ever use those. Airflow could be useful so would leave that in.
Meritus
Data Engineer
Meritus
Contract Data Engineer - Azure / Databricks Location: London (2 days onsite) Rate: 550- 600 per day (Inside IR35) Contract: 6 months A leading UK financial institution is seeking an experienced Data Engineer to support the development and enhancement of a modern cloud-based data platform. This role will focus on building scalable data pipelines and supporting the evolution of a cloud-first data architecture. Key Responsibilities Design and develop scalable data pipelines using modern cloud technologies. Build and optimise distributed data processing solutions using Databricks, Spark and Python . Develop and maintain data integration workflows using Azure Data Factory . Work with large datasets stored in Azure Data Lake environments. Collaborate with architects, analysts and engineering teams to deliver reliable and secure data solutions. Contribute to improving data quality, performance and operational monitoring across the platform. Key Skills & Experience Strong experience with Azure Databricks, Azure Data Factory and Azure Data Lake . Advanced Python, SQL and Spark (PySpark) development experience. Experience building and optimising ETL / data pipelines in cloud environments. Knowledge of CI/CD and version control (Azure DevOps, GitHub or similar). Experience working with large-scale distributed data processing systems. Contract Details 6-month initial contract 550- 600 per day (Inside IR35) Hybrid working: 2 days per week onsite in London If you're an experienced Data Engineer with strong Azure and Databricks expertise and are available for a new contract, please apply or get in touch to discuss further.
10/03/2026
Contractor
Contract Data Engineer - Azure / Databricks Location: London (2 days onsite) Rate: 550- 600 per day (Inside IR35) Contract: 6 months A leading UK financial institution is seeking an experienced Data Engineer to support the development and enhancement of a modern cloud-based data platform. This role will focus on building scalable data pipelines and supporting the evolution of a cloud-first data architecture. Key Responsibilities Design and develop scalable data pipelines using modern cloud technologies. Build and optimise distributed data processing solutions using Databricks, Spark and Python . Develop and maintain data integration workflows using Azure Data Factory . Work with large datasets stored in Azure Data Lake environments. Collaborate with architects, analysts and engineering teams to deliver reliable and secure data solutions. Contribute to improving data quality, performance and operational monitoring across the platform. Key Skills & Experience Strong experience with Azure Databricks, Azure Data Factory and Azure Data Lake . Advanced Python, SQL and Spark (PySpark) development experience. Experience building and optimising ETL / data pipelines in cloud environments. Knowledge of CI/CD and version control (Azure DevOps, GitHub or similar). Experience working with large-scale distributed data processing systems. Contract Details 6-month initial contract 550- 600 per day (Inside IR35) Hybrid working: 2 days per week onsite in London If you're an experienced Data Engineer with strong Azure and Databricks expertise and are available for a new contract, please apply or get in touch to discuss further.
Robert Walters
Data Modeller
Robert Walters Manchester, Lancashire
Data Modeller Location: Manchester Contract: Consultant Work Setup: Hybrid - 2 days onsite (moving to 3 days in September) Who We Are We are a consultancy operating within Robert Walters, the world's most trusted talent solutions business. Across the globe, we deliver recruitment, outsourcing, and talent advisory services for businesses of all sizes, opening doors for people with diverse skills, ambitions, and backgrounds. The Role We have an exciting new opportunity for a Data Modeller to join Robert Walters as a Consultant. As a consultant, you will benefit from permanent employment with Robert Walters and will be deployed on an assignment within our clients' organisations, in return we will provide you with the opportunity to develop your skills with ongoing training and professional support. This role offers an exciting opportunity to join a global business, providing top-tier service to our blue chip clients. What you'll do Design, build and maintain scalable data pipelines and models in Databricks using Python to deliver reliable datasets for reporting and key business metrics. Develop efficient, well-structured code while adhering to technical standards, reconciliation checks, and version control practices using Git and DevOps tools. Partner with visualisation analysts to ensure data models are structured effectively for dashboards, reporting and insight generation. Work within Agile delivery teams to scope work, contribute to sprint planning and deliver outputs within agreed timelines. Engage with stakeholders to clarify requirements, provide progress updates and communicate technical concepts clearly to non-technical audiences. Continuously develop knowledge of insurance data and emerging analytics technologies to improve data solutions and support business decision-making. What you bring Strong hands-on experience with Databricks, Python and Power BI, with the ability to contribute quickly in an established environment. Background as a Data Modeller, Analytics Engineer, or Data Analyst with strong modelling experience. Experience designing scalable data models and pipelines within cloud-based data platforms. Proficiency with Git version control and development best practices. Strong analytical mindset with the ability to interpret complex datasets and produce actionable insight. Insurance or financial services experience preferred, with understanding of business reporting and operational data. What's Next? If you are ready to take the next step, apply now. Successful applicants will be contacted directly by a recruiter to discuss the role more. We are committed to creating an inclusive recruitment experience. If you require support or adjustments to the recruitment process, our Adjustment Concierge Service is here to help. Please feel free to contact us at (see below) to discuss how we can support you. This position is being recruited on behalf of our client through our Outsourcing service line. Resource Solutions Limited, trading as Robert Walters, acts as an employment business and agency, partnering with top organizations to help them find the best talent. We welcome applications from all candidates and are committed to providing equal opportunities.
10/03/2026
Full time
Data Modeller Location: Manchester Contract: Consultant Work Setup: Hybrid - 2 days onsite (moving to 3 days in September) Who We Are We are a consultancy operating within Robert Walters, the world's most trusted talent solutions business. Across the globe, we deliver recruitment, outsourcing, and talent advisory services for businesses of all sizes, opening doors for people with diverse skills, ambitions, and backgrounds. The Role We have an exciting new opportunity for a Data Modeller to join Robert Walters as a Consultant. As a consultant, you will benefit from permanent employment with Robert Walters and will be deployed on an assignment within our clients' organisations, in return we will provide you with the opportunity to develop your skills with ongoing training and professional support. This role offers an exciting opportunity to join a global business, providing top-tier service to our blue chip clients. What you'll do Design, build and maintain scalable data pipelines and models in Databricks using Python to deliver reliable datasets for reporting and key business metrics. Develop efficient, well-structured code while adhering to technical standards, reconciliation checks, and version control practices using Git and DevOps tools. Partner with visualisation analysts to ensure data models are structured effectively for dashboards, reporting and insight generation. Work within Agile delivery teams to scope work, contribute to sprint planning and deliver outputs within agreed timelines. Engage with stakeholders to clarify requirements, provide progress updates and communicate technical concepts clearly to non-technical audiences. Continuously develop knowledge of insurance data and emerging analytics technologies to improve data solutions and support business decision-making. What you bring Strong hands-on experience with Databricks, Python and Power BI, with the ability to contribute quickly in an established environment. Background as a Data Modeller, Analytics Engineer, or Data Analyst with strong modelling experience. Experience designing scalable data models and pipelines within cloud-based data platforms. Proficiency with Git version control and development best practices. Strong analytical mindset with the ability to interpret complex datasets and produce actionable insight. Insurance or financial services experience preferred, with understanding of business reporting and operational data. What's Next? If you are ready to take the next step, apply now. Successful applicants will be contacted directly by a recruiter to discuss the role more. We are committed to creating an inclusive recruitment experience. If you require support or adjustments to the recruitment process, our Adjustment Concierge Service is here to help. Please feel free to contact us at (see below) to discuss how we can support you. This position is being recruited on behalf of our client through our Outsourcing service line. Resource Solutions Limited, trading as Robert Walters, acts as an employment business and agency, partnering with top organizations to help them find the best talent. We welcome applications from all candidates and are committed to providing equal opportunities.
Searchability NS&D
Data Scientist / AI Engineer
Searchability NS&D Cheltenham, Gloucestershire
Must have active enhanced DV (West) Clearance Junior to Lead levels available £45k to £95k DoE plus 15% clearance bonus Must be willing to be full-time on-site in Cheltenham Skills required in machine learning, GenAI, NLP, Customer Engagement/Consultancy Who are we? We are recruiting Junior, Senior and Lead Data Scientists with AI specialism and enhanced DV Clearance for a prestigious client to work on a portfolio of public and private sector projects. Our client is a global leader in technology, consulting, and engineering services at the forefront of innovation to evolve the world of digital, cloud, and platforms. You'll experience excellent career progression opportunities to develop your skillset and personal profile in an inclusive culture. What will the Data Scientist be doing? Our client is seeking individuals with strong technical expertise in machine learning, GenAI, computer vision, and data science, alongside solid skills in solution architecture and software engineering to design and scale impactful solutions. This role involves working closely with clients to identify challenges, define solutions, communicate their value clearly, and lead teams to successful delivery. There are also opportunities to publish whitepapers and represent the organisation at conferences, all within an inclusive and diverse working environment. Key Skills and Requirements: Proficient in AI techniques including machine learning, GenAI, NLP, deep learning, graph analytics, and time series analysis. Strong communicator with the ability to simplify complex concepts, manage stakeholders, and motivate/lead Agile teams to deliver robust outcomes. Experienced in securing work through RFI/RFPs, bids, and presentations across public and private sectors. Skilled in data science platforms (e.g. Databricks, AzureML) and cloud services (AWS, Azure, GCP), with knowledge of tools like Terraform. Experienced in deploying solutions using Docker, Kubernetes, CI/CD tools. To be Considered: Please either apply by clicking online or emailing me directly at . For further information please call me on or - I can make myself available outside of normal working hours to suit from 7 am until 10 pm. If unavailable, please leave a message and either myself or one of my colleagues will respond. By applying for this role, you give express consent for us to process & submit (subject to required skills) your application to our client in conjunction with this vacancy only. Also feel free to connect with me on LinkedIn, just search for Henry Clay-Davies. I look forward to hearing from you. KEY SKILLS: Data Science / Data Scientist / AI Engineer / AI / Machine Learning / ML / NLP / GenAI / Stakeholder Engagement / Customer Engagement / AWS / Azure / Cloud / Docker / Kubernetes / CI/CD / Deep Learning
10/03/2026
Full time
Must have active enhanced DV (West) Clearance Junior to Lead levels available £45k to £95k DoE plus 15% clearance bonus Must be willing to be full-time on-site in Cheltenham Skills required in machine learning, GenAI, NLP, Customer Engagement/Consultancy Who are we? We are recruiting Junior, Senior and Lead Data Scientists with AI specialism and enhanced DV Clearance for a prestigious client to work on a portfolio of public and private sector projects. Our client is a global leader in technology, consulting, and engineering services at the forefront of innovation to evolve the world of digital, cloud, and platforms. You'll experience excellent career progression opportunities to develop your skillset and personal profile in an inclusive culture. What will the Data Scientist be doing? Our client is seeking individuals with strong technical expertise in machine learning, GenAI, computer vision, and data science, alongside solid skills in solution architecture and software engineering to design and scale impactful solutions. This role involves working closely with clients to identify challenges, define solutions, communicate their value clearly, and lead teams to successful delivery. There are also opportunities to publish whitepapers and represent the organisation at conferences, all within an inclusive and diverse working environment. Key Skills and Requirements: Proficient in AI techniques including machine learning, GenAI, NLP, deep learning, graph analytics, and time series analysis. Strong communicator with the ability to simplify complex concepts, manage stakeholders, and motivate/lead Agile teams to deliver robust outcomes. Experienced in securing work through RFI/RFPs, bids, and presentations across public and private sectors. Skilled in data science platforms (e.g. Databricks, AzureML) and cloud services (AWS, Azure, GCP), with knowledge of tools like Terraform. Experienced in deploying solutions using Docker, Kubernetes, CI/CD tools. To be Considered: Please either apply by clicking online or emailing me directly at . For further information please call me on or - I can make myself available outside of normal working hours to suit from 7 am until 10 pm. If unavailable, please leave a message and either myself or one of my colleagues will respond. By applying for this role, you give express consent for us to process & submit (subject to required skills) your application to our client in conjunction with this vacancy only. Also feel free to connect with me on LinkedIn, just search for Henry Clay-Davies. I look forward to hearing from you. KEY SKILLS: Data Science / Data Scientist / AI Engineer / AI / Machine Learning / ML / NLP / GenAI / Stakeholder Engagement / Customer Engagement / AWS / Azure / Cloud / Docker / Kubernetes / CI/CD / Deep Learning
BMR Associates
DevOps Engineer - Azure, Permanent, Midlands
BMR Associates City, Birmingham
Azure, PaaS, App Service, Azure SQL Database, IaC, Terraform, Application Gateway, API Management Due to continued growth, this leading edge organisation is looking to the market for an Azure Specialist to join their expanding infrastructure team. On this occasion they are actively seeking an DEVOPS ENGINEER who is ready to hit the ground running and has a proven track record of implementing Azure Infrastructure / Platform solutions from the ground up. This position is Hybrid and you will be required to work in the West Midlands offices 2 days per week To be considered you must have a strong Azure infrastructure and Platform background and come with 3 years demonstrable hands on Azure DevOps engineering experience, specifically App Service, API Management, Azure SQL Database, Databricks, Storage Accounts and ServiceBus. As well as strong knowledge of Infrastructure as Code using TerraForm and Scripting. Working closely with the key stakeholders, you will be responsible for translating the solution design to create, implement and maintain the Azure Infrastructure and Network solution, deploy new features and continuously improve performance and automation. At this level you will obviously be expected to demonstrate superb communication skills both verbally and written, along with the ability to work collaboratively in agile, cross functional teams and develop excellent working relationships with Key Stakeholders at all levels
10/03/2026
Full time
Azure, PaaS, App Service, Azure SQL Database, IaC, Terraform, Application Gateway, API Management Due to continued growth, this leading edge organisation is looking to the market for an Azure Specialist to join their expanding infrastructure team. On this occasion they are actively seeking an DEVOPS ENGINEER who is ready to hit the ground running and has a proven track record of implementing Azure Infrastructure / Platform solutions from the ground up. This position is Hybrid and you will be required to work in the West Midlands offices 2 days per week To be considered you must have a strong Azure infrastructure and Platform background and come with 3 years demonstrable hands on Azure DevOps engineering experience, specifically App Service, API Management, Azure SQL Database, Databricks, Storage Accounts and ServiceBus. As well as strong knowledge of Infrastructure as Code using TerraForm and Scripting. Working closely with the key stakeholders, you will be responsible for translating the solution design to create, implement and maintain the Azure Infrastructure and Network solution, deploy new features and continuously improve performance and automation. At this level you will obviously be expected to demonstrate superb communication skills both verbally and written, along with the ability to work collaboratively in agile, cross functional teams and develop excellent working relationships with Key Stakeholders at all levels
Hays Technology
Platform Engineer - Active SC, Databricks, Trivy, Azure DevOps
Hays Technology
Platform Engineer - Active SC, Databricks, Trivy, Azure DevOps Up to 510 per day - Inside IR35 Remote 6 months My client is an instantly recognisable consultancy who urgently require a Platform Engineer with Active SC Clearance for an end client within the public sector. Key Requirements: Proven commercial experience working as a Platform / DevOps Engineer within the public sector. Active SC Clearance. Strong, commercial experience with Terraform for IaC, and with Databricks. Proven track record configuring and managing Azure DevOps CI/CD pipelines. Deep understanding of Azure cloud services and components. Practical experience with Docker containerisation. Knowledge of security scanning tooling (Trivy or similar). Scripting proficiency in Bash (Python is desirable). Solid understanding of Git-based version control, specifically within Azure DevOps. Nice to have: Immediate availability. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at (url removed)
10/03/2026
Contractor
Platform Engineer - Active SC, Databricks, Trivy, Azure DevOps Up to 510 per day - Inside IR35 Remote 6 months My client is an instantly recognisable consultancy who urgently require a Platform Engineer with Active SC Clearance for an end client within the public sector. Key Requirements: Proven commercial experience working as a Platform / DevOps Engineer within the public sector. Active SC Clearance. Strong, commercial experience with Terraform for IaC, and with Databricks. Proven track record configuring and managing Azure DevOps CI/CD pipelines. Deep understanding of Azure cloud services and components. Practical experience with Docker containerisation. Knowledge of security scanning tooling (Trivy or similar). Scripting proficiency in Bash (Python is desirable). Solid understanding of Git-based version control, specifically within Azure DevOps. Nice to have: Immediate availability. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at (url removed)
Datatech
Senior Data Engineer - (Python & SQL)
Datatech
Senior Data Engineer (Python & SQL) Location London with hybrid working Monday to Wednesday in the office Salary 70,000 to 85,000 depending on experience Reference J13026 An AI first SaaS business that transforms high quality first party data into trusted, decision ready insight at scale is looking for a Senior Data Engineer to join its growing data and engineering team. This role sits at the core of data engineering. You will work with data that is often imperfect and transform it into well structured, reliable datasets that other teams can depend on. The focus is on engineering high quality data foundations rather than analytics or cloud infrastructure alone. You will design and build clear, maintainable data pipelines using Python and SQL within a modern data and AI platform, with a strong focus on data quality, robustness, and long term reliability. You will also play an important mentoring role within the team, supporting and guiding other data engineers and helping to raise engineering standards through thoughtful, hands on leadership. Why join A supportive and inclusive environment where different perspectives are welcomed and people are encouraged to contribute and be heard Clear progression with space to deepen your technical expertise and grow your confidence at a sustainable pace A team that values collaboration, good communication, and shared ownership over hero culture The opportunity to work on meaningful data engineering problems where quality genuinely matters What you will be doing Designing and building cloud based data and machine learning pipelines that prepare data for analytics, AI, and product use Writing clear, well-structured Python, PySpark, and SQL to transform and validate data from multiple upstream sources Taking ownership of data quality, consistency, and reliability across the pipeline lifecycle Shaping scalable data models that support a wide range of downstream use cases Working closely with Product, Engineering, and Data Science teams to understand data needs and constraints Mentoring and supporting other data engineers, sharing knowledge and encouraging good engineering practices Contributing to the long term health of the data platform through thoughtful design and continuous improvement What we are looking for Strong experience using Python and SQL to transform large, real world datasets in production environments A deep understanding of data structures, data quality challenges, and how to design reliable transformation logic Experience working with modern data platforms such as Azure, GCP, AWS, Databricks, Snowflake, or similar Confidence working with imperfect data and making it fit for consumption downstream Experience supporting or mentoring other engineers through code reviews, pairing, or informal guidance Clear, thoughtful communication and a collaborative mindset You do not need to meet every requirement listed. What matters most is strong, hands on experience using Python and SQL to work confidently with complex, real world data, apply sound engineering judgement, and help others grow through your experience. Right to work in the UK is required. Sponsorship is not available now or in the future. Apply to find out more about the role. If you have a friend or colleague who may be interested, referrals are welcome. For each successful placement, you will be eligible for our general gift or voucher scheme. Datatech is one of the UK's leading recruitment agencies specialising in analytics and is the host of the critically acclaimed Women in Data event. For more information, visit (url removed)
09/03/2026
Full time
Senior Data Engineer (Python & SQL) Location London with hybrid working Monday to Wednesday in the office Salary 70,000 to 85,000 depending on experience Reference J13026 An AI first SaaS business that transforms high quality first party data into trusted, decision ready insight at scale is looking for a Senior Data Engineer to join its growing data and engineering team. This role sits at the core of data engineering. You will work with data that is often imperfect and transform it into well structured, reliable datasets that other teams can depend on. The focus is on engineering high quality data foundations rather than analytics or cloud infrastructure alone. You will design and build clear, maintainable data pipelines using Python and SQL within a modern data and AI platform, with a strong focus on data quality, robustness, and long term reliability. You will also play an important mentoring role within the team, supporting and guiding other data engineers and helping to raise engineering standards through thoughtful, hands on leadership. Why join A supportive and inclusive environment where different perspectives are welcomed and people are encouraged to contribute and be heard Clear progression with space to deepen your technical expertise and grow your confidence at a sustainable pace A team that values collaboration, good communication, and shared ownership over hero culture The opportunity to work on meaningful data engineering problems where quality genuinely matters What you will be doing Designing and building cloud based data and machine learning pipelines that prepare data for analytics, AI, and product use Writing clear, well-structured Python, PySpark, and SQL to transform and validate data from multiple upstream sources Taking ownership of data quality, consistency, and reliability across the pipeline lifecycle Shaping scalable data models that support a wide range of downstream use cases Working closely with Product, Engineering, and Data Science teams to understand data needs and constraints Mentoring and supporting other data engineers, sharing knowledge and encouraging good engineering practices Contributing to the long term health of the data platform through thoughtful design and continuous improvement What we are looking for Strong experience using Python and SQL to transform large, real world datasets in production environments A deep understanding of data structures, data quality challenges, and how to design reliable transformation logic Experience working with modern data platforms such as Azure, GCP, AWS, Databricks, Snowflake, or similar Confidence working with imperfect data and making it fit for consumption downstream Experience supporting or mentoring other engineers through code reviews, pairing, or informal guidance Clear, thoughtful communication and a collaborative mindset You do not need to meet every requirement listed. What matters most is strong, hands on experience using Python and SQL to work confidently with complex, real world data, apply sound engineering judgement, and help others grow through your experience. Right to work in the UK is required. Sponsorship is not available now or in the future. Apply to find out more about the role. If you have a friend or colleague who may be interested, referrals are welcome. For each successful placement, you will be eligible for our general gift or voucher scheme. Datatech is one of the UK's leading recruitment agencies specialising in analytics and is the host of the critically acclaimed Women in Data event. For more information, visit (url removed)
TRIA
Senior Data Engineer
TRIA
Senior Data Engineer - Sponsorship Offered 65,000 - 72,000 + 20% Bonus + Excellent Benefits London 2-3 days on-site Our client is a leading global hospitality brand undergoing an exciting period of rapid growth and transformation. With significant investment in data and technology, they are building a world-class data platform to power decision-making across every area of the business - from supply chain and logistics to marketing, customer sales and in-store operations. We are seeking an experienced Senior Data Engineer with deep expertise in Databricks to design, build, and optimize the clients data platform. This role will be pivotal in developing scalable data pipelines, enabling advanced analytics, and driving data quality and governance across the organisation. You'll work closely with data scientists, analysts, and business stakeholders to transform raw data into trusted, actionable insights that power critical business decisions. Required Qualifications 6+ years of experience in data engineering 3+ years of hands-on experience with Azure Databricks Strong working knowledge Azure Strong knowledge of data modeling, ETL/ELT design, and data lakehouse concepts. To apply for this role please email across your CV ASAP.
09/03/2026
Full time
Senior Data Engineer - Sponsorship Offered 65,000 - 72,000 + 20% Bonus + Excellent Benefits London 2-3 days on-site Our client is a leading global hospitality brand undergoing an exciting period of rapid growth and transformation. With significant investment in data and technology, they are building a world-class data platform to power decision-making across every area of the business - from supply chain and logistics to marketing, customer sales and in-store operations. We are seeking an experienced Senior Data Engineer with deep expertise in Databricks to design, build, and optimize the clients data platform. This role will be pivotal in developing scalable data pipelines, enabling advanced analytics, and driving data quality and governance across the organisation. You'll work closely with data scientists, analysts, and business stakeholders to transform raw data into trusted, actionable insights that power critical business decisions. Required Qualifications 6+ years of experience in data engineering 3+ years of hands-on experience with Azure Databricks Strong working knowledge Azure Strong knowledge of data modeling, ETL/ELT design, and data lakehouse concepts. To apply for this role please email across your CV ASAP.
IntaPeople
Data Architect
IntaPeople Nantgarw, Cardiff
Data Architect Hybrid RCT (South Wales) IntaPeople are proud and excited to be appointed to recruit an experienced Data Architect for a Welsh-based not-for-profit sector client on an exclusive growth project. This is a very exciting opportunity to join their fast-growing Data function in this newly created position. You will be joining the data team as one of the first handful of team members in this area of the business which will work with external partners to build out the organisations data capability offering. As a Data Architect, you will be responsible for designing, building, and maintaining robust, scalable, and secure data pipelines and platform that enable them to make data -driven decisions at a enterprise level. Working closely with the Head of Data Engineering you will help grow out this data function with the recruitment of further data engineering resources whilst working closely with solutions architects and Software Engineers. You will also get the opportunity to progress into a leadership role if this suited the individuals desires and capabilities. You will shape, govern and assure the organisation s data architecture, defining, designing and maintaining strategic data models, standards, flows and governance structures that support organisational goals, ensure compliance, foster collaboration across business areas, and enable the organisation to make data-driven decisions Essential Skills Proven experience as a Senior Data Engineer or Data Architect (or similar/related role). Experience with Enterprise level Data sets. Expertise and practical experience in designing and aligning data models across multiple subject areas, applying recognised patterns and industry standards. Familiarity with structured architectural approaches found in TOGAF (data architecture) or equivalent. Proven experience defining and evolving data governance, including data quality, metadata, lineage, and policy assurance across services. Strong capability in data profiling, source system analysis and identifying links across problem domains to define common, reusable solutions. Experience of communicating technical information and data to a non technical audience and working collaboratively with analysts, architects, and product owners to deliver data solutions that meet user and organisational needs. Ability to lead and mentor other team members. Demonstrable knowledge of data modelling and data warehousing within platforms such as Azure or AWS. Practical experience with Microsoft Azure services, including Azure Data Lake (Gen2), Synapse, Event Hubs, and Cosmos DB, within scalable cloud -based architectures. Robust understanding of data governance, data quality, and metadata management. Desirable skills Experience with Azure Data Factory, Databricks, or Apache Spark, following modern ETL/ELT principles. Experience in using Git, Azure DevOps, or GitHub Actions for version control, CI/CD, and collaborative data delivery. Experience with Big Data. Certification in data architecture or governance frameworks (e.g., TOGAF, DAMA, DCAM, EDMC). Experience of using programming languages such as Python, Scala and SQL Welsh language skills. Key Responsibilities (at a glance): Establish Data strategies and data modelling internally within the data estate Lead the design and oversight of enterprise aligned data models and supporting data architecture, ensuring that all modelling approaches follow organisational standards, recognised patterns, and enable scalable, high quality data flows across services. Provide expert architectural guidance to technical teams delivering cloud based data platforms, ensuring that data integration, modelling, metadata and design decisions align with organisational and enterprise-wide standards Work closely with other business leaders to maintain governance and compliance within their data estate. Work closely with data analysts,data engineering, Enterprise and solution architects, DevOps, and business stakeholders through regular communication and collaborative planning to ensure data solutions are closely aligned with business objectives and effectively meet user needs. Contribute to the development and execution of the Data Strategy by maintaining thorough documentation of data processes, architectures, and workflows to ensure all technical and process information is systematically recorded, updated and data initiatives deliver business value and are aligned with broader technology and organisational goals Research into emerging technologies and upcoming trends Provide oversight to teams building data processing pipelines and integration patterns, ensuring their artefacts are consistent with data architecture principles and metadata strategies. Lead on the introduction of foundational data management capabilities to improve trust, accessibility, and efficiency in an organisation that has limited data management capability, lacks data management practices, including governance, metadata standards, and quality controls. Design, implement, and optimise physical data models that align with pipeline architecture, by using the approach that ensures efficient query performance, scalable storage, and robust integration and delivers adaptable and resource -efficient data processing, meeting the organisation s evolving analytical and operational demands. Managing the aspirations of a variety of stakeholders to enable successful project delivery can be challenging, especially when their priorities may differ or even conflict and require reconciliation to meet business and project needs. What you ll get in return (at a glance) A salary of circa £62,500 - £67,500 (depending on experience) 28 days annual leave + public bank holidays Hybrid working - To be based in their brand new, modern offices 1-2 days per week A flexible working environment Competitive Legal and General pension Scheme (8% employer contribution) 4 x Death in service The opportunity to work on modern and industry changing projects Progression and development opportunities Free Rail travel throughout Wales and discounted throughout the UK Salary sacrifice scheme such as cycle to work, electric vehicle A chance to truly contribute to large scale digitalisation projects within Wales For more information click APPLY now or for a confidential chat call Nathan Handley on (phone number removed). This role is commutable from Swansea, Bridgend, Pontypridd, Cardiff and Newport or surrounding areas.
06/03/2026
Full time
Data Architect Hybrid RCT (South Wales) IntaPeople are proud and excited to be appointed to recruit an experienced Data Architect for a Welsh-based not-for-profit sector client on an exclusive growth project. This is a very exciting opportunity to join their fast-growing Data function in this newly created position. You will be joining the data team as one of the first handful of team members in this area of the business which will work with external partners to build out the organisations data capability offering. As a Data Architect, you will be responsible for designing, building, and maintaining robust, scalable, and secure data pipelines and platform that enable them to make data -driven decisions at a enterprise level. Working closely with the Head of Data Engineering you will help grow out this data function with the recruitment of further data engineering resources whilst working closely with solutions architects and Software Engineers. You will also get the opportunity to progress into a leadership role if this suited the individuals desires and capabilities. You will shape, govern and assure the organisation s data architecture, defining, designing and maintaining strategic data models, standards, flows and governance structures that support organisational goals, ensure compliance, foster collaboration across business areas, and enable the organisation to make data-driven decisions Essential Skills Proven experience as a Senior Data Engineer or Data Architect (or similar/related role). Experience with Enterprise level Data sets. Expertise and practical experience in designing and aligning data models across multiple subject areas, applying recognised patterns and industry standards. Familiarity with structured architectural approaches found in TOGAF (data architecture) or equivalent. Proven experience defining and evolving data governance, including data quality, metadata, lineage, and policy assurance across services. Strong capability in data profiling, source system analysis and identifying links across problem domains to define common, reusable solutions. Experience of communicating technical information and data to a non technical audience and working collaboratively with analysts, architects, and product owners to deliver data solutions that meet user and organisational needs. Ability to lead and mentor other team members. Demonstrable knowledge of data modelling and data warehousing within platforms such as Azure or AWS. Practical experience with Microsoft Azure services, including Azure Data Lake (Gen2), Synapse, Event Hubs, and Cosmos DB, within scalable cloud -based architectures. Robust understanding of data governance, data quality, and metadata management. Desirable skills Experience with Azure Data Factory, Databricks, or Apache Spark, following modern ETL/ELT principles. Experience in using Git, Azure DevOps, or GitHub Actions for version control, CI/CD, and collaborative data delivery. Experience with Big Data. Certification in data architecture or governance frameworks (e.g., TOGAF, DAMA, DCAM, EDMC). Experience of using programming languages such as Python, Scala and SQL Welsh language skills. Key Responsibilities (at a glance): Establish Data strategies and data modelling internally within the data estate Lead the design and oversight of enterprise aligned data models and supporting data architecture, ensuring that all modelling approaches follow organisational standards, recognised patterns, and enable scalable, high quality data flows across services. Provide expert architectural guidance to technical teams delivering cloud based data platforms, ensuring that data integration, modelling, metadata and design decisions align with organisational and enterprise-wide standards Work closely with other business leaders to maintain governance and compliance within their data estate. Work closely with data analysts,data engineering, Enterprise and solution architects, DevOps, and business stakeholders through regular communication and collaborative planning to ensure data solutions are closely aligned with business objectives and effectively meet user needs. Contribute to the development and execution of the Data Strategy by maintaining thorough documentation of data processes, architectures, and workflows to ensure all technical and process information is systematically recorded, updated and data initiatives deliver business value and are aligned with broader technology and organisational goals Research into emerging technologies and upcoming trends Provide oversight to teams building data processing pipelines and integration patterns, ensuring their artefacts are consistent with data architecture principles and metadata strategies. Lead on the introduction of foundational data management capabilities to improve trust, accessibility, and efficiency in an organisation that has limited data management capability, lacks data management practices, including governance, metadata standards, and quality controls. Design, implement, and optimise physical data models that align with pipeline architecture, by using the approach that ensures efficient query performance, scalable storage, and robust integration and delivers adaptable and resource -efficient data processing, meeting the organisation s evolving analytical and operational demands. Managing the aspirations of a variety of stakeholders to enable successful project delivery can be challenging, especially when their priorities may differ or even conflict and require reconciliation to meet business and project needs. What you ll get in return (at a glance) A salary of circa £62,500 - £67,500 (depending on experience) 28 days annual leave + public bank holidays Hybrid working - To be based in their brand new, modern offices 1-2 days per week A flexible working environment Competitive Legal and General pension Scheme (8% employer contribution) 4 x Death in service The opportunity to work on modern and industry changing projects Progression and development opportunities Free Rail travel throughout Wales and discounted throughout the UK Salary sacrifice scheme such as cycle to work, electric vehicle A chance to truly contribute to large scale digitalisation projects within Wales For more information click APPLY now or for a confidential chat call Nathan Handley on (phone number removed). This role is commutable from Swansea, Bridgend, Pontypridd, Cardiff and Newport or surrounding areas.
Experis
Enterprise Data Manager (Head of)
Experis
Enterprise Data Manager/Head of Data Hybrid: 2-3 days per week in the office ( London ) Permanent Paying up to 115k + Bonus Experis are delighted to be partnering with a well-established organisation as they continue to evolve and mature their enterprise data capability during a significant phase of data platform transformation. We are supporting them in the search for an Enterprise Data Manager to take ownership of the organisation's enterprise-wide data agenda. This is a senior, strategic role focused on data leadership, governance, stakeholder alignment and platform direction, sitting above the existing Data Engineering Manager and working across multiple business functions. This role provides enterprise-level oversight and cohesion across a complex, federated data landscape. The organisation is midway through modernising its data platform and needs an experienced data leader who can bring clarity, alignment and strategic direction across teams. This is a highly visible role operating across technology and business leadership, treating data as a strategic asset and ensuring the organisation is positioned to scale its data and AI capabilities responsibly. What You'll Be Doing Leading and evolving the enterprise data strategy, turning high-level intent into a clear, actionable roadmap. Providing enterprise leadership across the data ecosystem, bringing alignment between data engineering, architecture, governance and reporting functions. Acting as the senior authority on data across the organisation, influencing senior stakeholders and shaping data-driven decision making. Providing strategic oversight of the organisation's modern data platform including Azure, Databricks, Lakehouse architecture and early exploration of Microsoft Fabric. Guiding key decisions around sources of truth, data product lifecycle management and platform operating models. Managing the strategic delivery partnership with Telef nica Tech, ensuring strong collaboration, knowledge transfer and service delivery. Overseeing vendor-delivered workstreams and shaping the future sourcing strategy as the internal capability evolves. Supporting the organisation's data governance and AI governance foundations, ensuring strong controls before AI capability is scaled further. Operating across a federated data landscape, aligning multiple teams and reducing duplication while strengthening the organisation's overall data ecosystem. Experience Required Proven experience operating at Head of Data / Enterprise Data Manager / senior data leadership level. Strong track record working within large, complex or regulated organisations. Deep understanding of modern data platforms, data governance and enterprise data operating models. Experience managing external vendors, managed services and strategic delivery partners. Comfortable leading through organisational change, transformation and ambiguity. Experience working with Databricks, Lakehouse architectures or Microsoft Fabric. Background in regulated or data-rich industries such as finance, insurance, legal or media. If you'd like to learn more, please contact Jacob Ferdinand at
06/03/2026
Full time
Enterprise Data Manager/Head of Data Hybrid: 2-3 days per week in the office ( London ) Permanent Paying up to 115k + Bonus Experis are delighted to be partnering with a well-established organisation as they continue to evolve and mature their enterprise data capability during a significant phase of data platform transformation. We are supporting them in the search for an Enterprise Data Manager to take ownership of the organisation's enterprise-wide data agenda. This is a senior, strategic role focused on data leadership, governance, stakeholder alignment and platform direction, sitting above the existing Data Engineering Manager and working across multiple business functions. This role provides enterprise-level oversight and cohesion across a complex, federated data landscape. The organisation is midway through modernising its data platform and needs an experienced data leader who can bring clarity, alignment and strategic direction across teams. This is a highly visible role operating across technology and business leadership, treating data as a strategic asset and ensuring the organisation is positioned to scale its data and AI capabilities responsibly. What You'll Be Doing Leading and evolving the enterprise data strategy, turning high-level intent into a clear, actionable roadmap. Providing enterprise leadership across the data ecosystem, bringing alignment between data engineering, architecture, governance and reporting functions. Acting as the senior authority on data across the organisation, influencing senior stakeholders and shaping data-driven decision making. Providing strategic oversight of the organisation's modern data platform including Azure, Databricks, Lakehouse architecture and early exploration of Microsoft Fabric. Guiding key decisions around sources of truth, data product lifecycle management and platform operating models. Managing the strategic delivery partnership with Telef nica Tech, ensuring strong collaboration, knowledge transfer and service delivery. Overseeing vendor-delivered workstreams and shaping the future sourcing strategy as the internal capability evolves. Supporting the organisation's data governance and AI governance foundations, ensuring strong controls before AI capability is scaled further. Operating across a federated data landscape, aligning multiple teams and reducing duplication while strengthening the organisation's overall data ecosystem. Experience Required Proven experience operating at Head of Data / Enterprise Data Manager / senior data leadership level. Strong track record working within large, complex or regulated organisations. Deep understanding of modern data platforms, data governance and enterprise data operating models. Experience managing external vendors, managed services and strategic delivery partners. Comfortable leading through organisational change, transformation and ambiguity. Experience working with Databricks, Lakehouse architectures or Microsoft Fabric. Background in regulated or data-rich industries such as finance, insurance, legal or media. If you'd like to learn more, please contact Jacob Ferdinand at
Vermelo RPO
Innovation Senior Business Analyst
Vermelo RPO
Innovation Senior Business Analyst This is a flexible, hybrid role and can be based from either of our offices in Peterborough, Manchester, Chesterfield, Stoke or Sunderland. We also have largely remote options available. Must be able to travel on an ad-hoc basis. Role Purpose The Innovation Senior Business Analyst plays a key role in the Group's Innovation function, helping to shape and deliver the Technology Innovation and GenAI roadmap. This role acts as the critical bridge between business needs and technical solutions, identifying opportunities where emerging technologies - particularly Generative AI - can drive meaningful change. Working within a multidisciplinary team, the Innovation Senior Business Analyst will lead the discovery, analysis, and validation of innovation initiatives, ensuring they are aligned with business goals and deliver measurable value. Key Accountabilities & Responsibilities Facilitate workshops, interviews, and discovery sessions to understand the business value stream, pain points and opportunities Identify and shape a pipeline of innovation initiatives that potentially reduce waste and improve processes, productivity and quality Collaborate with engineers, architects, data teams, 3rd party providers, and business SMEs to detail individual PoC requirements and success criteria, that will allow us to evaluate the feasibility of prototypes Work in a tight team to design, build, test, and validate prototypes, ensuring they are feasible and clearly demonstrate business value Support the rollout and scaling of successful PoCs across business functions. Help build AI literacy across the organisation, supporting adoption of new technologies and co-developing new ways of working Operate within an Agile framework, contributing to backlog management, sprint planning, and iterative delivery Co-develop, and enforce, AI governance policies and protocols Skills, Experience & Knowledge Experience with Lean thinking and value stream mapping. Significant experience at a senior level Business Analyst including skills in process mapping, requirements gathering and performance analysis Strong analytical and problem-solving skills, with a keen eye for detail Appreciation of value creation, commercial priorities and business case analysis Experience working in fast-paced, digital environments and Agile delivery teams. Skilled in stakeholder engagement, facilitation. Comfortable operating at all levels of the business, influencing and gaining trust Understanding of data and technology, ideally AI/ML concepts, and their business applications. Comfortable working with ambiguity and shaping early-stage ideas into tangible outcomes. Preferred Familiarity with innovation accelerators and PoC frameworks. Experience with GenAI and an understanding of its potential impact on business. Exposure to tools and platforms such as Azure, GCP, LangChain, MLflow, Databricks, Kubernetes, and CI/CD pipelines. Experience in regulated industries such as insurance or financial services. Background in digital transformation, R&D, or emerging technology teams. What we offer in return? A collaborative and fast paced work environment Health care cash plan Yearly bonus scheme 24 days annual leave plus Bank Holidays and the ability to buy additional leave (annual leave also increases with service) Life Assurance 4x annual salary Vibrant, modern offices About the business: Markerstudy is a leading provider of private insurance in the UK, insuring around 5% of the private cars on the UK roads, 20% of commercial vehicles and over 30% of motorcycles in total premium levels of circa £1.2b. Markerstudy also has a large and growing direct presence in the market as well. Having acquired and successfully integrated Co-op Insurance Services in 2021, BGLi in 2022 & Atlanta in 2024.
05/03/2026
Full time
Innovation Senior Business Analyst This is a flexible, hybrid role and can be based from either of our offices in Peterborough, Manchester, Chesterfield, Stoke or Sunderland. We also have largely remote options available. Must be able to travel on an ad-hoc basis. Role Purpose The Innovation Senior Business Analyst plays a key role in the Group's Innovation function, helping to shape and deliver the Technology Innovation and GenAI roadmap. This role acts as the critical bridge between business needs and technical solutions, identifying opportunities where emerging technologies - particularly Generative AI - can drive meaningful change. Working within a multidisciplinary team, the Innovation Senior Business Analyst will lead the discovery, analysis, and validation of innovation initiatives, ensuring they are aligned with business goals and deliver measurable value. Key Accountabilities & Responsibilities Facilitate workshops, interviews, and discovery sessions to understand the business value stream, pain points and opportunities Identify and shape a pipeline of innovation initiatives that potentially reduce waste and improve processes, productivity and quality Collaborate with engineers, architects, data teams, 3rd party providers, and business SMEs to detail individual PoC requirements and success criteria, that will allow us to evaluate the feasibility of prototypes Work in a tight team to design, build, test, and validate prototypes, ensuring they are feasible and clearly demonstrate business value Support the rollout and scaling of successful PoCs across business functions. Help build AI literacy across the organisation, supporting adoption of new technologies and co-developing new ways of working Operate within an Agile framework, contributing to backlog management, sprint planning, and iterative delivery Co-develop, and enforce, AI governance policies and protocols Skills, Experience & Knowledge Experience with Lean thinking and value stream mapping. Significant experience at a senior level Business Analyst including skills in process mapping, requirements gathering and performance analysis Strong analytical and problem-solving skills, with a keen eye for detail Appreciation of value creation, commercial priorities and business case analysis Experience working in fast-paced, digital environments and Agile delivery teams. Skilled in stakeholder engagement, facilitation. Comfortable operating at all levels of the business, influencing and gaining trust Understanding of data and technology, ideally AI/ML concepts, and their business applications. Comfortable working with ambiguity and shaping early-stage ideas into tangible outcomes. Preferred Familiarity with innovation accelerators and PoC frameworks. Experience with GenAI and an understanding of its potential impact on business. Exposure to tools and platforms such as Azure, GCP, LangChain, MLflow, Databricks, Kubernetes, and CI/CD pipelines. Experience in regulated industries such as insurance or financial services. Background in digital transformation, R&D, or emerging technology teams. What we offer in return? A collaborative and fast paced work environment Health care cash plan Yearly bonus scheme 24 days annual leave plus Bank Holidays and the ability to buy additional leave (annual leave also increases with service) Life Assurance 4x annual salary Vibrant, modern offices About the business: Markerstudy is a leading provider of private insurance in the UK, insuring around 5% of the private cars on the UK roads, 20% of commercial vehicles and over 30% of motorcycles in total premium levels of circa £1.2b. Markerstudy also has a large and growing direct presence in the market as well. Having acquired and successfully integrated Co-op Insurance Services in 2021, BGLi in 2022 & Atlanta in 2024.
Hunter Bond
Lead DataOps Engineer - Big Data
Hunter Bond
My leading Tech client are looking for a talented and motivated individual to ensure the resilience, performance, and cost-effectiveness of their Azure-based data platform. This role is essential to their data ecosystem, combining platform reliability, incident response, SLA management, cost optimization (FinOps), and deployment oversight. You will be the single point of contact for operational issues, driving rapid resolution during outages, leading communications with stakeholders, and shaping the processes that keeps their platform running smoothly and efficiently. This is a newly created role in a growing business. A brilliant opportunity! The following skills/experience is required: Proven operational leadership for large-scale data platforms. Expertise in incident management, SLA enforcement, and stakeholder communication. Hands-on experience with Azure Synapse, Databricks, ADF, Power BI. Familiarity with CI/CD and automation. Strong FinOps mindset and cost management experience. Knowledge of monitoring and observability frameworks. Salary: Up to £90,000 + bonus + package Level: Lead Engineer Location: London (good work from home options available) If you are interested in this Lead DataOps Engineer (Big Data) position and meet the above requirements please apply immediately.
05/03/2026
Full time
My leading Tech client are looking for a talented and motivated individual to ensure the resilience, performance, and cost-effectiveness of their Azure-based data platform. This role is essential to their data ecosystem, combining platform reliability, incident response, SLA management, cost optimization (FinOps), and deployment oversight. You will be the single point of contact for operational issues, driving rapid resolution during outages, leading communications with stakeholders, and shaping the processes that keeps their platform running smoothly and efficiently. This is a newly created role in a growing business. A brilliant opportunity! The following skills/experience is required: Proven operational leadership for large-scale data platforms. Expertise in incident management, SLA enforcement, and stakeholder communication. Hands-on experience with Azure Synapse, Databricks, ADF, Power BI. Familiarity with CI/CD and automation. Strong FinOps mindset and cost management experience. Knowledge of monitoring and observability frameworks. Salary: Up to £90,000 + bonus + package Level: Lead Engineer Location: London (good work from home options available) If you are interested in this Lead DataOps Engineer (Big Data) position and meet the above requirements please apply immediately.
Hunter Bond
Lead Big Data Ops Engineer
Hunter Bond
My leading Tech client are looking for a talented and motivated individual to ensure the resilience, performance, and cost-effectiveness of their Azure-based data platform. This role is essential to their data ecosystem, combining platform reliability, incident response, SLA management, cost optimization (FinOps), and deployment oversight. You will be the single point of contact for operational issues, driving rapid resolution during outages, leading communications with stakeholders, and shaping the processes that keeps their platform running smoothly and efficiently. This is a newly created role in a growing business. A brilliant opportunity! The following skills/experience is required: Proven operational leadership for large-scale data platforms. Expertise in incident management, SLA enforcement, and stakeholder communication. Hands-on experience with Azure Synapse, Databricks, ADF, Power BI. Familiarity with CI/CD and automation. Strong FinOps mindset and cost management experience. Knowledge of monitoring and observability frameworks. Salary: Up to £90,000 + bonus + package Level: Lead Engineer Location: London (good work from home options available) If you are interested in this Lead Big Data Ops Engineer position and meet the above requirements please apply immediately.
05/03/2026
Full time
My leading Tech client are looking for a talented and motivated individual to ensure the resilience, performance, and cost-effectiveness of their Azure-based data platform. This role is essential to their data ecosystem, combining platform reliability, incident response, SLA management, cost optimization (FinOps), and deployment oversight. You will be the single point of contact for operational issues, driving rapid resolution during outages, leading communications with stakeholders, and shaping the processes that keeps their platform running smoothly and efficiently. This is a newly created role in a growing business. A brilliant opportunity! The following skills/experience is required: Proven operational leadership for large-scale data platforms. Expertise in incident management, SLA enforcement, and stakeholder communication. Hands-on experience with Azure Synapse, Databricks, ADF, Power BI. Familiarity with CI/CD and automation. Strong FinOps mindset and cost management experience. Knowledge of monitoring and observability frameworks. Salary: Up to £90,000 + bonus + package Level: Lead Engineer Location: London (good work from home options available) If you are interested in this Lead Big Data Ops Engineer position and meet the above requirements please apply immediately.
Nigel Wright Group
Business Intelligence Developer
Nigel Wright Group Sunderland, Tyne And Wear
The CompanyNW Tech are delighted to be working with a financial services company in their search for a Business Intelligence Developer. The RoleJoining a newly formed team, you will take ownership of transforming ingested data from the staging layer into governed, high performance star schema models within the reporting layer using Azure Databricks and Azure SQL. Other key responsibilities include: Data Modelling - Design and maintain star schema models, ensuring fact and dimension structures are aligned to business definitions and scalable for future growth. Reporting Development and Delivery - Contribute directly to the development and enhancement of Tableau dashboards and certified data sources. Data Governance - Play a critical role in centralising KPI logic and improving consistency across the reporting estate. Platform Management - You will support management and optimisation of Tableau data sources including performance tuning and refresh reliability Collaboration - Strong stakeholder engagement will be required, including the ability to explain modelling decisions to non-technical audiences The RequirementsThis is a brilliant opportunity to work in a fast paced environment which also provides clear pathways for future progression. The role provides a unique opportunity to shape the direction, foundations and long term success of the BI function. Key requirements include: Proven experience working in a Business Intelligence, Analytics Engineering or data focused roles Strong proficiency in SQL Hands-on experience working with Python and Spark, ideally within Azure Databricks Understanding of data warehouse principles Experience within a regulated environment would be desirable
04/03/2026
Full time
The CompanyNW Tech are delighted to be working with a financial services company in their search for a Business Intelligence Developer. The RoleJoining a newly formed team, you will take ownership of transforming ingested data from the staging layer into governed, high performance star schema models within the reporting layer using Azure Databricks and Azure SQL. Other key responsibilities include: Data Modelling - Design and maintain star schema models, ensuring fact and dimension structures are aligned to business definitions and scalable for future growth. Reporting Development and Delivery - Contribute directly to the development and enhancement of Tableau dashboards and certified data sources. Data Governance - Play a critical role in centralising KPI logic and improving consistency across the reporting estate. Platform Management - You will support management and optimisation of Tableau data sources including performance tuning and refresh reliability Collaboration - Strong stakeholder engagement will be required, including the ability to explain modelling decisions to non-technical audiences The RequirementsThis is a brilliant opportunity to work in a fast paced environment which also provides clear pathways for future progression. The role provides a unique opportunity to shape the direction, foundations and long term success of the BI function. Key requirements include: Proven experience working in a Business Intelligence, Analytics Engineering or data focused roles Strong proficiency in SQL Hands-on experience working with Python and Spark, ideally within Azure Databricks Understanding of data warehouse principles Experience within a regulated environment would be desirable
Corriculo Ltd
Python Developer, NumPy, Pandas, COR7433B
Corriculo Ltd
Python Developer, NumPy, Pandas, COR7433B This is a great time for a junior-to-mid-level Python Developer, to consider joining an established Oxfordshire-based scale-up in the Medtech sector. The Python Developer will be joining a dedicated R&D team that leverage machine and deep learning for the computer vision and signal processing algorithms that underpin an innovative patient-monitoring platform, deployed in 100s of hospitals across the UK and US. Using existing Python skills to support the research, development, deployment and monitoring of deep-learning enabled products into hospitals, the Python Developer will work with a variety of technologies including the Python scientific stack (NumPy, Pandas, SciPy, etc.), frameworks such as OpenCV, PyTorch, TensorFlow and JAX, data pipeline tools such as Airflow or Prefect and data processing tools such as Spark or Databricks. The Company The Python Developer will be joining a company whose platform has been scientifically proven to deliver safer, high-quality and efficient patient care, at an exciting time of growth and expansion into new markets. As a well-funded scale-up, they offer an entrepreneurial team spirit, where you'll be contributing directly to the team's success. Working predominantly remotely, you'd ideally be onsite a couple of times each month, although there will be other opportunities for meeting up as a team off-site. Benefits Predominantly remote working 25 days' holiday, with the ability to purchase more Private health insurance Personal Learning & Development budget Wellbeing days What's required to be successful in this role? Demonstrable programming ability in Python, preferably within a data context A STEM-related degree Some experience with the Python scientific stack (NumPy, Pandas, SciPy, etc.) is preferred Experience with Docker, GIT and Linux So What's Next? If you are an experienced Python Developer looking for your next technical challenge and the opportunity to further develop your skills; apply now for immediate consideration! Software Developer, Python Developer, Software Engineer, Python Corriculo Ltd acts as an employment agency and an employment business.
04/03/2026
Full time
Python Developer, NumPy, Pandas, COR7433B This is a great time for a junior-to-mid-level Python Developer, to consider joining an established Oxfordshire-based scale-up in the Medtech sector. The Python Developer will be joining a dedicated R&D team that leverage machine and deep learning for the computer vision and signal processing algorithms that underpin an innovative patient-monitoring platform, deployed in 100s of hospitals across the UK and US. Using existing Python skills to support the research, development, deployment and monitoring of deep-learning enabled products into hospitals, the Python Developer will work with a variety of technologies including the Python scientific stack (NumPy, Pandas, SciPy, etc.), frameworks such as OpenCV, PyTorch, TensorFlow and JAX, data pipeline tools such as Airflow or Prefect and data processing tools such as Spark or Databricks. The Company The Python Developer will be joining a company whose platform has been scientifically proven to deliver safer, high-quality and efficient patient care, at an exciting time of growth and expansion into new markets. As a well-funded scale-up, they offer an entrepreneurial team spirit, where you'll be contributing directly to the team's success. Working predominantly remotely, you'd ideally be onsite a couple of times each month, although there will be other opportunities for meeting up as a team off-site. Benefits Predominantly remote working 25 days' holiday, with the ability to purchase more Private health insurance Personal Learning & Development budget Wellbeing days What's required to be successful in this role? Demonstrable programming ability in Python, preferably within a data context A STEM-related degree Some experience with the Python scientific stack (NumPy, Pandas, SciPy, etc.) is preferred Experience with Docker, GIT and Linux So What's Next? If you are an experienced Python Developer looking for your next technical challenge and the opportunity to further develop your skills; apply now for immediate consideration! Software Developer, Python Developer, Software Engineer, Python Corriculo Ltd acts as an employment agency and an employment business.
Michael Page Technology
Senior Data engineer - Databricks
Michael Page Technology Basingstoke, Hampshire
This is an exciting opportunity for a Senior Data Engineer to own the Azure/Databricks data & AI platform end-to-end from architecture and pipelines to governance, data quality, observability, ML enablement so the business gets trusted, timely, and cost-efficient data and models. Client Details Senior Data Engineer The organisation is a well established entity within the financial services industry. It operates as a medium-sized firm and focuses on providing reliable and innovative solutions to its clients. Description Senior Data Engineer Develop and oversee Azure-based solutions to support analytics functions. Design and evolve the Azure/Databricks modern data platform architecture. Build, optimise and maintain scalable ETL/ELT pipelines and data models. Implement data governance, lineage and cataloguing using Purview/Unity Catalog. Establish data quality frameworks, monitoring and observability across pipelines. Manage orchestration, platform operations and ITIL-aligned incident/change processes. Ensure strong data security, access controls and regulatory compliance. Support and enable machine learning and AI solutions within Databricks. Monitor and optimise cloud and compute costs using Azure FinOps tooling. Provide guidance and mentorship to the analytics team on Azure best practices. Profile Senior Data Engineer A successful senior data engineer will have deep, hands-on experience with Databricks and Azure, with the ability to own platform architecture, delivery and operations end-to-end (including DevOps, monitoring, reliability and cost control). They will be fluent across data engineering, governance, ML enablement and ITIL-aligned change and incident management. Proven expertise in delivering Azure and Databricks data platform solutions. Strong background in designing and optimising complex ETL/ELT pipelines. Hands-on experience with data governance, lineage and cataloguing tools. Proven ability to implement data quality, monitoring and observability practices. Experience leading platform operations, including incident and change management. Demonstrated capability in mentoring others and collaborating with diverse stakeholders. Strong leadership skills with the ability to guide and develop teams. Excellent analytical and problem-solving abilities. Job Offer Senior Data Engineer Competitive salary up to £70,000 + Bonus & Benefits. Standard benefits package provided. Permanent position within a reputable organisation. Chance to develop and lead innovative Azure-driven projects. Take the next step in your career by applying for this Senior Data Engineer position in Basingstoke today. Join a trusted organisation and make a significant impact.
04/03/2026
Full time
This is an exciting opportunity for a Senior Data Engineer to own the Azure/Databricks data & AI platform end-to-end from architecture and pipelines to governance, data quality, observability, ML enablement so the business gets trusted, timely, and cost-efficient data and models. Client Details Senior Data Engineer The organisation is a well established entity within the financial services industry. It operates as a medium-sized firm and focuses on providing reliable and innovative solutions to its clients. Description Senior Data Engineer Develop and oversee Azure-based solutions to support analytics functions. Design and evolve the Azure/Databricks modern data platform architecture. Build, optimise and maintain scalable ETL/ELT pipelines and data models. Implement data governance, lineage and cataloguing using Purview/Unity Catalog. Establish data quality frameworks, monitoring and observability across pipelines. Manage orchestration, platform operations and ITIL-aligned incident/change processes. Ensure strong data security, access controls and regulatory compliance. Support and enable machine learning and AI solutions within Databricks. Monitor and optimise cloud and compute costs using Azure FinOps tooling. Provide guidance and mentorship to the analytics team on Azure best practices. Profile Senior Data Engineer A successful senior data engineer will have deep, hands-on experience with Databricks and Azure, with the ability to own platform architecture, delivery and operations end-to-end (including DevOps, monitoring, reliability and cost control). They will be fluent across data engineering, governance, ML enablement and ITIL-aligned change and incident management. Proven expertise in delivering Azure and Databricks data platform solutions. Strong background in designing and optimising complex ETL/ELT pipelines. Hands-on experience with data governance, lineage and cataloguing tools. Proven ability to implement data quality, monitoring and observability practices. Experience leading platform operations, including incident and change management. Demonstrated capability in mentoring others and collaborating with diverse stakeholders. Strong leadership skills with the ability to guide and develop teams. Excellent analytical and problem-solving abilities. Job Offer Senior Data Engineer Competitive salary up to £70,000 + Bonus & Benefits. Standard benefits package provided. Permanent position within a reputable organisation. Chance to develop and lead innovative Azure-driven projects. Take the next step in your career by applying for this Senior Data Engineer position in Basingstoke today. Join a trusted organisation and make a significant impact.
The Portfolio Group
AI Platform Engineer
The Portfolio Group City, London
AI Platform Engineer London Excellent Salary +Benefits Join an award-winning, internationally recognised B2B consultancy as an AI Platform Engineer, owning the cloud-native platform that underpins conversational AI and generative AI products at scale. Sitting at the core of AI delivery, you will design, build, and operate the runtime, infrastructure, and operational layers supporting RAG pipelines, LLM orchestration, vector search, and evaluation workflows across AWS and Databricks. Working closely with senior AI engineers and product teams, you'll ensure AI systems are scalable, observable, secure, and cost-efficient, turning experimental AI into reliable, production-grade capabilities. With further scope of responsibilities detailed below: Own and evolve the AI platform powering conversational assistants and generative AI products. Build, operate, and optimise RAG and LLM-backed services, improving latency, reliability, and cost. Design and run cloud-native AI services across AWS and Databricks, including ingestion and embedding pipelines. Scale and operate vector search infrastructure (Weaviate, OpenSearch, Algolia, AWS Bedrock Knowledge Bases). Implement strong observability, CI/CD, security, and governance across AI workloads. Enable future architectures such as multi-model orchestration and agentic workflows. Required Skills & Experience Strong experience designing and operating cloud-native platforms on AWS (Lambda, API Gateway, DynamoDB, S3, CloudWatch). Hands-on experience with Databricks and large-scale data or embedding pipelines. Proven experience building and operating production AI systems , including RAG pipelines, LLM-backed services, and vector search (Weaviate, OpenSearch, Algolia). Proficiency in Python , with experience deploying containerised services on Kubernetes using Terraform . Solid understanding of distributed systems, cloud architecture, and API design , with a focus on scalability and reliability. Demonstrable ownership of observability, performance, cost efficiency, and operational robustness in production environments. Why Join? You'll own the foundational AI platform behind a growing suite of generative AI products, working with senior AI leaders on systems used by real customers at scale. This role offers deep technical ownership, long-term impact, and an excellent compensation package within a market-leading organisation. INDAMS Portfolio Payroll Ltd is acting as an Employment Agency in relation to this vacancy.
02/03/2026
Full time
AI Platform Engineer London Excellent Salary +Benefits Join an award-winning, internationally recognised B2B consultancy as an AI Platform Engineer, owning the cloud-native platform that underpins conversational AI and generative AI products at scale. Sitting at the core of AI delivery, you will design, build, and operate the runtime, infrastructure, and operational layers supporting RAG pipelines, LLM orchestration, vector search, and evaluation workflows across AWS and Databricks. Working closely with senior AI engineers and product teams, you'll ensure AI systems are scalable, observable, secure, and cost-efficient, turning experimental AI into reliable, production-grade capabilities. With further scope of responsibilities detailed below: Own and evolve the AI platform powering conversational assistants and generative AI products. Build, operate, and optimise RAG and LLM-backed services, improving latency, reliability, and cost. Design and run cloud-native AI services across AWS and Databricks, including ingestion and embedding pipelines. Scale and operate vector search infrastructure (Weaviate, OpenSearch, Algolia, AWS Bedrock Knowledge Bases). Implement strong observability, CI/CD, security, and governance across AI workloads. Enable future architectures such as multi-model orchestration and agentic workflows. Required Skills & Experience Strong experience designing and operating cloud-native platforms on AWS (Lambda, API Gateway, DynamoDB, S3, CloudWatch). Hands-on experience with Databricks and large-scale data or embedding pipelines. Proven experience building and operating production AI systems , including RAG pipelines, LLM-backed services, and vector search (Weaviate, OpenSearch, Algolia). Proficiency in Python , with experience deploying containerised services on Kubernetes using Terraform . Solid understanding of distributed systems, cloud architecture, and API design , with a focus on scalability and reliability. Demonstrable ownership of observability, performance, cost efficiency, and operational robustness in production environments. Why Join? You'll own the foundational AI platform behind a growing suite of generative AI products, working with senior AI leaders on systems used by real customers at scale. This role offers deep technical ownership, long-term impact, and an excellent compensation package within a market-leading organisation. INDAMS Portfolio Payroll Ltd is acting as an Employment Agency in relation to this vacancy.

Modal Window

  • Home
  • Contact
  • About Us
  • FAQs
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • IT blog
  • Facebook
  • Twitter
  • LinkedIn
  • Youtube
© 2008-2026 IT Job Board