it job board logo
  • Home
  • Find IT Jobs
  • Register CV
  • Register as Employer
  • Contact us
  • Career Advice
  • Recruiting? Post a job
  • Sign in
  • Sign up
  • Home
  • Find IT Jobs
  • Register CV
  • Register as Employer
  • Contact us
  • Career Advice
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

33 jobs found

Email me jobs like this
Refine Search
Current Search
machine learning engineer databricks
Akkodis
Lead Data Platform Specialist Remote £85k + bonus
Akkodis City, Manchester
Lead Data Platform Specialist - Up to 85k + c.15% Bonus Fully Remote - Condensed & flexible hours available (4 day working week available) My client, a nation-wide organisation with a reputation for excellence and a supportive, inclusive culture, is seeking a Lead Data Platform Engineer to join their Data Engineering and Machine Learning team. This is a high-impact, senior role, ideal for someone with deep experience in modern cloud data platforms, looking to shape and deliver a scalable, secure, and innovative data platform. You need to have strong experience working with Databricks You'll play a pivotal role in designing, building, and optimising their Azure-based Data Lakehouse, with a focus on Databricks, PySpark, Spark SQL, and Azure Data Factory. This isn't just about coding - you'll also provide architectural guidance, mentor engineers, and ensure solutions are scalable, secure, and aligned with business needs. Hands-on experience with CI/CD, automation, and infrastructure-as-code (Terraform, ARM templates) is essential. Experience in machine learning platforms or ML engineering is a bonus. Key Responsibilities: Build and maintain the Data Lakehouse platform in a secure Azure environment Develop automation for cluster management, integration runtimes, and networking Lead architectural design and ensure platform scalability, reliability, and governance Write efficient, maintainable code in PySpark, Python, and SQL Implement CI/CD pipelines and cloud infrastructure via Terraform/ARM Collaborate with data engineers, architects, and business stakeholders Mentor and coach engineers, fostering a culture of learning and excellence Essential Skills: Deep experience in Databricks, Azure Data Factory, and Lakehouse architecture Strong solution architecture and data platform engineering skills DevOps and automation expertise, including CI/CD, monitoring, and code quality Infrastructure-as-code (Terraform or ARM templates) for cloud resource provisioning Excellent communication and mentoring skills It's not often a role comes along in a team like this, where you get the chance to flex or condense hours, receive a strong salary with a bonus, potential for growth, independence and autonomy, and a clear pathway of progression. You'll get Private medical options, 25 days holiday, plus bank holiday, and the chance to buy/sell holiday. This is a rare opportunity to join a forward-thinking team at a leading organisation, working fully remotely with flexibility and excellent benefits. You'll be shaping the future of their data platform while collaborating with a talented and diverse team. To apply, please submit your CV as we are looking to move quickly on this role. Modis International Ltd acts as an employment agency for permanent recruitment and an employment business for the supply of temporary workers in the UK. Modis Europe Ltd provide a variety of international solutions that connect clients to the best talent in the world. For all positions based in Switzerland, Modis Europe Ltd works with its licensed Swiss partner Accurity GmbH to ensure that candidate applications are handled in accordance with Swiss law. Both Modis International Ltd and Modis Europe Ltd are Equal Opportunities Employers. By applying for this role your details will be submitted to Modis International Ltd and/ or Modis Europe Ltd. Our Candidate Privacy Information Statement which explains how we will use your information is available on the Modis website.
03/03/2026
Full time
Lead Data Platform Specialist - Up to 85k + c.15% Bonus Fully Remote - Condensed & flexible hours available (4 day working week available) My client, a nation-wide organisation with a reputation for excellence and a supportive, inclusive culture, is seeking a Lead Data Platform Engineer to join their Data Engineering and Machine Learning team. This is a high-impact, senior role, ideal for someone with deep experience in modern cloud data platforms, looking to shape and deliver a scalable, secure, and innovative data platform. You need to have strong experience working with Databricks You'll play a pivotal role in designing, building, and optimising their Azure-based Data Lakehouse, with a focus on Databricks, PySpark, Spark SQL, and Azure Data Factory. This isn't just about coding - you'll also provide architectural guidance, mentor engineers, and ensure solutions are scalable, secure, and aligned with business needs. Hands-on experience with CI/CD, automation, and infrastructure-as-code (Terraform, ARM templates) is essential. Experience in machine learning platforms or ML engineering is a bonus. Key Responsibilities: Build and maintain the Data Lakehouse platform in a secure Azure environment Develop automation for cluster management, integration runtimes, and networking Lead architectural design and ensure platform scalability, reliability, and governance Write efficient, maintainable code in PySpark, Python, and SQL Implement CI/CD pipelines and cloud infrastructure via Terraform/ARM Collaborate with data engineers, architects, and business stakeholders Mentor and coach engineers, fostering a culture of learning and excellence Essential Skills: Deep experience in Databricks, Azure Data Factory, and Lakehouse architecture Strong solution architecture and data platform engineering skills DevOps and automation expertise, including CI/CD, monitoring, and code quality Infrastructure-as-code (Terraform or ARM templates) for cloud resource provisioning Excellent communication and mentoring skills It's not often a role comes along in a team like this, where you get the chance to flex or condense hours, receive a strong salary with a bonus, potential for growth, independence and autonomy, and a clear pathway of progression. You'll get Private medical options, 25 days holiday, plus bank holiday, and the chance to buy/sell holiday. This is a rare opportunity to join a forward-thinking team at a leading organisation, working fully remotely with flexibility and excellent benefits. You'll be shaping the future of their data platform while collaborating with a talented and diverse team. To apply, please submit your CV as we are looking to move quickly on this role. Modis International Ltd acts as an employment agency for permanent recruitment and an employment business for the supply of temporary workers in the UK. Modis Europe Ltd provide a variety of international solutions that connect clients to the best talent in the world. For all positions based in Switzerland, Modis Europe Ltd works with its licensed Swiss partner Accurity GmbH to ensure that candidate applications are handled in accordance with Swiss law. Both Modis International Ltd and Modis Europe Ltd are Equal Opportunities Employers. By applying for this role your details will be submitted to Modis International Ltd and/ or Modis Europe Ltd. Our Candidate Privacy Information Statement which explains how we will use your information is available on the Modis website.
Amorsa Ltd.
Senior-level Lead Databricks Engineer
Amorsa Ltd.
We are seeking a senior-level Lead Databricks Engineer to drive a large-scale Azure-to-Databricks transformation within a complex public sector environment. The role combines deep technical architecture capability with strong stakeholder engagement and governance oversight. You will lead discovery, architecture design, migration planning, security hardening, and post-migration optimisation for an enterprise Azure Data Science Platform and Databricks-based MLS (Machine Learning Service) environment. Strong stakeholder management is essential, this is a transformation journey, not just a technical implementation. Discovery & Assessment Produce a comprehensive Discovery and Assessment Report Conduct Azure tenant and subscription reviews Identify configuration drift, governance gaps, and security risks Assess workloads, integrations, and dependencies Architecture & Design Develop a Target Architecture Design Document (Databricks-based MLS) Define landing zones, network topology, and security architecture Design governance model using Unity Catalog Produce high- and low-level architecture documentation Define runtime upgrade and environment standardisation strategy Migration Planning & Execution Deliver a Migration Plan and Runbook, Lead migration of; Azure SQL Databases, Storage Accounts, Virtual Machines (assessment and right-sizing), App Services and Function Apps, Azure Data Gateway/Power BI Gateway Plan and execute; Databricks workspace setup, Cluster and pool configuration, Unity Catalog implementation, Service principal reconfiguration,App registration provisioning (AD/Entra ID) Cloud Infrastructure & Networking Hands-on expertise required in; Azure Portal environment review and governance setup, VNet architecture: Subnets, Peering, NSGs, Private Endpoints, Private DNS Zones Security & Compliance Monitoring, Optimisation & Governance Databricks Expertise Required Deep, hands-on experience with; Azure Databricks, Unity Catalog, Workspace, cluster, and pool management, Runtime upgrades and life cycle management, Private link connectivity, Secure enterprise integration,CI/CD integration (eg, Azure DevOps), Service principal RBAC elevation via automation Scripting Azure Expertise Required Advanced experience across; Microsoft Azure, Microsoft Entra ID (formerly Azure AD), Azure SQL Database migration, App Registrations and Service Principals, Key Vault (IAM-based model), Application Gateway & WAF, Azure Monitor & Log Analytics, Subscription and tenant-level governance Stakeholder Management Expectations Engage senior technical and non-technical stakeholders Translate technical architecture into executive-level reporting Lead workshops and architecture reviews Manage transformation roadmaps Support procurement, risk, and compliance conversations Public sector engagement experience strongly preferred. Desirable Experience; AI/ML platform architecture within Databricks MLS, Public sector governance frameworks, Large-scale cloud modernisation programmes, Enterprise data platform design, Security-cleared environments Security Clearance would be a distinct advantage. Knowledge of AI and MLS with Databricks would make you a standout candidate.
02/03/2026
Contractor
We are seeking a senior-level Lead Databricks Engineer to drive a large-scale Azure-to-Databricks transformation within a complex public sector environment. The role combines deep technical architecture capability with strong stakeholder engagement and governance oversight. You will lead discovery, architecture design, migration planning, security hardening, and post-migration optimisation for an enterprise Azure Data Science Platform and Databricks-based MLS (Machine Learning Service) environment. Strong stakeholder management is essential, this is a transformation journey, not just a technical implementation. Discovery & Assessment Produce a comprehensive Discovery and Assessment Report Conduct Azure tenant and subscription reviews Identify configuration drift, governance gaps, and security risks Assess workloads, integrations, and dependencies Architecture & Design Develop a Target Architecture Design Document (Databricks-based MLS) Define landing zones, network topology, and security architecture Design governance model using Unity Catalog Produce high- and low-level architecture documentation Define runtime upgrade and environment standardisation strategy Migration Planning & Execution Deliver a Migration Plan and Runbook, Lead migration of; Azure SQL Databases, Storage Accounts, Virtual Machines (assessment and right-sizing), App Services and Function Apps, Azure Data Gateway/Power BI Gateway Plan and execute; Databricks workspace setup, Cluster and pool configuration, Unity Catalog implementation, Service principal reconfiguration,App registration provisioning (AD/Entra ID) Cloud Infrastructure & Networking Hands-on expertise required in; Azure Portal environment review and governance setup, VNet architecture: Subnets, Peering, NSGs, Private Endpoints, Private DNS Zones Security & Compliance Monitoring, Optimisation & Governance Databricks Expertise Required Deep, hands-on experience with; Azure Databricks, Unity Catalog, Workspace, cluster, and pool management, Runtime upgrades and life cycle management, Private link connectivity, Secure enterprise integration,CI/CD integration (eg, Azure DevOps), Service principal RBAC elevation via automation Scripting Azure Expertise Required Advanced experience across; Microsoft Azure, Microsoft Entra ID (formerly Azure AD), Azure SQL Database migration, App Registrations and Service Principals, Key Vault (IAM-based model), Application Gateway & WAF, Azure Monitor & Log Analytics, Subscription and tenant-level governance Stakeholder Management Expectations Engage senior technical and non-technical stakeholders Translate technical architecture into executive-level reporting Lead workshops and architecture reviews Manage transformation roadmaps Support procurement, risk, and compliance conversations Public sector engagement experience strongly preferred. Desirable Experience; AI/ML platform architecture within Databricks MLS, Public sector governance frameworks, Large-scale cloud modernisation programmes, Enterprise data platform design, Security-cleared environments Security Clearance would be a distinct advantage. Knowledge of AI and MLS with Databricks would make you a standout candidate.
Fruition Group
Solution Architect - AI & Automation
Fruition Group
Job Title: Solutions Architect (AI and Automation) Location: Remote (occasional travel to London required) Salary: £75,000 - £85,000 (Depending on location) Would you like to work for an organisation plays a pivotal role in safeguarding and improving the quality of health and social care services across England. My client plays a vital role in improving outcomes across health and social care by using data, technology, and insight to drive meaningful change. With a strong focus on innovation and digital transformation, they are investing in modern cloud platforms, artificial intelligence, and advanced analytics to become a truly intelligence-led regulator. Solutions Architect Responsibilities Design scalable, secure, and ethical AI and data solutions using Microsoft Azure technologies. Translate business requirements into end-to-end solution architecture across AI, machine learning, and data platforms. Maintain architectural blueprints ensuring interoperability, resilience, governance, and regulatory compliance. Lead architecture design reviews and provide technical leadership to delivery teams using Agile, DevOps, and CI/CD practices. Mentor technical teams and promote responsible AI practices, including transparency, fairness, and security. Solutions Architect Requirements Extensive experience delivering solutions on the Microsoft Azure Data Platform (Synapse Analytics, Data Lake, Databricks, Azure ML, Power BI). Expertise in modern data engineering approaches such as ETL/ELT pipelines, streaming, APIs, and event-driven architecture. Experience with cloud-native architectures, microservices, and SaaS integrations. Knowledge of data governance, information security, and AI ethics within regulated industries. Excellent stakeholder engagement and leadership skills. We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief, sexual orientation or age.
02/03/2026
Full time
Job Title: Solutions Architect (AI and Automation) Location: Remote (occasional travel to London required) Salary: £75,000 - £85,000 (Depending on location) Would you like to work for an organisation plays a pivotal role in safeguarding and improving the quality of health and social care services across England. My client plays a vital role in improving outcomes across health and social care by using data, technology, and insight to drive meaningful change. With a strong focus on innovation and digital transformation, they are investing in modern cloud platforms, artificial intelligence, and advanced analytics to become a truly intelligence-led regulator. Solutions Architect Responsibilities Design scalable, secure, and ethical AI and data solutions using Microsoft Azure technologies. Translate business requirements into end-to-end solution architecture across AI, machine learning, and data platforms. Maintain architectural blueprints ensuring interoperability, resilience, governance, and regulatory compliance. Lead architecture design reviews and provide technical leadership to delivery teams using Agile, DevOps, and CI/CD practices. Mentor technical teams and promote responsible AI practices, including transparency, fairness, and security. Solutions Architect Requirements Extensive experience delivering solutions on the Microsoft Azure Data Platform (Synapse Analytics, Data Lake, Databricks, Azure ML, Power BI). Expertise in modern data engineering approaches such as ETL/ELT pipelines, streaming, APIs, and event-driven architecture. Experience with cloud-native architectures, microservices, and SaaS integrations. Knowledge of data governance, information security, and AI ethics within regulated industries. Excellent stakeholder engagement and leadership skills. We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief, sexual orientation or age.
Data Engineer
Youngs Employment Services
Data Engineer London + 2 or 3 days work from home Circ £60,000 - £70,000 + Excellent Benefits Package A fantastic opportunity is available for a Data Engineer that enjoys working in a fast paced and collaborative team playing work environment. Our client has been expanding at a remarkable pace and have transformed their technical landscape with leading edge solutions. Having implemented a new MS Fabric based Data platform, the need is now to scale up and deliver data driven insights and strategies right across the business globally. The Data Engineer will be joining a close-knit team that is the hub of our client s global data & analytics operation. Previous experience with MS Fabric would be beneficial but is by no means essential. Interested candidates must have experience in a similar role with MS Azure Data Platforms, Synapse, Databricks or other Cloud platforms such as AWS, GCP, Snowflake etc. Key Responsibilities will include; Design, implement, and optimize end-to-end solutions using Fabric components: o Data Factory (pipelines, orchestration) o Data Engineering (Lakehouse, notebooks, Apache Spark) o Data Warehouse (SQL endpoints, schemas, MPP performance tuning) o Real-Time Analytics (KQL databases, event ingestion) o Manage and enhance OneLake architecture, delta lake tables, security policies, and data governance within Fabric. o Build scalable, reusable data assets and engineering patterns that support analytics, reporting, and machine learning workloads. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver effective solutions. Troubleshoot and resolve data-related issues in a timely manner. Key Experience, Skills and Knowledge: Proven 2 yrs+ experience as a Data Engineer or similar role, with a strong focus on PySpark, SQL, Microsoft Azure Data platforms and Power BI an advantage Proficiency in development languages suitable for intermediate-level data engineers, such as: Python / PySpark: Widely used for data manipulation, analysis, and scripting. SQL: Essential for querying and managing relational databases. Understanding of D365 F&O Data Structures is highly desirable Strong problem-solving skills and attention to detail. Excellent communication and collaboration abilities. This is a hybrid role based in Central / West London with the flexibility to work from home 2 or 3 days per week. Salary will be dependent on experience and expected to be in the region of £60,000 - £70,000 + an attractive benefits package including bonus scheme. For further information, please send your CV to Wayne Young at Young's Employment Services Ltd. YES are operating as both a recruitment Agency and Recruitment Business
27/02/2026
Full time
Data Engineer London + 2 or 3 days work from home Circ £60,000 - £70,000 + Excellent Benefits Package A fantastic opportunity is available for a Data Engineer that enjoys working in a fast paced and collaborative team playing work environment. Our client has been expanding at a remarkable pace and have transformed their technical landscape with leading edge solutions. Having implemented a new MS Fabric based Data platform, the need is now to scale up and deliver data driven insights and strategies right across the business globally. The Data Engineer will be joining a close-knit team that is the hub of our client s global data & analytics operation. Previous experience with MS Fabric would be beneficial but is by no means essential. Interested candidates must have experience in a similar role with MS Azure Data Platforms, Synapse, Databricks or other Cloud platforms such as AWS, GCP, Snowflake etc. Key Responsibilities will include; Design, implement, and optimize end-to-end solutions using Fabric components: o Data Factory (pipelines, orchestration) o Data Engineering (Lakehouse, notebooks, Apache Spark) o Data Warehouse (SQL endpoints, schemas, MPP performance tuning) o Real-Time Analytics (KQL databases, event ingestion) o Manage and enhance OneLake architecture, delta lake tables, security policies, and data governance within Fabric. o Build scalable, reusable data assets and engineering patterns that support analytics, reporting, and machine learning workloads. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver effective solutions. Troubleshoot and resolve data-related issues in a timely manner. Key Experience, Skills and Knowledge: Proven 2 yrs+ experience as a Data Engineer or similar role, with a strong focus on PySpark, SQL, Microsoft Azure Data platforms and Power BI an advantage Proficiency in development languages suitable for intermediate-level data engineers, such as: Python / PySpark: Widely used for data manipulation, analysis, and scripting. SQL: Essential for querying and managing relational databases. Understanding of D365 F&O Data Structures is highly desirable Strong problem-solving skills and attention to detail. Excellent communication and collaboration abilities. This is a hybrid role based in Central / West London with the flexibility to work from home 2 or 3 days per week. Salary will be dependent on experience and expected to be in the region of £60,000 - £70,000 + an attractive benefits package including bonus scheme. For further information, please send your CV to Wayne Young at Young's Employment Services Ltd. YES are operating as both a recruitment Agency and Recruitment Business
Cathcart Technology
Data Engineer
Cathcart Technology
I'm working with a world-class, product-led technolog y company in Edinburgh to help them find a Data Engineer to join their growing team ( hybrid working , typically 1-2 days in the office). This is an opportunity to join a business operating at serious scale , building data systems that power products used by millions of customers . You'll be joining a high-performing data engineering team where data is central to how the organisation makes decisions. The team is responsible for building and maintaining both batch and streaming pipelines that support analytics, machine learning and key business reporting. It's a fully hands-on role in a modern, cloud-first environment, working on scalable , production-grade data solutions. As a Data Engineer, you'll design, develop and maintain reliable data pipelines and infrastructure, with a strong focus on data quality, performance and clean engineering standards. You'll collaborate closely with analysts, data scientists and fellow engineers to deliver high-quality, consumable datasets while contributing to best practices across the wider platform. You'll be working heavily with Python , SQL and Spark , using tools such as Databricks , Airflow , dbt and Kafka within AWS . Experience with modern data stacks is important, along with a solid understanding of data warehousing concepts, ETL/ELT processing , dimensional modelling and both batch and real-time ingestion patterns. Given the scale they operate at, reliability and performance are critical . Experience building robust pipelines, working with orchestration and monitoring tools, and contributing to well-tested, scalable data solutions will be highly valuable. The organisation has grown significantly over the past few years and continues to scale, meaning there is genuine scope for progression as the data function continues to expand. Their office is based in central Edinburgh and offers a great environment for collaboration when onsite. In return, they're offering a competitive salary and an excellent overall benefits package which includes a bonus and unlimited holidays . Hybrid working is standard (ideally 1-2 days in the office). If you're keen to join a fast-growing, data-driven organisation where you can work on systems operating at real scale, please apply or get in touch with Matthew MacAlpine at Cathcart Technology for a chat. Cathcart Technology is acting as an Employment Agency in relation to this vacancy.
25/02/2026
Full time
I'm working with a world-class, product-led technolog y company in Edinburgh to help them find a Data Engineer to join their growing team ( hybrid working , typically 1-2 days in the office). This is an opportunity to join a business operating at serious scale , building data systems that power products used by millions of customers . You'll be joining a high-performing data engineering team where data is central to how the organisation makes decisions. The team is responsible for building and maintaining both batch and streaming pipelines that support analytics, machine learning and key business reporting. It's a fully hands-on role in a modern, cloud-first environment, working on scalable , production-grade data solutions. As a Data Engineer, you'll design, develop and maintain reliable data pipelines and infrastructure, with a strong focus on data quality, performance and clean engineering standards. You'll collaborate closely with analysts, data scientists and fellow engineers to deliver high-quality, consumable datasets while contributing to best practices across the wider platform. You'll be working heavily with Python , SQL and Spark , using tools such as Databricks , Airflow , dbt and Kafka within AWS . Experience with modern data stacks is important, along with a solid understanding of data warehousing concepts, ETL/ELT processing , dimensional modelling and both batch and real-time ingestion patterns. Given the scale they operate at, reliability and performance are critical . Experience building robust pipelines, working with orchestration and monitoring tools, and contributing to well-tested, scalable data solutions will be highly valuable. The organisation has grown significantly over the past few years and continues to scale, meaning there is genuine scope for progression as the data function continues to expand. Their office is based in central Edinburgh and offers a great environment for collaboration when onsite. In return, they're offering a competitive salary and an excellent overall benefits package which includes a bonus and unlimited holidays . Hybrid working is standard (ideally 1-2 days in the office). If you're keen to join a fast-growing, data-driven organisation where you can work on systems operating at real scale, please apply or get in touch with Matthew MacAlpine at Cathcart Technology for a chat. Cathcart Technology is acting as an Employment Agency in relation to this vacancy.
E-SOLUTIONS IT SERVICES UK LTD
Software Development Engineer in Test (SDET)
E-SOLUTIONS IT SERVICES UK LTD
Role: SDET (Software Development Engineer in Test) We are looking for the Software engineer who have experience in working in Test Engineering role and hands-on experience in Python, AWS and testing tools like pytest, playwright . Key Responsibilities • Design and build high-performance tools and services to validate the reliability, performance, and correctness of ML data pipelines and AI infrastructure.• Develop platform-level test solutions and automation frameworks using Python, Terraform, and modern cloud-native practices.• Contribute to the platform's CI/CD pipeline by integrating automated testing, resilience checks, and observability hooks at every stage.• Lead initiatives that drive testability, platform resilience, and validation as code across all layers of the ML platform stack.• Collaborate with engineering, MLOps, and infrastructure teams to embed quality engineering deeply into platform components.• Build reusable components that support scalability, modularity, and self-service quality tooling.• Mentor junior engineers and influence technical standards across the Test Engineering Program. Required Qualifications • Bachelor's or master's degree in computer science, Engineering, or a related technical field.• 8+ years of hands-on software development experience, including large-scale backend systems or platform engineering.• Expert in Python with a strong understanding of object-oriented programming, testing frameworks, and automation libraries.• Experience building or validating platform infrastructure, with hands-on knowledge of CI/CD systems, GitHub Actions, Jenkins, or similar tools.• Solid experience with AWS services (Lambda, S3, ECS/EKS, Step Functions, CloudWatch).• Proficient in Infrastructure as Code using Terraform to manage and provision cloud infrastructure.• Strong understanding of software engineering best practices: code quality, reliability, performance optimization, and observability. Preferred Qualifications • Exposure to machine learning workflows, model lifecycle management, or data engineering platforms.• Experience with distributed systems, event-driven architectures (e.g., Kafka), and big data platforms (e.g., Spark, Databricks).• Familiarity with banking or financial domain use cases, including data governance and compliance-focused development.• Knowledge of platform security, monitoring, and resilient architecture patterns.
24/02/2026
Full time
Role: SDET (Software Development Engineer in Test) We are looking for the Software engineer who have experience in working in Test Engineering role and hands-on experience in Python, AWS and testing tools like pytest, playwright . Key Responsibilities • Design and build high-performance tools and services to validate the reliability, performance, and correctness of ML data pipelines and AI infrastructure.• Develop platform-level test solutions and automation frameworks using Python, Terraform, and modern cloud-native practices.• Contribute to the platform's CI/CD pipeline by integrating automated testing, resilience checks, and observability hooks at every stage.• Lead initiatives that drive testability, platform resilience, and validation as code across all layers of the ML platform stack.• Collaborate with engineering, MLOps, and infrastructure teams to embed quality engineering deeply into platform components.• Build reusable components that support scalability, modularity, and self-service quality tooling.• Mentor junior engineers and influence technical standards across the Test Engineering Program. Required Qualifications • Bachelor's or master's degree in computer science, Engineering, or a related technical field.• 8+ years of hands-on software development experience, including large-scale backend systems or platform engineering.• Expert in Python with a strong understanding of object-oriented programming, testing frameworks, and automation libraries.• Experience building or validating platform infrastructure, with hands-on knowledge of CI/CD systems, GitHub Actions, Jenkins, or similar tools.• Solid experience with AWS services (Lambda, S3, ECS/EKS, Step Functions, CloudWatch).• Proficient in Infrastructure as Code using Terraform to manage and provision cloud infrastructure.• Strong understanding of software engineering best practices: code quality, reliability, performance optimization, and observability. Preferred Qualifications • Exposure to machine learning workflows, model lifecycle management, or data engineering platforms.• Experience with distributed systems, event-driven architectures (e.g., Kafka), and big data platforms (e.g., Spark, Databricks).• Familiarity with banking or financial domain use cases, including data governance and compliance-focused development.• Knowledge of platform security, monitoring, and resilient architecture patterns.
London Borough of Barnet
Insight & Intelligence Manager (18 Months FTC)
London Borough of Barnet Barnet, London
Directorate: Strategy & Innovation Contract Type: 18 Months Fixed Term Contract Hours: 36 Salary: 62,766 - 69,984 Location: Colindale Closing Date: Midnight March 9th 2026 About Barnet Council Barnet is a borough with much to be proud of. Our excellent schools, vibrant town centers, vast green spaces and diverse communities all help make it a great place to live and work. As a council we want to build on these strengths as we move into the future. We are growing and developing as an organisation to meet the challenges facing our borough and we are committed to working with partner organisations and residents to make Barnet even better. As an organisation, our staff are committed to Our Values: Learning to Improve, Caring, Inclusive, Collaborative - which drive everything we do. About the role This is an exciting time to join Barnet as we grow our Digital, Data and Technology (DDaT) capabilities and accelerate our digital transformation journey. We're investing in smarter services, better use of data, modern technology, and you'll play a key part in shaping this future. We're looking for an exceptional Insight & Intelligence Manager to play a leading role in shaping how we use data, analytics and emerging technologies to improve outcomes for our residents. This is a key leadership role within our Insight & Intelligence Hub, responsible for driving data informed decision-making across the organisation, developing advanced analytical solutions, and guiding a talented team of analysts and data scientists. You'll blend technical excellence with strategic insight, helping Barnet unlock the full value of its data through innovative approaches, strong stakeholder relationships, and a commitment to building organisational data literacy. This is a hybrid role. You will be expected to attend monthly in-person team days in our Colindale office. We also come into the office to meet service stakeholders, work together on collaboration, discovery and user testing sessions and department days. Please click here to download the Job description for this role. You're an experienced analytics or data science professional with a track record of delivering high impact insight. You combine technical depth with the ability to translate complex concepts into actionable recommendations for senior leaders. You'll bring: - Expertise in advanced analytics, data science, machine learning and data engineering. - Experience designing secure, scalable data solutions using platforms such as Azure, Data Factory, Synapse, Databricks or Fabric. - Strong proficiency in Python, SQL and/or R, and confidence in developing data pipelines and operationalising models. - Excellent communication and stakeholder engagement skills, with the ability to influence decisions and tell compelling data stories. - Experience leading, coaching or developing analysts or data scientists. - A passion for driving data culture, improving data literacy, and championing ethical, secure use of data. In this role, you will: - Lead the design and delivery of sophisticated analytical, data science and AI projects. - Build, maintain and optimise data pipelines, analytical models and data architecture. - Develop and operationalise machine learning solutions, applying MLOps best practice. - Produce high-quality, accessible dashboards and tools using platforms such as Power BI. - Work closely with senior leaders to shape strategic insight, policy development and service transformation. - Champion best practice in data quality, governance, privacy and information security. - Develop and mentor a team of analysts and data scientists, supporting high performance and continuous learning. - Strengthen cross council collaboration, working with ICT, service leads, data engineers and external partners. - Support delivery of the Council's DDaT Strategies If you thrive in a collaborative environment, enjoy solving complex problems, and want to make a tangible difference to public services, we'd love to hear from you. What we offer - 31 days annual leave, plus public and bank holidays - Access to the Local Government Pension Scheme, which provides a valuable guaranteed income in your retirement together with security for your dependents - Work-life balance options may include hybrid working, flexitime, job share, home working, part-time - A vast range of lifestyle discounts from major retailers, supermarkets, energy suppliers and more - Broad range of payroll benefits including cycle to work, eye care vouchers, travel and gym membership - Excellent training and development opportunities - Employee well- being training programs including confidential employee assistance How to apply Read the job description and person specification before clicking 'Apply' to commence the online application form. If you would like any further information about the role before applying, please contact James Rapkin, Head of Organisational Insight & Intelligence, Barnet Council is committed to safeguarding and promoting the welfare of children, young people, and vulnerable adults and expects all staff and volunteers to share this commitment. Barnet operates stringent safer recruitment procedures, this may include AI Detection Screening, Biometric ID/Right to Work Checks, Qualification and Registration Checks, Up to 6 years of Employment Data and Insights to Accelerate Screening (Konfir), Up to 5 years of Employment History References, DBS (Disclosure & Barring Service) Checks, Credit Checks and Social Media, Sanctions and Occupational Health Screening. To deliver Barnet Council's commitment to equality of opportunity in the provision of services, all staff are expected to promote equality in the workplace and in the services the Council delivers. As such we value diversity and welcome applications from all backgrounds. Barnet Council embraces all forms of flexible working (including part-time, compressed hours, and hybrid working) and is committed to offering employees a healthy work-life balance. Candidates are encouraged to talk about relevant requirements and preferences at interview. We can't promise to give you exactly what you want, but we do promise not to judge you for asking. Barnet Council is a Disability Confident Committed Employer. We welcome and encourage job applications of all abilities. If you require any reasonable adjustments in the application or interview, please contact the lead contact on this advert. We will make reasonable adjustments to make sure our disabled applicants and those with health conditions are supported throughout our recruitment process. We support the access to work scheme, further details are available at (url removed) All posts with the council are subject to a probationary period of six months, during which time you will be required to demonstrate to the council satisfaction your suitability for the position in which you will be employed. Due to the high number of applications that are received for some posts we may close vacancies before the stated closing date if sufficient number of applications are received. Therefore, please apply as soon as possible. Please ensure you regularly check the email account (including JUNK MAIL folders) that you use to submit your application, as any further communication regarding your application will be sent electronically. Should you not hear from us within four working weeks of the closing date for this post, then regretfully in this instance, you have not been shortlisted.
13/02/2026
Contractor
Directorate: Strategy & Innovation Contract Type: 18 Months Fixed Term Contract Hours: 36 Salary: 62,766 - 69,984 Location: Colindale Closing Date: Midnight March 9th 2026 About Barnet Council Barnet is a borough with much to be proud of. Our excellent schools, vibrant town centers, vast green spaces and diverse communities all help make it a great place to live and work. As a council we want to build on these strengths as we move into the future. We are growing and developing as an organisation to meet the challenges facing our borough and we are committed to working with partner organisations and residents to make Barnet even better. As an organisation, our staff are committed to Our Values: Learning to Improve, Caring, Inclusive, Collaborative - which drive everything we do. About the role This is an exciting time to join Barnet as we grow our Digital, Data and Technology (DDaT) capabilities and accelerate our digital transformation journey. We're investing in smarter services, better use of data, modern technology, and you'll play a key part in shaping this future. We're looking for an exceptional Insight & Intelligence Manager to play a leading role in shaping how we use data, analytics and emerging technologies to improve outcomes for our residents. This is a key leadership role within our Insight & Intelligence Hub, responsible for driving data informed decision-making across the organisation, developing advanced analytical solutions, and guiding a talented team of analysts and data scientists. You'll blend technical excellence with strategic insight, helping Barnet unlock the full value of its data through innovative approaches, strong stakeholder relationships, and a commitment to building organisational data literacy. This is a hybrid role. You will be expected to attend monthly in-person team days in our Colindale office. We also come into the office to meet service stakeholders, work together on collaboration, discovery and user testing sessions and department days. Please click here to download the Job description for this role. You're an experienced analytics or data science professional with a track record of delivering high impact insight. You combine technical depth with the ability to translate complex concepts into actionable recommendations for senior leaders. You'll bring: - Expertise in advanced analytics, data science, machine learning and data engineering. - Experience designing secure, scalable data solutions using platforms such as Azure, Data Factory, Synapse, Databricks or Fabric. - Strong proficiency in Python, SQL and/or R, and confidence in developing data pipelines and operationalising models. - Excellent communication and stakeholder engagement skills, with the ability to influence decisions and tell compelling data stories. - Experience leading, coaching or developing analysts or data scientists. - A passion for driving data culture, improving data literacy, and championing ethical, secure use of data. In this role, you will: - Lead the design and delivery of sophisticated analytical, data science and AI projects. - Build, maintain and optimise data pipelines, analytical models and data architecture. - Develop and operationalise machine learning solutions, applying MLOps best practice. - Produce high-quality, accessible dashboards and tools using platforms such as Power BI. - Work closely with senior leaders to shape strategic insight, policy development and service transformation. - Champion best practice in data quality, governance, privacy and information security. - Develop and mentor a team of analysts and data scientists, supporting high performance and continuous learning. - Strengthen cross council collaboration, working with ICT, service leads, data engineers and external partners. - Support delivery of the Council's DDaT Strategies If you thrive in a collaborative environment, enjoy solving complex problems, and want to make a tangible difference to public services, we'd love to hear from you. What we offer - 31 days annual leave, plus public and bank holidays - Access to the Local Government Pension Scheme, which provides a valuable guaranteed income in your retirement together with security for your dependents - Work-life balance options may include hybrid working, flexitime, job share, home working, part-time - A vast range of lifestyle discounts from major retailers, supermarkets, energy suppliers and more - Broad range of payroll benefits including cycle to work, eye care vouchers, travel and gym membership - Excellent training and development opportunities - Employee well- being training programs including confidential employee assistance How to apply Read the job description and person specification before clicking 'Apply' to commence the online application form. If you would like any further information about the role before applying, please contact James Rapkin, Head of Organisational Insight & Intelligence, Barnet Council is committed to safeguarding and promoting the welfare of children, young people, and vulnerable adults and expects all staff and volunteers to share this commitment. Barnet operates stringent safer recruitment procedures, this may include AI Detection Screening, Biometric ID/Right to Work Checks, Qualification and Registration Checks, Up to 6 years of Employment Data and Insights to Accelerate Screening (Konfir), Up to 5 years of Employment History References, DBS (Disclosure & Barring Service) Checks, Credit Checks and Social Media, Sanctions and Occupational Health Screening. To deliver Barnet Council's commitment to equality of opportunity in the provision of services, all staff are expected to promote equality in the workplace and in the services the Council delivers. As such we value diversity and welcome applications from all backgrounds. Barnet Council embraces all forms of flexible working (including part-time, compressed hours, and hybrid working) and is committed to offering employees a healthy work-life balance. Candidates are encouraged to talk about relevant requirements and preferences at interview. We can't promise to give you exactly what you want, but we do promise not to judge you for asking. Barnet Council is a Disability Confident Committed Employer. We welcome and encourage job applications of all abilities. If you require any reasonable adjustments in the application or interview, please contact the lead contact on this advert. We will make reasonable adjustments to make sure our disabled applicants and those with health conditions are supported throughout our recruitment process. We support the access to work scheme, further details are available at (url removed) All posts with the council are subject to a probationary period of six months, during which time you will be required to demonstrate to the council satisfaction your suitability for the position in which you will be employed. Due to the high number of applications that are received for some posts we may close vacancies before the stated closing date if sufficient number of applications are received. Therefore, please apply as soon as possible. Please ensure you regularly check the email account (including JUNK MAIL folders) that you use to submit your application, as any further communication regarding your application will be sent electronically. Should you not hear from us within four working weeks of the closing date for this post, then regretfully in this instance, you have not been shortlisted.
London Borough of Barnet
Data Engineer (18 Months FTC)
London Borough of Barnet Barnet, London
Directorate: Strategy & Innovation Contract Type: 18 Months Fixed Term Contract Hours: 36 Salary: 48,003 - 53,172 Location: Colindale Closing Date: Midnight March 9th 2026 About Barnet Council Barnet is a borough with much to be proud of. Our excellent schools, vibrant town centers, vast green spaces and diverse communities all help make it a great place to live and work. As a council we want to build on these strengths as we move into the future. We are growing and developing as an organisation to meet the challenges facing our borough and we are committed to working with partner organisations and residents to make Barnet even better. As an organisation, our staff are committed to Our Values: Learning to Improve, Caring, Inclusive, Collaborative - which drive everything we do. About the role This is an exciting time to join Barnet as we grow our Digital, Data and Technology (DDaT) capabilities and accelerate our digital transformation journey. We're investing in smarter services, better use of data, modern technology, and you'll play a key part in shaping this future. We're looking for a Data Engineer to join our Insight & Intelligence Hub, working with a talented team to help design, build and maintain the data infrastructure that powers our analytics, AI and insight functions. You'll work across the organisation to integrate complex datasets, improve data quality, and create robust data pipelines that enable our analysts and data scientists to deliver highvalue insight for services and residents. This is a handson technical role where you'll work with modern cloud technologies, largescale data systems and cuttingedge tools to support predictive analytics, machine learning initiatives and enterprise reporting. If you're excited by engineering challenge, modern data platforms and solving realworld problems, you'll be a great fit. This is a hybrid role. You will be expected to attend monthly in-person team days in our Colindale office. We also come into the office to meet service stakeholders, work together on collaboration, discovery and user testing sessions and department days. Please click here to download the Job description for this role. About you You're an experienced and adaptable data engineering professional who enjoys creating welldesigned, efficient and reliable data solutions. You're comfortable working with complex systems, experimenting with new tools and collaborating with colleagues across technical and nontechnical teams. You will bring: - Strong handson experience with modern data engineering tools and practices, including ETL/ELT, data warehousing and data modelling. - Expertise in cloud data platforms, ideally Azure services such as Data Factory, Synapse, Databricks, Spark or Snowflake. - Proficiency in Python, SQL and other relevant languages for building pipelines and enabling automation. - Experience building and managing production data services, integrating data from multiple systems, and improving data quality. - Understanding of data governance, security, information management and UK GDPR. - The ability to communicate complex technical concepts clearly for nonspecialist audiences. - A collaborative and proactive approach, with confidence leading projects, supporting partners and driving continuous improvement. If you enjoy solving complex data challenges and playing a leading role in a growing data function, this is an excellent opportunity. In this role, you will: - Develop and maintain the council's Microsoft Azure Data Platform, supporting cloudready, scalable analytics. - Build and optimise data pipelines, APIs and integrations across onpremise, cloud and supplierhosted systems. - Extract, clean, transform and combine data from a wide range of operational systems to create highquality datasets for insight and machine learning. - Work with data scientists to design and maintain pipelines for AI/ML models, including supporting feature engineering, labelling and largescale data flows. - Automate manual data processes to improve speed, reliability and accessibility of insight. - Lead data engineering projects, including working with external suppliers and supporting more junior colleagues. - Implement and maintain data lakehouse architectures, master data management and metadata repositories. - Ensure that data workflows comply with security, privacy and governance standards, including anonymisation where required. - Optimise the performance of code, pipelines and data systems, resolving issues, bugs and outages. - Contribute to the development of data strategy, standards and reusable engineering frameworks across the Hub. - Work with colleagues across ICT, Connected Places and service teams to join up data and support emerging technologies (e.g., IoT, geospatial, AI). - Support the creation of visualisations and reusable analytical products that enhance decisionmaking. You'll be part of a forwardlooking team that values continuous learning, experimentation and innovation, helping Barnet fully unlock the power of its data. What we offer - 31 days annual leave, plus public and bank holidays - Access to the Local Government Pension Scheme, which provides a valuable guaranteed income in your retirement together with security for your dependents - Work-life balance options may include hybrid working, flexitime, job share, home working, part-time - A vast range of lifestyle discounts from major retailers, supermarkets, energy suppliers and more - Broad range of payroll benefits including cycle to work, eye care vouchers, travel and gym membership - Excellent training and development opportunities - Employee well- being training programs including confidential employee assistance How to apply Read the job description and person specification before clicking 'Apply' to commence the online application form. If you would like any further information about the role before applying, please contact James Rapkin, Head of Organisational Insight & Intelligence, Barnet Council is committed to safeguarding and promoting the welfare of children, young people, and vulnerable adults and expects all staff and volunteers to share this commitment. Barnet operates stringent safer recruitment procedures, this may include AI Detection Screening, Biometric ID/Right to Work Checks, Qualification and Registration Checks, Up to 6 years of Employment Data and Insights to Accelerate Screening (Konfir), Up to 5 years of Employment History References, DBS (Disclosure & Barring Service) Checks, Credit Checks and Social Media, Sanctions and Occupational Health Screening. To deliver Barnet Council's commitment to equality of opportunity in the provision of services, all staff are expected to promote equality in the workplace and in the services the Council delivers. As such we value diversity and welcome applications from all backgrounds. Barnet Council embraces all forms of flexible working (including part-time, compressed hours, and hybrid working) and is committed to offering employees a healthy work-life balance. Candidates are encouraged to talk about relevant requirements and preferences at interview. We can't promise to give you exactly what you want, but we do promise not to judge you for asking. Barnet Council is a Disability Confident Committed Employer. We welcome and encourage job applications of all abilities. If you require any reasonable adjustments in the application or interview, please contact the lead contact on this advert. We will make reasonable adjustments to make sure our disabled applicants and those with health conditions are supported throughout our recruitment process. We support the access to work scheme, further details are available at (url removed) All posts with the council are subject to a probationary period of six months, during which time you will be required to demonstrate to the council satisfaction your suitability for the position in which you will be employed. Due to the high number of applications that are received for some posts we may close vacancies before the stated closing date if sufficient number of applications are received. Therefore, please apply as soon as possible. Please ensure you regularly check the email account (including JUNK MAIL folders) that you use to submit your application, as any further communication regarding your application will be sent electronically. Should you not hear from us within four working weeks of the closing date for this post, then regretfully in this instance, you have not been shortlisted.
13/02/2026
Contractor
Directorate: Strategy & Innovation Contract Type: 18 Months Fixed Term Contract Hours: 36 Salary: 48,003 - 53,172 Location: Colindale Closing Date: Midnight March 9th 2026 About Barnet Council Barnet is a borough with much to be proud of. Our excellent schools, vibrant town centers, vast green spaces and diverse communities all help make it a great place to live and work. As a council we want to build on these strengths as we move into the future. We are growing and developing as an organisation to meet the challenges facing our borough and we are committed to working with partner organisations and residents to make Barnet even better. As an organisation, our staff are committed to Our Values: Learning to Improve, Caring, Inclusive, Collaborative - which drive everything we do. About the role This is an exciting time to join Barnet as we grow our Digital, Data and Technology (DDaT) capabilities and accelerate our digital transformation journey. We're investing in smarter services, better use of data, modern technology, and you'll play a key part in shaping this future. We're looking for a Data Engineer to join our Insight & Intelligence Hub, working with a talented team to help design, build and maintain the data infrastructure that powers our analytics, AI and insight functions. You'll work across the organisation to integrate complex datasets, improve data quality, and create robust data pipelines that enable our analysts and data scientists to deliver highvalue insight for services and residents. This is a handson technical role where you'll work with modern cloud technologies, largescale data systems and cuttingedge tools to support predictive analytics, machine learning initiatives and enterprise reporting. If you're excited by engineering challenge, modern data platforms and solving realworld problems, you'll be a great fit. This is a hybrid role. You will be expected to attend monthly in-person team days in our Colindale office. We also come into the office to meet service stakeholders, work together on collaboration, discovery and user testing sessions and department days. Please click here to download the Job description for this role. About you You're an experienced and adaptable data engineering professional who enjoys creating welldesigned, efficient and reliable data solutions. You're comfortable working with complex systems, experimenting with new tools and collaborating with colleagues across technical and nontechnical teams. You will bring: - Strong handson experience with modern data engineering tools and practices, including ETL/ELT, data warehousing and data modelling. - Expertise in cloud data platforms, ideally Azure services such as Data Factory, Synapse, Databricks, Spark or Snowflake. - Proficiency in Python, SQL and other relevant languages for building pipelines and enabling automation. - Experience building and managing production data services, integrating data from multiple systems, and improving data quality. - Understanding of data governance, security, information management and UK GDPR. - The ability to communicate complex technical concepts clearly for nonspecialist audiences. - A collaborative and proactive approach, with confidence leading projects, supporting partners and driving continuous improvement. If you enjoy solving complex data challenges and playing a leading role in a growing data function, this is an excellent opportunity. In this role, you will: - Develop and maintain the council's Microsoft Azure Data Platform, supporting cloudready, scalable analytics. - Build and optimise data pipelines, APIs and integrations across onpremise, cloud and supplierhosted systems. - Extract, clean, transform and combine data from a wide range of operational systems to create highquality datasets for insight and machine learning. - Work with data scientists to design and maintain pipelines for AI/ML models, including supporting feature engineering, labelling and largescale data flows. - Automate manual data processes to improve speed, reliability and accessibility of insight. - Lead data engineering projects, including working with external suppliers and supporting more junior colleagues. - Implement and maintain data lakehouse architectures, master data management and metadata repositories. - Ensure that data workflows comply with security, privacy and governance standards, including anonymisation where required. - Optimise the performance of code, pipelines and data systems, resolving issues, bugs and outages. - Contribute to the development of data strategy, standards and reusable engineering frameworks across the Hub. - Work with colleagues across ICT, Connected Places and service teams to join up data and support emerging technologies (e.g., IoT, geospatial, AI). - Support the creation of visualisations and reusable analytical products that enhance decisionmaking. You'll be part of a forwardlooking team that values continuous learning, experimentation and innovation, helping Barnet fully unlock the power of its data. What we offer - 31 days annual leave, plus public and bank holidays - Access to the Local Government Pension Scheme, which provides a valuable guaranteed income in your retirement together with security for your dependents - Work-life balance options may include hybrid working, flexitime, job share, home working, part-time - A vast range of lifestyle discounts from major retailers, supermarkets, energy suppliers and more - Broad range of payroll benefits including cycle to work, eye care vouchers, travel and gym membership - Excellent training and development opportunities - Employee well- being training programs including confidential employee assistance How to apply Read the job description and person specification before clicking 'Apply' to commence the online application form. If you would like any further information about the role before applying, please contact James Rapkin, Head of Organisational Insight & Intelligence, Barnet Council is committed to safeguarding and promoting the welfare of children, young people, and vulnerable adults and expects all staff and volunteers to share this commitment. Barnet operates stringent safer recruitment procedures, this may include AI Detection Screening, Biometric ID/Right to Work Checks, Qualification and Registration Checks, Up to 6 years of Employment Data and Insights to Accelerate Screening (Konfir), Up to 5 years of Employment History References, DBS (Disclosure & Barring Service) Checks, Credit Checks and Social Media, Sanctions and Occupational Health Screening. To deliver Barnet Council's commitment to equality of opportunity in the provision of services, all staff are expected to promote equality in the workplace and in the services the Council delivers. As such we value diversity and welcome applications from all backgrounds. Barnet Council embraces all forms of flexible working (including part-time, compressed hours, and hybrid working) and is committed to offering employees a healthy work-life balance. Candidates are encouraged to talk about relevant requirements and preferences at interview. We can't promise to give you exactly what you want, but we do promise not to judge you for asking. Barnet Council is a Disability Confident Committed Employer. We welcome and encourage job applications of all abilities. If you require any reasonable adjustments in the application or interview, please contact the lead contact on this advert. We will make reasonable adjustments to make sure our disabled applicants and those with health conditions are supported throughout our recruitment process. We support the access to work scheme, further details are available at (url removed) All posts with the council are subject to a probationary period of six months, during which time you will be required to demonstrate to the council satisfaction your suitability for the position in which you will be employed. Due to the high number of applications that are received for some posts we may close vacancies before the stated closing date if sufficient number of applications are received. Therefore, please apply as soon as possible. Please ensure you regularly check the email account (including JUNK MAIL folders) that you use to submit your application, as any further communication regarding your application will be sent electronically. Should you not hear from us within four working weeks of the closing date for this post, then regretfully in this instance, you have not been shortlisted.
Akkodis
Lead Data Platform Engineer Remote £85k + 4 day week
Akkodis Bristol, Gloucestershire
Lead Data Platform Specialist - Up to 85k + c.15% Bonus Fully Remote - Condensed & flexible hours available (4 day working week available - 9 out of 10, etc.) My client, a nation-wide organisation with a reputation for excellence and a supportive, inclusive culture, is seeking a Lead Data Platform Engineer to join their Data Engineering and Machine Learning team. This is a high-impact, senior role, ideal for someone with deep experience in modern cloud data platforms, looking to shape and deliver a scalable, secure, and innovative data platform. You need to have strong experience working with Databricks You'll play a pivotal role in designing, building, and optimising their Azure-based Data Lakehouse, with a focus on Databricks, PySpark, Spark SQL, and Azure Data Factory. This isn't just about coding - you'll also provide architectural guidance, mentor engineers, and ensure solutions are scalable, secure, and aligned with business needs. Hands-on experience with CI/CD, automation, and infrastructure-as-code (Terraform, ARM templates) is essential. Experience in machine learning platforms or ML engineering is a bonus. Key Responsibilities: Build and maintain the Data Lakehouse platform in a secure Azure environment Develop automation for cluster management, integration runtimes, and networking Lead architectural design and ensure platform scalability, reliability, and governance Write efficient, maintainable code in PySpark, Python, and SQL Implement CI/CD pipelines and cloud infrastructure via Terraform/ARM Collaborate with data engineers, architects, and business stakeholders Mentor and coach engineers, fostering a culture of learning and excellence Essential Skills: Deep experience in Databricks, Azure Data Factory, and Lakehouse architecture Strong solution architecture and data platform engineering skills DevOps and automation expertise, including CI/CD, monitoring, and code quality Infrastructure-as-code (Terraform or ARM templates) for cloud resource provisioning Excellent communication and mentoring skills It's not often a role comes along in a team like this, where you get the chance to flex or condense hours, receive a strong salary with a bonus, potential for growth, independence and autonomy, and a clear pathway of progression. You'll get Private medical options, 25 days holiday, plus bank holiday, and the chance to buy/sell holiday. This is a rare opportunity to join a forward-thinking team at a leading organisation, working fully remotely with flexibility and excellent benefits. You'll be shaping the future of their data platform while collaborating with a talented and diverse team. To apply, please submit your CV as we are looking to move quickly on this role. Modis International Ltd acts as an employment agency for permanent recruitment and an employment business for the supply of temporary workers in the UK. Modis Europe Ltd provide a variety of international solutions that connect clients to the best talent in the world. For all positions based in Switzerland, Modis Europe Ltd works with its licensed Swiss partner Accurity GmbH to ensure that candidate applications are handled in accordance with Swiss law. Both Modis International Ltd and Modis Europe Ltd are Equal Opportunities Employers. By applying for this role your details will be submitted to Modis International Ltd and/ or Modis Europe Ltd. Our Candidate Privacy Information Statement which explains how we will use your information is available on the Modis website.
10/02/2026
Full time
Lead Data Platform Specialist - Up to 85k + c.15% Bonus Fully Remote - Condensed & flexible hours available (4 day working week available - 9 out of 10, etc.) My client, a nation-wide organisation with a reputation for excellence and a supportive, inclusive culture, is seeking a Lead Data Platform Engineer to join their Data Engineering and Machine Learning team. This is a high-impact, senior role, ideal for someone with deep experience in modern cloud data platforms, looking to shape and deliver a scalable, secure, and innovative data platform. You need to have strong experience working with Databricks You'll play a pivotal role in designing, building, and optimising their Azure-based Data Lakehouse, with a focus on Databricks, PySpark, Spark SQL, and Azure Data Factory. This isn't just about coding - you'll also provide architectural guidance, mentor engineers, and ensure solutions are scalable, secure, and aligned with business needs. Hands-on experience with CI/CD, automation, and infrastructure-as-code (Terraform, ARM templates) is essential. Experience in machine learning platforms or ML engineering is a bonus. Key Responsibilities: Build and maintain the Data Lakehouse platform in a secure Azure environment Develop automation for cluster management, integration runtimes, and networking Lead architectural design and ensure platform scalability, reliability, and governance Write efficient, maintainable code in PySpark, Python, and SQL Implement CI/CD pipelines and cloud infrastructure via Terraform/ARM Collaborate with data engineers, architects, and business stakeholders Mentor and coach engineers, fostering a culture of learning and excellence Essential Skills: Deep experience in Databricks, Azure Data Factory, and Lakehouse architecture Strong solution architecture and data platform engineering skills DevOps and automation expertise, including CI/CD, monitoring, and code quality Infrastructure-as-code (Terraform or ARM templates) for cloud resource provisioning Excellent communication and mentoring skills It's not often a role comes along in a team like this, where you get the chance to flex or condense hours, receive a strong salary with a bonus, potential for growth, independence and autonomy, and a clear pathway of progression. You'll get Private medical options, 25 days holiday, plus bank holiday, and the chance to buy/sell holiday. This is a rare opportunity to join a forward-thinking team at a leading organisation, working fully remotely with flexibility and excellent benefits. You'll be shaping the future of their data platform while collaborating with a talented and diverse team. To apply, please submit your CV as we are looking to move quickly on this role. Modis International Ltd acts as an employment agency for permanent recruitment and an employment business for the supply of temporary workers in the UK. Modis Europe Ltd provide a variety of international solutions that connect clients to the best talent in the world. For all positions based in Switzerland, Modis Europe Ltd works with its licensed Swiss partner Accurity GmbH to ensure that candidate applications are handled in accordance with Swiss law. Both Modis International Ltd and Modis Europe Ltd are Equal Opportunities Employers. By applying for this role your details will be submitted to Modis International Ltd and/ or Modis Europe Ltd. Our Candidate Privacy Information Statement which explains how we will use your information is available on the Modis website.
Akkodis
Lead Data Platform Engineer Remote £85k + 4 day week
Akkodis
Lead Data Platform Specialist - Up to 85k + c.15% Bonus Fully Remote - Condensed & flexible hours available (4 day working week available - 9 out of 10, etc.) My client, a nation-wide organisation with a reputation for excellence and a supportive, inclusive culture, is seeking a Lead Data Platform Engineer to join their Data Engineering and Machine Learning team. This is a high-impact, senior role, ideal for someone with deep experience in modern cloud data platforms, looking to shape and deliver a scalable, secure, and innovative data platform. You need to have strong experience working with Databricks You'll play a pivotal role in designing, building, and optimising their Azure-based Data Lakehouse, with a focus on Databricks, PySpark, Spark SQL, and Azure Data Factory. This isn't just about coding - you'll also provide architectural guidance, mentor engineers, and ensure solutions are scalable, secure, and aligned with business needs. Hands-on experience with CI/CD, automation, and infrastructure-as-code (Terraform, ARM templates) is essential. Experience in machine learning platforms or ML engineering is a bonus. Key Responsibilities: Build and maintain the Data Lakehouse platform in a secure Azure environment Develop automation for cluster management, integration runtimes, and networking Lead architectural design and ensure platform scalability, reliability, and governance Write efficient, maintainable code in PySpark, Python, and SQL Implement CI/CD pipelines and cloud infrastructure via Terraform/ARM Collaborate with data engineers, architects, and business stakeholders Mentor and coach engineers, fostering a culture of learning and excellence Essential Skills: Deep experience in Databricks, Azure Data Factory, and Lakehouse architecture Strong solution architecture and data platform engineering skills DevOps and automation expertise, including CI/CD, monitoring, and code quality Infrastructure-as-code (Terraform or ARM templates) for cloud resource provisioning Excellent communication and mentoring skills It's not often a role comes along in a team like this, where you get the chance to flex or condense hours, receive a strong salary with a bonus, potential for growth, independence and autonomy, and a clear pathway of progression. You'll get Private medical options, 25 days holiday, plus bank holiday, and the chance to buy/sell holiday. This is a rare opportunity to join a forward-thinking team at a leading organisation, working fully remotely with flexibility and excellent benefits. You'll be shaping the future of their data platform while collaborating with a talented and diverse team. To apply, please submit your CV as we are looking to move quickly on this role. Modis International Ltd acts as an employment agency for permanent recruitment and an employment business for the supply of temporary workers in the UK. Modis Europe Ltd provide a variety of international solutions that connect clients to the best talent in the world. For all positions based in Switzerland, Modis Europe Ltd works with its licensed Swiss partner Accurity GmbH to ensure that candidate applications are handled in accordance with Swiss law. Both Modis International Ltd and Modis Europe Ltd are Equal Opportunities Employers. By applying for this role your details will be submitted to Modis International Ltd and/ or Modis Europe Ltd. Our Candidate Privacy Information Statement which explains how we will use your information is available on the Modis website.
09/02/2026
Full time
Lead Data Platform Specialist - Up to 85k + c.15% Bonus Fully Remote - Condensed & flexible hours available (4 day working week available - 9 out of 10, etc.) My client, a nation-wide organisation with a reputation for excellence and a supportive, inclusive culture, is seeking a Lead Data Platform Engineer to join their Data Engineering and Machine Learning team. This is a high-impact, senior role, ideal for someone with deep experience in modern cloud data platforms, looking to shape and deliver a scalable, secure, and innovative data platform. You need to have strong experience working with Databricks You'll play a pivotal role in designing, building, and optimising their Azure-based Data Lakehouse, with a focus on Databricks, PySpark, Spark SQL, and Azure Data Factory. This isn't just about coding - you'll also provide architectural guidance, mentor engineers, and ensure solutions are scalable, secure, and aligned with business needs. Hands-on experience with CI/CD, automation, and infrastructure-as-code (Terraform, ARM templates) is essential. Experience in machine learning platforms or ML engineering is a bonus. Key Responsibilities: Build and maintain the Data Lakehouse platform in a secure Azure environment Develop automation for cluster management, integration runtimes, and networking Lead architectural design and ensure platform scalability, reliability, and governance Write efficient, maintainable code in PySpark, Python, and SQL Implement CI/CD pipelines and cloud infrastructure via Terraform/ARM Collaborate with data engineers, architects, and business stakeholders Mentor and coach engineers, fostering a culture of learning and excellence Essential Skills: Deep experience in Databricks, Azure Data Factory, and Lakehouse architecture Strong solution architecture and data platform engineering skills DevOps and automation expertise, including CI/CD, monitoring, and code quality Infrastructure-as-code (Terraform or ARM templates) for cloud resource provisioning Excellent communication and mentoring skills It's not often a role comes along in a team like this, where you get the chance to flex or condense hours, receive a strong salary with a bonus, potential for growth, independence and autonomy, and a clear pathway of progression. You'll get Private medical options, 25 days holiday, plus bank holiday, and the chance to buy/sell holiday. This is a rare opportunity to join a forward-thinking team at a leading organisation, working fully remotely with flexibility and excellent benefits. You'll be shaping the future of their data platform while collaborating with a talented and diverse team. To apply, please submit your CV as we are looking to move quickly on this role. Modis International Ltd acts as an employment agency for permanent recruitment and an employment business for the supply of temporary workers in the UK. Modis Europe Ltd provide a variety of international solutions that connect clients to the best talent in the world. For all positions based in Switzerland, Modis Europe Ltd works with its licensed Swiss partner Accurity GmbH to ensure that candidate applications are handled in accordance with Swiss law. Both Modis International Ltd and Modis Europe Ltd are Equal Opportunities Employers. By applying for this role your details will be submitted to Modis International Ltd and/ or Modis Europe Ltd. Our Candidate Privacy Information Statement which explains how we will use your information is available on the Modis website.
Specialist Solutions Architect - DS/ML/AI/GenAI with Big Data
Databricks Inc.
Overview Skills: Data Science, Machine Learning, AI, LLM, GenAI and Spark or Big Data As a Specialist Solutions Architect (SSA) - ML Engineering, you will be the trusted technical ML expert to both Databricks customers and the Field Engineering organisation. You will work with Solution Architects to guide customers in architecting production-grade ML applications on Databricks, while aligning their technical roadmap with the evolving Databricks Data Intelligence Platform. You will continue to strengthen your technical skills through applying the latest technologies in GenAI, LLMOps, and ML, while expanding your impact through mentorship and establishing yourself as an ML expert. What you will do Architect production-level ML workloads for customers using our unified platform, including end-to-end ML pipelines, training/inference optimisation, integration with cloud-native services and MLOps Provide advanced technical support to Solution Engineers during the technical sale, ranging from feature engineering, training, tracking, serving, to model monitoring, all within a single platform, and participating in the larger ML SME community in Databricks Collaborate with the product and engineering teams to represent the voice of the customer, define priorities and influence the product roadmap, helping with the adoption of Databricks' ML offerings Build and increase customer data science workloads and apply the best MLOps to productionize these workloads across a variety of domains Serve as the trusted technical advisor for customers developing GenAI solutions, such as RAG architectures on enterprise knowledge repos, querying structured data with natural language, content generation, and monitoring What we look for 5+ years of experience in a customer-facing technical role. Pre-sales or post-sales experience working with external clients across a variety of industry markets Data Science/ML Skills 5+ years of hands-on industry ML experience in at least one of the following: ML Engineer: Develop production-grade cloud (AWS/Azure/GCP) infrastructure that supports the deployment of ML applications, including drift monitoring Data Scientist: Experience with the latest techniques in natural language processing, including vector databases, fine-tuning LLMs, and deploying LLMs with tools such as HuggingFace, Langchain, and OpenAI Experience communicating and teaching technical concepts to non-technical and technical audiences alike Passion for collaboration, life-long learning, and driving our values through ML Experience in a customer-facing role in a pre-sales or post-sales role Experience working with Apache Spark to process large-scale distributed datasets Can meet expectations for technical training and role-specific outcomes within 3 months of hire Can travel up to 30% when needed About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide - including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 - rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark , Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook. Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please visit
09/02/2026
Full time
Overview Skills: Data Science, Machine Learning, AI, LLM, GenAI and Spark or Big Data As a Specialist Solutions Architect (SSA) - ML Engineering, you will be the trusted technical ML expert to both Databricks customers and the Field Engineering organisation. You will work with Solution Architects to guide customers in architecting production-grade ML applications on Databricks, while aligning their technical roadmap with the evolving Databricks Data Intelligence Platform. You will continue to strengthen your technical skills through applying the latest technologies in GenAI, LLMOps, and ML, while expanding your impact through mentorship and establishing yourself as an ML expert. What you will do Architect production-level ML workloads for customers using our unified platform, including end-to-end ML pipelines, training/inference optimisation, integration with cloud-native services and MLOps Provide advanced technical support to Solution Engineers during the technical sale, ranging from feature engineering, training, tracking, serving, to model monitoring, all within a single platform, and participating in the larger ML SME community in Databricks Collaborate with the product and engineering teams to represent the voice of the customer, define priorities and influence the product roadmap, helping with the adoption of Databricks' ML offerings Build and increase customer data science workloads and apply the best MLOps to productionize these workloads across a variety of domains Serve as the trusted technical advisor for customers developing GenAI solutions, such as RAG architectures on enterprise knowledge repos, querying structured data with natural language, content generation, and monitoring What we look for 5+ years of experience in a customer-facing technical role. Pre-sales or post-sales experience working with external clients across a variety of industry markets Data Science/ML Skills 5+ years of hands-on industry ML experience in at least one of the following: ML Engineer: Develop production-grade cloud (AWS/Azure/GCP) infrastructure that supports the deployment of ML applications, including drift monitoring Data Scientist: Experience with the latest techniques in natural language processing, including vector databases, fine-tuning LLMs, and deploying LLMs with tools such as HuggingFace, Langchain, and OpenAI Experience communicating and teaching technical concepts to non-technical and technical audiences alike Passion for collaboration, life-long learning, and driving our values through ML Experience in a customer-facing role in a pre-sales or post-sales role Experience working with Apache Spark to process large-scale distributed datasets Can meet expectations for technical training and role-specific outcomes within 3 months of hire Can travel up to 30% when needed About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide - including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 - rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark , Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook. Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please visit
Lead Data Scientist, Machine Learning Engineer 2025
Aimpoint Digital
Overview Aimpoint Digital is a premier analytics consulting firm with a mission to drive business value for clients through expertise in data strategy, data analytics, decision sciences, and data engineering and infrastructure. This position is within our decision sciences practice which focuses on delivering solutions via machine learning and statistical modelling. What you will do As a part of Aimpoint Digital, you will focus on enabling clients to get the most out of their data. You will work with all levels of the client organization to build value driving solutions that extract insights and then train them on how to manage and maintain these solutions. Typical solutions will utilize machine learning, artificial intelligence, statistical analysis, automation, optimization, and/or data visualizations. As a Lead Data Scientist, you will be expected to work independently on client engagements, take part in the development of our practice, aid in business development, and contribute innovative ideas and initiatives to our company. As a Lead Data Scientist you will: Become a trusted advisor working with clients to design end-to-end analytical solutions Work independently to solve complex data science use-cases across various industries Design and develop feature engineering pipelines, build ML & AI infrastructure, deploy models, and orchestrate advanced analytical insights Write code in SQL, Python, and Spark following software engineering best practices Collaborate with stakeholders and customers to ensure successful project delivery Who we are looking for We are looking for collaborative individuals who want to drive value, work in a fast-paced environment, and solve real business problems. You are a coder who writes efficient and optimized code leveraging key Databricks features. You are a problem-solver who can deliver simple, elegant solutions as well as cutting-edge solutions that, regardless of complexity, your clients can understand, implement, and maintain. You genuinely think about the end-to-end machine learning pipeline as you generate robust solutions. You are both a teacher and a student as we enable our clients, upskill our teammates, and learn from one another. You want to drive impact for your clients and do so through thoughtfulness, prioritization, and seeing a solution through from brainstorming to deployment. In particular you have these traits: Degree in Computer Science, Engineering, Mathematics, or equivalent experience. Experience with building high quality Data Science models to solve a client's business problems Experience with managing stakeholders and collaborating with customers Strong written and verbal communication skills required Ability to manage an individual workstream independently 3+ years of experience developing and deploying ML models in any platform (Azure, AWS, GCP, Databricks etc.) Ability to apply data science methodologies and principles to real life projects Expertise in software engineering concepts and best practices Self-starter with excellent communication skills, able to work independently, and lead projects, initiatives, and/or people Willingness to travel Want to stand out? Consulting Experience Databricks Machine Learning Associate or Machine Learning Professional Certification Familiarity with traditional machine learning tools such as Python, SKLearn, XGBoost, SparkML, etc Experience with deep learning frameworks like TensorFlow or PyTorch Knowledge of ML model deployment options (e.g., Azure Functions, FastAPI, Kubernetes) for real-time and batch processing. Experience with CI/CD pipelines (e.g., DevOps pipelines, GitHub Actions) Knowledge of infrastructure as code (e.g., Terraform, ARM Template, Databricks Asset Bundles Understanding of advanced machine learning techniques, including graph-based processing, computer vision, natural language processing, and simulation modeling Experience with generative AI and LLMs, such as LLamaIndex and LangChain Understanding of MLOps or LLMOps Familiarity with Agile methodologies, preferably Scrum We are actively seeking candidates for full-time, remote work within the US, UK or Colombia.
08/02/2026
Full time
Overview Aimpoint Digital is a premier analytics consulting firm with a mission to drive business value for clients through expertise in data strategy, data analytics, decision sciences, and data engineering and infrastructure. This position is within our decision sciences practice which focuses on delivering solutions via machine learning and statistical modelling. What you will do As a part of Aimpoint Digital, you will focus on enabling clients to get the most out of their data. You will work with all levels of the client organization to build value driving solutions that extract insights and then train them on how to manage and maintain these solutions. Typical solutions will utilize machine learning, artificial intelligence, statistical analysis, automation, optimization, and/or data visualizations. As a Lead Data Scientist, you will be expected to work independently on client engagements, take part in the development of our practice, aid in business development, and contribute innovative ideas and initiatives to our company. As a Lead Data Scientist you will: Become a trusted advisor working with clients to design end-to-end analytical solutions Work independently to solve complex data science use-cases across various industries Design and develop feature engineering pipelines, build ML & AI infrastructure, deploy models, and orchestrate advanced analytical insights Write code in SQL, Python, and Spark following software engineering best practices Collaborate with stakeholders and customers to ensure successful project delivery Who we are looking for We are looking for collaborative individuals who want to drive value, work in a fast-paced environment, and solve real business problems. You are a coder who writes efficient and optimized code leveraging key Databricks features. You are a problem-solver who can deliver simple, elegant solutions as well as cutting-edge solutions that, regardless of complexity, your clients can understand, implement, and maintain. You genuinely think about the end-to-end machine learning pipeline as you generate robust solutions. You are both a teacher and a student as we enable our clients, upskill our teammates, and learn from one another. You want to drive impact for your clients and do so through thoughtfulness, prioritization, and seeing a solution through from brainstorming to deployment. In particular you have these traits: Degree in Computer Science, Engineering, Mathematics, or equivalent experience. Experience with building high quality Data Science models to solve a client's business problems Experience with managing stakeholders and collaborating with customers Strong written and verbal communication skills required Ability to manage an individual workstream independently 3+ years of experience developing and deploying ML models in any platform (Azure, AWS, GCP, Databricks etc.) Ability to apply data science methodologies and principles to real life projects Expertise in software engineering concepts and best practices Self-starter with excellent communication skills, able to work independently, and lead projects, initiatives, and/or people Willingness to travel Want to stand out? Consulting Experience Databricks Machine Learning Associate or Machine Learning Professional Certification Familiarity with traditional machine learning tools such as Python, SKLearn, XGBoost, SparkML, etc Experience with deep learning frameworks like TensorFlow or PyTorch Knowledge of ML model deployment options (e.g., Azure Functions, FastAPI, Kubernetes) for real-time and batch processing. Experience with CI/CD pipelines (e.g., DevOps pipelines, GitHub Actions) Knowledge of infrastructure as code (e.g., Terraform, ARM Template, Databricks Asset Bundles Understanding of advanced machine learning techniques, including graph-based processing, computer vision, natural language processing, and simulation modeling Experience with generative AI and LLMs, such as LLamaIndex and LangChain Understanding of MLOps or LLMOps Familiarity with Agile methodologies, preferably Scrum We are actively seeking candidates for full-time, remote work within the US, UK or Colombia.
ComputerWorld Personnel Ltd
Machine Learning Engineer
ComputerWorld Personnel Ltd Bristol, Gloucestershire
Overview Machine Learning Engineer - Remote - £50-£70k + excellent benefits We're looking for a skilled Machine Learning Engineer to join a newly established team working on an exciting data platform project. This is a hands-on role where you'll help build and maintain the infrastructure that supports machine learning models in a live environment. Responsibilities Developing and maintaining API services using Azure and Databricks Managing caching layers with Azure Cache (Redis) Using Delta Live Tables for data processing and analytics Integrating with cloud-based data storage solutions like Snowflake Collaborating with cross-functional teams in an agile environment Supporting analytics, model deployment, and data-driven decision tools Conducting performance, load, and end-to-end testing Writing pipeline code and managing deployments via Azure DevOps (GitHub) Qualifications Solid experience in ML Ops, particularly with Azure and Databricks Familiarity with Postgres, Redis, and Snowflake Understanding of Delta Lake Architecture, Docker, and container services Experience building and orchestrating APIs Strong problem-solving and communication skills Bonus: exposure to Azure Functions, Containers, or Insights Benefits 25 days holiday, plus bank holidays Annual discretionary bonus Enhanced Pension scheme Flexible working and flexi-time options Healthcare cash plan Electric vehicle salary sacrifice scheme Discounts scheme Wellbeing app Enhanced maternity and paternity leave Life assurance (4x salary) Discounts on insurance Cycle to Work scheme Employee referral scheme If you are interested in this position please click 'apply'. Hunter Selection Limited is a recruitment consultancy with offices UK wide, specialising in permanent & contract roles within Engineering & Manufacturing, IT & Digital, Science & Technology and Service & Sales sectors. Please note as we receive a high level of applications we can only respond to applicants whose skills & qualifications are suitable for this position. No terminology in this advert is intended to discriminate against any of the protected characteristics that fall under the Equality Act 2010. For the purposes of the Conduct Regulations 2003, when advertising permanent vacancies we are acting as an Employment Agency, and when advertising temporary/contract vacancies we are acting as an Employment Business.
07/02/2026
Full time
Overview Machine Learning Engineer - Remote - £50-£70k + excellent benefits We're looking for a skilled Machine Learning Engineer to join a newly established team working on an exciting data platform project. This is a hands-on role where you'll help build and maintain the infrastructure that supports machine learning models in a live environment. Responsibilities Developing and maintaining API services using Azure and Databricks Managing caching layers with Azure Cache (Redis) Using Delta Live Tables for data processing and analytics Integrating with cloud-based data storage solutions like Snowflake Collaborating with cross-functional teams in an agile environment Supporting analytics, model deployment, and data-driven decision tools Conducting performance, load, and end-to-end testing Writing pipeline code and managing deployments via Azure DevOps (GitHub) Qualifications Solid experience in ML Ops, particularly with Azure and Databricks Familiarity with Postgres, Redis, and Snowflake Understanding of Delta Lake Architecture, Docker, and container services Experience building and orchestrating APIs Strong problem-solving and communication skills Bonus: exposure to Azure Functions, Containers, or Insights Benefits 25 days holiday, plus bank holidays Annual discretionary bonus Enhanced Pension scheme Flexible working and flexi-time options Healthcare cash plan Electric vehicle salary sacrifice scheme Discounts scheme Wellbeing app Enhanced maternity and paternity leave Life assurance (4x salary) Discounts on insurance Cycle to Work scheme Employee referral scheme If you are interested in this position please click 'apply'. Hunter Selection Limited is a recruitment consultancy with offices UK wide, specialising in permanent & contract roles within Engineering & Manufacturing, IT & Digital, Science & Technology and Service & Sales sectors. Please note as we receive a high level of applications we can only respond to applicants whose skills & qualifications are suitable for this position. No terminology in this advert is intended to discriminate against any of the protected characteristics that fall under the Equality Act 2010. For the purposes of the Conduct Regulations 2003, when advertising permanent vacancies we are acting as an Employment Agency, and when advertising temporary/contract vacancies we are acting as an Employment Business.
ComputerWorld Personnel Ltd
Remote ML Engineer - Azure & Databricks ML Ops
ComputerWorld Personnel Ltd Bristol, Gloucestershire
A leading technology recruitment consultancy is seeking a skilled Machine Learning Engineer for a remote role with a salary of £50-£70k. The successful candidate will work on developing and maintaining API services for a data platform, managing caching layers and cloud integrations. Responsibilities include collaboration in an agile environment, performance testing, and supporting analytics and model deployment. Ideal candidates should have solid ML Ops experience with Azure and Databricks, alongside strong problem-solving and communication skills.
07/02/2026
Full time
A leading technology recruitment consultancy is seeking a skilled Machine Learning Engineer for a remote role with a salary of £50-£70k. The successful candidate will work on developing and maintaining API services for a data platform, managing caching layers and cloud integrations. Responsibilities include collaboration in an agile environment, performance testing, and supporting analytics and model deployment. Ideal candidates should have solid ML Ops experience with Azure and Databricks, alongside strong problem-solving and communication skills.
Principal Consultant - Azure AI Engineer / Lead
Genpact
Overview With us, you'll learn fast, work smart, and make a difference. You'll build a career that matters. Job Description - Principal Consultant - Azure AI Engineer / Lead (ITO098519) Principal Consultant - Azure AI Engineer / Lead - ITO098519 Ready to build the future with AI? At Genpact, we don't just keep up with technology-we set the pace. AI and digital innovation are redefining industries, and we're leading the charge. Genpact's AI Gigafactory, our industry-first accelerator, is an example of how we're scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies' most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what's possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on LinkedIn, YouTube, and Facebook. Role Inviting Applications for the role of Principal Consultant - Azure AI Engineer / Lead Responsibilities Design & Development: Architect and develop end-to-end AI/ML solutions using Azure AI services (e.g., Azure Machine Learning, Azure Cognitive Services, Azure Databricks, Azure Synapse Analytics). Model Deployment & Management: Implement robust MLOps practices for CI/CD, monitoring, and retraining of machine learning models in Azure. Data Pipelining: Build and optimize scalable data pipelines for ingesting, transforming, and preparing large datasets for AI model training and inference. Performance Optimization: Optimize AI models and their serving infrastructure for performance, cost-efficiency, and scalability. Collaboration: Work with data scientists to operationalize models, provide feedback on model design for production readiness, and ensure seamless integration with existing systems. Troubleshooting & Support: Diagnose and resolve issues related to AI/ML deployments, data pipelines, and Azure infrastructure. Security & Compliance: Ensure AI solutions adhere to best practices for data security, privacy, and compliance within the Azure ecosystem. Documentation: Create comprehensive documentation for AI solution architectures, deployment procedures, and operational guidelines. Stay Current: Keep abreast of the latest advancements in Azure AI services, machine learning technologies, and MLOps trends. Qualifications Minimum Qualifications Bachelor's or master's degree in computer science, Engineering, Data Science, or a related quantitative field. Relevant years of experience in developing and deploying AI/ML solutions, with a strong focus on Microsoft Azure. Proficiency in Python and experience with relevant AI/ML libraries (e.g., scikit-learn, TensorFlow, PyTorch, Keras). Demonstrated experience with Azure Machine Learning, including MLOps concepts, model registration, deployment (AKS, ACI), and monitoring. Hands-on experience with Azure data services such as Azure Data Lake Storage, Azure Data Factory, Azure Databricks, or Azure Synapse Analytics. Solid understanding of software engineering principles, including version control (Git), testing, and code review. Experience with containerization technologies (Docker, Kubernetes). Strong problem-solving skills and the ability to work independently and as part of a team. Excellent communication and interpersonal skills. Preferred Qualifications/ Skills Azure certifications (e.g., Azure AI Engineer Associate, Azure Data Scientist Associate). Experience with Azure Cognitive Services (e.g., Language, Vision, Speech) or Azure Bot Service. Familiarity with streaming data processing frameworks (e.g., Apache Kafka, Azure Event Hubs, Azure Stream Analytics). Experience with CI/CD pipelines (e.g., Azure DevOps, GitHub Actions). Understanding of distributed computing concepts and big data technologies. Prior experience with sequential data processing, time-series analysis, or natural language processing (NLP). Why join Genpact? Lead AI-first transformation - Build and scale AI solutions that redefine industries Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career-Gain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills Grow with the best - Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace Committed to ethical AI - Work in an environment where governance, transparency, and security are at the core of everything we build Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up. Let's build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
07/02/2026
Full time
Overview With us, you'll learn fast, work smart, and make a difference. You'll build a career that matters. Job Description - Principal Consultant - Azure AI Engineer / Lead (ITO098519) Principal Consultant - Azure AI Engineer / Lead - ITO098519 Ready to build the future with AI? At Genpact, we don't just keep up with technology-we set the pace. AI and digital innovation are redefining industries, and we're leading the charge. Genpact's AI Gigafactory, our industry-first accelerator, is an example of how we're scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies' most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what's possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on LinkedIn, YouTube, and Facebook. Role Inviting Applications for the role of Principal Consultant - Azure AI Engineer / Lead Responsibilities Design & Development: Architect and develop end-to-end AI/ML solutions using Azure AI services (e.g., Azure Machine Learning, Azure Cognitive Services, Azure Databricks, Azure Synapse Analytics). Model Deployment & Management: Implement robust MLOps practices for CI/CD, monitoring, and retraining of machine learning models in Azure. Data Pipelining: Build and optimize scalable data pipelines for ingesting, transforming, and preparing large datasets for AI model training and inference. Performance Optimization: Optimize AI models and their serving infrastructure for performance, cost-efficiency, and scalability. Collaboration: Work with data scientists to operationalize models, provide feedback on model design for production readiness, and ensure seamless integration with existing systems. Troubleshooting & Support: Diagnose and resolve issues related to AI/ML deployments, data pipelines, and Azure infrastructure. Security & Compliance: Ensure AI solutions adhere to best practices for data security, privacy, and compliance within the Azure ecosystem. Documentation: Create comprehensive documentation for AI solution architectures, deployment procedures, and operational guidelines. Stay Current: Keep abreast of the latest advancements in Azure AI services, machine learning technologies, and MLOps trends. Qualifications Minimum Qualifications Bachelor's or master's degree in computer science, Engineering, Data Science, or a related quantitative field. Relevant years of experience in developing and deploying AI/ML solutions, with a strong focus on Microsoft Azure. Proficiency in Python and experience with relevant AI/ML libraries (e.g., scikit-learn, TensorFlow, PyTorch, Keras). Demonstrated experience with Azure Machine Learning, including MLOps concepts, model registration, deployment (AKS, ACI), and monitoring. Hands-on experience with Azure data services such as Azure Data Lake Storage, Azure Data Factory, Azure Databricks, or Azure Synapse Analytics. Solid understanding of software engineering principles, including version control (Git), testing, and code review. Experience with containerization technologies (Docker, Kubernetes). Strong problem-solving skills and the ability to work independently and as part of a team. Excellent communication and interpersonal skills. Preferred Qualifications/ Skills Azure certifications (e.g., Azure AI Engineer Associate, Azure Data Scientist Associate). Experience with Azure Cognitive Services (e.g., Language, Vision, Speech) or Azure Bot Service. Familiarity with streaming data processing frameworks (e.g., Apache Kafka, Azure Event Hubs, Azure Stream Analytics). Experience with CI/CD pipelines (e.g., Azure DevOps, GitHub Actions). Understanding of distributed computing concepts and big data technologies. Prior experience with sequential data processing, time-series analysis, or natural language processing (NLP). Why join Genpact? Lead AI-first transformation - Build and scale AI solutions that redefine industries Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career-Gain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills Grow with the best - Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace Committed to ethical AI - Work in an environment where governance, transparency, and security are at the core of everything we build Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up. Let's build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
GCS
Data Scientist - Fully Remote - Consultancy - AI / Agentic AI
GCS
Data Scientist for a Global consultancy. Varied clients. Please only apply if you have the ability to be security cleared (5+ years in the UK) Role - Data Scientist - AI / Agentic AI Type - Perm Location - Remote - Occasional office visit in the UK Salary - 60,000 Role The Data Scientist will join a growing analytics and AI function, contributing to the development of intelligent solutions that support business decision making and automation. This role is ideal for someone with solid Python skills, hands on exposure to AI or Agentic AI projects, and a strong grounding in machine learning fundamentals. You'll work closely with internal teams and external clients, translating technical insights into clear, actionable outcomes. Key Responsibilities Build, train, and evaluate machine learning models to support a range of analytical and AI driven initiatives. Contribute to the development of Agentic AI solutions, including workflow automation, reasoning systems, or LLM based tools. Write clean, efficient Python code for data processing, modelling, and experimentation. Extract, manipulate, and analyse data using SQL across structured data environments. Support feature engineering, model optimisation, and validation activities. Collaborate with cross functional teams to understand requirements and deliver insights. Communicate findings clearly to technical and non technical stakeholders. Participate in client facing discussions, demos, and solution walkthroughs. Required Skills & Experience Strong proficiency in Python for data science and machine learning. Demonstrable experience working on AI or Agentic AI projects (critical requirement). Solid understanding of machine learning fundamentals, including supervised and unsupervised techniques. Good working knowledge of SQL for data extraction and analysis. Approximately 5 years of hands on experience in data science or a closely related field. Nice to Have Skills Experience with Databricks for data engineering or ML workflows. Familiarity with Azure ML, MLflow, or similar MLOps tooling. Exposure to feature engineering best practices and model lifecycle management. Soft Skills Strong communication skills with the ability to explain complex concepts simply. Comfortable in client facing environments, including presentations and requirement gathering sessions. Proactive, curious, and eager to learn emerging AI technologies. GCS is acting as an Employment Agency in relation to this vacancy.
06/02/2026
Full time
Data Scientist for a Global consultancy. Varied clients. Please only apply if you have the ability to be security cleared (5+ years in the UK) Role - Data Scientist - AI / Agentic AI Type - Perm Location - Remote - Occasional office visit in the UK Salary - 60,000 Role The Data Scientist will join a growing analytics and AI function, contributing to the development of intelligent solutions that support business decision making and automation. This role is ideal for someone with solid Python skills, hands on exposure to AI or Agentic AI projects, and a strong grounding in machine learning fundamentals. You'll work closely with internal teams and external clients, translating technical insights into clear, actionable outcomes. Key Responsibilities Build, train, and evaluate machine learning models to support a range of analytical and AI driven initiatives. Contribute to the development of Agentic AI solutions, including workflow automation, reasoning systems, or LLM based tools. Write clean, efficient Python code for data processing, modelling, and experimentation. Extract, manipulate, and analyse data using SQL across structured data environments. Support feature engineering, model optimisation, and validation activities. Collaborate with cross functional teams to understand requirements and deliver insights. Communicate findings clearly to technical and non technical stakeholders. Participate in client facing discussions, demos, and solution walkthroughs. Required Skills & Experience Strong proficiency in Python for data science and machine learning. Demonstrable experience working on AI or Agentic AI projects (critical requirement). Solid understanding of machine learning fundamentals, including supervised and unsupervised techniques. Good working knowledge of SQL for data extraction and analysis. Approximately 5 years of hands on experience in data science or a closely related field. Nice to Have Skills Experience with Databricks for data engineering or ML workflows. Familiarity with Azure ML, MLflow, or similar MLOps tooling. Exposure to feature engineering best practices and model lifecycle management. Soft Skills Strong communication skills with the ability to explain complex concepts simply. Comfortable in client facing environments, including presentations and requirement gathering sessions. Proactive, curious, and eager to learn emerging AI technologies. GCS is acting as an Employment Agency in relation to this vacancy.
Reed Technology
Data Architect
Reed Technology City, Derby
Data Architect Hybrid - Staffordshire (2-3 days a week ideally, but can be flexible for the right person) 70-90,000 based on experience About the Organisation Join a nationally recognised organisation that is significantly investing in its data and AI capabilities as part of a multi-year digital transformation. With a modern data strategy in place and strong leadership backing, the business is building scalable, cloud-based platforms to unlock advanced analytics, automation and AI-driven decision-making. You'll join a growing, highly collaborative data function with the freedom to influence architecture, shape long-term data foundations and work across innovative data, AI and machine learning initiatives. The Role As a Data Architect , you'll take a hands-on role designing, governing and delivering modern data architectures that support the organisation's strategic goals. You will work closely with data engineers, analysts, data scientists and business stakeholders to ensure data platforms are scalable, secure and structured for future innovation. This is an opportunity to work across cloud, integration, modelling and governance disciplines - helping define the building blocks of a modern enterprise data ecosystem. What you'll be doing: Designing and delivering enterprise data models , data architectures and end-to-end data solutions Shaping roadmaps for data platforms, data integration, data processing and analytics capability Collaborating with engineers and data science teams to deliver high-quality, trusted datasets Ensuring all solutions align with governance, compliance, quality and security standards Contributing to data standards, best practices and architectural guidelines Supporting modern data engineering practices and helping evolve the organisation's data maturity What my client are looking for: Proven experience as a Data Architect in a complex or data-driven environment Strong knowledge of data modelling , data processing , integration patterns and database design Hands-on experience with modern cloud data platforms - Azure, Databricks or similar Ability to translate business requirements into scalable, robust data solutions A collaborative communicator with a passion for data, innovation and continuous improvement Technology You'll Work With Cloud & Data Platforms: Azure Synapse, Azure Data Lake, Azure Data Factory Data Modelling & Integration: SQL, ETL tools, data pipelines and orchestration Architecture & Governance: Enterprise data models, data catalogues, metadata management Ways of Working: Agile delivery, architectural documentation, stakeholder workshops What's On Offer (phone number removed) base salary, based on experience Strong long-term career progression Significant investment in modern data and AI platforms Interested? If you're a Data Architect who wants to build modern data foundations, contribute to major transformation programmes and work with the latest cloud and AI technologies, this role offers a chance to make a real impact.
04/02/2026
Full time
Data Architect Hybrid - Staffordshire (2-3 days a week ideally, but can be flexible for the right person) 70-90,000 based on experience About the Organisation Join a nationally recognised organisation that is significantly investing in its data and AI capabilities as part of a multi-year digital transformation. With a modern data strategy in place and strong leadership backing, the business is building scalable, cloud-based platforms to unlock advanced analytics, automation and AI-driven decision-making. You'll join a growing, highly collaborative data function with the freedom to influence architecture, shape long-term data foundations and work across innovative data, AI and machine learning initiatives. The Role As a Data Architect , you'll take a hands-on role designing, governing and delivering modern data architectures that support the organisation's strategic goals. You will work closely with data engineers, analysts, data scientists and business stakeholders to ensure data platforms are scalable, secure and structured for future innovation. This is an opportunity to work across cloud, integration, modelling and governance disciplines - helping define the building blocks of a modern enterprise data ecosystem. What you'll be doing: Designing and delivering enterprise data models , data architectures and end-to-end data solutions Shaping roadmaps for data platforms, data integration, data processing and analytics capability Collaborating with engineers and data science teams to deliver high-quality, trusted datasets Ensuring all solutions align with governance, compliance, quality and security standards Contributing to data standards, best practices and architectural guidelines Supporting modern data engineering practices and helping evolve the organisation's data maturity What my client are looking for: Proven experience as a Data Architect in a complex or data-driven environment Strong knowledge of data modelling , data processing , integration patterns and database design Hands-on experience with modern cloud data platforms - Azure, Databricks or similar Ability to translate business requirements into scalable, robust data solutions A collaborative communicator with a passion for data, innovation and continuous improvement Technology You'll Work With Cloud & Data Platforms: Azure Synapse, Azure Data Lake, Azure Data Factory Data Modelling & Integration: SQL, ETL tools, data pipelines and orchestration Architecture & Governance: Enterprise data models, data catalogues, metadata management Ways of Working: Agile delivery, architectural documentation, stakeholder workshops What's On Offer (phone number removed) base salary, based on experience Strong long-term career progression Significant investment in modern data and AI platforms Interested? If you're a Data Architect who wants to build modern data foundations, contribute to major transformation programmes and work with the latest cloud and AI technologies, this role offers a chance to make a real impact.
Senior Solutions Architect (Africa/MEA Fly-in)
Databricks Inc.
At Databricks, our core values are at the heart of everything we do; creating a culture of proactiveness and a customer centric mindset guides us to create a unified platform that makes data science and analytics accessible to everyone. We aim to inspire our customers to make informed decisions that push their business forward. We provide a user friendly and intuitive platform that makes it easy to turn insights into action and fosters a culture of creativity, experimentation, and continuous improvement. As a Senior Solutions Architect in the IMEA Pre Sales team (MEA - Africa Fly in), you will be an essential part of this mission, using your technical expertise to demonstrate how our Data Intelligence Platform can help customers solve their complex data challenges. You'll work with a collaborative, customer focused team that values innovation and creativity, using your skills to create customised solutions to help our customers achieve their goals and guide their businesses forward. Join us in our quest to change how people work with data and make a better world! The impact you will have: Create impactful and successful relationships with customer accounts focused on South Africa, but in Africa in general, providing technical and business value to Databricks customers in collaboration with the extended team. Act as a trusted advisor to C level executives, defining long term Data & AI strategies, modernising platforms, and aligning technical vision with business outcomes to drive measurable impact and innovation. Lead strategic engagements across the Africa. Shaping Data & AI transformation roadmaps, fostering innovation partnerships, and driving account growth in alignment with Africa Vision. Scale best practices in your field by authoring reference architectures, how tos, and demo applications, and help build the Databricks community in your region by leading workshops, seminars, and meet ups. Grow your knowledge and expertise to the level of a technical and/or industry specialist. What we look for: 8+ years of experience engaging in customer interactions for technical pre sales and complex sales lifecycles. Experience and expertise in enterprise engagements will be highly preferable for the role. Experienced in use case discovery, scoping, and delivering complex solution architecture designs to multiple audiences, requiring an ability to switch context and/or levels of technical depth. Ability to provide technical solutions for specialised customer needs, navigate a competitive landscape and influence C level executives by developing relationships and orchestrating teams to achieve long term customer success. Hands on expertise with complex Big Data architecture design for public cloud platform(s) solutions, focusing on use cases in Data Warehousing and Data Engineering architecture and implementation. Data Science and Machine Learning skills will be advantageous. Prior experience with coding in a core programming language (i.e., Python, SQL etc.) and willingness to learn Apache Spark . Experience and skills on the Databricks platform will be highly advantageous for the role! Key Notes: Location for the role can be in London or Paris (i.e., within a commutable distance of these office locations for hybrid schedule). You will need to be flexible and willing to travel to South Africa (and sometimes other countries in Africa) for customer visits on a regular basis (i.e., up to 2 weeks per month). Willing to manage different conversations with Business and Technical stakeholders; Enabling Partners and support internal events in the MEA region. Excellent communication skills in English required as a minimum. About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide - including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 - rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark , Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook. Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please visit Our Commitment to Diversity and Inclusion At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics. Compliance If access to export controlled technology or source code is required for performance of job duties, it is within Employer's discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.
04/02/2026
Full time
At Databricks, our core values are at the heart of everything we do; creating a culture of proactiveness and a customer centric mindset guides us to create a unified platform that makes data science and analytics accessible to everyone. We aim to inspire our customers to make informed decisions that push their business forward. We provide a user friendly and intuitive platform that makes it easy to turn insights into action and fosters a culture of creativity, experimentation, and continuous improvement. As a Senior Solutions Architect in the IMEA Pre Sales team (MEA - Africa Fly in), you will be an essential part of this mission, using your technical expertise to demonstrate how our Data Intelligence Platform can help customers solve their complex data challenges. You'll work with a collaborative, customer focused team that values innovation and creativity, using your skills to create customised solutions to help our customers achieve their goals and guide their businesses forward. Join us in our quest to change how people work with data and make a better world! The impact you will have: Create impactful and successful relationships with customer accounts focused on South Africa, but in Africa in general, providing technical and business value to Databricks customers in collaboration with the extended team. Act as a trusted advisor to C level executives, defining long term Data & AI strategies, modernising platforms, and aligning technical vision with business outcomes to drive measurable impact and innovation. Lead strategic engagements across the Africa. Shaping Data & AI transformation roadmaps, fostering innovation partnerships, and driving account growth in alignment with Africa Vision. Scale best practices in your field by authoring reference architectures, how tos, and demo applications, and help build the Databricks community in your region by leading workshops, seminars, and meet ups. Grow your knowledge and expertise to the level of a technical and/or industry specialist. What we look for: 8+ years of experience engaging in customer interactions for technical pre sales and complex sales lifecycles. Experience and expertise in enterprise engagements will be highly preferable for the role. Experienced in use case discovery, scoping, and delivering complex solution architecture designs to multiple audiences, requiring an ability to switch context and/or levels of technical depth. Ability to provide technical solutions for specialised customer needs, navigate a competitive landscape and influence C level executives by developing relationships and orchestrating teams to achieve long term customer success. Hands on expertise with complex Big Data architecture design for public cloud platform(s) solutions, focusing on use cases in Data Warehousing and Data Engineering architecture and implementation. Data Science and Machine Learning skills will be advantageous. Prior experience with coding in a core programming language (i.e., Python, SQL etc.) and willingness to learn Apache Spark . Experience and skills on the Databricks platform will be highly advantageous for the role! Key Notes: Location for the role can be in London or Paris (i.e., within a commutable distance of these office locations for hybrid schedule). You will need to be flexible and willing to travel to South Africa (and sometimes other countries in Africa) for customer visits on a regular basis (i.e., up to 2 weeks per month). Willing to manage different conversations with Business and Technical stakeholders; Enabling Partners and support internal events in the MEA region. Excellent communication skills in English required as a minimum. About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide - including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 - rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark , Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook. Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please visit Our Commitment to Diversity and Inclusion At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics. Compliance If access to export controlled technology or source code is required for performance of job duties, it is within Employer's discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.
HUNTER SELECTION
Machine Learning Engineer
HUNTER SELECTION Bristol, Gloucestershire
Machine Learning Engineer - Remote - 50- 70k + excellent benefits We're looking for a skilled Machine Learning Engineer to join a newly established team working on an exciting data platform project. This is a hands-on role where you'll help build and maintain the infrastructure that supports machine learning models in a live environment. What you'll be doing as the Machine Learning Engineer: Developing and maintaining API services using Azure and Databricks Managing caching layers with Azure Cache (Redis) Using Delta Live Tables for data processing and analytics Integrating with cloud-based data storage solutions like Snowflake Collaborating with cross-functional teams in an agile environment Supporting analytics, model deployment, and data-driven decision tools Conducting performance, load, and end-to-end testing Writing pipeline code and managing deployments via Azure DevOps (GitHub) What we're looking for from the Machine Learning Engineer: Solid experience in ML Ops, particularly with Azure and Databricks Familiarity with Postgres, Redis, and Snowflake Understanding of Delta Lake Architecture, Docker, and container services Experience building and orchestrating APIs Strong problem-solving and communication skills Bonus: exposure to Azure Functions, Containers, or Insights Benefits for the Machine Learning Operations Engineer: 25 days holiday (rising with ), plus bank holidays Annual discretionary bonus Enhanced Pension scheme Flexible working and flexi-time options Healthcare cash plan Electric vehicle salary sacrifice scheme Discounts scheme Wellbeing app Enhanced mat and pat leave Life assurance (4x salary) Discounts on insurance Cycle to Work scheme Employee referral scheme If you are interested in this position please click 'apply'. Hunter Selection Limited is a recruitment consultancy with offices UK wide, specialising in permanent & contract roles within Engineering & Manufacturing, IT & Digital, Science & Technology and Service & Sales sectors. Please note as we receive a high level of applications we can only respond to applicants whose skills & qualifications are suitable for this position. No terminology in this advert is intended to discriminate against any of the protected characteristics that fall under the Equality Act 2010. For the purposes of the Conduct Regulations 2003, when advertising permanent vacancies we are acting as an Employment Agency, and when advertising temporary/contract vacancies we are acting as an Employment Business.
04/02/2026
Full time
Machine Learning Engineer - Remote - 50- 70k + excellent benefits We're looking for a skilled Machine Learning Engineer to join a newly established team working on an exciting data platform project. This is a hands-on role where you'll help build and maintain the infrastructure that supports machine learning models in a live environment. What you'll be doing as the Machine Learning Engineer: Developing and maintaining API services using Azure and Databricks Managing caching layers with Azure Cache (Redis) Using Delta Live Tables for data processing and analytics Integrating with cloud-based data storage solutions like Snowflake Collaborating with cross-functional teams in an agile environment Supporting analytics, model deployment, and data-driven decision tools Conducting performance, load, and end-to-end testing Writing pipeline code and managing deployments via Azure DevOps (GitHub) What we're looking for from the Machine Learning Engineer: Solid experience in ML Ops, particularly with Azure and Databricks Familiarity with Postgres, Redis, and Snowflake Understanding of Delta Lake Architecture, Docker, and container services Experience building and orchestrating APIs Strong problem-solving and communication skills Bonus: exposure to Azure Functions, Containers, or Insights Benefits for the Machine Learning Operations Engineer: 25 days holiday (rising with ), plus bank holidays Annual discretionary bonus Enhanced Pension scheme Flexible working and flexi-time options Healthcare cash plan Electric vehicle salary sacrifice scheme Discounts scheme Wellbeing app Enhanced mat and pat leave Life assurance (4x salary) Discounts on insurance Cycle to Work scheme Employee referral scheme If you are interested in this position please click 'apply'. Hunter Selection Limited is a recruitment consultancy with offices UK wide, specialising in permanent & contract roles within Engineering & Manufacturing, IT & Digital, Science & Technology and Service & Sales sectors. Please note as we receive a high level of applications we can only respond to applicants whose skills & qualifications are suitable for this position. No terminology in this advert is intended to discriminate against any of the protected characteristics that fall under the Equality Act 2010. For the purposes of the Conduct Regulations 2003, when advertising permanent vacancies we are acting as an Employment Agency, and when advertising temporary/contract vacancies we are acting as an Employment Business.
Tenth Revolution Group
Senior AWS Data Engineer - London - £125,000
Tenth Revolution Group
Senior AWS Data Engineer - London - £125,000 Please note - this role will require you to work from the London based office. You must have the unrestricted right to work in the UK to be eligible for this role. This organisation is not able to offer sponsorship. An exciting opportunity to join a greenfield initiative focused on transforming how market data is accessed and utilised. As a Senior AWS Data Engineer, you'll play a key role in designing and building a cutting-edge data platform using technologies like Databricks, Snowflake, and AWS Glue. Key Responsibilities: Build and maintain scalable data pipelines, warehouses, and lakes. Design secure, high-performance data architectures. Develop processing and analysis algorithms for complex data sets. Collaborate with data scientists to deploy machine learning models. Contribute to strategy, planning, and continuous improvement. Required Experience: Hands-on experience with AWS data tools: Glue, PySpark, Athena, Iceberg, Lake Formation. Strong Python and SQL skills for data processing and analysis. Deep understanding of data governance, quality, and security. Knowledge of market data and its business applications. Desirable Experience: Experience with Databricks and Snowflake. Familiarity with machine learning and data science concepts. Strategic thinking and ability to influence cross-functional teams. This role offers the chance to work across multiple business areas, solve complex data challenges, and contribute to long-term strategic goals. You'll be empowered to lead, collaborate, and innovate in a dynamic environment. To apply for this role please submit your CV or contact David Airey on or at . Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
03/10/2025
Full time
Senior AWS Data Engineer - London - £125,000 Please note - this role will require you to work from the London based office. You must have the unrestricted right to work in the UK to be eligible for this role. This organisation is not able to offer sponsorship. An exciting opportunity to join a greenfield initiative focused on transforming how market data is accessed and utilised. As a Senior AWS Data Engineer, you'll play a key role in designing and building a cutting-edge data platform using technologies like Databricks, Snowflake, and AWS Glue. Key Responsibilities: Build and maintain scalable data pipelines, warehouses, and lakes. Design secure, high-performance data architectures. Develop processing and analysis algorithms for complex data sets. Collaborate with data scientists to deploy machine learning models. Contribute to strategy, planning, and continuous improvement. Required Experience: Hands-on experience with AWS data tools: Glue, PySpark, Athena, Iceberg, Lake Formation. Strong Python and SQL skills for data processing and analysis. Deep understanding of data governance, quality, and security. Knowledge of market data and its business applications. Desirable Experience: Experience with Databricks and Snowflake. Familiarity with machine learning and data science concepts. Strategic thinking and ability to influence cross-functional teams. This role offers the chance to work across multiple business areas, solve complex data challenges, and contribute to long-term strategic goals. You'll be empowered to lead, collaborate, and innovate in a dynamic environment. To apply for this role please submit your CV or contact David Airey on or at . Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.

Modal Window

  • Home
  • Contact
  • About Us
  • FAQs
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • IT blog
  • Facebook
  • Twitter
  • LinkedIn
  • Youtube
© 2008-2026 IT Job Board