it job board logo
  • Home
  • Find IT Jobs
  • Register CV
  • Register as Employer
  • Contact us
  • Career Advice
  • Recruiting? Post a job
  • Sign in
  • Sign up
  • Home
  • Find IT Jobs
  • Register CV
  • Register as Employer
  • Contact us
  • Career Advice
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

40 jobs found

Email me jobs like this
Refine Search
Current Search
data engineer databricks and aws
Tenth Revolution Group
Databricks Engineer - £400PD - Hybrid - SC Clearance required
Tenth Revolution Group Newcastle Upon Tyne, Tyne And Wear
Databricks Engineer - £400PD - Hybrid We are seeking a skilled Databricks Data Engineer to design, build, and optimize scalable data pipelines and analytics platforms. In this role, you will work closely with data scientists, analysts, and product teams to deliver reliable, high-performance data solutions using Databricks and modern cloud technologies. Key Responsibilities Design, develop, and maintain scalable data pipelines using Databricks, Apache Spark, and Delta Lake Build and optimize ETL/ELT workflows for batch and streaming data Develop data models to support analytics, reporting, and machine learning use cases Ensure data quality, reliability, and performance across data platforms Integrate data from multiple sources including APIs, databases, and cloud storage Collaborate with data scientists and analysts to enable advanced analytics and ML workloads Implement monitoring, logging, and cost-optimization best practices Follow data governance, security, and compliance standards Document data pipelines, architectures, and best practices Required Qualifications 3+ years of experience in data engineering or a similar role Strong experience with Databricks and Apache Spark Proficiency in Python and SQL Experience with Delta Lake, data modelling, and performance tuning Hands-on experience with cloud platforms (AWS, Azure, or GCP) Familiarity with data orchestration tools (eg, Airflow, Azure Data Factory) Solid understanding of data warehousing and big data concepts Preferred Qualifications Experience with Real Time/streaming data (Spark Structured Streaming, Kafka, etc.) Knowledge of CI/CD for data pipelines Experience supporting machine learning pipelines Databricks or cloud platform certifications To apply for this role please submit your CV or contact Dillon Blackburn (see below) Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
08/01/2026
Contractor
Databricks Engineer - £400PD - Hybrid We are seeking a skilled Databricks Data Engineer to design, build, and optimize scalable data pipelines and analytics platforms. In this role, you will work closely with data scientists, analysts, and product teams to deliver reliable, high-performance data solutions using Databricks and modern cloud technologies. Key Responsibilities Design, develop, and maintain scalable data pipelines using Databricks, Apache Spark, and Delta Lake Build and optimize ETL/ELT workflows for batch and streaming data Develop data models to support analytics, reporting, and machine learning use cases Ensure data quality, reliability, and performance across data platforms Integrate data from multiple sources including APIs, databases, and cloud storage Collaborate with data scientists and analysts to enable advanced analytics and ML workloads Implement monitoring, logging, and cost-optimization best practices Follow data governance, security, and compliance standards Document data pipelines, architectures, and best practices Required Qualifications 3+ years of experience in data engineering or a similar role Strong experience with Databricks and Apache Spark Proficiency in Python and SQL Experience with Delta Lake, data modelling, and performance tuning Hands-on experience with cloud platforms (AWS, Azure, or GCP) Familiarity with data orchestration tools (eg, Airflow, Azure Data Factory) Solid understanding of data warehousing and big data concepts Preferred Qualifications Experience with Real Time/streaming data (Spark Structured Streaming, Kafka, etc.) Knowledge of CI/CD for data pipelines Experience supporting machine learning pipelines Databricks or cloud platform certifications To apply for this role please submit your CV or contact Dillon Blackburn (see below) Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Data Engineer Manager
Youngs Employment Services
Data Engineer Manager Hybrid - London with 2/3 days WFH Circ £85,000 - £95,000 + Attractive Bonus & Benefits Hands On Data Engineer Manager required for this exciting newly created position with a prestigious and rapidly expanding business in West London. It would suit someone with official management experience, or potentially a Lead / Senior Engineer looking to take on more managerial responsibility. The Data Engineer Manager will play a pivotal role at the heart of our client's data & analytics operation. Having implemented a new MS Fabric based Data platform, the need now is to scale up and meet the demand to deliver data driven insights and strategies right across the business globally. There'll be a hands-on element to the role as you'll be troubleshooting, reviewing code, steering the team through deployments and acting as the escalation point for data engineering. Our client can offer an excellent career development opportunity and a vibrant, creative and collaborative work environment. This is a hybrid role based in Central / West London with the flexibility to work from home 2 or 3 days per week. Key Responsibilities include; Define and take ownership of the roadmap for the ongoing development and enhancement of the Data Platform. Design, implement, and oversee scalable data pipelines and ETL/ELT processes within MS Fabric, leveraging expertise in Azure Data Factory, Databricks, and other Azure services. Advocate for engineering best practices and ensure long-term sustainability of systems. Integrate principles of data quality, observability, and governance throughout all processes. Participate in recruiting, mentoring, and developing a high-performing data organization. Demonstrate pragmatic leadership by aligning multiple product workstreams to achieve a unified, robust, and trustworthy data platform that supports production services such as dashboards, new product launches, analytics, and data science initiatives. Develop and maintain comprehensive data models, data lakes, and data warehouses (e.g., utilizing Azure Synapse). Collaborate with data analysts, Analytics Engineers, and various stakeholders to fulfil business requirements. Key Experience, Skills and Knowledge: Experience leading data or platform teams in a production environment as a Senior Data Engineer, Tech Lead, Data Engineering Manager etc. Proven success with modern data infrastructure: distributed systems, batch and streaming pipelines Hands-on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in the likes of Python, Pyspark, SQL, Scala or Java. Experience operating in a cloud-native environment such as Azure, AWS, GCP etc ( Fabric experience would be beneficial but is not essential). Excellent stakeholder management and communication skills. A strategic mindset, with a practical approach to delivery and prioritisation. Proven success with modern data infrastructure: distributed systems, batch and streaming pipelines. Experience building, defining, and owning data models, data lakes, and data warehouses. Exposure to data science concepts and techniques is highly desirable. Strong problem-solving skills and attention to detail. Salary is dependent on experience and expected to be in the region of £85,000 - £95,000 + an attractive bonus scheme and benefits package. For further information, please send your CV to Wayne Young at Young's Employment Services Ltd. YES are operating as both a recruitment Agency and Recruitment Business.
07/01/2026
Full time
Data Engineer Manager Hybrid - London with 2/3 days WFH Circ £85,000 - £95,000 + Attractive Bonus & Benefits Hands On Data Engineer Manager required for this exciting newly created position with a prestigious and rapidly expanding business in West London. It would suit someone with official management experience, or potentially a Lead / Senior Engineer looking to take on more managerial responsibility. The Data Engineer Manager will play a pivotal role at the heart of our client's data & analytics operation. Having implemented a new MS Fabric based Data platform, the need now is to scale up and meet the demand to deliver data driven insights and strategies right across the business globally. There'll be a hands-on element to the role as you'll be troubleshooting, reviewing code, steering the team through deployments and acting as the escalation point for data engineering. Our client can offer an excellent career development opportunity and a vibrant, creative and collaborative work environment. This is a hybrid role based in Central / West London with the flexibility to work from home 2 or 3 days per week. Key Responsibilities include; Define and take ownership of the roadmap for the ongoing development and enhancement of the Data Platform. Design, implement, and oversee scalable data pipelines and ETL/ELT processes within MS Fabric, leveraging expertise in Azure Data Factory, Databricks, and other Azure services. Advocate for engineering best practices and ensure long-term sustainability of systems. Integrate principles of data quality, observability, and governance throughout all processes. Participate in recruiting, mentoring, and developing a high-performing data organization. Demonstrate pragmatic leadership by aligning multiple product workstreams to achieve a unified, robust, and trustworthy data platform that supports production services such as dashboards, new product launches, analytics, and data science initiatives. Develop and maintain comprehensive data models, data lakes, and data warehouses (e.g., utilizing Azure Synapse). Collaborate with data analysts, Analytics Engineers, and various stakeholders to fulfil business requirements. Key Experience, Skills and Knowledge: Experience leading data or platform teams in a production environment as a Senior Data Engineer, Tech Lead, Data Engineering Manager etc. Proven success with modern data infrastructure: distributed systems, batch and streaming pipelines Hands-on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in the likes of Python, Pyspark, SQL, Scala or Java. Experience operating in a cloud-native environment such as Azure, AWS, GCP etc ( Fabric experience would be beneficial but is not essential). Excellent stakeholder management and communication skills. A strategic mindset, with a practical approach to delivery and prioritisation. Proven success with modern data infrastructure: distributed systems, batch and streaming pipelines. Experience building, defining, and owning data models, data lakes, and data warehouses. Exposure to data science concepts and techniques is highly desirable. Strong problem-solving skills and attention to detail. Salary is dependent on experience and expected to be in the region of £85,000 - £95,000 + an attractive bonus scheme and benefits package. For further information, please send your CV to Wayne Young at Young's Employment Services Ltd. YES are operating as both a recruitment Agency and Recruitment Business.
Boston Consulting Group
AI Software Engineer/Platform Architect - BCG X
Boston Consulting Group
Locations : Stockholm Copenhagen V Berlin München London Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures-and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. We Are BCG X We're a diverse team of more than 3,000 tech experts united by a drive to make a difference. Working across industries and disciplines, we combine our experience and expertise to tackle the biggest challenges faced by society today. We go beyond what was once thought possible, creating new and innovative solutions to the world's most complex problems. Leveraging BCG's global network and partnerships with leading organizations, BCG X provides a stable ecosystem for talent to build game-changing businesses, products, and services from the ground up, all while growing their career. Together, we strive to create solutions that will positively impact the lives of millions. What You'll Do Our BCG X teams own the full analytics value-chain end to end: framing new business challenges, designing innovative algorithms, implementing, and deploying scalable solutions, and enabling colleagues and clients to fully embrace AI. Our product offerings span from fully custom-builds to industry specific leading edge AI software solutions. As a (Senior) AI Software Engineer you'll be part of our rapidly growing engineering team and help to build the next generation of AI solutions. You'll have the chance to partner with clients in a variety of BCG regions and industries, and on key topics like climate change, enabling them to design, build, and deploy new and innovative solutions. Additional responsibilities will include developing and delivering thought leadership in scientific communities and papers as well as leading conferences on behalf of BCG X. We are looking for talented individuals with a passion for software development, large-scale data analytics and transforming organizations into AI led innovative companies. Successful candidates possess the following: +4 years of experience in a technology consulting environment Apply software development practices and standards to develop robust and maintainable software Actively involved in every part of the software development life cycle Experienced at guiding non-technical teams and consultants in and best practices for robust software development Optimize and enhance computational efficiency of algorithms and software design Motivated by a fast-paced, service-oriented environment and interacting directly with clients on new features for future product releases Enjoy collaborating in teams to share software design and solution ideas A natural problem-solver and intellectually curious across a breadth of industries and topics Master's degree or PhD in relevant field of study - please provide all academic certificates showing the final grades (A-level, Bachelor, Master, PhD) Additional tasks: Designing and building data & AI platforms for our clients. Such platforms provide data and (Gen)AI capabilities to a wide variety of consumers and use cases across the client organization. Often part of large (AI) transformational journeys BCG does for its clients. Often involves the following engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith, Langfuse and similar for LLMOps The difference to our "AI Engineer" role is: Do you "use/consume" these technologies, or are you the one that "provides" them to the rest of the organization. What You'll Bring TECHNOLOGIES: Programming Languages: Python Experience with additional programming languages is a plus Additional info BCG offers a comprehensive benefits program, including medical, dental and vision coverage, telemedicine services, life, accident and disability insurance, parental leave and family planning benefits, caregiving resources, mental health offerings, a generous retirement program, financial guidance, paid time off, and more. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify.
05/01/2026
Full time
Locations : Stockholm Copenhagen V Berlin München London Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures-and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. We Are BCG X We're a diverse team of more than 3,000 tech experts united by a drive to make a difference. Working across industries and disciplines, we combine our experience and expertise to tackle the biggest challenges faced by society today. We go beyond what was once thought possible, creating new and innovative solutions to the world's most complex problems. Leveraging BCG's global network and partnerships with leading organizations, BCG X provides a stable ecosystem for talent to build game-changing businesses, products, and services from the ground up, all while growing their career. Together, we strive to create solutions that will positively impact the lives of millions. What You'll Do Our BCG X teams own the full analytics value-chain end to end: framing new business challenges, designing innovative algorithms, implementing, and deploying scalable solutions, and enabling colleagues and clients to fully embrace AI. Our product offerings span from fully custom-builds to industry specific leading edge AI software solutions. As a (Senior) AI Software Engineer you'll be part of our rapidly growing engineering team and help to build the next generation of AI solutions. You'll have the chance to partner with clients in a variety of BCG regions and industries, and on key topics like climate change, enabling them to design, build, and deploy new and innovative solutions. Additional responsibilities will include developing and delivering thought leadership in scientific communities and papers as well as leading conferences on behalf of BCG X. We are looking for talented individuals with a passion for software development, large-scale data analytics and transforming organizations into AI led innovative companies. Successful candidates possess the following: +4 years of experience in a technology consulting environment Apply software development practices and standards to develop robust and maintainable software Actively involved in every part of the software development life cycle Experienced at guiding non-technical teams and consultants in and best practices for robust software development Optimize and enhance computational efficiency of algorithms and software design Motivated by a fast-paced, service-oriented environment and interacting directly with clients on new features for future product releases Enjoy collaborating in teams to share software design and solution ideas A natural problem-solver and intellectually curious across a breadth of industries and topics Master's degree or PhD in relevant field of study - please provide all academic certificates showing the final grades (A-level, Bachelor, Master, PhD) Additional tasks: Designing and building data & AI platforms for our clients. Such platforms provide data and (Gen)AI capabilities to a wide variety of consumers and use cases across the client organization. Often part of large (AI) transformational journeys BCG does for its clients. Often involves the following engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith, Langfuse and similar for LLMOps The difference to our "AI Engineer" role is: Do you "use/consume" these technologies, or are you the one that "provides" them to the rest of the organization. What You'll Bring TECHNOLOGIES: Programming Languages: Python Experience with additional programming languages is a plus Additional info BCG offers a comprehensive benefits program, including medical, dental and vision coverage, telemedicine services, life, accident and disability insurance, parental leave and family planning benefits, caregiving resources, mental health offerings, a generous retirement program, financial guidance, paid time off, and more. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify.
Huxley Associates
Python Data Engineer - Hedgefund
Huxley Associates
Python Data Engineer - Multi-Strategy Hedge Fund Location: London Hybrid: 2 days per week on-site Type: Full-time About the Role A leading multi-strategy hedge fund is seeking a highly skilled Python Data Engineer to join its technology and data team. This is a hands-on role focused on building and optimising data infrastructure that powers quantitative research, trading strategies, and risk management. Key Responsibilities Develop and maintain scalable Python-based ETL pipelines for ingesting and transforming market data from multiple sources. Design and manage cloud-based data lake solutions (AWS, Databricks) for large volumes of structured and unstructured data. Implement rigorous data quality, validation, and cleansing routines to ensure accuracy of financial time-series data. Optimize workflows for low latency and high throughput, critical for trading and research. Collaborate with portfolio managers, quantitative researchers, and traders to deliver tailored data solutions for modeling and strategy development. Contribute to the design and implementation of the firm's security master database. Analyse datasets to extract actionable insights for trading and risk management. Document system architecture, data flows, and technical processes for transparency and reproducibility. Requirements Strong proficiency in Python (pandas, NumPy, PySpark) and ETL development. Hands-on experience with AWS services (S3, Glue, Lambda) and Databricks. Solid understanding of financial market data, particularly time-series. Knowledge of data quality frameworks and performance optimisation techniques. Degree in Computer Science, Engineering, or related field. Preferred Skills SQL and relational database design experience. Exposure to quantitative finance or trading environments. Familiarity with containerisation and orchestration (Docker, Kubernetes). What We Offer Competitive compensation and performance-based bonus. Hybrid working model: 2 days per week on-site in London. Opportunity to work on mission-critical data systems for a global hedge fund. Collaborative, high-performance culture with direct exposure to front-office teams To Avoid Disappointment, Apply Now! To find out more about Huxley, please visit (url removed) Huxley, a trading division of SThree Partnership LLP is acting as an Employment Business in relation to this vacancy Registered office 8 Bishopsgate, London, EC2N 4BQ, United Kingdom Partnership Number OC(phone number removed) England and Wales
03/01/2026
Full time
Python Data Engineer - Multi-Strategy Hedge Fund Location: London Hybrid: 2 days per week on-site Type: Full-time About the Role A leading multi-strategy hedge fund is seeking a highly skilled Python Data Engineer to join its technology and data team. This is a hands-on role focused on building and optimising data infrastructure that powers quantitative research, trading strategies, and risk management. Key Responsibilities Develop and maintain scalable Python-based ETL pipelines for ingesting and transforming market data from multiple sources. Design and manage cloud-based data lake solutions (AWS, Databricks) for large volumes of structured and unstructured data. Implement rigorous data quality, validation, and cleansing routines to ensure accuracy of financial time-series data. Optimize workflows for low latency and high throughput, critical for trading and research. Collaborate with portfolio managers, quantitative researchers, and traders to deliver tailored data solutions for modeling and strategy development. Contribute to the design and implementation of the firm's security master database. Analyse datasets to extract actionable insights for trading and risk management. Document system architecture, data flows, and technical processes for transparency and reproducibility. Requirements Strong proficiency in Python (pandas, NumPy, PySpark) and ETL development. Hands-on experience with AWS services (S3, Glue, Lambda) and Databricks. Solid understanding of financial market data, particularly time-series. Knowledge of data quality frameworks and performance optimisation techniques. Degree in Computer Science, Engineering, or related field. Preferred Skills SQL and relational database design experience. Exposure to quantitative finance or trading environments. Familiarity with containerisation and orchestration (Docker, Kubernetes). What We Offer Competitive compensation and performance-based bonus. Hybrid working model: 2 days per week on-site in London. Opportunity to work on mission-critical data systems for a global hedge fund. Collaborative, high-performance culture with direct exposure to front-office teams To Avoid Disappointment, Apply Now! To find out more about Huxley, please visit (url removed) Huxley, a trading division of SThree Partnership LLP is acting as an Employment Business in relation to this vacancy Registered office 8 Bishopsgate, London, EC2N 4BQ, United Kingdom Partnership Number OC(phone number removed) England and Wales
Nine Twenty
Data Engineer
Nine Twenty
Data Engineer An established technology consultancy is looking to hire an experienced Data Engineer to work on large-scale, customer-facing data projects while also contributing to the development of internal data services. This role blends hands-on engineering with architecture design and technical advisory work, offering exposure to enterprise clients and modern cloud platforms. You will play a key role in designing and delivering cloud-native data platforms, working closely with engineering teams, stakeholders, and customers from initial design through to production release. The role offers variety, autonomy, and the opportunity to work with leading-edge data technologies across Azure and AWS. The role As a Data Engineer, you will be responsible for designing, building, and maintaining scalable data platforms and pipelines. You will support and lead technical workshops, contribute to architecture decisions, and act as a trusted technical partner on complex data initiatives. Key responsibilities include: Designing and building scalable data platforms and ETL/ELT pipelines in Azure and AWS Implementing serverless, batch, and streaming data architectures Working hands-on with Spark, Python, Databricks, and SQL-based analytics platforms Designing Lakehouse-style architectures and analytical data models Feeding behavioural and analytical data back into production systems Supporting architecture reviews, design sessions, and technical workshops Collaborating with engineering, analytics, and commercial teams Advising customers throughout the full project lifecycle Contributing to internal data services, standards, and best practices What we are looking for Essential experience Proven experience as a Data Engineer working with large-scale data platforms Strong hands-on experience in either Azure or AWS, with working knowledge of the other Azure experience with Lakehouse concepts, Data Factory, Synapse and/or Fabric AWS experience with Redshift, Lambda, and SQL-based analytics services Strong Python skills and experience using Apache Spark Hands-on experience with Databricks Experience designing and maintaining ETL/ELT pipelines Solid understanding of data modelling techniques Experience working in cross-functional teams on cloud-based data platforms Ability to work with SDKs and APIs across cloud services Strong communication skills and a customer-focused approach Desirable experience Data migrations and platform modernisation projects Implementing machine learning models using Python Consulting or customer-facing engineering roles Feeding analytics insights back into operational systems Certifications (beneficial but not required) AWS Solutions Architect Associate Azure Solutions Architect Associate AWS Data Engineer Associate Azure Data Engineer Associate What s on offer The opportunity to work on modern cloud and data projects using leading technologies A collaborative engineering culture with highly skilled colleagues Structured learning paths and access to training and certifications Certification exam fees covered and certification-related bonuses Competitive salary and comprehensive benefits package A supportive and inclusive working environment with regular knowledge sharing and team events This role would suit a Data Engineer who enjoys combining deep technical work with customer interaction and wants to continue developing their expertise across cloud and data platforms. If you would like to find out more, then please get in contact with Jack at (url removed).
03/01/2026
Full time
Data Engineer An established technology consultancy is looking to hire an experienced Data Engineer to work on large-scale, customer-facing data projects while also contributing to the development of internal data services. This role blends hands-on engineering with architecture design and technical advisory work, offering exposure to enterprise clients and modern cloud platforms. You will play a key role in designing and delivering cloud-native data platforms, working closely with engineering teams, stakeholders, and customers from initial design through to production release. The role offers variety, autonomy, and the opportunity to work with leading-edge data technologies across Azure and AWS. The role As a Data Engineer, you will be responsible for designing, building, and maintaining scalable data platforms and pipelines. You will support and lead technical workshops, contribute to architecture decisions, and act as a trusted technical partner on complex data initiatives. Key responsibilities include: Designing and building scalable data platforms and ETL/ELT pipelines in Azure and AWS Implementing serverless, batch, and streaming data architectures Working hands-on with Spark, Python, Databricks, and SQL-based analytics platforms Designing Lakehouse-style architectures and analytical data models Feeding behavioural and analytical data back into production systems Supporting architecture reviews, design sessions, and technical workshops Collaborating with engineering, analytics, and commercial teams Advising customers throughout the full project lifecycle Contributing to internal data services, standards, and best practices What we are looking for Essential experience Proven experience as a Data Engineer working with large-scale data platforms Strong hands-on experience in either Azure or AWS, with working knowledge of the other Azure experience with Lakehouse concepts, Data Factory, Synapse and/or Fabric AWS experience with Redshift, Lambda, and SQL-based analytics services Strong Python skills and experience using Apache Spark Hands-on experience with Databricks Experience designing and maintaining ETL/ELT pipelines Solid understanding of data modelling techniques Experience working in cross-functional teams on cloud-based data platforms Ability to work with SDKs and APIs across cloud services Strong communication skills and a customer-focused approach Desirable experience Data migrations and platform modernisation projects Implementing machine learning models using Python Consulting or customer-facing engineering roles Feeding analytics insights back into operational systems Certifications (beneficial but not required) AWS Solutions Architect Associate Azure Solutions Architect Associate AWS Data Engineer Associate Azure Data Engineer Associate What s on offer The opportunity to work on modern cloud and data projects using leading technologies A collaborative engineering culture with highly skilled colleagues Structured learning paths and access to training and certifications Certification exam fees covered and certification-related bonuses Competitive salary and comprehensive benefits package A supportive and inclusive working environment with regular knowledge sharing and team events This role would suit a Data Engineer who enjoys combining deep technical work with customer interaction and wants to continue developing their expertise across cloud and data platforms. If you would like to find out more, then please get in contact with Jack at (url removed).
Barclays Bank Plc
AI/MLOps Platform Engineer
Barclays Bank Plc City, Glasgow
Join Us in Shaping the Future of AI at Barclays. We're launching an exciting new initiative at Barclays to design, build, and scale next-generation platform components that empower developers - including Quants and Strats - to create high-performance, AI-driven applications. As an AI/MLOps Platform Engineer, you'll play a pivotal role in this transformation, working hands-on to develop the infrastructure and tooling that supports the full lifecycle of machine learning and generative AI workloads. This is more than an engineering role-it's an opportunity to influence technical direction, collaborate across diverse teams, and help define how AI and GenAI are delivered at scale. To be successful as an AI/MLOps Platform Engineer at this level, you should have experience with: Proficiency in Python engineering skills, especially in backend systems and infrastructure. Deep AWS expertise, including services like SageMaker, Lambda, ECS, Step Functions, S3, IAM, KMS, CloudFormation, and Bedrock. Proven experience building and scaling MLOps platforms and supporting GenAI workloads in production. Strong understanding of secure software development, cloud cost optimization, and platform observability. Ability to communicate complex technical concepts clearly to both technical and non-technical audiences. Demonstrated leadership in setting technical direction while remaining hands-on. Some other highly valued skills may include: Experience with MLOps platforms such as Databricks or SageMaker, and familiarity with hybrid cloud strategies (Azure, on-prem Kubernetes). Strong understanding of AI infrastructure for scalable model serving, distributed training, and GPU orchestration. Expertise in Large Language Models (LLMs) and Small Language Models (SLMs), including fine-tuning and deployment for enterprise use cases. Hands-on experience with Hugging Face libraries and tools for model training, evaluation, and deployment. Knowledge of agentic frameworks (e.g., LangChain, AutoGen) and Model Context Protocol (MCP) for building autonomous AI workflows and interoperability. Awareness of emerging trends in GenAI platforms, open-source MLOps, and cloud-native AI solutions You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills. This role can be based out of our Glasgow or Canary Wharf office. Purpose of the role To design, develop and improve software, utilising various engineering methodologies, that provides business, platform, and technology capabilities for our customers and colleagues. Accountabilities Development and delivery of high-quality software solutions by using industry aligned programming languages, frameworks, and tools. Ensuring that code is scalable, maintainable, and optimized for performance. Cross-functional collaboration with product managers, designers, and other engineers to define software requirements, devise solution strategies, and ensure seamless integration and alignment with business objectives. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations and actively contribute to the organization's technology communities to foster a culture of technical excellence and growth. Adherence to secure coding practices to mitigate vulnerabilities, protect sensitive data, and ensure secure software solutions. Implementation of effective unit testing practices to ensure proper code design, readability, and reliability. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L - Listen and be authentic, E - Energise and inspire, A - Align across the enterprise, D - Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc).to solve problems creatively and effectively. Communicate complex information. 'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship - our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset - to Empower, Challenge and Drive - the operating manual for how we behave.
02/01/2026
Full time
Join Us in Shaping the Future of AI at Barclays. We're launching an exciting new initiative at Barclays to design, build, and scale next-generation platform components that empower developers - including Quants and Strats - to create high-performance, AI-driven applications. As an AI/MLOps Platform Engineer, you'll play a pivotal role in this transformation, working hands-on to develop the infrastructure and tooling that supports the full lifecycle of machine learning and generative AI workloads. This is more than an engineering role-it's an opportunity to influence technical direction, collaborate across diverse teams, and help define how AI and GenAI are delivered at scale. To be successful as an AI/MLOps Platform Engineer at this level, you should have experience with: Proficiency in Python engineering skills, especially in backend systems and infrastructure. Deep AWS expertise, including services like SageMaker, Lambda, ECS, Step Functions, S3, IAM, KMS, CloudFormation, and Bedrock. Proven experience building and scaling MLOps platforms and supporting GenAI workloads in production. Strong understanding of secure software development, cloud cost optimization, and platform observability. Ability to communicate complex technical concepts clearly to both technical and non-technical audiences. Demonstrated leadership in setting technical direction while remaining hands-on. Some other highly valued skills may include: Experience with MLOps platforms such as Databricks or SageMaker, and familiarity with hybrid cloud strategies (Azure, on-prem Kubernetes). Strong understanding of AI infrastructure for scalable model serving, distributed training, and GPU orchestration. Expertise in Large Language Models (LLMs) and Small Language Models (SLMs), including fine-tuning and deployment for enterprise use cases. Hands-on experience with Hugging Face libraries and tools for model training, evaluation, and deployment. Knowledge of agentic frameworks (e.g., LangChain, AutoGen) and Model Context Protocol (MCP) for building autonomous AI workflows and interoperability. Awareness of emerging trends in GenAI platforms, open-source MLOps, and cloud-native AI solutions You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills. This role can be based out of our Glasgow or Canary Wharf office. Purpose of the role To design, develop and improve software, utilising various engineering methodologies, that provides business, platform, and technology capabilities for our customers and colleagues. Accountabilities Development and delivery of high-quality software solutions by using industry aligned programming languages, frameworks, and tools. Ensuring that code is scalable, maintainable, and optimized for performance. Cross-functional collaboration with product managers, designers, and other engineers to define software requirements, devise solution strategies, and ensure seamless integration and alignment with business objectives. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations and actively contribute to the organization's technology communities to foster a culture of technical excellence and growth. Adherence to secure coding practices to mitigate vulnerabilities, protect sensitive data, and ensure secure software solutions. Implementation of effective unit testing practices to ensure proper code design, readability, and reliability. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L - Listen and be authentic, E - Energise and inspire, A - Align across the enterprise, D - Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc).to solve problems creatively and effectively. Communicate complex information. 'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship - our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset - to Empower, Challenge and Drive - the operating manual for how we behave.
DGH Recruitment Ltd
Data Science Engineer
DGH Recruitment Ltd City, London
Data Science Engineer My client is recruiting for a Data Science Engineer to design, develop, and deliver AI and analytics solutions aligned with the organisations Data & AI strategy. Key Responsibilities End-to-end development of AI/ML solutions. MLOps practices: CI/CD, model monitoring, retraining. Use of open-source and enterprise tools (LangChain, Azure OpenAI, Databricks). Generative AI features: embeddings, RAG, AI agents. Clean, testable code with modern engineering practices. Align with enterprise architecture and governance. Collaborate with architects and stakeholders. Lifecycle management of models. Pilot emerging technologies. Experience & Skills 2-4 years in production-level AI/ML delivery. Legal/professional services experience is a plus. AI/ML frameworks: PyTorch, TensorFlow, LangChain. Cloud: Azure (preferred), AWS, GCP. MLOps: CI/CD, model lifecycle, monitoring. Generative AI: LLMs, RAG, chat agents. Data engineering alignment: ETL, governance. Strong coding, communication, and collaboration skills. Strategic thinking, problem-solving, and stakeholder engagement. In accordance with the Employment Agencies and Employment Businesses Regulations 2003, this position is advertised based upon DGH Recruitment Limited having first sought approval of its client to find candidates for this position. DGH Recruitment Limited acts as both an Employment Agency and Employment Business
31/12/2025
Full time
Data Science Engineer My client is recruiting for a Data Science Engineer to design, develop, and deliver AI and analytics solutions aligned with the organisations Data & AI strategy. Key Responsibilities End-to-end development of AI/ML solutions. MLOps practices: CI/CD, model monitoring, retraining. Use of open-source and enterprise tools (LangChain, Azure OpenAI, Databricks). Generative AI features: embeddings, RAG, AI agents. Clean, testable code with modern engineering practices. Align with enterprise architecture and governance. Collaborate with architects and stakeholders. Lifecycle management of models. Pilot emerging technologies. Experience & Skills 2-4 years in production-level AI/ML delivery. Legal/professional services experience is a plus. AI/ML frameworks: PyTorch, TensorFlow, LangChain. Cloud: Azure (preferred), AWS, GCP. MLOps: CI/CD, model lifecycle, monitoring. Generative AI: LLMs, RAG, chat agents. Data engineering alignment: ETL, governance. Strong coding, communication, and collaboration skills. Strategic thinking, problem-solving, and stakeholder engagement. In accordance with the Employment Agencies and Employment Businesses Regulations 2003, this position is advertised based upon DGH Recruitment Limited having first sought approval of its client to find candidates for this position. DGH Recruitment Limited acts as both an Employment Agency and Employment Business
Deerfoot Recruitment Solutions Limited
Data Technical Lead
Deerfoot Recruitment Solutions Limited City, London
Data Engineering Technical Lead Global Investment Bank London - Hybrid Permanent - Excellent Package + Benefits We are working with one of the world's leading banking groups, who we have partnered with for 15 years. We are seeking an experienced Data Architect / EDM Developer / Data Engineering Lead to join their International Technology team in London. You will be a key part of the Architecture, Middleware, Data & Enterprise Services (AMD) division, driving data engineering, integration and automation initiatives across our clients EMEA banking and securities entities. This is a hands-on leadership role, combining technical expertise with mentoring and team leadership. Key Responsibilities Architect, design and deliver enterprise-wide EDM and data solutions. Lead and mentor EDM developers, ensuring high-quality, cost-effective delivery. Drive data innovation, automation and best practices across EMEA. Translate business requirements into functional and technical designs. Ensure compliance with SDLC, governance, and risk policies. Skills & Experience - Essential Strong SQL Server or Snowflake skills. Advanced knowledge of low-code/no-code data engineering / ETL tools - ideally Markit EDM (v19.2+) or similar (e.g. Informatica). Proven delivery experience in Financial Services / Banking sector. Deep understanding of SDLC, systems integration, and data warehousing. Ability to gather requirements and liaise effectively with business stakeholders. Desirable Skills Cloud (AWS / Azure), Python, PowerShell, APIs. Data pipelines, lineage, automation. BI tools (Power BI, Tableau, SSRS). Modern data architectures (lakehouse, data mesh). CI/CD, GitHub, Control-M, dbt/Databricks. This is an opportunity to join a global top-5 bank with long-term stability, world-class resources, and clear career progression routes. Enterprise Data Architect, EDM Developer, Data Engineering Lead, Data Architect, ETL Developer, Data Solutions Architect, Senior Data Engineer (Financial Services). Apply today for full details. Deerfoot Recruitment Solutions Ltd is a leading independent tech recruitment consultancy in the UK. For every CV sent to clients, we donate 1 to The Born Free Foundation. We are a Climate Action Workforce in partnership with Ecologi. If this role isn't right for you, explore our referral reward program with payouts at interview and placement milestones. Visit our website for details. Deerfoot Recruitment Solutions Ltd is acting as an Employment Agency in relation to this vacancy.
30/12/2025
Full time
Data Engineering Technical Lead Global Investment Bank London - Hybrid Permanent - Excellent Package + Benefits We are working with one of the world's leading banking groups, who we have partnered with for 15 years. We are seeking an experienced Data Architect / EDM Developer / Data Engineering Lead to join their International Technology team in London. You will be a key part of the Architecture, Middleware, Data & Enterprise Services (AMD) division, driving data engineering, integration and automation initiatives across our clients EMEA banking and securities entities. This is a hands-on leadership role, combining technical expertise with mentoring and team leadership. Key Responsibilities Architect, design and deliver enterprise-wide EDM and data solutions. Lead and mentor EDM developers, ensuring high-quality, cost-effective delivery. Drive data innovation, automation and best practices across EMEA. Translate business requirements into functional and technical designs. Ensure compliance with SDLC, governance, and risk policies. Skills & Experience - Essential Strong SQL Server or Snowflake skills. Advanced knowledge of low-code/no-code data engineering / ETL tools - ideally Markit EDM (v19.2+) or similar (e.g. Informatica). Proven delivery experience in Financial Services / Banking sector. Deep understanding of SDLC, systems integration, and data warehousing. Ability to gather requirements and liaise effectively with business stakeholders. Desirable Skills Cloud (AWS / Azure), Python, PowerShell, APIs. Data pipelines, lineage, automation. BI tools (Power BI, Tableau, SSRS). Modern data architectures (lakehouse, data mesh). CI/CD, GitHub, Control-M, dbt/Databricks. This is an opportunity to join a global top-5 bank with long-term stability, world-class resources, and clear career progression routes. Enterprise Data Architect, EDM Developer, Data Engineering Lead, Data Architect, ETL Developer, Data Solutions Architect, Senior Data Engineer (Financial Services). Apply today for full details. Deerfoot Recruitment Solutions Ltd is a leading independent tech recruitment consultancy in the UK. For every CV sent to clients, we donate 1 to The Born Free Foundation. We are a Climate Action Workforce in partnership with Ecologi. If this role isn't right for you, explore our referral reward program with payouts at interview and placement milestones. Visit our website for details. Deerfoot Recruitment Solutions Ltd is acting as an Employment Agency in relation to this vacancy.
Hays Technology
Software Development Manager
Hays Technology City, London
As a Software Development Manager, you will lead the Global Content Delivery team. Ideally, having previous experience of working in Financial Services. This is a "Player/Coach" role, where you will both lead and contribute hands-on to the overall success of the team. The Software Development Manager will be responsible for transforming and optimising the delivery of software applications and data solutions, leading a highly motivated team to build innovative, scalable, and resilient software and data solutions that empower the organisation. Manage partnerships with business leaders, stakeholders, and IT to develop and promote content delivery data solutions. Translates stakeholder needs into technical solutions. As a player coach, you will impact the product, architecture, software design and engineering. With a focus on intuitive front-end creation, robust backend Frameworks and strong data design Lead, a globally dispersed team Experience with AWS, Databricks, DBT, PySpark, React, JavaScript Terraform. Experience with AI/ML/Gen AI/LLM What you need to do now If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now. If this job isn't quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at (url removed)
29/12/2025
Full time
As a Software Development Manager, you will lead the Global Content Delivery team. Ideally, having previous experience of working in Financial Services. This is a "Player/Coach" role, where you will both lead and contribute hands-on to the overall success of the team. The Software Development Manager will be responsible for transforming and optimising the delivery of software applications and data solutions, leading a highly motivated team to build innovative, scalable, and resilient software and data solutions that empower the organisation. Manage partnerships with business leaders, stakeholders, and IT to develop and promote content delivery data solutions. Translates stakeholder needs into technical solutions. As a player coach, you will impact the product, architecture, software design and engineering. With a focus on intuitive front-end creation, robust backend Frameworks and strong data design Lead, a globally dispersed team Experience with AWS, Databricks, DBT, PySpark, React, JavaScript Terraform. Experience with AI/ML/Gen AI/LLM What you need to do now If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now. If this job isn't quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at (url removed)
Cathcart Technology
Lead Data Engineer
Cathcart Technology Edinburgh, Midlothian
I'm working with a world-class technology company in Edinburgh to help them find a Lead Data Engineer to join their team (hybrid working but there is flex on this for the right person). This is your chance to take the technical lead on complex, large-scale data projects that power real-world products used by millions of people . The organisation has been steadily growing for a number of years and have become a market leader in their field so it's genuinely a really exciting time to join! You'll be joining a forward-thinking team that's passionate about doing things properly using a modern tech stack , cloud-first approach, and a genuine commitment to engineering excellence. As Lead Data Engineer, you'll be hands-on in designing and building scalable data platforms and pipelines that enable advanced analytics, machine learning, and business-critical insights. You'll shape the technical vision , set best practices, and make key architectural decisions that define how data flows across the organisation. You won't be working in isolation either as collaboration is at the heart of this role. You'll work closely with engineers, product managers, and data scientists to turn ideas into high-performing, production-ready systems. You'll also play a big part in mentoring others , driving standards across the team, and influencing the overall data strategy. The ideal person for this role will have a strong background in data engineering , with experience building modern data solutions using technologies like Kafka , Spark , Databricks , dbt , and Airflow . You'll know your way around cloud platforms (AWS, GCP, or Azure) and be confident coding in Python , Java , or Scala . Most importantly, you'll understand what it takes to design data systems that are scalable , reliable and built for the long haul. In return, they are offering a competitive salary (happy to discuss prior to application), great benefits which includes uncapped holidays and multiple bonuses! Their office in central Edinburgh is only a short walk from Haymarket train station. The role is Hybrid (ideally 1 or 2 days in office), however, they can be flex on this for the right candidate. If you're ready to step into a role where your technical leadership will have a visible impact and where you can build data systems that continue to scale then please apply or contact Matthew MacAlpine at Cathcart Technology. Cathcart Technology is acting as an Employment Agency in relation to this vacancy.
24/12/2025
Full time
I'm working with a world-class technology company in Edinburgh to help them find a Lead Data Engineer to join their team (hybrid working but there is flex on this for the right person). This is your chance to take the technical lead on complex, large-scale data projects that power real-world products used by millions of people . The organisation has been steadily growing for a number of years and have become a market leader in their field so it's genuinely a really exciting time to join! You'll be joining a forward-thinking team that's passionate about doing things properly using a modern tech stack , cloud-first approach, and a genuine commitment to engineering excellence. As Lead Data Engineer, you'll be hands-on in designing and building scalable data platforms and pipelines that enable advanced analytics, machine learning, and business-critical insights. You'll shape the technical vision , set best practices, and make key architectural decisions that define how data flows across the organisation. You won't be working in isolation either as collaboration is at the heart of this role. You'll work closely with engineers, product managers, and data scientists to turn ideas into high-performing, production-ready systems. You'll also play a big part in mentoring others , driving standards across the team, and influencing the overall data strategy. The ideal person for this role will have a strong background in data engineering , with experience building modern data solutions using technologies like Kafka , Spark , Databricks , dbt , and Airflow . You'll know your way around cloud platforms (AWS, GCP, or Azure) and be confident coding in Python , Java , or Scala . Most importantly, you'll understand what it takes to design data systems that are scalable , reliable and built for the long haul. In return, they are offering a competitive salary (happy to discuss prior to application), great benefits which includes uncapped holidays and multiple bonuses! Their office in central Edinburgh is only a short walk from Haymarket train station. The role is Hybrid (ideally 1 or 2 days in office), however, they can be flex on this for the right candidate. If you're ready to step into a role where your technical leadership will have a visible impact and where you can build data systems that continue to scale then please apply or contact Matthew MacAlpine at Cathcart Technology. Cathcart Technology is acting as an Employment Agency in relation to this vacancy.
The Portfolio Group
AI Platform Engineer
The Portfolio Group City, London
AI Platform Engineer London Excellent Salary +Benefits Join an award-winning, internationally recognised B2B consultancy as an AI Platform Engineer, owning the cloud-native platform that underpins conversational AI and generative AI products at scale. Sitting at the core of AI delivery, you will design, build, and operate the runtime, infrastructure, and operational layers supporting RAG pipelines, LLM orchestration, vector search, and evaluation workflows across AWS and Databricks. Working closely with senior AI engineers and product teams, you'll ensure AI systems are scalable, observable, secure, and cost-efficient, turning experimental AI into reliable, production-grade capabilities. With further scope of responsibilities detailed below: Own and evolve the AI platform powering conversational assistants and generative AI products. Build, operate, and optimise RAG and LLM-backed services, improving latency, reliability, and cost. Design and run cloud-native AI services across AWS and Databricks, including ingestion and embedding pipelines. Scale and operate vector search infrastructure (Weaviate, OpenSearch, Algolia, AWS Bedrock Knowledge Bases). Implement strong observability, CI/CD, security, and governance across AI workloads. Enable future architectures such as multi-model orchestration and agentic workflows. Required Skills & Experience Strong experience designing and operating cloud-native platforms on AWS (Lambda, API Gateway, DynamoDB, S3, CloudWatch). Hands-on experience with Databricks and large-scale data or embedding pipelines. Proven experience building and operating production AI systems , including RAG pipelines, LLM-backed services, and vector search (Weaviate, OpenSearch, Algolia). Proficiency in Python , with experience deploying containerised services on Kubernetes using Terraform . Solid understanding of distributed systems, cloud architecture, and API design , with a focus on scalability and reliability. Demonstrable ownership of observability, performance, cost efficiency, and operational robustness in production environments. Why Join? You'll own the foundational AI platform behind a growing suite of generative AI products, working with senior AI leaders on systems used by real customers at scale. This role offers deep technical ownership, long-term impact, and an excellent compensation package within a market-leading organisation. INDAM
23/12/2025
Full time
AI Platform Engineer London Excellent Salary +Benefits Join an award-winning, internationally recognised B2B consultancy as an AI Platform Engineer, owning the cloud-native platform that underpins conversational AI and generative AI products at scale. Sitting at the core of AI delivery, you will design, build, and operate the runtime, infrastructure, and operational layers supporting RAG pipelines, LLM orchestration, vector search, and evaluation workflows across AWS and Databricks. Working closely with senior AI engineers and product teams, you'll ensure AI systems are scalable, observable, secure, and cost-efficient, turning experimental AI into reliable, production-grade capabilities. With further scope of responsibilities detailed below: Own and evolve the AI platform powering conversational assistants and generative AI products. Build, operate, and optimise RAG and LLM-backed services, improving latency, reliability, and cost. Design and run cloud-native AI services across AWS and Databricks, including ingestion and embedding pipelines. Scale and operate vector search infrastructure (Weaviate, OpenSearch, Algolia, AWS Bedrock Knowledge Bases). Implement strong observability, CI/CD, security, and governance across AI workloads. Enable future architectures such as multi-model orchestration and agentic workflows. Required Skills & Experience Strong experience designing and operating cloud-native platforms on AWS (Lambda, API Gateway, DynamoDB, S3, CloudWatch). Hands-on experience with Databricks and large-scale data or embedding pipelines. Proven experience building and operating production AI systems , including RAG pipelines, LLM-backed services, and vector search (Weaviate, OpenSearch, Algolia). Proficiency in Python , with experience deploying containerised services on Kubernetes using Terraform . Solid understanding of distributed systems, cloud architecture, and API design , with a focus on scalability and reliability. Demonstrable ownership of observability, performance, cost efficiency, and operational robustness in production environments. Why Join? You'll own the foundational AI platform behind a growing suite of generative AI products, working with senior AI leaders on systems used by real customers at scale. This role offers deep technical ownership, long-term impact, and an excellent compensation package within a market-leading organisation. INDAM
Datatech
Data Engineer
Datatech
Data Engineer, Remote Modern Cloud Data Stack 45,000 PA DOE This is a high-visibility opportunity in an ambitious, values-led organisation refreshing its data strategy and modernising its intelligence platform. You'll be trusted early, work closely with stakeholders, and build the foundations that drive better insight, smarter decisions, and meaningful impact, using data for good. It's ideal for someone early in their journey with 2+ years' experience, ready to step up. You'll join a supportive, encouraging environment with real runway to grow technically and start developing leadership skills as your ownership and influence increases across the business. What you'll do Help shape and deliver a refreshed data strategy and modern intelligence platform Build reliable, scalable ELT/ETL pipelines into a cloud data warehouse, Snowflake, Databricks, or similar Build and optimise core data models and transformations, dimensional, analytics-ready, built to last Create trusted data products that enable self-service analytics across the organisation Improve data quality, monitoring, performance, and cost efficiency Partner with analysts, BI, and non-technical teams to turn questions into robust data assets Contribute to standards, best practice, and reusable engineering frameworks Support responsible AI tooling, including programmatic LLM workflows where relevant What you'll bring 2+ years' experience in data engineering within a modern stack Strong SQL and a solid modelling foundation Python (preferred) or similar for pipeline development and automation Cloud experience, AWS, Azure, or GCP Familiarity with orchestration and analytics engineering tools, dbt, Airflow, or similar Strong habits around governance, security, documentation, version control (Git), and CI/CD The kind of person who thrives here Confident, curious, and motivated. You care about doing things properly, you enjoy being visible and trusted in the business, and you're passionate about using data to create positive outcomes. Excited . APPLY NOW No Sponsorship - Post Grad Visa
18/12/2025
Full time
Data Engineer, Remote Modern Cloud Data Stack 45,000 PA DOE This is a high-visibility opportunity in an ambitious, values-led organisation refreshing its data strategy and modernising its intelligence platform. You'll be trusted early, work closely with stakeholders, and build the foundations that drive better insight, smarter decisions, and meaningful impact, using data for good. It's ideal for someone early in their journey with 2+ years' experience, ready to step up. You'll join a supportive, encouraging environment with real runway to grow technically and start developing leadership skills as your ownership and influence increases across the business. What you'll do Help shape and deliver a refreshed data strategy and modern intelligence platform Build reliable, scalable ELT/ETL pipelines into a cloud data warehouse, Snowflake, Databricks, or similar Build and optimise core data models and transformations, dimensional, analytics-ready, built to last Create trusted data products that enable self-service analytics across the organisation Improve data quality, monitoring, performance, and cost efficiency Partner with analysts, BI, and non-technical teams to turn questions into robust data assets Contribute to standards, best practice, and reusable engineering frameworks Support responsible AI tooling, including programmatic LLM workflows where relevant What you'll bring 2+ years' experience in data engineering within a modern stack Strong SQL and a solid modelling foundation Python (preferred) or similar for pipeline development and automation Cloud experience, AWS, Azure, or GCP Familiarity with orchestration and analytics engineering tools, dbt, Airflow, or similar Strong habits around governance, security, documentation, version control (Git), and CI/CD The kind of person who thrives here Confident, curious, and motivated. You care about doing things properly, you enjoy being visible and trusted in the business, and you're passionate about using data to create positive outcomes. Excited . APPLY NOW No Sponsorship - Post Grad Visa
Stott and May
Principle Data Engineer
Stott and May
Principal Data Engineer - Hybrid (London/Winchester) We're seeking a hands-on Principal Data Engineer to design and deliver enterprise-scale, cloud-native data platforms that power analytics, reporting, and Real Time decision-making. This is a strategic technical leadership role where you'll shape architecture, mentor engineers, and deliver end-to-end solutions across a modern AWS/Databricks stack. What you'll do Lead the design of scalable, secure data architectures on AWS. Build and optimise ETL/ELT pipelines for batch and streaming data. Deploy and manage Apache Spark jobs on Databricks and Delta Lake. Write production-grade Python and SQL for large-scale data transformations. Drive data quality, governance, and automation through CI/CD and IaC. Collaborate with data scientists, analysts, and business stakeholders. Mentor and guide data engineering teams. What we're looking for Proven experience in senior/principal data engineering roles. Expertise in AWS, Databricks, Apache Spark, Python, and SQL. Strong background in cloud-native data platforms, Real Time processing, and data lakes. Hands-on experience with tools such as Airflow, Kafka, Docker, GitLab CI/CD. Excellent stakeholder engagement and leadership skills. What's on offer £84000 salary + 10% bonus 6% pension contribution Private medical & flexible benefits package 25 days annual leave (plus buy/sell options) Hybrid working - travel to London or Winchester once/twice per week Join a company at the forefront of media, connectivity, and smart technology, where your work directly powers millions of daily connections across the UK.
06/10/2025
Full time
Principal Data Engineer - Hybrid (London/Winchester) We're seeking a hands-on Principal Data Engineer to design and deliver enterprise-scale, cloud-native data platforms that power analytics, reporting, and Real Time decision-making. This is a strategic technical leadership role where you'll shape architecture, mentor engineers, and deliver end-to-end solutions across a modern AWS/Databricks stack. What you'll do Lead the design of scalable, secure data architectures on AWS. Build and optimise ETL/ELT pipelines for batch and streaming data. Deploy and manage Apache Spark jobs on Databricks and Delta Lake. Write production-grade Python and SQL for large-scale data transformations. Drive data quality, governance, and automation through CI/CD and IaC. Collaborate with data scientists, analysts, and business stakeholders. Mentor and guide data engineering teams. What we're looking for Proven experience in senior/principal data engineering roles. Expertise in AWS, Databricks, Apache Spark, Python, and SQL. Strong background in cloud-native data platforms, Real Time processing, and data lakes. Hands-on experience with tools such as Airflow, Kafka, Docker, GitLab CI/CD. Excellent stakeholder engagement and leadership skills. What's on offer £84000 salary + 10% bonus 6% pension contribution Private medical & flexible benefits package 25 days annual leave (plus buy/sell options) Hybrid working - travel to London or Winchester once/twice per week Join a company at the forefront of media, connectivity, and smart technology, where your work directly powers millions of daily connections across the UK.
Tenth Revolution Group
Senior AWS Data Engineer - London - £125,000
Tenth Revolution Group
Senior AWS Data Engineer - London - £125,000 Please note - this role will require you to work from the London based office. You must have the unrestricted right to work in the UK to be eligible for this role. This organisation is not able to offer sponsorship. An exciting opportunity to join a greenfield initiative focused on transforming how market data is accessed and utilised. As a Senior AWS Data Engineer, you'll play a key role in designing and building a cutting-edge data platform using technologies like Databricks, Snowflake, and AWS Glue. Key Responsibilities: Build and maintain scalable data pipelines, warehouses, and lakes. Design secure, high-performance data architectures. Develop processing and analysis algorithms for complex data sets. Collaborate with data scientists to deploy machine learning models. Contribute to strategy, planning, and continuous improvement. Required Experience: Hands-on experience with AWS data tools: Glue, PySpark, Athena, Iceberg, Lake Formation. Strong Python and SQL skills for data processing and analysis. Deep understanding of data governance, quality, and security. Knowledge of market data and its business applications. Desirable Experience: Experience with Databricks and Snowflake. Familiarity with machine learning and data science concepts. Strategic thinking and ability to influence cross-functional teams. This role offers the chance to work across multiple business areas, solve complex data challenges, and contribute to long-term strategic goals. You'll be empowered to lead, collaborate, and innovate in a dynamic environment. To apply for this role please submit your CV or contact David Airey on or at . Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
03/10/2025
Full time
Senior AWS Data Engineer - London - £125,000 Please note - this role will require you to work from the London based office. You must have the unrestricted right to work in the UK to be eligible for this role. This organisation is not able to offer sponsorship. An exciting opportunity to join a greenfield initiative focused on transforming how market data is accessed and utilised. As a Senior AWS Data Engineer, you'll play a key role in designing and building a cutting-edge data platform using technologies like Databricks, Snowflake, and AWS Glue. Key Responsibilities: Build and maintain scalable data pipelines, warehouses, and lakes. Design secure, high-performance data architectures. Develop processing and analysis algorithms for complex data sets. Collaborate with data scientists to deploy machine learning models. Contribute to strategy, planning, and continuous improvement. Required Experience: Hands-on experience with AWS data tools: Glue, PySpark, Athena, Iceberg, Lake Formation. Strong Python and SQL skills for data processing and analysis. Deep understanding of data governance, quality, and security. Knowledge of market data and its business applications. Desirable Experience: Experience with Databricks and Snowflake. Familiarity with machine learning and data science concepts. Strategic thinking and ability to influence cross-functional teams. This role offers the chance to work across multiple business areas, solve complex data challenges, and contribute to long-term strategic goals. You'll be empowered to lead, collaborate, and innovate in a dynamic environment. To apply for this role please submit your CV or contact David Airey on or at . Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Greencore
Senior Data Engineer
Greencore Worksop, Nottinghamshire
Why Greencore? We're a leading manufacturer of convenience food in the UK and our purpose is to make everyday taste better! We're a vibrant, fast-paced leading food manufacturer. Employing 13,300 colleagues across 16 manufacturing units and 17 distribution depots across the UK. We supply all the UK's food retailers with everything from Sandwiches, soups and sushi to cooking sauces, pickles and ready meals, and in FY24, we generated revenues of £1.8bn. Our vast direct-to-store (DTS) distribution network, comprising of 17 depots nationwide, enables us to make over 10,500 daily deliveries of our own chilled and frozen produce and that of third parties. Why is this exciting for your career as a Senior Data Engineer? The MBE Programme presents a huge opportunity for colleagues across the technology function to play a central role in the design, shape, delivery and execution of an enterprise wide digital transformation programme. The complexity of the initiative, within a FTSE 250 business, will allow for large-scale problem solving, group wide impact assessment and supporting the delivery of an enablement project to future proof the business. Why we embarked on Making Business Easier? Over time processes have become increasingly complex, increasing both the risk and cost they pose, whilst restricting our agility. At the same time, our customers and the market expect more from us than ever before. Making Business Easier forms a fundamental foundation for our commercial and operational excellence agendas, whilst supporting managing our cost base effectively in the future. The MBE Programme will streamline and simplify core processes, provide easier access to quality business data and will invest in the right technology to enable these processes. What you'll be doing: As a Senior Data Engineer, you will play a key role in shaping and delivering enterprise-wide data solutions that translate complex business requirements into scalable, high-performance data platforms. In this role, you will help define and guide the structure of data systems, focusing on seamless integration, accessibility, and governance, while optimising data flows to support both analytics and operational needs. Collaborating closely with business stakeholders, data engineers, and analysts, you will ensure that data platforms are robust, efficient, and adaptable to evolving business priorities. You will also support the usage, alignment, and consistency of data models; therefore, will have a wide-ranging role across many business projects and deliverables Shape and implement data solutions that align with business objectives and leverage both cloud and on-premise technologies Translate complex business needs into scalable, high-performing data solutions Support the development and application of best practices in data governance, security, and system design Collaborate closely with business stakeholders, product teams, and engineers to design and deliver effective, integrated data solutions Optimise data flows and pipelines to enable a wide range of analytical and operational use cases Promote data consistency across transactional and analytical systems through well-designed integration approaches Contribute to the design and ongoing improvement of data platforms - including data lakes, data warehouses, and other distributed storage environments - focused on efficiency, scalability, and ease of maintenance Mentor and support junior engineers and analysts in applying best practices in data engineering and solution design What you'll need: 5+ years of Data Engineering experience, with expertise in Azure data services and/or Microsoft Fabric Strong expertise in designing scalable data platforms and managing cloud-based data ecosystems Proven track record in data integration, ETL processes, and optimising large-scale data systems Expertise in cloud-based data platforms (AWS, Azure, Google Cloud) and distributed storage solutions Proficiency in Python, PySpark, SQL, NoSQL, and data processing frameworks (Spark, Databricks) Expertise in ETL/ELT design and orchestration in Azure, as well as pipeline performance tuning & optimisation Competent in integrating relational, NoSQL, and streaming data sources Management of CI/CD pipelines & Git-based workflows Good knowledge of data governance, privacy regulations, and security best practices Experience with modern data architectures, including data lakes, data mesh, and event-driven data processing Strong problem-solving and analytical skills to translate complex business needs into scalable data solutions Excellent communication and stakeholder management to align business and technical goals High attention to detail and commitment to data quality, security, and governance Ability to mentor and guide teams, fostering a culture of best practices in data architecture Power BI and DAX for data visualisation (desirable) Knowledge of Azure Machine Learning and AI services (desirable) Experience with streaming platforms like Event Hub or Kafka Familiarity with cloud cost optimisation techniques (desirable) What you'll get: Competitive salary and job-related benefits 25 days holiday allowance plus bank holidays Car Allowance Annual Target Bonus Pension up to 8% matched PMI Cover: Individual Life insurance up to 4x salary Company share save scheme Greencore Qualifications Exclusive Greencore employee discount platform Access to a full Wellbeing Centre platform
03/10/2025
Full time
Why Greencore? We're a leading manufacturer of convenience food in the UK and our purpose is to make everyday taste better! We're a vibrant, fast-paced leading food manufacturer. Employing 13,300 colleagues across 16 manufacturing units and 17 distribution depots across the UK. We supply all the UK's food retailers with everything from Sandwiches, soups and sushi to cooking sauces, pickles and ready meals, and in FY24, we generated revenues of £1.8bn. Our vast direct-to-store (DTS) distribution network, comprising of 17 depots nationwide, enables us to make over 10,500 daily deliveries of our own chilled and frozen produce and that of third parties. Why is this exciting for your career as a Senior Data Engineer? The MBE Programme presents a huge opportunity for colleagues across the technology function to play a central role in the design, shape, delivery and execution of an enterprise wide digital transformation programme. The complexity of the initiative, within a FTSE 250 business, will allow for large-scale problem solving, group wide impact assessment and supporting the delivery of an enablement project to future proof the business. Why we embarked on Making Business Easier? Over time processes have become increasingly complex, increasing both the risk and cost they pose, whilst restricting our agility. At the same time, our customers and the market expect more from us than ever before. Making Business Easier forms a fundamental foundation for our commercial and operational excellence agendas, whilst supporting managing our cost base effectively in the future. The MBE Programme will streamline and simplify core processes, provide easier access to quality business data and will invest in the right technology to enable these processes. What you'll be doing: As a Senior Data Engineer, you will play a key role in shaping and delivering enterprise-wide data solutions that translate complex business requirements into scalable, high-performance data platforms. In this role, you will help define and guide the structure of data systems, focusing on seamless integration, accessibility, and governance, while optimising data flows to support both analytics and operational needs. Collaborating closely with business stakeholders, data engineers, and analysts, you will ensure that data platforms are robust, efficient, and adaptable to evolving business priorities. You will also support the usage, alignment, and consistency of data models; therefore, will have a wide-ranging role across many business projects and deliverables Shape and implement data solutions that align with business objectives and leverage both cloud and on-premise technologies Translate complex business needs into scalable, high-performing data solutions Support the development and application of best practices in data governance, security, and system design Collaborate closely with business stakeholders, product teams, and engineers to design and deliver effective, integrated data solutions Optimise data flows and pipelines to enable a wide range of analytical and operational use cases Promote data consistency across transactional and analytical systems through well-designed integration approaches Contribute to the design and ongoing improvement of data platforms - including data lakes, data warehouses, and other distributed storage environments - focused on efficiency, scalability, and ease of maintenance Mentor and support junior engineers and analysts in applying best practices in data engineering and solution design What you'll need: 5+ years of Data Engineering experience, with expertise in Azure data services and/or Microsoft Fabric Strong expertise in designing scalable data platforms and managing cloud-based data ecosystems Proven track record in data integration, ETL processes, and optimising large-scale data systems Expertise in cloud-based data platforms (AWS, Azure, Google Cloud) and distributed storage solutions Proficiency in Python, PySpark, SQL, NoSQL, and data processing frameworks (Spark, Databricks) Expertise in ETL/ELT design and orchestration in Azure, as well as pipeline performance tuning & optimisation Competent in integrating relational, NoSQL, and streaming data sources Management of CI/CD pipelines & Git-based workflows Good knowledge of data governance, privacy regulations, and security best practices Experience with modern data architectures, including data lakes, data mesh, and event-driven data processing Strong problem-solving and analytical skills to translate complex business needs into scalable data solutions Excellent communication and stakeholder management to align business and technical goals High attention to detail and commitment to data quality, security, and governance Ability to mentor and guide teams, fostering a culture of best practices in data architecture Power BI and DAX for data visualisation (desirable) Knowledge of Azure Machine Learning and AI services (desirable) Experience with streaming platforms like Event Hub or Kafka Familiarity with cloud cost optimisation techniques (desirable) What you'll get: Competitive salary and job-related benefits 25 days holiday allowance plus bank holidays Car Allowance Annual Target Bonus Pension up to 8% matched PMI Cover: Individual Life insurance up to 4x salary Company share save scheme Greencore Qualifications Exclusive Greencore employee discount platform Access to a full Wellbeing Centre platform
Greencore
Senior Data Engineer
Greencore Scofton, Nottinghamshire
Why Greencore? We're a leading manufacturer of convenience food in the UK and our purpose is to make everyday taste better! We're a vibrant, fast-paced leading food manufacturer. Employing 13,300 colleagues across 16 manufacturing units and 17 distribution depots across the UK. We supply all the UK's food retailers with everything from Sandwiches, soups and sushi to cooking sauces, pickles and ready meals, and in FY24, we generated revenues of 1.8bn. Our vast direct-to-store (DTS) distribution network, comprising of 17 depots nationwide, enables us to make over 10,500 daily deliveries of our own chilled and frozen produce and that of third parties. Why is this exciting for your career as a Senior Data Engineer? The MBE Programme presents a huge opportunity for colleagues across the technology function to play a central role in the design, shape, delivery and execution of an enterprise wide digital transformation programme. The complexity of the initiative, within a FTSE 250 business, will allow for large-scale problem solving, group wide impact assessment and supporting the delivery of an enablement project to future proof the business. Why we embarked on Making Business Easier? Over time processes have become increasingly complex, increasing both the risk and cost they pose, whilst restricting our agility. At the same time, our customers and the market expect more from us than ever before. Making Business Easier forms a fundamental foundation for our commercial and operational excellence agendas, whilst supporting managing our cost base effectively in the future. The MBE Programme will streamline and simplify core processes, provide easier access to quality business data and will invest in the right technology to enable these processes. What you'll be doing: As a Senior Data Engineer, you will play a key role in shaping and delivering enterprise-wide data solutions that translate complex business requirements into scalable, high-performance data platforms. In this role, you will help define and guide the structure of data systems, focusing on seamless integration, accessibility, and governance, while optimising data flows to support both analytics and operational needs. Collaborating closely with business stakeholders, data engineers, and analysts, you will ensure that data platforms are robust, efficient, and adaptable to evolving business priorities. You will also support the usage, alignment, and consistency of data models; therefore, will have a wide-ranging role across many business projects and deliverables Shape and implement data solutions that align with business objectives and leverage both cloud and on-premise technologies Translate complex business needs into scalable, high-performing data solutions Support the development and application of best practices in data governance, security, and system design Collaborate closely with business stakeholders, product teams, and engineers to design and deliver effective, integrated data solutions Optimise data flows and pipelines to enable a wide range of analytical and operational use cases Promote data consistency across transactional and analytical systems through well-designed integration approaches Contribute to the design and ongoing improvement of data platforms - including data lakes, data warehouses, and other distributed storage environments - focused on efficiency, scalability, and ease of maintenance Mentor and support junior engineers and analysts in applying best practices in data engineering and solution design What you'll need: 5+ years of Data Engineering experience, with expertise in Azure data services and/or Microsoft Fabric Strong expertise in designing scalable data platforms and managing cloud-based data ecosystems Proven track record in data integration, ETL processes, and optimising large-scale data systems Expertise in cloud-based data platforms (AWS, Azure, Google Cloud) and distributed storage solutions Proficiency in Python, PySpark, SQL, NoSQL, and data processing frameworks (Spark, Databricks) Expertise in ETL/ELT design and orchestration in Azure, as well as pipeline performance tuning & optimisation Competent in integrating relational, NoSQL, and streaming data sources Management of CI/CD pipelines & Git-based workflows Good knowledge of data governance, privacy regulations, and security best practices Experience with modern data architectures, including data lakes, data mesh, and event-driven data processing Strong problem-solving and analytical skills to translate complex business needs into scalable data solutions Excellent communication and stakeholder management to align business and technical goals High attention to detail and commitment to data quality, security, and governance Ability to mentor and guide teams, fostering a culture of best practices in data architecture Power BI and DAX for data visualisation (desirable) Knowledge of Azure Machine Learning and AI services (desirable) Experience with streaming platforms like Event Hub or Kafka Familiarity with cloud cost optimisation techniques (desirable) What you'll get: Competitive salary and job-related benefits 25 days holiday allowance plus bank holidays Car Allowance Annual Target Bonus Pension up to 8% matched PMI Cover: Individual Life insurance up to 4x salary Company share save scheme Greencore Qualifications Exclusive Greencore employee discount platform Access to a full Wellbeing Centre platform
02/10/2025
Full time
Why Greencore? We're a leading manufacturer of convenience food in the UK and our purpose is to make everyday taste better! We're a vibrant, fast-paced leading food manufacturer. Employing 13,300 colleagues across 16 manufacturing units and 17 distribution depots across the UK. We supply all the UK's food retailers with everything from Sandwiches, soups and sushi to cooking sauces, pickles and ready meals, and in FY24, we generated revenues of 1.8bn. Our vast direct-to-store (DTS) distribution network, comprising of 17 depots nationwide, enables us to make over 10,500 daily deliveries of our own chilled and frozen produce and that of third parties. Why is this exciting for your career as a Senior Data Engineer? The MBE Programme presents a huge opportunity for colleagues across the technology function to play a central role in the design, shape, delivery and execution of an enterprise wide digital transformation programme. The complexity of the initiative, within a FTSE 250 business, will allow for large-scale problem solving, group wide impact assessment and supporting the delivery of an enablement project to future proof the business. Why we embarked on Making Business Easier? Over time processes have become increasingly complex, increasing both the risk and cost they pose, whilst restricting our agility. At the same time, our customers and the market expect more from us than ever before. Making Business Easier forms a fundamental foundation for our commercial and operational excellence agendas, whilst supporting managing our cost base effectively in the future. The MBE Programme will streamline and simplify core processes, provide easier access to quality business data and will invest in the right technology to enable these processes. What you'll be doing: As a Senior Data Engineer, you will play a key role in shaping and delivering enterprise-wide data solutions that translate complex business requirements into scalable, high-performance data platforms. In this role, you will help define and guide the structure of data systems, focusing on seamless integration, accessibility, and governance, while optimising data flows to support both analytics and operational needs. Collaborating closely with business stakeholders, data engineers, and analysts, you will ensure that data platforms are robust, efficient, and adaptable to evolving business priorities. You will also support the usage, alignment, and consistency of data models; therefore, will have a wide-ranging role across many business projects and deliverables Shape and implement data solutions that align with business objectives and leverage both cloud and on-premise technologies Translate complex business needs into scalable, high-performing data solutions Support the development and application of best practices in data governance, security, and system design Collaborate closely with business stakeholders, product teams, and engineers to design and deliver effective, integrated data solutions Optimise data flows and pipelines to enable a wide range of analytical and operational use cases Promote data consistency across transactional and analytical systems through well-designed integration approaches Contribute to the design and ongoing improvement of data platforms - including data lakes, data warehouses, and other distributed storage environments - focused on efficiency, scalability, and ease of maintenance Mentor and support junior engineers and analysts in applying best practices in data engineering and solution design What you'll need: 5+ years of Data Engineering experience, with expertise in Azure data services and/or Microsoft Fabric Strong expertise in designing scalable data platforms and managing cloud-based data ecosystems Proven track record in data integration, ETL processes, and optimising large-scale data systems Expertise in cloud-based data platforms (AWS, Azure, Google Cloud) and distributed storage solutions Proficiency in Python, PySpark, SQL, NoSQL, and data processing frameworks (Spark, Databricks) Expertise in ETL/ELT design and orchestration in Azure, as well as pipeline performance tuning & optimisation Competent in integrating relational, NoSQL, and streaming data sources Management of CI/CD pipelines & Git-based workflows Good knowledge of data governance, privacy regulations, and security best practices Experience with modern data architectures, including data lakes, data mesh, and event-driven data processing Strong problem-solving and analytical skills to translate complex business needs into scalable data solutions Excellent communication and stakeholder management to align business and technical goals High attention to detail and commitment to data quality, security, and governance Ability to mentor and guide teams, fostering a culture of best practices in data architecture Power BI and DAX for data visualisation (desirable) Knowledge of Azure Machine Learning and AI services (desirable) Experience with streaming platforms like Event Hub or Kafka Familiarity with cloud cost optimisation techniques (desirable) What you'll get: Competitive salary and job-related benefits 25 days holiday allowance plus bank holidays Car Allowance Annual Target Bonus Pension up to 8% matched PMI Cover: Individual Life insurance up to 4x salary Company share save scheme Greencore Qualifications Exclusive Greencore employee discount platform Access to a full Wellbeing Centre platform
Guidant Global
IT Data and Analytics Senior Development Operations Engineer
Guidant Global Reading, Oxfordshire
Base Location: Reading / Havant / Perth Salary: 600 per day Working Pattern: 40 hours per week / Full time Embark on a transformative career journey with SSE energy company, where innovation meets impact in the heart of the IT sector. As a pivotal player in our forward-thinking team, you'll harness cutting-edge technology to drive change and propel the UK towards its ambitious net-zero targets. Your expertise will not only shape the future of energy but also carve a sustainable world for generations to come. Join us and be at the forefront of the green revolution, where every line of code contributes to a cleaner, brighter future. Key Responsibilities: Provide technical leadership and oversight to the group Data & Analytics platform team. Responsible for ensuring the reliability, security and scalability of analytics platform services. Deliver full automation of the deployment of Data & Analytics platform services via Infrastructure as code. Help to set development standards, configure operational support processes and provide technical assurance. Provide support to Data & Analytics platform users and internal development teams interacting with the Data & Analytics platform services. What do you need? Extensive experience of deploying Azure and ideally AWS cloud resources and be fully conversant with agile and DevOps development methodology. Extensive experience in using Terraform to deploy cloud resources as infrastructure as code. Excellent understanding of CI/CD principles and experience with related tools (e.g. Azure DevOps, GitHub Actions). Strong knowledge of scripting languages such as PowerShell, Python and Azure CLI and proven experience with automation runbooks, VM maintenance scripts and SQL. Strong understanding of cloud access control and governance such as RBAC and IAM. Strong knowledge on Cloud Networking (Azure) such as private endpoints, Firewalls, NSGs, NAT gateways and route tables. Good knowledge in Microsoft Entra ID such as managing App registrations, Enterprise Apps, AD groups, managed identities and Privileged Identity Management. Proven experience in IaaS such as virtual machines - both Windows and Linux. Familiarity with server patching and maintenance. Strong understanding of security best practices within Azure and ideally AWS. Experience of configuring cloud data services (preferably Databricks) in Azure and ideally AWS. Excellent communication and collaboration skills, with the ability to work across multiple technical and non-technical teams. What happens now? After submitting your application for the Data and Analytics Senior Development Operations Engineer role, we understand you're eager to hear back. We value your time and interest, and if your application is successful, you will be contacted directly by the team within 2 working days. We appreciate your patience and look forward to the possibility of welcoming you aboard.
01/10/2025
Contractor
Base Location: Reading / Havant / Perth Salary: 600 per day Working Pattern: 40 hours per week / Full time Embark on a transformative career journey with SSE energy company, where innovation meets impact in the heart of the IT sector. As a pivotal player in our forward-thinking team, you'll harness cutting-edge technology to drive change and propel the UK towards its ambitious net-zero targets. Your expertise will not only shape the future of energy but also carve a sustainable world for generations to come. Join us and be at the forefront of the green revolution, where every line of code contributes to a cleaner, brighter future. Key Responsibilities: Provide technical leadership and oversight to the group Data & Analytics platform team. Responsible for ensuring the reliability, security and scalability of analytics platform services. Deliver full automation of the deployment of Data & Analytics platform services via Infrastructure as code. Help to set development standards, configure operational support processes and provide technical assurance. Provide support to Data & Analytics platform users and internal development teams interacting with the Data & Analytics platform services. What do you need? Extensive experience of deploying Azure and ideally AWS cloud resources and be fully conversant with agile and DevOps development methodology. Extensive experience in using Terraform to deploy cloud resources as infrastructure as code. Excellent understanding of CI/CD principles and experience with related tools (e.g. Azure DevOps, GitHub Actions). Strong knowledge of scripting languages such as PowerShell, Python and Azure CLI and proven experience with automation runbooks, VM maintenance scripts and SQL. Strong understanding of cloud access control and governance such as RBAC and IAM. Strong knowledge on Cloud Networking (Azure) such as private endpoints, Firewalls, NSGs, NAT gateways and route tables. Good knowledge in Microsoft Entra ID such as managing App registrations, Enterprise Apps, AD groups, managed identities and Privileged Identity Management. Proven experience in IaaS such as virtual machines - both Windows and Linux. Familiarity with server patching and maintenance. Strong understanding of security best practices within Azure and ideally AWS. Experience of configuring cloud data services (preferably Databricks) in Azure and ideally AWS. Excellent communication and collaboration skills, with the ability to work across multiple technical and non-technical teams. What happens now? After submitting your application for the Data and Analytics Senior Development Operations Engineer role, we understand you're eager to hear back. We value your time and interest, and if your application is successful, you will be contacted directly by the team within 2 working days. We appreciate your patience and look forward to the possibility of welcoming you aboard.
Noir
Lead Data Engineer Databricks - Leeds
Noir Leeds, Yorkshire
Lead Data Engineer (Databricks) - Leeds (Lead Data Engineer, Team Lead, Technical Lead, Senior Data Engineer, Data Engineer, Python, PySpark, SQL, Big Data, Databricks, R, Machine Learning, AI, Agile, Scrum, TDD, BDD, CI / CD, SOLID principles, Github, Azure DevOps, Jenkins, Terraform, AWS CDK, AWS CloudFormation, Azure, Lead Data Engineer, Team Lead, Technical Lead, Senior Data Engineer, Data Engineer) Our client is a global innovator and world leader with one of the most recognisable names within technology. They are looking for a Lead Data Engineer with significant Databricks experience as well as leadership responsibility to run an exceptional Agile engineering team and provide technical leadership through coaching and mentorship. We are seeking a Lead Data Engineer capable of leading client delivery, ensuring the highest standards. This will include working with architects, creating automated tests, instilling a culture of continuous improvement and setting standards for the team. You will be responsible for building a greenfield modern data platform using cutting-edge technologies, architecting big data solutions and developing complex enterprise data ETL and ML pipelines and projections. The successful candidate will have strong Python, PySpark and SQL experience, possess a clear understanding of databricks, as well as a passion for Data Science (R, Machine Learning and AI). Database experience with SQL and No-SQL - Aurora, MS SQL Server, MySQL is expected, as well as significant Agile and Scrum exposure along with SOLID principles. Continuous Integration tools, Infrastructure as code and strong Cloud Platform knowledge, ideally with AWS is also key. We are keen to hear from talented Lead Data Engineer candidates from all backgrounds. This is a truly amazing opportunity to work for a prestigious brand that will do wonders for your career. They invest heavily in training and career development with unlimited career progression for top performers. Location: Leeds Salary: £55k - £70k + Pension + Benefits To apply for this position please send your CV to Nathan Warner at Noir Consulting. (Lead Data Engineer, Team Lead, Technical Lead, Senior Data Engineer, Data Engineer, Python, PySpark, SQL, Big Data, Databricks, R, Machine Learning, AI, Agile, Scrum, TDD, BDD, CI / CD, SOLID principles, Github, Azure DevOps, Jenkins, Terraform, AWS CDK, AWS CloudFormation, Azure, Lead Data Engineer, Team Lead, Technical Lead, Senior Data Engineer, Data Engineer) NOIRUKTECHREC NOIRUKREC
01/10/2025
Full time
Lead Data Engineer (Databricks) - Leeds (Lead Data Engineer, Team Lead, Technical Lead, Senior Data Engineer, Data Engineer, Python, PySpark, SQL, Big Data, Databricks, R, Machine Learning, AI, Agile, Scrum, TDD, BDD, CI / CD, SOLID principles, Github, Azure DevOps, Jenkins, Terraform, AWS CDK, AWS CloudFormation, Azure, Lead Data Engineer, Team Lead, Technical Lead, Senior Data Engineer, Data Engineer) Our client is a global innovator and world leader with one of the most recognisable names within technology. They are looking for a Lead Data Engineer with significant Databricks experience as well as leadership responsibility to run an exceptional Agile engineering team and provide technical leadership through coaching and mentorship. We are seeking a Lead Data Engineer capable of leading client delivery, ensuring the highest standards. This will include working with architects, creating automated tests, instilling a culture of continuous improvement and setting standards for the team. You will be responsible for building a greenfield modern data platform using cutting-edge technologies, architecting big data solutions and developing complex enterprise data ETL and ML pipelines and projections. The successful candidate will have strong Python, PySpark and SQL experience, possess a clear understanding of databricks, as well as a passion for Data Science (R, Machine Learning and AI). Database experience with SQL and No-SQL - Aurora, MS SQL Server, MySQL is expected, as well as significant Agile and Scrum exposure along with SOLID principles. Continuous Integration tools, Infrastructure as code and strong Cloud Platform knowledge, ideally with AWS is also key. We are keen to hear from talented Lead Data Engineer candidates from all backgrounds. This is a truly amazing opportunity to work for a prestigious brand that will do wonders for your career. They invest heavily in training and career development with unlimited career progression for top performers. Location: Leeds Salary: £55k - £70k + Pension + Benefits To apply for this position please send your CV to Nathan Warner at Noir Consulting. (Lead Data Engineer, Team Lead, Technical Lead, Senior Data Engineer, Data Engineer, Python, PySpark, SQL, Big Data, Databricks, R, Machine Learning, AI, Agile, Scrum, TDD, BDD, CI / CD, SOLID principles, Github, Azure DevOps, Jenkins, Terraform, AWS CDK, AWS CloudFormation, Azure, Lead Data Engineer, Team Lead, Technical Lead, Senior Data Engineer, Data Engineer) NOIRUKTECHREC NOIRUKREC
Experis IT
Databricks Engineer
Experis IT
Databricks Engineer London- hybrid- 3 days per week on-site 6 months + UMBRELLA only- Inside IR35 Key Responsibilities Design, develop, and maintain ETL/ELT pipelines using Airflow for orchestration and scheduling. Build and manage data transformation workflows in DBT running on Databricks . Optimize data models in Delta Lake for performance, scalability, and cost efficiency. Collaborate with analytics, BI, and data science teams to deliver clean, reliable datasets. Implement data quality checks (dbt tests, monitoring) and ensure governance standards. Manage and monitor Databricks clusters & SQL Warehouses to support workloads. Contribute to CI/CD practices for data pipelines (version control, testing, deployments). Troubleshoot pipeline failures, performance bottlenecks, and scaling challenges. Document workflows, transformations, and data models for knowledge sharing. Required Skills & Qualifications 3-6 years of experience as a Data Engineer (or similar). Hands-on expertise with: DBT (dbt-core, dbt-databricks adapter, testing & documentation). Apache Airflow (DAG design, operators, scheduling, dependencies). Databricks (Spark, SQL, Delta Lake, job clusters, SQL Warehouses). Strong SQL skills and understanding of data modelling (Kimball, Data Vault, or similar) . Proficiency in Python for Scripting and pipeline development. Experience with CI/CD tools (eg, GitHub Actions, GitLab CI, Azure DevOps). Familiarity with cloud platforms (AWS, Azure, or GCP). Strong problem-solving skills and ability to work in cross-functional teams. All profiles will be reviewed against the required skills and experience. Due to the high number of applications we will only be able to respond to successful applicants in the first instance. We thank you for your interest and the time taken to apply!
01/10/2025
Contractor
Databricks Engineer London- hybrid- 3 days per week on-site 6 months + UMBRELLA only- Inside IR35 Key Responsibilities Design, develop, and maintain ETL/ELT pipelines using Airflow for orchestration and scheduling. Build and manage data transformation workflows in DBT running on Databricks . Optimize data models in Delta Lake for performance, scalability, and cost efficiency. Collaborate with analytics, BI, and data science teams to deliver clean, reliable datasets. Implement data quality checks (dbt tests, monitoring) and ensure governance standards. Manage and monitor Databricks clusters & SQL Warehouses to support workloads. Contribute to CI/CD practices for data pipelines (version control, testing, deployments). Troubleshoot pipeline failures, performance bottlenecks, and scaling challenges. Document workflows, transformations, and data models for knowledge sharing. Required Skills & Qualifications 3-6 years of experience as a Data Engineer (or similar). Hands-on expertise with: DBT (dbt-core, dbt-databricks adapter, testing & documentation). Apache Airflow (DAG design, operators, scheduling, dependencies). Databricks (Spark, SQL, Delta Lake, job clusters, SQL Warehouses). Strong SQL skills and understanding of data modelling (Kimball, Data Vault, or similar) . Proficiency in Python for Scripting and pipeline development. Experience with CI/CD tools (eg, GitHub Actions, GitLab CI, Azure DevOps). Familiarity with cloud platforms (AWS, Azure, or GCP). Strong problem-solving skills and ability to work in cross-functional teams. All profiles will be reviewed against the required skills and experience. Due to the high number of applications we will only be able to respond to successful applicants in the first instance. We thank you for your interest and the time taken to apply!
Harnham - Data & Analytics Recruitment
Infrastructure Engineer
Harnham - Data & Analytics Recruitment
Infrastructure Engineer AI Startup Remote (One week per month in Barcelona) Up to £180,000 + Equity Want to help power some of the most advanced AI models being built today? I'm hiring for an Infrastructure Engineer to join a cutting-edge startup developing next-generation AI for tabular data-technology with the potential to transform industries like finance, healthcare, and advertising. You'll work alongside researchers and engineers from the likes of DeepMind and other top AI labs, building cloud infrastructure from the ground up to support high-performance, GPU-intensive workloads at scale. Why You'll Love This Role Highly competitive salary + meaningful equity package ? 100% remote within Europe Work with world-class AI experts on cutting-edge problems Join at an early stage with big growth ahead Access to the latest tools, GPUs, and research infrastructure What You'll Be Doing Design and implement multi-cloud infrastructure (AWS + GCP) optimised for ML workloads Build and manage Kubernetes clusters for GPU training, serving, and SaaS hosting Implement GitOps deployment workflows using ArgoCD Create and manage infrastructure as code with Terraform Set up CI/CD pipelines for infrastructure and application deployment Implement monitoring, observability, and cloud cost optimisation (FinOps) Collaborate with ML engineers to fine-tune infrastructure for large-scale model training What You'll Bring 5+ years in cloud infrastructure / DevOps roles Deep Kubernetes expertise, including GPU workload optimisation Strong AWS and GCP experience Proven skills with Terraform , GitOps tools, and CI/CD (GitHub Actions preferred) Proficiency in Python and scripting for automation Solid understanding of cloud networking, security, and cross-cloud connectivity Experience in monitoring, observability, and cost optimisation ? Nice to Have Experience with ML tooling (MLflow, Kubeflow) Knowledge of FastAPI , Databricks, or Snowflake Exposure to SRE practices or cloud security certifications Familiarity with Prometheus , Grafana , or Datadog Interested? If you want to be part of a world-class AI team at an early stage-where your infrastructure decisions will directly shape the company's success-apply today or reach out for a confidential chat.
01/09/2025
Full time
Infrastructure Engineer AI Startup Remote (One week per month in Barcelona) Up to £180,000 + Equity Want to help power some of the most advanced AI models being built today? I'm hiring for an Infrastructure Engineer to join a cutting-edge startup developing next-generation AI for tabular data-technology with the potential to transform industries like finance, healthcare, and advertising. You'll work alongside researchers and engineers from the likes of DeepMind and other top AI labs, building cloud infrastructure from the ground up to support high-performance, GPU-intensive workloads at scale. Why You'll Love This Role Highly competitive salary + meaningful equity package ? 100% remote within Europe Work with world-class AI experts on cutting-edge problems Join at an early stage with big growth ahead Access to the latest tools, GPUs, and research infrastructure What You'll Be Doing Design and implement multi-cloud infrastructure (AWS + GCP) optimised for ML workloads Build and manage Kubernetes clusters for GPU training, serving, and SaaS hosting Implement GitOps deployment workflows using ArgoCD Create and manage infrastructure as code with Terraform Set up CI/CD pipelines for infrastructure and application deployment Implement monitoring, observability, and cloud cost optimisation (FinOps) Collaborate with ML engineers to fine-tune infrastructure for large-scale model training What You'll Bring 5+ years in cloud infrastructure / DevOps roles Deep Kubernetes expertise, including GPU workload optimisation Strong AWS and GCP experience Proven skills with Terraform , GitOps tools, and CI/CD (GitHub Actions preferred) Proficiency in Python and scripting for automation Solid understanding of cloud networking, security, and cross-cloud connectivity Experience in monitoring, observability, and cost optimisation ? Nice to Have Experience with ML tooling (MLflow, Kubeflow) Knowledge of FastAPI , Databricks, or Snowflake Exposure to SRE practices or cloud security certifications Familiarity with Prometheus , Grafana , or Datadog Interested? If you want to be part of a world-class AI team at an early stage-where your infrastructure decisions will directly shape the company's success-apply today or reach out for a confidential chat.

Modal Window

  • Home
  • Contact
  • About Us
  • FAQs
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • IT blog
  • Facebook
  • Twitter
  • LinkedIn
  • Youtube
© 2008-2026 IT Job Board