it job board logo
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
  • Recruiting? Post a job
  • Sign in
  • Sign up
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

90 jobs found

Email me jobs like this
Refine Search
Current Search
data analytics gen ai engineers x 2
Harnham - Data & Analytics Recruitment
Product Analyst / Data Scientist
Harnham - Data & Analytics Recruitment Leicester, Leicestershire
Are you a Product Analyst who loves understanding user behaviour, running experiments, and helping product teams build better digital experiences? I'm hiring for a Product Analyst role at a well-established consumer platform offering discounts and perks to millions of UK users. The business is scaling internationally and evolving into a more tech-led organisation, giving you massive data, real ownership, and exposure to strategic product work. You'll sit within the central Data function and partner closely with Product Managers to analyse user journeys, run A/B tests, and provide the insights that shape product decisions. This is a hands-on, impact-driven role in a modern data environment with huge opportunities to influence the product roadmap. What you'll be doing Apply quantitative analysis and storytelling to understand how users interact with the platform and what drives behaviour. Use data proactively to uncover opportunities, size problems, and generate hypotheses for testing. Design, run, and analyse A/B tests and experiments in partnership with product teams. Define and track meaningful product metrics; ensure consistent measurement across teams. Build and maintain core data products enabling self-serve insights and faster product decisions. Conduct deep dives into user journeys, drop-off points, behaviour segments, funnel performance, and platform trends. Collaborate cross-functionally with engineers, data teams, product, commercial, and marketing stakeholders. Contribute to the Insights Hub and documentation repositories, keeping analytical knowledge up-to-date. What you bring Must-haves: Strong SQL skills (non-negotiable). Hands-on experience with product analytics in a tech or consumer-app environment, ideally companies like Monzo, Deliveroo, Just Eat, marketplaces, or membership/loyalty platforms. Experience running and evaluating A/B tests and experimentation frameworks. Ability to translate business problems into analytical tasks and communicate clear, actionable insights. Strong storytelling ability, turning numbers into narratives Good to have: Experience working with large-scale customer behaviour datasets. Experience in consumer tech, fintech, loyalty platforms, or other high-traffic digital products. Python/R/dbt exposure (not required). Industry background: While open, the strongest fits tend to come from tech-first consumer products where experimentation, app behaviour insights, and funnel optimisation are standard ways of working. The culture & offer A modern, data-mature environment with over four million UK users and expanding globally. A product-led organisation investing heavily in experimentation and user behaviour analytics. Private equity backing driving international expansion and new capabilities. FTC with benefits , strong likelihood of multi-month extension. Salary up to £85k depending on experience. Offices in London and Leicester - ideally 1-2 days per week but flexible.
01/04/2026
Full time
Are you a Product Analyst who loves understanding user behaviour, running experiments, and helping product teams build better digital experiences? I'm hiring for a Product Analyst role at a well-established consumer platform offering discounts and perks to millions of UK users. The business is scaling internationally and evolving into a more tech-led organisation, giving you massive data, real ownership, and exposure to strategic product work. You'll sit within the central Data function and partner closely with Product Managers to analyse user journeys, run A/B tests, and provide the insights that shape product decisions. This is a hands-on, impact-driven role in a modern data environment with huge opportunities to influence the product roadmap. What you'll be doing Apply quantitative analysis and storytelling to understand how users interact with the platform and what drives behaviour. Use data proactively to uncover opportunities, size problems, and generate hypotheses for testing. Design, run, and analyse A/B tests and experiments in partnership with product teams. Define and track meaningful product metrics; ensure consistent measurement across teams. Build and maintain core data products enabling self-serve insights and faster product decisions. Conduct deep dives into user journeys, drop-off points, behaviour segments, funnel performance, and platform trends. Collaborate cross-functionally with engineers, data teams, product, commercial, and marketing stakeholders. Contribute to the Insights Hub and documentation repositories, keeping analytical knowledge up-to-date. What you bring Must-haves: Strong SQL skills (non-negotiable). Hands-on experience with product analytics in a tech or consumer-app environment, ideally companies like Monzo, Deliveroo, Just Eat, marketplaces, or membership/loyalty platforms. Experience running and evaluating A/B tests and experimentation frameworks. Ability to translate business problems into analytical tasks and communicate clear, actionable insights. Strong storytelling ability, turning numbers into narratives Good to have: Experience working with large-scale customer behaviour datasets. Experience in consumer tech, fintech, loyalty platforms, or other high-traffic digital products. Python/R/dbt exposure (not required). Industry background: While open, the strongest fits tend to come from tech-first consumer products where experimentation, app behaviour insights, and funnel optimisation are standard ways of working. The culture & offer A modern, data-mature environment with over four million UK users and expanding globally. A product-led organisation investing heavily in experimentation and user behaviour analytics. Private equity backing driving international expansion and new capabilities. FTC with benefits , strong likelihood of multi-month extension. Salary up to £85k depending on experience. Offices in London and Leicester - ideally 1-2 days per week but flexible.
Experis
Analytics Engineer
Experis
Role: Analytics Engineer Location: London / Birmingham / Bristol - Any location (Hybrid 3 days onsite, 2 days remote) Duration: 6 Months Day rate: 450 - 600 Inside IR35 Role Overview We are seeking an experienced Analytics Engineer to design and build scalable analytical data models that support business intelligence, reporting and commercial analytics. The role sits within a multidisciplinary data team responsible for delivering trusted analytical data products used across commercial and marketing teams. The ideal candidate will combine strong analytical thinking with advanced SQL engineering capability, and will have experience designing analytics-ready datasets used by BI tools or semantic layers. This is not a pipeline engineering role; we are looking for someone experienced in building analytical data models that define consistent business metrics and enable self-service analytics. Key Responsibilities Analytical Data Modelling Design and implement scalable analytical data models in SQL used by BI tools and analytics platforms. Build datasets that support consistent business metrics, reporting and analysis. Implement modelling approaches such as star schemas, denormalised analytical tables and reusable metric layers. Data Analysis & Profiling Profile complex datasets to understand data structure, quality and business meaning. Investigate and interpret source data to inform robust analytical modelling decisions. Translate business questions into well-structured analytical datasets. SQL Engineering Develop robust SQL transformations to convert raw source data into trusted analytical assets. Ensure analytical models are scalable, performant and maintainable within a cloud data warehouse. Optimise SQL logic for performance and efficient data processing. Collaboration Work closely with analysts, visualisation developers, data engineers and business stakeholders. Contribute to the development of reusable data assets and consistent analytical definitions. Support the evolution of the organisation's analytics data layer and self-service reporting capability. Essential Skills Advanced SQL skills with experience engineering complex analytical transformations. Proven experience building analytical data models used by BI tools or reporting platforms. Experience designing analytics-ready datasets rather than ingestion pipelines. Strong experience with cloud data warehouse platforms (preferably Google BigQuery / GCP). Strong data analysis and data profiling capability with the ability to interpret complex datasets. Experience implementing analytical modelling approaches such as star schemas or wide tables. Desirable Skills Experience working with semantic layers or metrics layers (e.g. Looker / LookML). Experience designing consistent business metrics used across reporting and analytics. Python experience for data analysis, automation or advanced analytics workflows. Exposure to AI-enabled analytics tools or modern data workflows. Experience working in commercial or marketing analytics environments. Telecommunications or subscription business experience would be advantageous. Candidate Profile Strong analytical mindset combined with engineering discipline. Comfortable working with complex business data and translating it into analytical structures. Experienced in building analytics-ready datasets used by BI tools and reporting platforms. Collaborative and comfortable working within cross-functional data teams.
31/03/2026
Contractor
Role: Analytics Engineer Location: London / Birmingham / Bristol - Any location (Hybrid 3 days onsite, 2 days remote) Duration: 6 Months Day rate: 450 - 600 Inside IR35 Role Overview We are seeking an experienced Analytics Engineer to design and build scalable analytical data models that support business intelligence, reporting and commercial analytics. The role sits within a multidisciplinary data team responsible for delivering trusted analytical data products used across commercial and marketing teams. The ideal candidate will combine strong analytical thinking with advanced SQL engineering capability, and will have experience designing analytics-ready datasets used by BI tools or semantic layers. This is not a pipeline engineering role; we are looking for someone experienced in building analytical data models that define consistent business metrics and enable self-service analytics. Key Responsibilities Analytical Data Modelling Design and implement scalable analytical data models in SQL used by BI tools and analytics platforms. Build datasets that support consistent business metrics, reporting and analysis. Implement modelling approaches such as star schemas, denormalised analytical tables and reusable metric layers. Data Analysis & Profiling Profile complex datasets to understand data structure, quality and business meaning. Investigate and interpret source data to inform robust analytical modelling decisions. Translate business questions into well-structured analytical datasets. SQL Engineering Develop robust SQL transformations to convert raw source data into trusted analytical assets. Ensure analytical models are scalable, performant and maintainable within a cloud data warehouse. Optimise SQL logic for performance and efficient data processing. Collaboration Work closely with analysts, visualisation developers, data engineers and business stakeholders. Contribute to the development of reusable data assets and consistent analytical definitions. Support the evolution of the organisation's analytics data layer and self-service reporting capability. Essential Skills Advanced SQL skills with experience engineering complex analytical transformations. Proven experience building analytical data models used by BI tools or reporting platforms. Experience designing analytics-ready datasets rather than ingestion pipelines. Strong experience with cloud data warehouse platforms (preferably Google BigQuery / GCP). Strong data analysis and data profiling capability with the ability to interpret complex datasets. Experience implementing analytical modelling approaches such as star schemas or wide tables. Desirable Skills Experience working with semantic layers or metrics layers (e.g. Looker / LookML). Experience designing consistent business metrics used across reporting and analytics. Python experience for data analysis, automation or advanced analytics workflows. Exposure to AI-enabled analytics tools or modern data workflows. Experience working in commercial or marketing analytics environments. Telecommunications or subscription business experience would be advantageous. Candidate Profile Strong analytical mindset combined with engineering discipline. Comfortable working with complex business data and translating it into analytical structures. Experienced in building analytics-ready datasets used by BI tools and reporting platforms. Collaborative and comfortable working within cross-functional data teams.
83Zero Ltd
Senior Data Engineer
83Zero Ltd
Company Overview We are working with an innovative organisation that recognises the increasing complexity of project delivery. Since 2013, our client has been helping companies of all sizes improve the way projects are delivered. Their mission is to become the number one provider of innovative project solutions, driven by a community of experienced, caring, and passionate professionals who are committed to improving project delivery. Why Join Our Client? Our client is currently in an exciting phase of growth, making this an excellent time to join their journey. They are building something special-scaling the business while maintaining a strong people-first approach. Investment in their teams is a key priority, creating an environment where development is encouraged and individuals are supported to grow with the organisation. Their culture sets them apart from other consulting practices, and they are looking to build a team that is equally ambitious. Position Overview Our client is seeking a Senior Data Engineer who thrives on building scalable, cloud-first data systems. In this role, you will design and manage data pipelines that support analytics, AI, and automation across complex infrastructure programmes. Your work will play a key part in enabling data-driven transformation across critical UK industries. Core Responsibilities Design, build, and optimise data pipelines using Azure Data Factory, Synapse, and Databricks Develop and maintain ETL/ELT workflows to ensure high data quality and reliability Collaborate with analysts and AI engineers to deliver robust and reusable data products Manage data lakes and warehouses using formats such as Delta Lake and Parquet Implement best practices for data governance, performance, and security Continuously evaluate and adopt new technologies to evolve the organisation's data platform Provide technical guidance to junior engineers and contribute to team capability building Technical Stack Core: Azure Data Factory Azure Synapse Analytics Azure Data Lake Storage Gen2 SQL Server Databricks Enhancements: Python (PySpark, Pandas) CI/CD (Azure DevOps) Infrastructure as Code (Terraform, Bicep) REST APIs GitHub Actions Desirable: Microsoft Fabric Delta Live Tables Power BI dataset automation DataOps practices What You'll Bring Professional experience in data engineering or cloud data development Strong understanding of data architecture, APIs, and modern data pipeline design Hands-on experience within Microsoft's Azure ecosystem, with an interest in emerging technologies such as Fabric, AI-enhanced ETL, and real-time data streaming Proven ability to lead technical workstreams and mentor junior team members A strong alignment with the organisation's IDEAL values: Integrity, Drive, Empathy, Adaptability, and Loyalty Ready to Apply? This is a fantastic opportunity to join a forward-thinking organisation at a key stage of growth, working on impactful projects across critical industries. If you're looking to take the next step in your career within a collaborative and innovative environment, we'd love to hear from you.
31/03/2026
Full time
Company Overview We are working with an innovative organisation that recognises the increasing complexity of project delivery. Since 2013, our client has been helping companies of all sizes improve the way projects are delivered. Their mission is to become the number one provider of innovative project solutions, driven by a community of experienced, caring, and passionate professionals who are committed to improving project delivery. Why Join Our Client? Our client is currently in an exciting phase of growth, making this an excellent time to join their journey. They are building something special-scaling the business while maintaining a strong people-first approach. Investment in their teams is a key priority, creating an environment where development is encouraged and individuals are supported to grow with the organisation. Their culture sets them apart from other consulting practices, and they are looking to build a team that is equally ambitious. Position Overview Our client is seeking a Senior Data Engineer who thrives on building scalable, cloud-first data systems. In this role, you will design and manage data pipelines that support analytics, AI, and automation across complex infrastructure programmes. Your work will play a key part in enabling data-driven transformation across critical UK industries. Core Responsibilities Design, build, and optimise data pipelines using Azure Data Factory, Synapse, and Databricks Develop and maintain ETL/ELT workflows to ensure high data quality and reliability Collaborate with analysts and AI engineers to deliver robust and reusable data products Manage data lakes and warehouses using formats such as Delta Lake and Parquet Implement best practices for data governance, performance, and security Continuously evaluate and adopt new technologies to evolve the organisation's data platform Provide technical guidance to junior engineers and contribute to team capability building Technical Stack Core: Azure Data Factory Azure Synapse Analytics Azure Data Lake Storage Gen2 SQL Server Databricks Enhancements: Python (PySpark, Pandas) CI/CD (Azure DevOps) Infrastructure as Code (Terraform, Bicep) REST APIs GitHub Actions Desirable: Microsoft Fabric Delta Live Tables Power BI dataset automation DataOps practices What You'll Bring Professional experience in data engineering or cloud data development Strong understanding of data architecture, APIs, and modern data pipeline design Hands-on experience within Microsoft's Azure ecosystem, with an interest in emerging technologies such as Fabric, AI-enhanced ETL, and real-time data streaming Proven ability to lead technical workstreams and mentor junior team members A strong alignment with the organisation's IDEAL values: Integrity, Drive, Empathy, Adaptability, and Loyalty Ready to Apply? This is a fantastic opportunity to join a forward-thinking organisation at a key stage of growth, working on impactful projects across critical industries. If you're looking to take the next step in your career within a collaborative and innovative environment, we'd love to hear from you.
Spectrum IT Recruitment
Data Scientist
Spectrum IT Recruitment
Looking for an experienced Data Scientist to design, build, and optimise machine learning models and advanced analytics solutions that support institutional priorities across a large, complex network. The role blends hands-on data science with strategic impact, using AWS technologies to deliver predictive insights that drive proactive interventions and data-driven decision-making. This is a hybrid role with the expectation of working 2 days pw in the London office. Skills and experience required: Bachelor's degree in data science, Statistics, Computer Science, Mathematics, or similar Experience delivering predictive analytics or machine learning solutions Strong skills in Python, SQL, and ML libraries (e.g. scikit-learn, XGBoost, PyTorch, TensorFlow) Hands-on experience with AWS ML services (SageMaker, Lambda, Redshift) Ability to clearly communicate insights to non-technical stakeholders Strong analytical thinking, collaboration skills, and a results-driven mindset Role responsibilities: Build, tune, and maintain predictive and ML models using AWS SageMaker Analyse large datasets and perform feature engineering to improve model performance Run experiments, test hypotheses, and optimise models for accuracy and value Monitor model performance and manage retraining over time Collaborate with Data Engineers, BI Developers, and Analysts to integrate outputs into dashboards and reports Partner with academic, operational, and IT stakeholders to translate insights into action Document models and support knowledge sharing and scalability Contribute to the expansion of predictive analytics into advanced ML/AI use cases Spectrum IT Recruitment (South) Limited is acting as an Employment Agency in relation to this vacancy.
31/03/2026
Full time
Looking for an experienced Data Scientist to design, build, and optimise machine learning models and advanced analytics solutions that support institutional priorities across a large, complex network. The role blends hands-on data science with strategic impact, using AWS technologies to deliver predictive insights that drive proactive interventions and data-driven decision-making. This is a hybrid role with the expectation of working 2 days pw in the London office. Skills and experience required: Bachelor's degree in data science, Statistics, Computer Science, Mathematics, or similar Experience delivering predictive analytics or machine learning solutions Strong skills in Python, SQL, and ML libraries (e.g. scikit-learn, XGBoost, PyTorch, TensorFlow) Hands-on experience with AWS ML services (SageMaker, Lambda, Redshift) Ability to clearly communicate insights to non-technical stakeholders Strong analytical thinking, collaboration skills, and a results-driven mindset Role responsibilities: Build, tune, and maintain predictive and ML models using AWS SageMaker Analyse large datasets and perform feature engineering to improve model performance Run experiments, test hypotheses, and optimise models for accuracy and value Monitor model performance and manage retraining over time Collaborate with Data Engineers, BI Developers, and Analysts to integrate outputs into dashboards and reports Partner with academic, operational, and IT stakeholders to translate insights into action Document models and support knowledge sharing and scalability Contribute to the expansion of predictive analytics into advanced ML/AI use cases Spectrum IT Recruitment (South) Limited is acting as an Employment Agency in relation to this vacancy.
Tenth Revolution Group
Data Platform Architect - Permanent - Hybrid
Tenth Revolution Group City, London
Data Platform Architect - Permanent - Hybrid Role Overview We are seeking a highly skilled Data & AI Platform Architect to design and deliver cloud-native data, analytics, and AI platforms across the full data value chain. This role focuses on building scalable, secure, and sustainable solutions that integrate batch and Real Time processing, robust data management practices, and modern analytics and AI capabilities. You will play a key role in shaping architecture, advising clients, and leading small teams or workstreams, while remaining hands-on and close to delivery. This is an ideal position for someone who combines strong technical expertise with the ability to translate complex concepts into practical, business-focused solutions. Key Responsibilities Design and architect end-to-end data and AI platforms leveraging cloud-native technologies Build scalable solutions that integrate batch and Real Time data processing pipelines Define and implement robust data management practices across the full data life cycle Incorporate modern analytics, Machine Learning, and Generative AI capabilities into platform designs Advise clients on architecture, best practices, and strategic data platform decisions Lead small teams or workstreams, ensuring high-quality delivery and technical excellence Collaborate with cross-functional stakeholders including engineers, data scientists, and business teams Ensure solutions are secure, compliant, and aligned with industry standards and regulations Promote adoption of CI/CD and modern DevOps practices for data and AI platforms Required Skills & Experience Proven experience (5+ years) in data & analytics and AI solution design and delivery Hands-on experience with cloud-centric architecture and modern data platform technologies Strong understanding of core data management concepts across the full data life cycle (ingestion, storage, processing, governance, and consumption) Experience leading solution implementation for small teams or workstreams Familiarity with CI/CD pipelines and modern approaches to infrastructure and application delivery Solid understanding of Machine Learning, Generative AI, and Large Language Models, and their application in enterprise solutions Ability to clearly articulate complex technical concepts to both technical and non-technical audiences Desirable Experience Experience working in public sector or regulated industries (eg, government, healthcare, financial services) Familiarity with data governance, security, and compliance frameworks Experience designing platforms with sustainability and cost-efficiency in mind To apply for this role please submit your CV or contact Dillon Blackburn (see below) Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
31/03/2026
Full time
Data Platform Architect - Permanent - Hybrid Role Overview We are seeking a highly skilled Data & AI Platform Architect to design and deliver cloud-native data, analytics, and AI platforms across the full data value chain. This role focuses on building scalable, secure, and sustainable solutions that integrate batch and Real Time processing, robust data management practices, and modern analytics and AI capabilities. You will play a key role in shaping architecture, advising clients, and leading small teams or workstreams, while remaining hands-on and close to delivery. This is an ideal position for someone who combines strong technical expertise with the ability to translate complex concepts into practical, business-focused solutions. Key Responsibilities Design and architect end-to-end data and AI platforms leveraging cloud-native technologies Build scalable solutions that integrate batch and Real Time data processing pipelines Define and implement robust data management practices across the full data life cycle Incorporate modern analytics, Machine Learning, and Generative AI capabilities into platform designs Advise clients on architecture, best practices, and strategic data platform decisions Lead small teams or workstreams, ensuring high-quality delivery and technical excellence Collaborate with cross-functional stakeholders including engineers, data scientists, and business teams Ensure solutions are secure, compliant, and aligned with industry standards and regulations Promote adoption of CI/CD and modern DevOps practices for data and AI platforms Required Skills & Experience Proven experience (5+ years) in data & analytics and AI solution design and delivery Hands-on experience with cloud-centric architecture and modern data platform technologies Strong understanding of core data management concepts across the full data life cycle (ingestion, storage, processing, governance, and consumption) Experience leading solution implementation for small teams or workstreams Familiarity with CI/CD pipelines and modern approaches to infrastructure and application delivery Solid understanding of Machine Learning, Generative AI, and Large Language Models, and their application in enterprise solutions Ability to clearly articulate complex technical concepts to both technical and non-technical audiences Desirable Experience Experience working in public sector or regulated industries (eg, government, healthcare, financial services) Familiarity with data governance, security, and compliance frameworks Experience designing platforms with sustainability and cost-efficiency in mind To apply for this role please submit your CV or contact Dillon Blackburn (see below) Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Big Red Recruitment Midlands Limited
Data Analyst FTC
Big Red Recruitment Midlands Limited Binley Woods, Warwickshire
Turn data into decisions that directly impact revenue, pricing, and performance. We are hiring a Data Analyst for a 6 month Fixed Term Contract to play a key role in transforming how a major UK contracts division uses data across sales, margin, and commercial performance. About the client: You will be joining a large, multi-brand UK organisation operating within a complex distribution environment. With multiple business units and evolving systems, the organisation is investing heavily in improving how data is accessed, trusted, and used. This is a business in transition, moving towards a more modern, insight-driven approach to decision making. Project overview: This role sits at the centre of a transformation focused on improving visibility of margin, pricing, and sales performance. You will help redefine how analytics supports the business, working closely with stakeholders to shape reporting priorities and deliver insights that drive action. The work will also contribute to building a scalable analytics capability that can be rolled out more widely across the organisation. What you will be doing: You will act as the link between business stakeholders and data teams, translating complex commercial requirements into clear data and reporting solutions. A core part of the role will involve analysing sales, pricing, and margin performance, identifying key drivers, and turning these into actionable insights. You will develop a strong understanding of rebate-driven margin structures and ensure that changes to pricing and margin logic are accurately reflected in reporting. You will also support the transition from legacy tools into modern data models, ensuring historical and current reporting remains consistent and reliable. Alongside analysis, you will design and build scalable dashboards and reports, enabling business users to access clear, consistent insights without relying on manual processes. You will work closely with Data Engineers to define reusable datasets and ensure data models support flexible, self-serve analytics. Tech environment: You will work with tools such as Excel, Power BI, and modern data platforms, with exposure to enterprise data environments and evolving cloud-based solutions. What we are looking for: We are looking for someone with strong experience in data analysis, business intelligence, or commercial analytics. You should have a proven ability to work with pricing, margin, or financial data, and be confident using Excel at an advanced level alongside BI tools such as Power BI. You will need to be comfortable translating business needs into data solutions and working closely with stakeholders across the organisation. Strong communication and the ability to influence decision making through insight are key. Nice to have: Experience working in large-scale data environments or transformation programmes would be beneficial, as would familiarity with tools such as Phocas, Azure Synapse, or Databricks. Experience within retail, distribution, or similar sectors is also advantageous. Why join: This is an opportunity to play a central role in shaping how data is used within a key part of the business. Your work will directly influence commercial decisions and help set the standard for future analytics across multiple brands. Role: Data Analyst Duration: 6 month FTC - Chance to renew or go perm Salary: Up to £50,000 Location: Warwickshire / Hybrid
26/03/2026
Full time
Turn data into decisions that directly impact revenue, pricing, and performance. We are hiring a Data Analyst for a 6 month Fixed Term Contract to play a key role in transforming how a major UK contracts division uses data across sales, margin, and commercial performance. About the client: You will be joining a large, multi-brand UK organisation operating within a complex distribution environment. With multiple business units and evolving systems, the organisation is investing heavily in improving how data is accessed, trusted, and used. This is a business in transition, moving towards a more modern, insight-driven approach to decision making. Project overview: This role sits at the centre of a transformation focused on improving visibility of margin, pricing, and sales performance. You will help redefine how analytics supports the business, working closely with stakeholders to shape reporting priorities and deliver insights that drive action. The work will also contribute to building a scalable analytics capability that can be rolled out more widely across the organisation. What you will be doing: You will act as the link between business stakeholders and data teams, translating complex commercial requirements into clear data and reporting solutions. A core part of the role will involve analysing sales, pricing, and margin performance, identifying key drivers, and turning these into actionable insights. You will develop a strong understanding of rebate-driven margin structures and ensure that changes to pricing and margin logic are accurately reflected in reporting. You will also support the transition from legacy tools into modern data models, ensuring historical and current reporting remains consistent and reliable. Alongside analysis, you will design and build scalable dashboards and reports, enabling business users to access clear, consistent insights without relying on manual processes. You will work closely with Data Engineers to define reusable datasets and ensure data models support flexible, self-serve analytics. Tech environment: You will work with tools such as Excel, Power BI, and modern data platforms, with exposure to enterprise data environments and evolving cloud-based solutions. What we are looking for: We are looking for someone with strong experience in data analysis, business intelligence, or commercial analytics. You should have a proven ability to work with pricing, margin, or financial data, and be confident using Excel at an advanced level alongside BI tools such as Power BI. You will need to be comfortable translating business needs into data solutions and working closely with stakeholders across the organisation. Strong communication and the ability to influence decision making through insight are key. Nice to have: Experience working in large-scale data environments or transformation programmes would be beneficial, as would familiarity with tools such as Phocas, Azure Synapse, or Databricks. Experience within retail, distribution, or similar sectors is also advantageous. Why join: This is an opportunity to play a central role in shaping how data is used within a key part of the business. Your work will directly influence commercial decisions and help set the standard for future analytics across multiple brands. Role: Data Analyst Duration: 6 month FTC - Chance to renew or go perm Salary: Up to £50,000 Location: Warwickshire / Hybrid
Damia Group LTD
Data Architect
Damia Group LTD Hounslow, London
Data Architect - 70-90K base (DOE) - West London (hybrid) We are recruiting a Data Architect for one of our clients based in West London on a permanent basis. The Data Architect is responsible for leading the definition, standardization, and governance of data architecture across platforms and products. This role balances technical leadership, data architecture, and collaboration with engineering, product, and security teams to ensure scalable, reliable, and secure systems. Key responsibilities: Enforce data architectural guidelines and consistency across development teams and services Support established Data Governance and Data Quality frameworks, including tooling, policy enforcement, and stewardship models Ensure robust metadata management, lineage tracking, and data cataloguing using business glossaries and modern catalog tools Review and approve data architecture for major features, platforms, and technical initiatives Collaborate with technical leads and DevOps on system scalability, performance, and reliability Ensure data platforms are AI/ML-ready, with scalable infrastructure and clean, well-structured data pipelines Collaborate with data science and analytics teams to enable model deployment, automation, and MLOps best practices Promote innovation in generative AI, predictive analytics, and real-time decision support Align data architecture with security, compliance, and data governance requirements Lead the evolution of technical architecture documentation, models, and decision records Conduct architecture and design reviews with cross-functional teams Guide teams in the adoption of best practices in API design, modularity, cloud-native patterns, and event-driven systems Recommend data management best practices, covering data flows, architecture patterns, retention, archival, and purging strategies Coach and mentor engineers on data design, refactoring, and architectural reasoning Essential skills and experience: Proven experience designing and scaling enterprise-grade cloud data platforms (AWS preferred) Deep experience with AWS, Databricks, Power Platform, and Redshift (Snowflake a plus) Proficiency in AWS Glue, Qlik Talend, DBT, Airflow, and modern data integration tools. Excellent knowledge of Python, SQL, PowerQuery (M), and preferably Scala or PySpark Working knowledge with enterprise architecture frameworks (e.g., TOGAF), MLOps, and BI tools like Power BI and QuickSight Experience of generative AI platforms (e.g., Amazon Bedrock, Anthropic) Familiarity with infrastructure as code (Terraform), CI/CD practices (Jenkins, GitHub Actions), and observability (Grafana, Kibana) Proficiency in scripting and automation using Bash, Groovy, or equivalent Ability to balance long-term architectural vision with immediate delivery constraints Damia Group Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this permanent job, you accept our Data Protection Policy which can be found on our website. Please note that no terminology in this advert is intended to discriminate on the grounds of a person's gender, marital status, race, religion, colour, age, disability or sexual orientation. Every candidate will be assessed only in accordance with their merits, qualifications and ability to perform the duties of the job. Damia Group is acting as an Employment Business in relation to this vacancy and in accordance to Conduct Regulations 2003. The advertised salary range is dependent on experience and the required qualifications. Damia Group Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept our Data Protection Policy which can be found on our website. Please note that no terminology in this advert is intended to discriminate on the grounds of a person's gender, marital status, race, religion, colour, age, disability or sexual orientation. Every candidate will be assessed only in accordance with their merits, qualifications and ability to perform the duties of the job. Damia Group is acting as an Employment Business in relation to this vacancy and in accordance to Conduct Regulations 2003.
26/03/2026
Full time
Data Architect - 70-90K base (DOE) - West London (hybrid) We are recruiting a Data Architect for one of our clients based in West London on a permanent basis. The Data Architect is responsible for leading the definition, standardization, and governance of data architecture across platforms and products. This role balances technical leadership, data architecture, and collaboration with engineering, product, and security teams to ensure scalable, reliable, and secure systems. Key responsibilities: Enforce data architectural guidelines and consistency across development teams and services Support established Data Governance and Data Quality frameworks, including tooling, policy enforcement, and stewardship models Ensure robust metadata management, lineage tracking, and data cataloguing using business glossaries and modern catalog tools Review and approve data architecture for major features, platforms, and technical initiatives Collaborate with technical leads and DevOps on system scalability, performance, and reliability Ensure data platforms are AI/ML-ready, with scalable infrastructure and clean, well-structured data pipelines Collaborate with data science and analytics teams to enable model deployment, automation, and MLOps best practices Promote innovation in generative AI, predictive analytics, and real-time decision support Align data architecture with security, compliance, and data governance requirements Lead the evolution of technical architecture documentation, models, and decision records Conduct architecture and design reviews with cross-functional teams Guide teams in the adoption of best practices in API design, modularity, cloud-native patterns, and event-driven systems Recommend data management best practices, covering data flows, architecture patterns, retention, archival, and purging strategies Coach and mentor engineers on data design, refactoring, and architectural reasoning Essential skills and experience: Proven experience designing and scaling enterprise-grade cloud data platforms (AWS preferred) Deep experience with AWS, Databricks, Power Platform, and Redshift (Snowflake a plus) Proficiency in AWS Glue, Qlik Talend, DBT, Airflow, and modern data integration tools. Excellent knowledge of Python, SQL, PowerQuery (M), and preferably Scala or PySpark Working knowledge with enterprise architecture frameworks (e.g., TOGAF), MLOps, and BI tools like Power BI and QuickSight Experience of generative AI platforms (e.g., Amazon Bedrock, Anthropic) Familiarity with infrastructure as code (Terraform), CI/CD practices (Jenkins, GitHub Actions), and observability (Grafana, Kibana) Proficiency in scripting and automation using Bash, Groovy, or equivalent Ability to balance long-term architectural vision with immediate delivery constraints Damia Group Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this permanent job, you accept our Data Protection Policy which can be found on our website. Please note that no terminology in this advert is intended to discriminate on the grounds of a person's gender, marital status, race, religion, colour, age, disability or sexual orientation. Every candidate will be assessed only in accordance with their merits, qualifications and ability to perform the duties of the job. Damia Group is acting as an Employment Business in relation to this vacancy and in accordance to Conduct Regulations 2003. The advertised salary range is dependent on experience and the required qualifications. Damia Group Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept our Data Protection Policy which can be found on our website. Please note that no terminology in this advert is intended to discriminate on the grounds of a person's gender, marital status, race, religion, colour, age, disability or sexual orientation. Every candidate will be assessed only in accordance with their merits, qualifications and ability to perform the duties of the job. Damia Group is acting as an Employment Business in relation to this vacancy and in accordance to Conduct Regulations 2003.
Damia Group LTD
SAS Environment & Configuration Engineer
Damia Group LTD Telford, Shropshire
SAS Environment & Configuration Engineer - £525 per day inside IR35 - 6 months - Location: Telford with 2 days/week in office Purpose: The SAS Platform Environment and Configuration Engineer is responsible for the setup, configuration, and ongoing support of SAS environments across development, test, and production landscapes. This includes managing deployments, ensuring platform stability, and supporting integration with technologies such as Oracle and Microsoft 365. The role is critical to enabling secure, scalable, and performant analytics platforms, particularly in the context of SAS Viya migrations and multi-tenant deployments. Key Responsibilities: Design, configure, and maintain SAS environments including SAS 9.4, SAS Viya 3.5, and SAS Viya 4. Support the migration of Legacy SAS platforms to cloud-native SAS Viya 4 environments, including namespace separation and Kubernetes orchestration. Implement and manage configuration objects, service data, and translation objects within SAS environments. Collaborate with data engineers and DevOps teams to integrate SAS with Oracle, GitLab CI/CD pipelines, and other enterprise platforms. Monitor and optimise platform performance, ensuring high availability and compliance with security standards. Provide technical support and troubleshooting for SAS users and developers. Automate environment provisioning and data transfers using approved EA integration patterns. Participate in community knowledge sharing and contribute to best practice documentation. Essential Skills and Experience: SAS Enterprise Guide (EG) - workflow orchestration and user support. SAS Data Integration Studio (DI) - job development and deployment. SAS Studio - job development and deployment. SAS Viya 3.5/Viya 4 - platform configuration, migration, and containerisation (Kubernetes). SAS Visual Analytics (VA) and SAS Visual Investigator (VI) - dashboard and investigation tool support. Oracle - integration and data pipeline support. Configuration Management - experience managing service objects, deployment artefacts, and environment baselines. Testing Lifecycle Knowledge - including test strategy design, environment comparison utilities, and automated testing integration. Desirable Attributes: Experience with cloud platforms (AWS, Azure) and container orchestration (Kubernetes, Docker). Familiarity with GitLab CI/CD, Infrastructure as Code (IaC), and automated deployment pipelines. Knowledge of scheduling tools - Airflow Strong documentation and communication skills. Ability to work collaboratively across multidisciplinary teams. This temporary contract is inside IR35 and will require working under the direction of the client delivery manager as part of a multi-disciplinary team. The successful candidate will follow established delivery processes and working practices This role requires the successful candidate to undergo and be eligible for UK Security Vetting at SC level. Clearance sponsorship will be provided where required. Due to the nature of the work, candidates should meet the relevant residency requirements. If applicable, reserved post nationality restrictions will be confirmed by the client. Damia is committed to inclusive recruitment and welcomes applicants from all backgrounds. Due to the secure nature of the position and working environment, you must have, or be eligible to obtain Security Clearance More details relating to UK Security Clearance can be found here: United Kingdom Security Vetting: clearance levels - GOV.UK List of national security vetting clearance levels with guidance on who requires clearance and what checks are completed. Damia Group Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept our Data Protection Policy which can be found on our website. Please note that no terminology in this advert is intended to discriminate on the grounds of a person's gender, marital status, race, religion, colour, age, disability or sexual orientation. Every candidate will be assessed only in accordance with their merits, qualifications and ability to perform the duties of the job. Should the role require the successful candidate to undergo and be eligible for UK Security Vetting. Clearance sponsorship will be provided where required. Due to the nature of the work, candidates should meet the relevant residency requirements. If applicable, Reserved Post nationality restrictions will be confirmed by the client. Damia is committed to inclusive recruitment and welcomes applicants from all backgrounds. Damia Group is acting as an Employment Business in relation to this vacancy and in accordance to Conduct Regulations 2003.
25/03/2026
Contractor
SAS Environment & Configuration Engineer - £525 per day inside IR35 - 6 months - Location: Telford with 2 days/week in office Purpose: The SAS Platform Environment and Configuration Engineer is responsible for the setup, configuration, and ongoing support of SAS environments across development, test, and production landscapes. This includes managing deployments, ensuring platform stability, and supporting integration with technologies such as Oracle and Microsoft 365. The role is critical to enabling secure, scalable, and performant analytics platforms, particularly in the context of SAS Viya migrations and multi-tenant deployments. Key Responsibilities: Design, configure, and maintain SAS environments including SAS 9.4, SAS Viya 3.5, and SAS Viya 4. Support the migration of Legacy SAS platforms to cloud-native SAS Viya 4 environments, including namespace separation and Kubernetes orchestration. Implement and manage configuration objects, service data, and translation objects within SAS environments. Collaborate with data engineers and DevOps teams to integrate SAS with Oracle, GitLab CI/CD pipelines, and other enterprise platforms. Monitor and optimise platform performance, ensuring high availability and compliance with security standards. Provide technical support and troubleshooting for SAS users and developers. Automate environment provisioning and data transfers using approved EA integration patterns. Participate in community knowledge sharing and contribute to best practice documentation. Essential Skills and Experience: SAS Enterprise Guide (EG) - workflow orchestration and user support. SAS Data Integration Studio (DI) - job development and deployment. SAS Studio - job development and deployment. SAS Viya 3.5/Viya 4 - platform configuration, migration, and containerisation (Kubernetes). SAS Visual Analytics (VA) and SAS Visual Investigator (VI) - dashboard and investigation tool support. Oracle - integration and data pipeline support. Configuration Management - experience managing service objects, deployment artefacts, and environment baselines. Testing Lifecycle Knowledge - including test strategy design, environment comparison utilities, and automated testing integration. Desirable Attributes: Experience with cloud platforms (AWS, Azure) and container orchestration (Kubernetes, Docker). Familiarity with GitLab CI/CD, Infrastructure as Code (IaC), and automated deployment pipelines. Knowledge of scheduling tools - Airflow Strong documentation and communication skills. Ability to work collaboratively across multidisciplinary teams. This temporary contract is inside IR35 and will require working under the direction of the client delivery manager as part of a multi-disciplinary team. The successful candidate will follow established delivery processes and working practices This role requires the successful candidate to undergo and be eligible for UK Security Vetting at SC level. Clearance sponsorship will be provided where required. Due to the nature of the work, candidates should meet the relevant residency requirements. If applicable, reserved post nationality restrictions will be confirmed by the client. Damia is committed to inclusive recruitment and welcomes applicants from all backgrounds. Due to the secure nature of the position and working environment, you must have, or be eligible to obtain Security Clearance More details relating to UK Security Clearance can be found here: United Kingdom Security Vetting: clearance levels - GOV.UK List of national security vetting clearance levels with guidance on who requires clearance and what checks are completed. Damia Group Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept our Data Protection Policy which can be found on our website. Please note that no terminology in this advert is intended to discriminate on the grounds of a person's gender, marital status, race, religion, colour, age, disability or sexual orientation. Every candidate will be assessed only in accordance with their merits, qualifications and ability to perform the duties of the job. Should the role require the successful candidate to undergo and be eligible for UK Security Vetting. Clearance sponsorship will be provided where required. Due to the nature of the work, candidates should meet the relevant residency requirements. If applicable, Reserved Post nationality restrictions will be confirmed by the client. Damia is committed to inclusive recruitment and welcomes applicants from all backgrounds. Damia Group is acting as an Employment Business in relation to this vacancy and in accordance to Conduct Regulations 2003.
Cambridge University Press & Assessment
Principal Data Scientist
Cambridge University Press & Assessment Cambridge, UK
Job Title: Principal Data Scientist Salary: £74,200 - £99,250 Location: Cambridge/Hybrid with 2 day per week at the office Contract: Permanent  Hours: Full time 35 hours per week Are you excited by the challenge of applying data science and AI to problems that genuinely matter? At Cambridge Assessment, we are transforming how assessments are designed, delivered and marked worldwide. As a Principal Data Scientist , you will play a pivotal role at the heart of this transformation – leading our data science capability for AI-enabled assessment products used by millions of learners globally. We are Cambridge University Press & Assessment, a world-leading academic publisher and assessment organisation and a proud part of the University of Cambridge.  This is a senior, influential role where you will combine deep technical expertise with strategic leadership. You will shape our data strategy, lead and mentor a growing team, and work closely with researchers, engineers and product teams to turn complex data into insight, innovation and trusted solutions. About the role    As Principal Data Scientist, you will lead the operational data science and analytics capability within our Assessment & Research Capabilities (ARC) function. You will be the data leader for automarking, representing ARC's data capability across Exam Technology and the wider organisation. You will: Set the direction for data science and analytics supporting automarking and AI-driven assessment Lead and grow a small, high-impact team of data scientists and engineers Curate high-quality data products used across research, machine learning and product teams Act as a trusted partner to senior stakeholders, influencing product and research decisions with evidence and insight Ensure sensitive exam and candidate data is handled responsibly and ethically Additional responsibilities and accountabilities include:   Lead data science, data engineering and analytics activities within ARC Define and own the data strategy for automarking and related AI capabilities Design and oversee data warehouses, pipelines and integrations with the wider organisation Translate complex business and research needs into robust data solutions Provide expert input into product, research and architectural decisions, up to board level Build strong relationships with internal teams and external research partners Champion best practice in data quality, DataOps and analytics engineering This position has been classified as a hybrid role, requiring the selected candidate to typically spend 40-60% of their time collaborating and connecting face-to-face at their dedicated location. Aside from our hybrid principles, other flexible working requests will be considered from the first day of employment, including other work arrangements should you require adjustments due to a disability or long-term health condition.  About You    To be successful in this role, you will bring: Extensive experience in data science, analytics or analytics engineering in a complex environment Advanced SQL skills, including writing, analysing and optimising large analytical queries Strong experience with a data science programming language such as Python, R or Julia Hands-on experience with data transformation tools such as dbt, Dataform or SQLMesh Experience using BI and visualisation tools such as Metabase, Looker, Tableau or Power BI A strong understanding of data warehousing principles (e.g. Kimball methodology) Experience designing data models that enable self-service analytics Proven ability to translate business or research questions into data-driven insights Experience communicating complex technical concepts to non-technical and senior audiences Leadership experience, including mentoring and guiding other data professionals If you meet the above minimum requirements, we encourage you to apply. Your application will be even stronger if you can also demonstrate the following  desirable  criteria:  Machine learning or AI product experience Exposure to automarking, assessment, or high-stakes data environments Skills in experimentation and statistical analysis (A/B testing, forecasting) Familiarity with DataOps (CI/CD, testing, orchestration, observability) For a detailed job description, please refer to the link at the bottom of the advert on our careers site. We are a Disability Confident (DC) employer that is committed to equality and inclusion ensuring our recruitment process is accessible to all. The DC scheme's  Offer of an Interview  commitment applies to applicants who opt in, and disclose a disability or a long-term health condition, and best meet the minimum criteria for the role. In instances where interviewing all qualifying candidates is not practicable, we prioritise those who best meet the minimum criteria, as we would for applicants who do not have a disability or long-term health condition. Cambridge University Press & Assessment is an approved UK employer for the sponsorship of eligible roles and applicants under the Skilled Worker visa route. Please refer to the  gov.uk  website for guidance to understand your own eligibility based on the role you are applying for. Rewards and benefits     We will support you to be at your best in work and to live well outside of it. In addition to competitive salaries, we offer a world-class, flexible  rewards package , featuring family-friendly and planet-friendly benefits including:  28 days annual leave plus bank holidays  Private medical and Permanent Health Insurance   Discretionary annual bonus   Group personal pension scheme  Life assurance up to 4 x annual salary   Green travel schemes   Ready to pursue your potential? Apply now. We aim to support candidates by making our interview process clear and transparent. The closing date for all applications will be  13   March 2026.   We will review applications on an ongoing basis, and shortlisted candidates can expect interviews to take place shortly after it closes. As part of the application process, you can expect:   At application stage: four technical questions to answer when submitting your CV. Stage 1 : 30-minute screening call with the hiring manager. Stage 2 : 60-minute session includes questions about key skills as well as a code review or whiteboard exercise. Stage 3 : 90-minute system design exercise with an assignment provided at least three days before the interview. During the interview, is where the designs are explained and discussed. Stage 4 : Leadership and cultural 45-minute interview. If you require any reasonable adjustments during the recruitment process due to a disability or a long-term health condition, there will be an opportunity for you to inform us via the online application form. We will do our best to accommodate your needs.  Please note that successful applicants will be subject to satisfactory background checks including DBS due to working in a regulated industry. We are committed to an equitable recruitment process. As such, applications must be submitted via our official online application procedure. Please refrain from sending your CV directly to our recruiters. If you experience technical difficulties or require additional support with submitting your online application, contact the Recruiter.  Why join us   Joining us is your opportunity to pursue potential. You will belong to a collaborative team that is exploring new and better ways to serve students, teachers and researchers across the globe – for the benefit of individuals, society and the world. Sharing our mission will inspire your own growth, development and progress, in an environment which embraces difference, change and aspiration. Cambridge University Press & Assessment is committed to being a place where anyone can enjoy a successful career, where it is safe to speak up, and where we learn continuously to improve together. We welcome applications from all candidates, regardless of demographic characteristics (age, disability, educational attainment, ethnicity, gender, marital status, neurodiversity, religion, sex, gender identity and sexual identity), cultural, or social class/background.  We believe better outcomes come through diversity of thought, background and approach. We welcome applications from people from all backgrounds and communities, actively seeking to employ people from a wide range of different communities.
02/03/2026
Full time
Job Title: Principal Data Scientist Salary: £74,200 - £99,250 Location: Cambridge/Hybrid with 2 day per week at the office Contract: Permanent  Hours: Full time 35 hours per week Are you excited by the challenge of applying data science and AI to problems that genuinely matter? At Cambridge Assessment, we are transforming how assessments are designed, delivered and marked worldwide. As a Principal Data Scientist , you will play a pivotal role at the heart of this transformation – leading our data science capability for AI-enabled assessment products used by millions of learners globally. We are Cambridge University Press & Assessment, a world-leading academic publisher and assessment organisation and a proud part of the University of Cambridge.  This is a senior, influential role where you will combine deep technical expertise with strategic leadership. You will shape our data strategy, lead and mentor a growing team, and work closely with researchers, engineers and product teams to turn complex data into insight, innovation and trusted solutions. About the role    As Principal Data Scientist, you will lead the operational data science and analytics capability within our Assessment & Research Capabilities (ARC) function. You will be the data leader for automarking, representing ARC's data capability across Exam Technology and the wider organisation. You will: Set the direction for data science and analytics supporting automarking and AI-driven assessment Lead and grow a small, high-impact team of data scientists and engineers Curate high-quality data products used across research, machine learning and product teams Act as a trusted partner to senior stakeholders, influencing product and research decisions with evidence and insight Ensure sensitive exam and candidate data is handled responsibly and ethically Additional responsibilities and accountabilities include:   Lead data science, data engineering and analytics activities within ARC Define and own the data strategy for automarking and related AI capabilities Design and oversee data warehouses, pipelines and integrations with the wider organisation Translate complex business and research needs into robust data solutions Provide expert input into product, research and architectural decisions, up to board level Build strong relationships with internal teams and external research partners Champion best practice in data quality, DataOps and analytics engineering This position has been classified as a hybrid role, requiring the selected candidate to typically spend 40-60% of their time collaborating and connecting face-to-face at their dedicated location. Aside from our hybrid principles, other flexible working requests will be considered from the first day of employment, including other work arrangements should you require adjustments due to a disability or long-term health condition.  About You    To be successful in this role, you will bring: Extensive experience in data science, analytics or analytics engineering in a complex environment Advanced SQL skills, including writing, analysing and optimising large analytical queries Strong experience with a data science programming language such as Python, R or Julia Hands-on experience with data transformation tools such as dbt, Dataform or SQLMesh Experience using BI and visualisation tools such as Metabase, Looker, Tableau or Power BI A strong understanding of data warehousing principles (e.g. Kimball methodology) Experience designing data models that enable self-service analytics Proven ability to translate business or research questions into data-driven insights Experience communicating complex technical concepts to non-technical and senior audiences Leadership experience, including mentoring and guiding other data professionals If you meet the above minimum requirements, we encourage you to apply. Your application will be even stronger if you can also demonstrate the following  desirable  criteria:  Machine learning or AI product experience Exposure to automarking, assessment, or high-stakes data environments Skills in experimentation and statistical analysis (A/B testing, forecasting) Familiarity with DataOps (CI/CD, testing, orchestration, observability) For a detailed job description, please refer to the link at the bottom of the advert on our careers site. We are a Disability Confident (DC) employer that is committed to equality and inclusion ensuring our recruitment process is accessible to all. The DC scheme's  Offer of an Interview  commitment applies to applicants who opt in, and disclose a disability or a long-term health condition, and best meet the minimum criteria for the role. In instances where interviewing all qualifying candidates is not practicable, we prioritise those who best meet the minimum criteria, as we would for applicants who do not have a disability or long-term health condition. Cambridge University Press & Assessment is an approved UK employer for the sponsorship of eligible roles and applicants under the Skilled Worker visa route. Please refer to the  gov.uk  website for guidance to understand your own eligibility based on the role you are applying for. Rewards and benefits     We will support you to be at your best in work and to live well outside of it. In addition to competitive salaries, we offer a world-class, flexible  rewards package , featuring family-friendly and planet-friendly benefits including:  28 days annual leave plus bank holidays  Private medical and Permanent Health Insurance   Discretionary annual bonus   Group personal pension scheme  Life assurance up to 4 x annual salary   Green travel schemes   Ready to pursue your potential? Apply now. We aim to support candidates by making our interview process clear and transparent. The closing date for all applications will be  13   March 2026.   We will review applications on an ongoing basis, and shortlisted candidates can expect interviews to take place shortly after it closes. As part of the application process, you can expect:   At application stage: four technical questions to answer when submitting your CV. Stage 1 : 30-minute screening call with the hiring manager. Stage 2 : 60-minute session includes questions about key skills as well as a code review or whiteboard exercise. Stage 3 : 90-minute system design exercise with an assignment provided at least three days before the interview. During the interview, is where the designs are explained and discussed. Stage 4 : Leadership and cultural 45-minute interview. If you require any reasonable adjustments during the recruitment process due to a disability or a long-term health condition, there will be an opportunity for you to inform us via the online application form. We will do our best to accommodate your needs.  Please note that successful applicants will be subject to satisfactory background checks including DBS due to working in a regulated industry. We are committed to an equitable recruitment process. As such, applications must be submitted via our official online application procedure. Please refrain from sending your CV directly to our recruiters. If you experience technical difficulties or require additional support with submitting your online application, contact the Recruiter.  Why join us   Joining us is your opportunity to pursue potential. You will belong to a collaborative team that is exploring new and better ways to serve students, teachers and researchers across the globe – for the benefit of individuals, society and the world. Sharing our mission will inspire your own growth, development and progress, in an environment which embraces difference, change and aspiration. Cambridge University Press & Assessment is committed to being a place where anyone can enjoy a successful career, where it is safe to speak up, and where we learn continuously to improve together. We welcome applications from all candidates, regardless of demographic characteristics (age, disability, educational attainment, ethnicity, gender, marital status, neurodiversity, religion, sex, gender identity and sexual identity), cultural, or social class/background.  We believe better outcomes come through diversity of thought, background and approach. We welcome applications from people from all backgrounds and communities, actively seeking to employ people from a wide range of different communities.
Stott and May
Data Architect
Stott and May
Role Title: Data Architect Location: London - Hybrid (2 days in office) Day Rate: Market rate (Inside IR35) Contract Duration: Minimum 6 months Role Overview We are seeking a talented Data Architect to join our data-driven transformation journey. This is an exciting opportunity to work with cutting-edge technologies such as Data Vault, Snowflake, DBT, Airflow, and AWS/Azure, designing and delivering innovative data solutions that empower analytics and drive business value. As a Data Architect, you will collaborate closely with Product Owners, engineers, and analytics teams to design, build, and maintain robust data models and architecture. This role is ideal for a professional passionate about creating order within complex data structures and delivering impactful, scalable data solutions. Key Responsibilities . Design and develop enterprise data architecture and advanced data models that support business requirements. . Triaging new requirements, assessing impact, and providing estimates for updates to data models. . Develop and enhance physical data models while ensuring alignment with business and technical standards. . Act as custodian of data models, defining and enforcing modelling principles within delivery teams. . Ensure solutions meet business needs while aligning with the overall end-state architecture. . Document, communicate, and centrally manage data models to promote consistency and reuse. . Collaborate with the broader architecture community to share expertise and promote innovation. . Stay up to date with emerging technologies, including machine learning, AI/ML, and modern algorithmic techniques at scale, applying best practices to data design. . Translate business requirements into actionable, scalable, and effective data solutions. Essential Skills & Experience . Strong experience in data architecture, modelling, and data warehousing. . Expertise with various modelling methodologies, including 3NF, Dimensional, and Data Vault. . Proven ability to work with large, complex datasets and deliver actionable insights. . Familiarity with cloud platforms (AWS/Azure) and experience with cloud data solutions/migrations. . Strong communication and influencing skills to engage with stakeholders and cross-functional teams. . Critical thinking and problem-solving skills, with the ability to translate business requirements into technical solutions. . Collaborative mindset with strong interpersonal skills. Desirable Skills . Exposure to AI/ML and Generative AI technologies. . Experience with data visualisation tools such as Power BI or MicroStrategy.
08/10/2025
Contractor
Role Title: Data Architect Location: London - Hybrid (2 days in office) Day Rate: Market rate (Inside IR35) Contract Duration: Minimum 6 months Role Overview We are seeking a talented Data Architect to join our data-driven transformation journey. This is an exciting opportunity to work with cutting-edge technologies such as Data Vault, Snowflake, DBT, Airflow, and AWS/Azure, designing and delivering innovative data solutions that empower analytics and drive business value. As a Data Architect, you will collaborate closely with Product Owners, engineers, and analytics teams to design, build, and maintain robust data models and architecture. This role is ideal for a professional passionate about creating order within complex data structures and delivering impactful, scalable data solutions. Key Responsibilities . Design and develop enterprise data architecture and advanced data models that support business requirements. . Triaging new requirements, assessing impact, and providing estimates for updates to data models. . Develop and enhance physical data models while ensuring alignment with business and technical standards. . Act as custodian of data models, defining and enforcing modelling principles within delivery teams. . Ensure solutions meet business needs while aligning with the overall end-state architecture. . Document, communicate, and centrally manage data models to promote consistency and reuse. . Collaborate with the broader architecture community to share expertise and promote innovation. . Stay up to date with emerging technologies, including machine learning, AI/ML, and modern algorithmic techniques at scale, applying best practices to data design. . Translate business requirements into actionable, scalable, and effective data solutions. Essential Skills & Experience . Strong experience in data architecture, modelling, and data warehousing. . Expertise with various modelling methodologies, including 3NF, Dimensional, and Data Vault. . Proven ability to work with large, complex datasets and deliver actionable insights. . Familiarity with cloud platforms (AWS/Azure) and experience with cloud data solutions/migrations. . Strong communication and influencing skills to engage with stakeholders and cross-functional teams. . Critical thinking and problem-solving skills, with the ability to translate business requirements into technical solutions. . Collaborative mindset with strong interpersonal skills. Desirable Skills . Exposure to AI/ML and Generative AI technologies. . Experience with data visualisation tools such as Power BI or MicroStrategy.
RecruitmentRevolution.com
VP Engineering - Head of Software Development. AI Martech SaaS
RecruitmentRevolution.com City, Manchester
Welcome to ASK BOSCO , thanks for stopping by Let s pause for a second. Before we talk perks, equity, or growth stats, let s flip the script. This isn t about us. Not yet. This is about you: • What s driving your search right now, what s prompting you to take the next big step in your career? • Are you looking for a role where you can lead a high-performing team, shape how technology is applied, and make a direct impact on growth and customer success? • Do you want to work in a fast-paced, idea driven environment where your voice matters, but delivery, stability, and scalability always come first? Hold onto those thoughts. Now let us introduce you to something special, a chance to join us as VP of Engineering at ASK BOSCO as we build the foundations for our next phase of hyper-growth. By the end, if it doesn t feel like the right fit, no worries. But if you feel that spark, the same one we ve got, this could be the start of something extraordinary. The Role at a Glance: VP of Engineering (Python) Hybrid Leeds HQ, 2 Days per Week £120,000 + Equity Potential Plus Benefits: 4-day week, 23 days annual leave + bank holidays, health insurance, retail & leisure perks, electric car scheme Values & Culture: Outstanding Company to Work For 2024 Company: AI-powered marketing analytics platform Pedigree: Visionary Founder. Backed by renowned tech entrepreneurs, including co-founder of SkyScanner. Profits with Purpose 10% of profits donated to 1moreChild orphanage annually Markets: Marketing & eCommerce Agencies, eCommerce retailers Your Expertise: You re a proven engineering leader with deep experience in building, scaling, and managing cross-functional tech teams: data engineers, data scientists, data analysts, and developers. You ve led teams through transformational growth, delivering stable, scalable SaaS products that retain and delight customers. You understand what it takes to grow a business from early-stage traction to a large-scale, enterprise-grade platform, and you re ready to lay the technical foundations for our next phase of hyper-growth. You also bring strong experience managing infrastructure partners, confidently navigating and shaping these critical relationships. This isn t a back-seat role. You ll be hands-on where it counts, partnering with the Chief Product Officer to drive delivery, scalability, and long-term success. You ll guide senior stakeholders with clear, data-driven insights, helping keep the roadmap on track in a fast-moving, idea-driven environment. What You ll Be Driving: • Drive technical execution: Deliver and enhance our architecture, infrastructure, and product roadmap to ensure scalable growth and high-performing solutions. • Lead the team: Mentor and grow data engineers, scientists, analysts, and developers to deliver a high-performing, collaborative culture. • Ensure delivery & stability: Make sure features are delivered on time, scalable, and reliable, supporting long-term customer retention. • Manage key relationships: Own the infrastructure partner relationship to guarantee uptime, performance, and security. • Partner with the Chief Product Officer: Align on roadmap priorities, balancing strategic vision with day-to-day execution. • Drive enterprise-grade excellence: Embed best-in-class engineering practices agile, CI/CD, DevOps, security-first design across teams. • Champion scalability: Build products and systems that can grow globally and support tens of thousands of customers. What You ll Bring: • Experience as VP of Engineering, CTO, or senior technical leader in a high-growth or Series A-stage company. • Solid-Strong Python development skills • Proven ability to scale cross-functional teams: data engineering, data science, analytics, and development. • Hands-on experience delivering stable, scalable SaaS products with measurable customer impact. • Experience managing infrastructure partners and external technical relationships. • Confident and polite stakeholder management, able to challenge senior leaders while keeping delivery on track. • Track record of leading transformational growth, scaling teams and products successfully. • Excellent communication - bridging tech, product, and commercial discussions seamlessly. • Customer-first mindset - building products that retain and delight users. This is a role for someone who s been there, done it, and is ready to do it again with more ownership, autonomy, and impact than ever before in a fast-paced, high-growth environment. Inclusive Culture At ASK BOSCO , everybody is invited with open arms. We embrace diversity in all forms race, gender, age, sexual orientation, disability, and beyond. Our mission is stronger when everyone can bring their authentic selves to work. Can you see yourself building the tech engine that drives our next phase of hyper-growth? If high-impact, high-growth, and hands-on leadership excites you, let s make it happen. Apply now and let s talk. P.S. Did we have you at the four-day week? Application notice We take your privacy seriously. As you might expect you may be contacted by email, text or telephone. Your data is processed by our talent partner RR (Recruitment Revolution) on the basis of their legitimate interests in fulfilling the recruitment process. Please refer to their Data Privacy Policy & Notice on their website for further details.
04/10/2025
Full time
Welcome to ASK BOSCO , thanks for stopping by Let s pause for a second. Before we talk perks, equity, or growth stats, let s flip the script. This isn t about us. Not yet. This is about you: • What s driving your search right now, what s prompting you to take the next big step in your career? • Are you looking for a role where you can lead a high-performing team, shape how technology is applied, and make a direct impact on growth and customer success? • Do you want to work in a fast-paced, idea driven environment where your voice matters, but delivery, stability, and scalability always come first? Hold onto those thoughts. Now let us introduce you to something special, a chance to join us as VP of Engineering at ASK BOSCO as we build the foundations for our next phase of hyper-growth. By the end, if it doesn t feel like the right fit, no worries. But if you feel that spark, the same one we ve got, this could be the start of something extraordinary. The Role at a Glance: VP of Engineering (Python) Hybrid Leeds HQ, 2 Days per Week £120,000 + Equity Potential Plus Benefits: 4-day week, 23 days annual leave + bank holidays, health insurance, retail & leisure perks, electric car scheme Values & Culture: Outstanding Company to Work For 2024 Company: AI-powered marketing analytics platform Pedigree: Visionary Founder. Backed by renowned tech entrepreneurs, including co-founder of SkyScanner. Profits with Purpose 10% of profits donated to 1moreChild orphanage annually Markets: Marketing & eCommerce Agencies, eCommerce retailers Your Expertise: You re a proven engineering leader with deep experience in building, scaling, and managing cross-functional tech teams: data engineers, data scientists, data analysts, and developers. You ve led teams through transformational growth, delivering stable, scalable SaaS products that retain and delight customers. You understand what it takes to grow a business from early-stage traction to a large-scale, enterprise-grade platform, and you re ready to lay the technical foundations for our next phase of hyper-growth. You also bring strong experience managing infrastructure partners, confidently navigating and shaping these critical relationships. This isn t a back-seat role. You ll be hands-on where it counts, partnering with the Chief Product Officer to drive delivery, scalability, and long-term success. You ll guide senior stakeholders with clear, data-driven insights, helping keep the roadmap on track in a fast-moving, idea-driven environment. What You ll Be Driving: • Drive technical execution: Deliver and enhance our architecture, infrastructure, and product roadmap to ensure scalable growth and high-performing solutions. • Lead the team: Mentor and grow data engineers, scientists, analysts, and developers to deliver a high-performing, collaborative culture. • Ensure delivery & stability: Make sure features are delivered on time, scalable, and reliable, supporting long-term customer retention. • Manage key relationships: Own the infrastructure partner relationship to guarantee uptime, performance, and security. • Partner with the Chief Product Officer: Align on roadmap priorities, balancing strategic vision with day-to-day execution. • Drive enterprise-grade excellence: Embed best-in-class engineering practices agile, CI/CD, DevOps, security-first design across teams. • Champion scalability: Build products and systems that can grow globally and support tens of thousands of customers. What You ll Bring: • Experience as VP of Engineering, CTO, or senior technical leader in a high-growth or Series A-stage company. • Solid-Strong Python development skills • Proven ability to scale cross-functional teams: data engineering, data science, analytics, and development. • Hands-on experience delivering stable, scalable SaaS products with measurable customer impact. • Experience managing infrastructure partners and external technical relationships. • Confident and polite stakeholder management, able to challenge senior leaders while keeping delivery on track. • Track record of leading transformational growth, scaling teams and products successfully. • Excellent communication - bridging tech, product, and commercial discussions seamlessly. • Customer-first mindset - building products that retain and delight users. This is a role for someone who s been there, done it, and is ready to do it again with more ownership, autonomy, and impact than ever before in a fast-paced, high-growth environment. Inclusive Culture At ASK BOSCO , everybody is invited with open arms. We embrace diversity in all forms race, gender, age, sexual orientation, disability, and beyond. Our mission is stronger when everyone can bring their authentic selves to work. Can you see yourself building the tech engine that drives our next phase of hyper-growth? If high-impact, high-growth, and hands-on leadership excites you, let s make it happen. Apply now and let s talk. P.S. Did we have you at the four-day week? Application notice We take your privacy seriously. As you might expect you may be contacted by email, text or telephone. Your data is processed by our talent partner RR (Recruitment Revolution) on the basis of their legitimate interests in fulfilling the recruitment process. Please refer to their Data Privacy Policy & Notice on their website for further details.
RecruitmentRevolution.com
VP Engineering - Head of Software Development. AI Martech SaaS
RecruitmentRevolution.com City, Leeds
Welcome to ASK BOSCO , thanks for stopping by Let s pause for a second. Before we talk perks, equity, or growth stats, let s flip the script. This isn t about us. Not yet. This is about you: • What s driving your search right now, what s prompting you to take the next big step in your career? • Are you looking for a role where you can lead a high-performing team, shape how technology is applied, and make a direct impact on growth and customer success? • Do you want to work in a fast-paced, idea driven environment where your voice matters, but delivery, stability, and scalability always come first? Hold onto those thoughts. Now let us introduce you to something special, a chance to join us as VP of Engineering at ASK BOSCO as we build the foundations for our next phase of hyper-growth. By the end, if it doesn t feel like the right fit, no worries. But if you feel that spark, the same one we ve got, this could be the start of something extraordinary. The Role at a Glance: VP of Engineering (Python) Hybrid Leeds HQ, 2 Days per Week £120,000 + Equity Potential Plus Benefits: 4-day week, 23 days annual leave + bank holidays, health insurance, retail & leisure perks, electric car scheme Values & Culture: Outstanding Company to Work For 2024 Company: AI-powered marketing analytics platform Pedigree: Visionary Founder. Backed by renowned tech entrepreneurs, including co-founder of SkyScanner. Profits with Purpose 10% of profits donated to 1moreChild orphanage annually Markets: Marketing & eCommerce Agencies, eCommerce retailers Your Expertise: You re a proven engineering leader with deep experience in building, scaling, and managing cross-functional tech teams: data engineers, data scientists, data analysts, and developers. You ve led teams through transformational growth, delivering stable, scalable SaaS products that retain and delight customers. You understand what it takes to grow a business from early-stage traction to a large-scale, enterprise-grade platform, and you re ready to lay the technical foundations for our next phase of hyper-growth. You also bring strong experience managing infrastructure partners, confidently navigating and shaping these critical relationships. This isn t a back-seat role. You ll be hands-on where it counts, partnering with the Chief Product Officer to drive delivery, scalability, and long-term success. You ll guide senior stakeholders with clear, data-driven insights, helping keep the roadmap on track in a fast-moving, idea-driven environment. What You ll Be Driving: • Drive technical execution: Deliver and enhance our architecture, infrastructure, and product roadmap to ensure scalable growth and high-performing solutions. • Lead the team: Mentor and grow data engineers, scientists, analysts, and developers to deliver a high-performing, collaborative culture. • Ensure delivery & stability: Make sure features are delivered on time, scalable, and reliable, supporting long-term customer retention. • Manage key relationships: Own the infrastructure partner relationship to guarantee uptime, performance, and security. • Partner with the Chief Product Officer: Align on roadmap priorities, balancing strategic vision with day-to-day execution. • Drive enterprise-grade excellence: Embed best-in-class engineering practices agile, CI/CD, DevOps, security-first design across teams. • Champion scalability: Build products and systems that can grow globally and support tens of thousands of customers. What You ll Bring: • Experience as VP of Engineering, CTO, or senior technical leader in a high-growth or Series A-stage company. • Solid-Strong Python development skills • Proven ability to scale cross-functional teams: data engineering, data science, analytics, and development. • Hands-on experience delivering stable, scalable SaaS products with measurable customer impact. • Experience managing infrastructure partners and external technical relationships. • Confident and polite stakeholder management, able to challenge senior leaders while keeping delivery on track. • Track record of leading transformational growth, scaling teams and products successfully. • Excellent communication - bridging tech, product, and commercial discussions seamlessly. • Customer-first mindset - building products that retain and delight users. This is a role for someone who s been there, done it, and is ready to do it again with more ownership, autonomy, and impact than ever before in a fast-paced, high-growth environment. Inclusive Culture At ASK BOSCO , everybody is invited with open arms. We embrace diversity in all forms race, gender, age, sexual orientation, disability, and beyond. Our mission is stronger when everyone can bring their authentic selves to work. Can you see yourself building the tech engine that drives our next phase of hyper-growth? If high-impact, high-growth, and hands-on leadership excites you, let s make it happen. Apply now and let s talk. P.S. Did we have you at the four-day week? Application notice We take your privacy seriously. As you might expect you may be contacted by email, text or telephone. Your data is processed by our talent partner RR (Recruitment Revolution) on the basis of their legitimate interests in fulfilling the recruitment process. Please refer to their Data Privacy Policy & Notice on their website for further details.
03/10/2025
Full time
Welcome to ASK BOSCO , thanks for stopping by Let s pause for a second. Before we talk perks, equity, or growth stats, let s flip the script. This isn t about us. Not yet. This is about you: • What s driving your search right now, what s prompting you to take the next big step in your career? • Are you looking for a role where you can lead a high-performing team, shape how technology is applied, and make a direct impact on growth and customer success? • Do you want to work in a fast-paced, idea driven environment where your voice matters, but delivery, stability, and scalability always come first? Hold onto those thoughts. Now let us introduce you to something special, a chance to join us as VP of Engineering at ASK BOSCO as we build the foundations for our next phase of hyper-growth. By the end, if it doesn t feel like the right fit, no worries. But if you feel that spark, the same one we ve got, this could be the start of something extraordinary. The Role at a Glance: VP of Engineering (Python) Hybrid Leeds HQ, 2 Days per Week £120,000 + Equity Potential Plus Benefits: 4-day week, 23 days annual leave + bank holidays, health insurance, retail & leisure perks, electric car scheme Values & Culture: Outstanding Company to Work For 2024 Company: AI-powered marketing analytics platform Pedigree: Visionary Founder. Backed by renowned tech entrepreneurs, including co-founder of SkyScanner. Profits with Purpose 10% of profits donated to 1moreChild orphanage annually Markets: Marketing & eCommerce Agencies, eCommerce retailers Your Expertise: You re a proven engineering leader with deep experience in building, scaling, and managing cross-functional tech teams: data engineers, data scientists, data analysts, and developers. You ve led teams through transformational growth, delivering stable, scalable SaaS products that retain and delight customers. You understand what it takes to grow a business from early-stage traction to a large-scale, enterprise-grade platform, and you re ready to lay the technical foundations for our next phase of hyper-growth. You also bring strong experience managing infrastructure partners, confidently navigating and shaping these critical relationships. This isn t a back-seat role. You ll be hands-on where it counts, partnering with the Chief Product Officer to drive delivery, scalability, and long-term success. You ll guide senior stakeholders with clear, data-driven insights, helping keep the roadmap on track in a fast-moving, idea-driven environment. What You ll Be Driving: • Drive technical execution: Deliver and enhance our architecture, infrastructure, and product roadmap to ensure scalable growth and high-performing solutions. • Lead the team: Mentor and grow data engineers, scientists, analysts, and developers to deliver a high-performing, collaborative culture. • Ensure delivery & stability: Make sure features are delivered on time, scalable, and reliable, supporting long-term customer retention. • Manage key relationships: Own the infrastructure partner relationship to guarantee uptime, performance, and security. • Partner with the Chief Product Officer: Align on roadmap priorities, balancing strategic vision with day-to-day execution. • Drive enterprise-grade excellence: Embed best-in-class engineering practices agile, CI/CD, DevOps, security-first design across teams. • Champion scalability: Build products and systems that can grow globally and support tens of thousands of customers. What You ll Bring: • Experience as VP of Engineering, CTO, or senior technical leader in a high-growth or Series A-stage company. • Solid-Strong Python development skills • Proven ability to scale cross-functional teams: data engineering, data science, analytics, and development. • Hands-on experience delivering stable, scalable SaaS products with measurable customer impact. • Experience managing infrastructure partners and external technical relationships. • Confident and polite stakeholder management, able to challenge senior leaders while keeping delivery on track. • Track record of leading transformational growth, scaling teams and products successfully. • Excellent communication - bridging tech, product, and commercial discussions seamlessly. • Customer-first mindset - building products that retain and delight users. This is a role for someone who s been there, done it, and is ready to do it again with more ownership, autonomy, and impact than ever before in a fast-paced, high-growth environment. Inclusive Culture At ASK BOSCO , everybody is invited with open arms. We embrace diversity in all forms race, gender, age, sexual orientation, disability, and beyond. Our mission is stronger when everyone can bring their authentic selves to work. Can you see yourself building the tech engine that drives our next phase of hyper-growth? If high-impact, high-growth, and hands-on leadership excites you, let s make it happen. Apply now and let s talk. P.S. Did we have you at the four-day week? Application notice We take your privacy seriously. As you might expect you may be contacted by email, text or telephone. Your data is processed by our talent partner RR (Recruitment Revolution) on the basis of their legitimate interests in fulfilling the recruitment process. Please refer to their Data Privacy Policy & Notice on their website for further details.
United Utilities
Graduate Data Scientist
United Utilities Warrington, Cheshire
Who are United Utilities? United Utilities is responsible for water and wastewater services in the North West of England. From we will undertake the largest investment in water and wastewater services in the North West in 100 years - that's more than £13 billion worth of investments. Now, more than ever, we need inspiring future talent to help make the North West stronger, greener, and healthier. Whatever area of our business your interests lie in, our 3-year graduate programme will give you first-class training and support, together with an in-depth understanding of your chosen business area - so that you can develop the skills you need to progress your career as you make a real contribution to the communities we serve. You might enter our scheme as a graduate, but one thing's for sure, before long you'll be heading into your next role as a technical, operational or people leader Working in the North West means there's a lot of ground to cover. You could have the opportunity to work in the heart of the Lake District, or you could sample a taste of city life. As well as our main office in Warrington we also recruit for a variety of roles based across the North West; we have 575 wastewater treatment works and 96 water treatment works in areas from Crewe in Cheshire to Carlisle in Cumbria. Whichever scheme you join, you'll be at the heart of bringing innovation and positive change to the water industry. Ready to flow into your future with us? The Role Join United Utilities' 3-year Graduate Programme in our Data & Analytics team and kickstart your career! This role will give you the opportunity to learn skills and deliver data products using leading business intelligence and advanced analytics technologies including PowerBI, Python, R and SQL. You'll be encouraged and supported to develop important skills for a Data Scientist such as connecting to different data sources and manipulating different data types into useful formats, analysing and visualising data, utilising machine learning and artificial intelligence and statistical methods to create insight from data. Driving value from our data is key to us delivering services and to improving our performance. This is your chance to make a real impact, improve efficiency, and help shape a sustainable future. Start living your future today! What will I be doing? Work with business teams to identify requirements and understand where analytic insight can add value. Develop analytic insight based on functional designs and user requirements Develop a technical understanding and knowledge of key infrastructure and analytics tools Develop analytical models and visualisations to support strategic and operational decision making Communicate findings via data visualisation to a range of technical and non-technical audiences Implement data models and data quality business rules and front-end user experience Ensure compliance to relevant corporate policies regarding data security and processing Build statistical models from the data and follow best coding practices Adhere to ethics framework for data analytics Adopt a culture of continuous improvement Work in an open, transparent and collaborative manner, sharing good practice and seeking to continuously improve the quality of outputs Develop services that are automated, reliable and secure Work closely with data architects, data engineers, data managers, data owners and data developers Recommend and implement ways to improve data efficiency performance and reliability What do I need to be successful? We require our graduates to have a minimum 2:1 degree in Data Science, Mathematics, or a related field. You will also need: Strong technical skills in programming, statistical analysis, and data visualisation. Critical thinking and proactive problem-solving abilities. Excellent written and interpersonal skills to communicate effectively with technical and non-technical audiences. Collaboration skills to work across different teams and achieve goals. Flexibility to work in various teams and travel if needed. Embrace challenges and be resourceful. Additional Information Our recruitment process requires you to complete: An Online application form Online tests Online MS Teams Interview and then Assessment Centre Please note that you must be available to attend an in-person recruitment stage during the period 17th November - 19th December We are an equal opportunity employer committed to creating a diverse environment. All qualified applicants will be considered without regard to race, ethnicity, religion, gender, sexual orientation, disability, or age. If you require any reasonable adjustments throughout your recruitment journey, please let us know. If you are offered a job with us, a number of pre-employment checks need to be carried out before your appointment can be confirmed. Any offer of employment with United Utilities will be subject to a satisfactory checking report from Disclosure and Barring Service/Disclosure Scotland. Application deadline: 13th October 2025 - please apply early as deadlines are subject to change Based on current immigration guidelines this role is not eligible for visa sponsorship
03/10/2025
Full time
Who are United Utilities? United Utilities is responsible for water and wastewater services in the North West of England. From we will undertake the largest investment in water and wastewater services in the North West in 100 years - that's more than £13 billion worth of investments. Now, more than ever, we need inspiring future talent to help make the North West stronger, greener, and healthier. Whatever area of our business your interests lie in, our 3-year graduate programme will give you first-class training and support, together with an in-depth understanding of your chosen business area - so that you can develop the skills you need to progress your career as you make a real contribution to the communities we serve. You might enter our scheme as a graduate, but one thing's for sure, before long you'll be heading into your next role as a technical, operational or people leader Working in the North West means there's a lot of ground to cover. You could have the opportunity to work in the heart of the Lake District, or you could sample a taste of city life. As well as our main office in Warrington we also recruit for a variety of roles based across the North West; we have 575 wastewater treatment works and 96 water treatment works in areas from Crewe in Cheshire to Carlisle in Cumbria. Whichever scheme you join, you'll be at the heart of bringing innovation and positive change to the water industry. Ready to flow into your future with us? The Role Join United Utilities' 3-year Graduate Programme in our Data & Analytics team and kickstart your career! This role will give you the opportunity to learn skills and deliver data products using leading business intelligence and advanced analytics technologies including PowerBI, Python, R and SQL. You'll be encouraged and supported to develop important skills for a Data Scientist such as connecting to different data sources and manipulating different data types into useful formats, analysing and visualising data, utilising machine learning and artificial intelligence and statistical methods to create insight from data. Driving value from our data is key to us delivering services and to improving our performance. This is your chance to make a real impact, improve efficiency, and help shape a sustainable future. Start living your future today! What will I be doing? Work with business teams to identify requirements and understand where analytic insight can add value. Develop analytic insight based on functional designs and user requirements Develop a technical understanding and knowledge of key infrastructure and analytics tools Develop analytical models and visualisations to support strategic and operational decision making Communicate findings via data visualisation to a range of technical and non-technical audiences Implement data models and data quality business rules and front-end user experience Ensure compliance to relevant corporate policies regarding data security and processing Build statistical models from the data and follow best coding practices Adhere to ethics framework for data analytics Adopt a culture of continuous improvement Work in an open, transparent and collaborative manner, sharing good practice and seeking to continuously improve the quality of outputs Develop services that are automated, reliable and secure Work closely with data architects, data engineers, data managers, data owners and data developers Recommend and implement ways to improve data efficiency performance and reliability What do I need to be successful? We require our graduates to have a minimum 2:1 degree in Data Science, Mathematics, or a related field. You will also need: Strong technical skills in programming, statistical analysis, and data visualisation. Critical thinking and proactive problem-solving abilities. Excellent written and interpersonal skills to communicate effectively with technical and non-technical audiences. Collaboration skills to work across different teams and achieve goals. Flexibility to work in various teams and travel if needed. Embrace challenges and be resourceful. Additional Information Our recruitment process requires you to complete: An Online application form Online tests Online MS Teams Interview and then Assessment Centre Please note that you must be available to attend an in-person recruitment stage during the period 17th November - 19th December We are an equal opportunity employer committed to creating a diverse environment. All qualified applicants will be considered without regard to race, ethnicity, religion, gender, sexual orientation, disability, or age. If you require any reasonable adjustments throughout your recruitment journey, please let us know. If you are offered a job with us, a number of pre-employment checks need to be carried out before your appointment can be confirmed. Any offer of employment with United Utilities will be subject to a satisfactory checking report from Disclosure and Barring Service/Disclosure Scotland. Application deadline: 13th October 2025 - please apply early as deadlines are subject to change Based on current immigration guidelines this role is not eligible for visa sponsorship
RecruitmentRevolution.com
VP of Software Engineering - 4 Day Week. AI Powered Martech SaaS
RecruitmentRevolution.com Manchester, Lancashire
Welcome to ASK BOSCO, thanks for stopping by Let's pause for a second. Before we talk perks, equity, or growth stats, let's flip the script. This isn't about us. Not yet. This is about you: • What's driving your search right now, what's prompting you to take the next big step in your career? • Are you looking for a role where you can lead a high-performing team, shape how technology is applied, and make a direct impact on growth and customer success? • Do you want to work in a fast-paced, idea driven environment where your voice matters, but delivery, stability, and scalability always come first? Hold onto those thoughts. Now let us introduce you to something special, a chance to join us as VP of Engineering at ASK BOSCO as we build the foundations for our next phase of hyper-growth. By the end, if it doesn't feel like the right fit, no worries. But if you feel that spark, the same one we've got, this could be the start of something extraordinary. The Role at a Glance: VP of Engineering Hybrid Leeds HQ, 2 Days per Week £120,000 Equity Potential Plus Benefits: 4-day week, 23 days annual leave bank holidays, health insurance, retail & leisure perks, electric car scheme Values & Culture: Outstanding Company to Work For 2024 Company: AI-powered marketing analytics platform Pedigree: Visionary Founder. Backed by renowned tech entrepreneurs, including co-founder of SkyScanner. Profits with Purpose - 10% of profits donated to 1moreChild orphanage annually Markets: Marketing & eCommerce Agencies, eCommerce retailers Your Expertise: You're a proven engineering leader with deep experience in building, scaling, and managing cross-functional tech teams: data engineers, data scientists, data analysts, and developers. You've led teams through transformational growth, delivering stable, scalable SaaS products that retain and delight customers. You understand what it takes to grow a business from early-stage traction to a large-scale, enterprise-grade platform, and you're ready to lay the technical foundations for our next phase of hyper-growth. You also bring strong experience managing infrastructure partners, confidently navigating and shaping these critical relationships. This isn't a back-seat role. You'll be hands-on where it counts, partnering with the Chief Product Officer to drive delivery, scalability, and long-term success. You'll guide senior stakeholders with clear, data-driven insights, helping keep the roadmap on track in a fast-moving, idea-driven environment. What You'll Be Driving: • Drive technical execution: Deliver and enhance our architecture, infrastructure, and product roadmap to ensure scalable growth and high-performing solutions. • Lead the team: Mentor and grow data engineers, scientists, analysts, and developers to deliver a high-performing, collaborative culture. • Ensure delivery & stability: Make sure features are delivered on time, scalable, and reliable, supporting long-term customer retention. • Manage key relationships: Own the infrastructure partner relationship to guarantee uptime, performance, and security. • Partner with the Chief Product Officer: Align on roadmap priorities, balancing strategic vision with day-to-day execution. • Drive enterprise-grade excellence: Embed best-in-class engineering practices agile, CI/CD, DevOps, security-first design across teams. • Champion scalability: Build products and systems that can grow globally and support tens of thousands of customers. What You'll Bring: • Experience as VP of Engineering, CTO, or senior technical leader in a high-growth or Series A-stage company. • Proven ability to scale cross-functional teams: data engineering, data science, analytics, and development. • Hands-on experience delivering stable, scalable SaaS products with measurable customer impact. • Experience managing infrastructure partners and external technical relationships. • Confident and polite stakeholder management, able to challenge senior leaders while keeping delivery on track. • Track record of leading transformational growth, scaling teams and products successfully. • Excellent communication - bridging tech, product, and commercial discussions seamlessly. • Customer-first mindset - building products that retain and delight users. This is a role for someone who's been there, done it, and is ready to do it again with more ownership, autonomy, and impact than ever before in a fast-paced, high-growth environment. Inclusive Culture At ASK BOSCO , everybody is invited with open arms. We embrace diversity in all forms - race, gender, age, sexual orientation, disability, and beyond. Our mission is stronger when everyone can bring their authentic selves to work. Can you see yourself building the tech engine that drives our next phase of hyper-growth? If high-impact, high-growth, and hands-on leadership excites you, let's make it happen. Apply now and let's talk. P.S. Did we have you at the four-day week? Application notice We take your privacy seriously. As you might expect you may be contacted by email, text or telephone. Your data is processed by our talent partner RR (Recruitment Revolution) on the basis of their legitimate interests in fulfilling the recruitment process. Please refer to their Data Privacy Policy & Notice on their website for further details.
03/10/2025
Full time
Welcome to ASK BOSCO, thanks for stopping by Let's pause for a second. Before we talk perks, equity, or growth stats, let's flip the script. This isn't about us. Not yet. This is about you: • What's driving your search right now, what's prompting you to take the next big step in your career? • Are you looking for a role where you can lead a high-performing team, shape how technology is applied, and make a direct impact on growth and customer success? • Do you want to work in a fast-paced, idea driven environment where your voice matters, but delivery, stability, and scalability always come first? Hold onto those thoughts. Now let us introduce you to something special, a chance to join us as VP of Engineering at ASK BOSCO as we build the foundations for our next phase of hyper-growth. By the end, if it doesn't feel like the right fit, no worries. But if you feel that spark, the same one we've got, this could be the start of something extraordinary. The Role at a Glance: VP of Engineering Hybrid Leeds HQ, 2 Days per Week £120,000 Equity Potential Plus Benefits: 4-day week, 23 days annual leave bank holidays, health insurance, retail & leisure perks, electric car scheme Values & Culture: Outstanding Company to Work For 2024 Company: AI-powered marketing analytics platform Pedigree: Visionary Founder. Backed by renowned tech entrepreneurs, including co-founder of SkyScanner. Profits with Purpose - 10% of profits donated to 1moreChild orphanage annually Markets: Marketing & eCommerce Agencies, eCommerce retailers Your Expertise: You're a proven engineering leader with deep experience in building, scaling, and managing cross-functional tech teams: data engineers, data scientists, data analysts, and developers. You've led teams through transformational growth, delivering stable, scalable SaaS products that retain and delight customers. You understand what it takes to grow a business from early-stage traction to a large-scale, enterprise-grade platform, and you're ready to lay the technical foundations for our next phase of hyper-growth. You also bring strong experience managing infrastructure partners, confidently navigating and shaping these critical relationships. This isn't a back-seat role. You'll be hands-on where it counts, partnering with the Chief Product Officer to drive delivery, scalability, and long-term success. You'll guide senior stakeholders with clear, data-driven insights, helping keep the roadmap on track in a fast-moving, idea-driven environment. What You'll Be Driving: • Drive technical execution: Deliver and enhance our architecture, infrastructure, and product roadmap to ensure scalable growth and high-performing solutions. • Lead the team: Mentor and grow data engineers, scientists, analysts, and developers to deliver a high-performing, collaborative culture. • Ensure delivery & stability: Make sure features are delivered on time, scalable, and reliable, supporting long-term customer retention. • Manage key relationships: Own the infrastructure partner relationship to guarantee uptime, performance, and security. • Partner with the Chief Product Officer: Align on roadmap priorities, balancing strategic vision with day-to-day execution. • Drive enterprise-grade excellence: Embed best-in-class engineering practices agile, CI/CD, DevOps, security-first design across teams. • Champion scalability: Build products and systems that can grow globally and support tens of thousands of customers. What You'll Bring: • Experience as VP of Engineering, CTO, or senior technical leader in a high-growth or Series A-stage company. • Proven ability to scale cross-functional teams: data engineering, data science, analytics, and development. • Hands-on experience delivering stable, scalable SaaS products with measurable customer impact. • Experience managing infrastructure partners and external technical relationships. • Confident and polite stakeholder management, able to challenge senior leaders while keeping delivery on track. • Track record of leading transformational growth, scaling teams and products successfully. • Excellent communication - bridging tech, product, and commercial discussions seamlessly. • Customer-first mindset - building products that retain and delight users. This is a role for someone who's been there, done it, and is ready to do it again with more ownership, autonomy, and impact than ever before in a fast-paced, high-growth environment. Inclusive Culture At ASK BOSCO , everybody is invited with open arms. We embrace diversity in all forms - race, gender, age, sexual orientation, disability, and beyond. Our mission is stronger when everyone can bring their authentic selves to work. Can you see yourself building the tech engine that drives our next phase of hyper-growth? If high-impact, high-growth, and hands-on leadership excites you, let's make it happen. Apply now and let's talk. P.S. Did we have you at the four-day week? Application notice We take your privacy seriously. As you might expect you may be contacted by email, text or telephone. Your data is processed by our talent partner RR (Recruitment Revolution) on the basis of their legitimate interests in fulfilling the recruitment process. Please refer to their Data Privacy Policy & Notice on their website for further details.
Greencore
Senior Data Engineer
Greencore Worksop, Nottinghamshire
Why Greencore? We're a leading manufacturer of convenience food in the UK and our purpose is to make everyday taste better! We're a vibrant, fast-paced leading food manufacturer. Employing 13,300 colleagues across 16 manufacturing units and 17 distribution depots across the UK. We supply all the UK's food retailers with everything from Sandwiches, soups and sushi to cooking sauces, pickles and ready meals, and in FY24, we generated revenues of £1.8bn. Our vast direct-to-store (DTS) distribution network, comprising of 17 depots nationwide, enables us to make over 10,500 daily deliveries of our own chilled and frozen produce and that of third parties. Why is this exciting for your career as a Senior Data Engineer? The MBE Programme presents a huge opportunity for colleagues across the technology function to play a central role in the design, shape, delivery and execution of an enterprise wide digital transformation programme. The complexity of the initiative, within a FTSE 250 business, will allow for large-scale problem solving, group wide impact assessment and supporting the delivery of an enablement project to future proof the business. Why we embarked on Making Business Easier? Over time processes have become increasingly complex, increasing both the risk and cost they pose, whilst restricting our agility. At the same time, our customers and the market expect more from us than ever before. Making Business Easier forms a fundamental foundation for our commercial and operational excellence agendas, whilst supporting managing our cost base effectively in the future. The MBE Programme will streamline and simplify core processes, provide easier access to quality business data and will invest in the right technology to enable these processes. What you'll be doing: As a Senior Data Engineer, you will play a key role in shaping and delivering enterprise-wide data solutions that translate complex business requirements into scalable, high-performance data platforms. In this role, you will help define and guide the structure of data systems, focusing on seamless integration, accessibility, and governance, while optimising data flows to support both analytics and operational needs. Collaborating closely with business stakeholders, data engineers, and analysts, you will ensure that data platforms are robust, efficient, and adaptable to evolving business priorities. You will also support the usage, alignment, and consistency of data models; therefore, will have a wide-ranging role across many business projects and deliverables Shape and implement data solutions that align with business objectives and leverage both cloud and on-premise technologies Translate complex business needs into scalable, high-performing data solutions Support the development and application of best practices in data governance, security, and system design Collaborate closely with business stakeholders, product teams, and engineers to design and deliver effective, integrated data solutions Optimise data flows and pipelines to enable a wide range of analytical and operational use cases Promote data consistency across transactional and analytical systems through well-designed integration approaches Contribute to the design and ongoing improvement of data platforms - including data lakes, data warehouses, and other distributed storage environments - focused on efficiency, scalability, and ease of maintenance Mentor and support junior engineers and analysts in applying best practices in data engineering and solution design What you'll need: 5+ years of Data Engineering experience, with expertise in Azure data services and/or Microsoft Fabric Strong expertise in designing scalable data platforms and managing cloud-based data ecosystems Proven track record in data integration, ETL processes, and optimising large-scale data systems Expertise in cloud-based data platforms (AWS, Azure, Google Cloud) and distributed storage solutions Proficiency in Python, PySpark, SQL, NoSQL, and data processing frameworks (Spark, Databricks) Expertise in ETL/ELT design and orchestration in Azure, as well as pipeline performance tuning & optimisation Competent in integrating relational, NoSQL, and streaming data sources Management of CI/CD pipelines & Git-based workflows Good knowledge of data governance, privacy regulations, and security best practices Experience with modern data architectures, including data lakes, data mesh, and event-driven data processing Strong problem-solving and analytical skills to translate complex business needs into scalable data solutions Excellent communication and stakeholder management to align business and technical goals High attention to detail and commitment to data quality, security, and governance Ability to mentor and guide teams, fostering a culture of best practices in data architecture Power BI and DAX for data visualisation (desirable) Knowledge of Azure Machine Learning and AI services (desirable) Experience with streaming platforms like Event Hub or Kafka Familiarity with cloud cost optimisation techniques (desirable) What you'll get: Competitive salary and job-related benefits 25 days holiday allowance plus bank holidays Car Allowance Annual Target Bonus Pension up to 8% matched PMI Cover: Individual Life insurance up to 4x salary Company share save scheme Greencore Qualifications Exclusive Greencore employee discount platform Access to a full Wellbeing Centre platform
03/10/2025
Full time
Why Greencore? We're a leading manufacturer of convenience food in the UK and our purpose is to make everyday taste better! We're a vibrant, fast-paced leading food manufacturer. Employing 13,300 colleagues across 16 manufacturing units and 17 distribution depots across the UK. We supply all the UK's food retailers with everything from Sandwiches, soups and sushi to cooking sauces, pickles and ready meals, and in FY24, we generated revenues of £1.8bn. Our vast direct-to-store (DTS) distribution network, comprising of 17 depots nationwide, enables us to make over 10,500 daily deliveries of our own chilled and frozen produce and that of third parties. Why is this exciting for your career as a Senior Data Engineer? The MBE Programme presents a huge opportunity for colleagues across the technology function to play a central role in the design, shape, delivery and execution of an enterprise wide digital transformation programme. The complexity of the initiative, within a FTSE 250 business, will allow for large-scale problem solving, group wide impact assessment and supporting the delivery of an enablement project to future proof the business. Why we embarked on Making Business Easier? Over time processes have become increasingly complex, increasing both the risk and cost they pose, whilst restricting our agility. At the same time, our customers and the market expect more from us than ever before. Making Business Easier forms a fundamental foundation for our commercial and operational excellence agendas, whilst supporting managing our cost base effectively in the future. The MBE Programme will streamline and simplify core processes, provide easier access to quality business data and will invest in the right technology to enable these processes. What you'll be doing: As a Senior Data Engineer, you will play a key role in shaping and delivering enterprise-wide data solutions that translate complex business requirements into scalable, high-performance data platforms. In this role, you will help define and guide the structure of data systems, focusing on seamless integration, accessibility, and governance, while optimising data flows to support both analytics and operational needs. Collaborating closely with business stakeholders, data engineers, and analysts, you will ensure that data platforms are robust, efficient, and adaptable to evolving business priorities. You will also support the usage, alignment, and consistency of data models; therefore, will have a wide-ranging role across many business projects and deliverables Shape and implement data solutions that align with business objectives and leverage both cloud and on-premise technologies Translate complex business needs into scalable, high-performing data solutions Support the development and application of best practices in data governance, security, and system design Collaborate closely with business stakeholders, product teams, and engineers to design and deliver effective, integrated data solutions Optimise data flows and pipelines to enable a wide range of analytical and operational use cases Promote data consistency across transactional and analytical systems through well-designed integration approaches Contribute to the design and ongoing improvement of data platforms - including data lakes, data warehouses, and other distributed storage environments - focused on efficiency, scalability, and ease of maintenance Mentor and support junior engineers and analysts in applying best practices in data engineering and solution design What you'll need: 5+ years of Data Engineering experience, with expertise in Azure data services and/or Microsoft Fabric Strong expertise in designing scalable data platforms and managing cloud-based data ecosystems Proven track record in data integration, ETL processes, and optimising large-scale data systems Expertise in cloud-based data platforms (AWS, Azure, Google Cloud) and distributed storage solutions Proficiency in Python, PySpark, SQL, NoSQL, and data processing frameworks (Spark, Databricks) Expertise in ETL/ELT design and orchestration in Azure, as well as pipeline performance tuning & optimisation Competent in integrating relational, NoSQL, and streaming data sources Management of CI/CD pipelines & Git-based workflows Good knowledge of data governance, privacy regulations, and security best practices Experience with modern data architectures, including data lakes, data mesh, and event-driven data processing Strong problem-solving and analytical skills to translate complex business needs into scalable data solutions Excellent communication and stakeholder management to align business and technical goals High attention to detail and commitment to data quality, security, and governance Ability to mentor and guide teams, fostering a culture of best practices in data architecture Power BI and DAX for data visualisation (desirable) Knowledge of Azure Machine Learning and AI services (desirable) Experience with streaming platforms like Event Hub or Kafka Familiarity with cloud cost optimisation techniques (desirable) What you'll get: Competitive salary and job-related benefits 25 days holiday allowance plus bank holidays Car Allowance Annual Target Bonus Pension up to 8% matched PMI Cover: Individual Life insurance up to 4x salary Company share save scheme Greencore Qualifications Exclusive Greencore employee discount platform Access to a full Wellbeing Centre platform
Greencore
Senior Data Engineer
Greencore Scofton, Nottinghamshire
Why Greencore? We're a leading manufacturer of convenience food in the UK and our purpose is to make everyday taste better! We're a vibrant, fast-paced leading food manufacturer. Employing 13,300 colleagues across 16 manufacturing units and 17 distribution depots across the UK. We supply all the UK's food retailers with everything from Sandwiches, soups and sushi to cooking sauces, pickles and ready meals, and in FY24, we generated revenues of 1.8bn. Our vast direct-to-store (DTS) distribution network, comprising of 17 depots nationwide, enables us to make over 10,500 daily deliveries of our own chilled and frozen produce and that of third parties. Why is this exciting for your career as a Senior Data Engineer? The MBE Programme presents a huge opportunity for colleagues across the technology function to play a central role in the design, shape, delivery and execution of an enterprise wide digital transformation programme. The complexity of the initiative, within a FTSE 250 business, will allow for large-scale problem solving, group wide impact assessment and supporting the delivery of an enablement project to future proof the business. Why we embarked on Making Business Easier? Over time processes have become increasingly complex, increasing both the risk and cost they pose, whilst restricting our agility. At the same time, our customers and the market expect more from us than ever before. Making Business Easier forms a fundamental foundation for our commercial and operational excellence agendas, whilst supporting managing our cost base effectively in the future. The MBE Programme will streamline and simplify core processes, provide easier access to quality business data and will invest in the right technology to enable these processes. What you'll be doing: As a Senior Data Engineer, you will play a key role in shaping and delivering enterprise-wide data solutions that translate complex business requirements into scalable, high-performance data platforms. In this role, you will help define and guide the structure of data systems, focusing on seamless integration, accessibility, and governance, while optimising data flows to support both analytics and operational needs. Collaborating closely with business stakeholders, data engineers, and analysts, you will ensure that data platforms are robust, efficient, and adaptable to evolving business priorities. You will also support the usage, alignment, and consistency of data models; therefore, will have a wide-ranging role across many business projects and deliverables Shape and implement data solutions that align with business objectives and leverage both cloud and on-premise technologies Translate complex business needs into scalable, high-performing data solutions Support the development and application of best practices in data governance, security, and system design Collaborate closely with business stakeholders, product teams, and engineers to design and deliver effective, integrated data solutions Optimise data flows and pipelines to enable a wide range of analytical and operational use cases Promote data consistency across transactional and analytical systems through well-designed integration approaches Contribute to the design and ongoing improvement of data platforms - including data lakes, data warehouses, and other distributed storage environments - focused on efficiency, scalability, and ease of maintenance Mentor and support junior engineers and analysts in applying best practices in data engineering and solution design What you'll need: 5+ years of Data Engineering experience, with expertise in Azure data services and/or Microsoft Fabric Strong expertise in designing scalable data platforms and managing cloud-based data ecosystems Proven track record in data integration, ETL processes, and optimising large-scale data systems Expertise in cloud-based data platforms (AWS, Azure, Google Cloud) and distributed storage solutions Proficiency in Python, PySpark, SQL, NoSQL, and data processing frameworks (Spark, Databricks) Expertise in ETL/ELT design and orchestration in Azure, as well as pipeline performance tuning & optimisation Competent in integrating relational, NoSQL, and streaming data sources Management of CI/CD pipelines & Git-based workflows Good knowledge of data governance, privacy regulations, and security best practices Experience with modern data architectures, including data lakes, data mesh, and event-driven data processing Strong problem-solving and analytical skills to translate complex business needs into scalable data solutions Excellent communication and stakeholder management to align business and technical goals High attention to detail and commitment to data quality, security, and governance Ability to mentor and guide teams, fostering a culture of best practices in data architecture Power BI and DAX for data visualisation (desirable) Knowledge of Azure Machine Learning and AI services (desirable) Experience with streaming platforms like Event Hub or Kafka Familiarity with cloud cost optimisation techniques (desirable) What you'll get: Competitive salary and job-related benefits 25 days holiday allowance plus bank holidays Car Allowance Annual Target Bonus Pension up to 8% matched PMI Cover: Individual Life insurance up to 4x salary Company share save scheme Greencore Qualifications Exclusive Greencore employee discount platform Access to a full Wellbeing Centre platform
02/10/2025
Full time
Why Greencore? We're a leading manufacturer of convenience food in the UK and our purpose is to make everyday taste better! We're a vibrant, fast-paced leading food manufacturer. Employing 13,300 colleagues across 16 manufacturing units and 17 distribution depots across the UK. We supply all the UK's food retailers with everything from Sandwiches, soups and sushi to cooking sauces, pickles and ready meals, and in FY24, we generated revenues of 1.8bn. Our vast direct-to-store (DTS) distribution network, comprising of 17 depots nationwide, enables us to make over 10,500 daily deliveries of our own chilled and frozen produce and that of third parties. Why is this exciting for your career as a Senior Data Engineer? The MBE Programme presents a huge opportunity for colleagues across the technology function to play a central role in the design, shape, delivery and execution of an enterprise wide digital transformation programme. The complexity of the initiative, within a FTSE 250 business, will allow for large-scale problem solving, group wide impact assessment and supporting the delivery of an enablement project to future proof the business. Why we embarked on Making Business Easier? Over time processes have become increasingly complex, increasing both the risk and cost they pose, whilst restricting our agility. At the same time, our customers and the market expect more from us than ever before. Making Business Easier forms a fundamental foundation for our commercial and operational excellence agendas, whilst supporting managing our cost base effectively in the future. The MBE Programme will streamline and simplify core processes, provide easier access to quality business data and will invest in the right technology to enable these processes. What you'll be doing: As a Senior Data Engineer, you will play a key role in shaping and delivering enterprise-wide data solutions that translate complex business requirements into scalable, high-performance data platforms. In this role, you will help define and guide the structure of data systems, focusing on seamless integration, accessibility, and governance, while optimising data flows to support both analytics and operational needs. Collaborating closely with business stakeholders, data engineers, and analysts, you will ensure that data platforms are robust, efficient, and adaptable to evolving business priorities. You will also support the usage, alignment, and consistency of data models; therefore, will have a wide-ranging role across many business projects and deliverables Shape and implement data solutions that align with business objectives and leverage both cloud and on-premise technologies Translate complex business needs into scalable, high-performing data solutions Support the development and application of best practices in data governance, security, and system design Collaborate closely with business stakeholders, product teams, and engineers to design and deliver effective, integrated data solutions Optimise data flows and pipelines to enable a wide range of analytical and operational use cases Promote data consistency across transactional and analytical systems through well-designed integration approaches Contribute to the design and ongoing improvement of data platforms - including data lakes, data warehouses, and other distributed storage environments - focused on efficiency, scalability, and ease of maintenance Mentor and support junior engineers and analysts in applying best practices in data engineering and solution design What you'll need: 5+ years of Data Engineering experience, with expertise in Azure data services and/or Microsoft Fabric Strong expertise in designing scalable data platforms and managing cloud-based data ecosystems Proven track record in data integration, ETL processes, and optimising large-scale data systems Expertise in cloud-based data platforms (AWS, Azure, Google Cloud) and distributed storage solutions Proficiency in Python, PySpark, SQL, NoSQL, and data processing frameworks (Spark, Databricks) Expertise in ETL/ELT design and orchestration in Azure, as well as pipeline performance tuning & optimisation Competent in integrating relational, NoSQL, and streaming data sources Management of CI/CD pipelines & Git-based workflows Good knowledge of data governance, privacy regulations, and security best practices Experience with modern data architectures, including data lakes, data mesh, and event-driven data processing Strong problem-solving and analytical skills to translate complex business needs into scalable data solutions Excellent communication and stakeholder management to align business and technical goals High attention to detail and commitment to data quality, security, and governance Ability to mentor and guide teams, fostering a culture of best practices in data architecture Power BI and DAX for data visualisation (desirable) Knowledge of Azure Machine Learning and AI services (desirable) Experience with streaming platforms like Event Hub or Kafka Familiarity with cloud cost optimisation techniques (desirable) What you'll get: Competitive salary and job-related benefits 25 days holiday allowance plus bank holidays Car Allowance Annual Target Bonus Pension up to 8% matched PMI Cover: Individual Life insurance up to 4x salary Company share save scheme Greencore Qualifications Exclusive Greencore employee discount platform Access to a full Wellbeing Centre platform
Tenth Revolution Group
EXL - Data Solutions Lead Pre Sales
Tenth Revolution Group City, London
Data Solutions lead- 95,000-Hybrid About the Role We are seeking a dynamic and client-focused Pre-Sales Data Solutions Architect to join our high-performing team. In this role, you will lead the design and positioning of modern data platform solutions for enterprise clients. Your primary focus will be on pre-sales solutioning , including RFP/RFI response leadership , client workshops , and translating complex data architectures into compelling business value propositions. Key Responsibilities Client-Facing Solution Leadership Lead the creation of tailored, data-driven solutions in response to RFPs, RFIs, and client needs. Engage with senior stakeholders to understand business objectives and align them with scalable, modern data platform architectures. Craft solution narratives that clearly articulate business value and technical feasibility. Cross-Functional Collaboration Collaborate with sales, data architects, engineers, and delivery teams to shape commercially viable and technically sound solutions. Conduct discovery workshops and requirements-gathering sessions with clients to inform solution design. Act as a bridge between business and technical teams, ensuring a shared understanding of goals and outcomes. Proposal Management & Industry Insight Own and manage end-to-end proposal processes including content creation, reviews, pricing inputs, and executive summaries. Maintain a repository of reusable proposal assets, templates, and case studies. Stay current on trends in cloud platforms (Azure, AWS, GCP), data governance, analytics, and emerging technologies to inform solution strategy. Required Qualifications Proven experience in a pre-sales or consulting role focused on data platforms, cloud architecture, or analytics solutions. Deep knowledge of modern cloud data platforms (Azure, AWS, GCP), including data lakes, warehousing, governance, and pipelines. Demonstrated ability to lead RFP/RFI/RFS responses and engage effectively with senior business and IT stakeholders . Strong communication skills with the ability to translate complex technical concepts into clear business benefits. Experience with discovery workshops, solution design sessions, and proposal presentations. Preferred Qualifications Prior experience in a consulting, systems integrator, or cloud services environment. Certifications in Azure, AWS, or GCP (e.g., Azure Data Engineer, AWS Data Analytics, GCP Professional Data Engineer). Familiarity with data governance tools and practices (e.g., Collibra, Informatica, Microsoft Purview). Industry-specific knowledge in sectors like Financial Services, Healthcare, or Retail is a plus. To apply for this role please submit your CV or contact Dillon Blackburn on (phone number removed) or at (url removed). Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
02/10/2025
Full time
Data Solutions lead- 95,000-Hybrid About the Role We are seeking a dynamic and client-focused Pre-Sales Data Solutions Architect to join our high-performing team. In this role, you will lead the design and positioning of modern data platform solutions for enterprise clients. Your primary focus will be on pre-sales solutioning , including RFP/RFI response leadership , client workshops , and translating complex data architectures into compelling business value propositions. Key Responsibilities Client-Facing Solution Leadership Lead the creation of tailored, data-driven solutions in response to RFPs, RFIs, and client needs. Engage with senior stakeholders to understand business objectives and align them with scalable, modern data platform architectures. Craft solution narratives that clearly articulate business value and technical feasibility. Cross-Functional Collaboration Collaborate with sales, data architects, engineers, and delivery teams to shape commercially viable and technically sound solutions. Conduct discovery workshops and requirements-gathering sessions with clients to inform solution design. Act as a bridge between business and technical teams, ensuring a shared understanding of goals and outcomes. Proposal Management & Industry Insight Own and manage end-to-end proposal processes including content creation, reviews, pricing inputs, and executive summaries. Maintain a repository of reusable proposal assets, templates, and case studies. Stay current on trends in cloud platforms (Azure, AWS, GCP), data governance, analytics, and emerging technologies to inform solution strategy. Required Qualifications Proven experience in a pre-sales or consulting role focused on data platforms, cloud architecture, or analytics solutions. Deep knowledge of modern cloud data platforms (Azure, AWS, GCP), including data lakes, warehousing, governance, and pipelines. Demonstrated ability to lead RFP/RFI/RFS responses and engage effectively with senior business and IT stakeholders . Strong communication skills with the ability to translate complex technical concepts into clear business benefits. Experience with discovery workshops, solution design sessions, and proposal presentations. Preferred Qualifications Prior experience in a consulting, systems integrator, or cloud services environment. Certifications in Azure, AWS, or GCP (e.g., Azure Data Engineer, AWS Data Analytics, GCP Professional Data Engineer). Familiarity with data governance tools and practices (e.g., Collibra, Informatica, Microsoft Purview). Industry-specific knowledge in sectors like Financial Services, Healthcare, or Retail is a plus. To apply for this role please submit your CV or contact Dillon Blackburn on (phone number removed) or at (url removed). Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Tenth Revolution Group
EXL - Data Solutions Lead Pre Sales
Tenth Revolution Group City, London
Data Solutions lead-£95,000-Hybrid About the Role We are seeking a dynamic and client-focused Pre-Sales Data Solutions Architect to join our high-performing team. In this role, you will lead the design and positioning of modern data platform solutions for enterprise clients. Your primary focus will be on pre-sales solutioning , including RFP/RFI response leadership , client workshops , and translating complex data architectures into compelling business value propositions. Key Responsibilities Client-Facing Solution Leadership Lead the creation of tailored, data-driven solutions in response to RFPs, RFIs, and client needs. Engage with senior stakeholders to understand business objectives and align them with scalable, modern data platform architectures. Craft solution narratives that clearly articulate business value and technical feasibility. Cross-Functional Collaboration Collaborate with sales, data architects, engineers, and delivery teams to shape commercially viable and technically sound solutions. Conduct discovery workshops and requirements-gathering sessions with clients to inform solution design. Act as a bridge between business and technical teams, ensuring a shared understanding of goals and outcomes. Proposal Management & Industry Insight Own and manage end-to-end proposal processes including content creation, reviews, pricing inputs, and executive summaries. Maintain a repository of reusable proposal assets, templates, and case studies. Stay current on trends in cloud platforms (Azure, AWS, GCP), data governance, analytics, and emerging technologies to inform solution strategy. Required Qualifications Proven experience in a pre-sales or consulting role focused on data platforms, cloud architecture, or analytics solutions. Deep knowledge of modern cloud data platforms (Azure, AWS, GCP), including data lakes, warehousing, governance, and pipelines. Demonstrated ability to lead RFP/RFI/RFS responses and engage effectively with senior business and IT stakeholders . Strong communication skills with the ability to translate complex technical concepts into clear business benefits. Experience with discovery workshops, solution design sessions, and proposal presentations. Preferred Qualifications Prior experience in a consulting, systems integrator, or cloud services environment. Certifications in Azure, AWS, or GCP (eg, Azure Data Engineer, AWS Data Analytics, GCP Professional Data Engineer). Familiarity with data governance tools and practices (eg, Collibra, Informatica, Microsoft Purview). Industry-specific knowledge in sectors like Financial Services, Healthcare, or Retail is a plus. To apply for this role please submit your CV or contact Dillon Blackburn (see below) Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
02/10/2025
Full time
Data Solutions lead-£95,000-Hybrid About the Role We are seeking a dynamic and client-focused Pre-Sales Data Solutions Architect to join our high-performing team. In this role, you will lead the design and positioning of modern data platform solutions for enterprise clients. Your primary focus will be on pre-sales solutioning , including RFP/RFI response leadership , client workshops , and translating complex data architectures into compelling business value propositions. Key Responsibilities Client-Facing Solution Leadership Lead the creation of tailored, data-driven solutions in response to RFPs, RFIs, and client needs. Engage with senior stakeholders to understand business objectives and align them with scalable, modern data platform architectures. Craft solution narratives that clearly articulate business value and technical feasibility. Cross-Functional Collaboration Collaborate with sales, data architects, engineers, and delivery teams to shape commercially viable and technically sound solutions. Conduct discovery workshops and requirements-gathering sessions with clients to inform solution design. Act as a bridge between business and technical teams, ensuring a shared understanding of goals and outcomes. Proposal Management & Industry Insight Own and manage end-to-end proposal processes including content creation, reviews, pricing inputs, and executive summaries. Maintain a repository of reusable proposal assets, templates, and case studies. Stay current on trends in cloud platforms (Azure, AWS, GCP), data governance, analytics, and emerging technologies to inform solution strategy. Required Qualifications Proven experience in a pre-sales or consulting role focused on data platforms, cloud architecture, or analytics solutions. Deep knowledge of modern cloud data platforms (Azure, AWS, GCP), including data lakes, warehousing, governance, and pipelines. Demonstrated ability to lead RFP/RFI/RFS responses and engage effectively with senior business and IT stakeholders . Strong communication skills with the ability to translate complex technical concepts into clear business benefits. Experience with discovery workshops, solution design sessions, and proposal presentations. Preferred Qualifications Prior experience in a consulting, systems integrator, or cloud services environment. Certifications in Azure, AWS, or GCP (eg, Azure Data Engineer, AWS Data Analytics, GCP Professional Data Engineer). Familiarity with data governance tools and practices (eg, Collibra, Informatica, Microsoft Purview). Industry-specific knowledge in sectors like Financial Services, Healthcare, or Retail is a plus. To apply for this role please submit your CV or contact Dillon Blackburn (see below) Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Client Server Ltd.
Senior Data Engineer Python AWS
Client Server Ltd.
Senior Data Engineer (Python AWS Data Lake) London / WFH to £80k Are you a data technologist with fluent Python coding skills who enjoys working with and continually learning new technologies? You could be progressing your career in a senior, hands-on role at a "Tech for Good" company that is enabling life changing education to be accessed by millions of students across the globe (currently 22 countries and growing). As a Data Engineer you'll help to design, build and operate the data pipelines and data services that power learning experiences. Collaborating with software engineers, data scientists and product managers, you will grow your expertise while contributing to systems that need to process 25-35,000 messages per second to support the learning experiences as well as reporting and analytics requirements across the business as it scales to 10's of millions of students. Typically you'll be developing and maintaining reliable data pipelines for both real-time streaming and batch workloads, assisting with the evolution of the Lakehouse, collaborating with analytics, product and machine learning teams to make data accessible, trust worthy and reusable. You'll be encouraged to contribute ideas and investigate new and emerging technologies. Location / WFH: You'll join the team in London four days a week with flexibility to work from home once a week. About you: You have experience as a Data Engineer, working on scalable systems You have Python coding skills You have experience with at least one streaming technology, Kafka preferred You have strong SQL skills (e.g. PostgreSQL, MySQL) You have strong AWS knowledge, ideally including S3 and Lambda You have a good understanding of data modelling and batch / streaming ETL / ELT patterns You're collaborative and pragmatic with great communication skills What's in it for you: As a Senior Data Engineer you will receive a competitive package: Salary to £80k 25 days holiday Private medical insurance Training and certifications Perks such as restaurant and retail discounts Apply now to find out more about this Senior Data Engineer opportunity. At Client Server we believe in a diverse workplace that allows people to play to their strengths and continually learn. We're an equal opportunities employer whose people come from all walks of life and will never discriminate based on race, colour, religion, sex, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status. The clients we work with share our values.
01/10/2025
Full time
Senior Data Engineer (Python AWS Data Lake) London / WFH to £80k Are you a data technologist with fluent Python coding skills who enjoys working with and continually learning new technologies? You could be progressing your career in a senior, hands-on role at a "Tech for Good" company that is enabling life changing education to be accessed by millions of students across the globe (currently 22 countries and growing). As a Data Engineer you'll help to design, build and operate the data pipelines and data services that power learning experiences. Collaborating with software engineers, data scientists and product managers, you will grow your expertise while contributing to systems that need to process 25-35,000 messages per second to support the learning experiences as well as reporting and analytics requirements across the business as it scales to 10's of millions of students. Typically you'll be developing and maintaining reliable data pipelines for both real-time streaming and batch workloads, assisting with the evolution of the Lakehouse, collaborating with analytics, product and machine learning teams to make data accessible, trust worthy and reusable. You'll be encouraged to contribute ideas and investigate new and emerging technologies. Location / WFH: You'll join the team in London four days a week with flexibility to work from home once a week. About you: You have experience as a Data Engineer, working on scalable systems You have Python coding skills You have experience with at least one streaming technology, Kafka preferred You have strong SQL skills (e.g. PostgreSQL, MySQL) You have strong AWS knowledge, ideally including S3 and Lambda You have a good understanding of data modelling and batch / streaming ETL / ELT patterns You're collaborative and pragmatic with great communication skills What's in it for you: As a Senior Data Engineer you will receive a competitive package: Salary to £80k 25 days holiday Private medical insurance Training and certifications Perks such as restaurant and retail discounts Apply now to find out more about this Senior Data Engineer opportunity. At Client Server we believe in a diverse workplace that allows people to play to their strengths and continually learn. We're an equal opportunities employer whose people come from all walks of life and will never discriminate based on race, colour, religion, sex, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status. The clients we work with share our values.
fortice
SC Cleared SAS/Operational Engineer
fortice Telford, Shropshire
We are heading up a recruitment drive for a global consultancy that require a SC Cleared Operational Engineer to join them on a major government project that's based in Telford 2 days per week. Operational Engineer (Minerva SA Reg Risking) Max Supplier: £600 Clearance required: SC is required - Must hold SC (preferably HMRC) but we can transfer Duration: 6 months Location: Telford: If local to Telford, requirement will be 2 days per week on site. If not local, expectation will be to attend once/twice a month. IR35 Status: Capgemini Mandated PAYE only Job description: The Preventative Risking (PR) team within RIS is responsible for managing the risking and compliance referral processes for Self-Assessment (SA) registrations. Currently, the system identifies approximately 200,000 fraudulent registrations out of 1 million, resulting in an estimated £51 million to £219 million in lost SA repayment claims for HMRC (based on 2021/2022 figures). To address this, a proof of concept (POC) was developed using: SAS Enterprise Guide for table creation, SAS Studio V and SAS RTENG to build the SA registration network, SAS Viya 3.5 tools for risk assessment of new SA registrations. The POC leveraged data from 20 different sources, most of which were already housed in the Minerva Oracle database. Previously, some data had to be transferred manually. However, the automated file transfers described in this Solution Design Document (SDD) will now move that data to the SAS platform using approved Enterprise Architecture (EA) integration patterns, with the initial phase, targeted for delivery in April 2026. This role will form part of a new scrum team within Minerva Platfrom to develop and deliver the Ingestion and Risking within the SAS Platform including IDP. Key Responsibilities: Design, development, and deployment of data integration and transformation solutions using Pentaho, Denodo, Talend, and SAS. Architect and implement scalable data pipelines and services that support Business Intelligence and analytics platforms. Collaborate with cross-functional teams to gather requirements, define technical specifications, and deliver robust data solutions. Champion Agile and Scrum methodologies, ensuring timely delivery of sprints and continuous improvement. Drive DevOps practices for CI/CD, automated testing, and deployment of data services. Mentor and guide junior engineers, fostering a culture of technical excellence and innovation. Ensure data quality, governance, and security standards are upheld across all solutions. Troubleshoot and resolve complex data issues and performance bottlenecks. Key Skills: SAS 9.4 (DI), SAS Viya 3.x (SAS Studio, VA, VI). Platform LSF, Jira, Platform Support. GIT. Strong expertise in ETL tools: Pentaho, Talend. Experience with data virtualization using Denodo. Proficiency in SAS for data analytics and reporting. Oracle (good to have). Solid understanding of Agile and Scrum frameworks. Hands-on experience with DevOps tools and practices (eg, Jenkins, Git, Docker, Kubernetes). Strong SQL and data modelling skills. Excellent problem-solving, communication, and leadership abilities. Key Qualifications: Proven track record of data projects and teams. Certifications in Agile/Scrum, DevOps, or relevant data technologies are a plus.
04/09/2025
Contractor
We are heading up a recruitment drive for a global consultancy that require a SC Cleared Operational Engineer to join them on a major government project that's based in Telford 2 days per week. Operational Engineer (Minerva SA Reg Risking) Max Supplier: £600 Clearance required: SC is required - Must hold SC (preferably HMRC) but we can transfer Duration: 6 months Location: Telford: If local to Telford, requirement will be 2 days per week on site. If not local, expectation will be to attend once/twice a month. IR35 Status: Capgemini Mandated PAYE only Job description: The Preventative Risking (PR) team within RIS is responsible for managing the risking and compliance referral processes for Self-Assessment (SA) registrations. Currently, the system identifies approximately 200,000 fraudulent registrations out of 1 million, resulting in an estimated £51 million to £219 million in lost SA repayment claims for HMRC (based on 2021/2022 figures). To address this, a proof of concept (POC) was developed using: SAS Enterprise Guide for table creation, SAS Studio V and SAS RTENG to build the SA registration network, SAS Viya 3.5 tools for risk assessment of new SA registrations. The POC leveraged data from 20 different sources, most of which were already housed in the Minerva Oracle database. Previously, some data had to be transferred manually. However, the automated file transfers described in this Solution Design Document (SDD) will now move that data to the SAS platform using approved Enterprise Architecture (EA) integration patterns, with the initial phase, targeted for delivery in April 2026. This role will form part of a new scrum team within Minerva Platfrom to develop and deliver the Ingestion and Risking within the SAS Platform including IDP. Key Responsibilities: Design, development, and deployment of data integration and transformation solutions using Pentaho, Denodo, Talend, and SAS. Architect and implement scalable data pipelines and services that support Business Intelligence and analytics platforms. Collaborate with cross-functional teams to gather requirements, define technical specifications, and deliver robust data solutions. Champion Agile and Scrum methodologies, ensuring timely delivery of sprints and continuous improvement. Drive DevOps practices for CI/CD, automated testing, and deployment of data services. Mentor and guide junior engineers, fostering a culture of technical excellence and innovation. Ensure data quality, governance, and security standards are upheld across all solutions. Troubleshoot and resolve complex data issues and performance bottlenecks. Key Skills: SAS 9.4 (DI), SAS Viya 3.x (SAS Studio, VA, VI). Platform LSF, Jira, Platform Support. GIT. Strong expertise in ETL tools: Pentaho, Talend. Experience with data virtualization using Denodo. Proficiency in SAS for data analytics and reporting. Oracle (good to have). Solid understanding of Agile and Scrum frameworks. Hands-on experience with DevOps tools and practices (eg, Jenkins, Git, Docker, Kubernetes). Strong SQL and data modelling skills. Excellent problem-solving, communication, and leadership abilities. Key Qualifications: Proven track record of data projects and teams. Certifications in Agile/Scrum, DevOps, or relevant data technologies are a plus.

Modal Window

  • Home
  • Contact
  • About Us
  • FAQs
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • IT blog
  • Facebook
  • Twitter
  • LinkedIn
  • Youtube
© 2008-2026 IT Job Board