it job board logo
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
  • Recruiting? Post a job
  • Sign in
  • Sign up
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

81 jobs found

Email me jobs like this
Refine Search
Current Search
databricks architect
VIQU IT
Data Architect
VIQU IT City, Birmingham
The Role: Data Architect Location: Birmingham (once a week on site) Salary: Up to £85,000 + bonus + generous pension VIQU have partnered with a leading organisation undergoing a large scale brownfield transformation, replacing an existing data platform with a modern, scalable solution and so are hiring a Data Architect to lead the design of this platform. This role would suit a pragmatic Data Architect who has previously designed and built a greenfield or brownfield platform with a high level of autonomy. The organisation is currently relatively immature from a data architecture perspective, so the ideal candidate must be comfortable operating with ambiguity, shaping direction and implementing data governance frameworks. Experience within a highly regulated industry would be advantageous. Key Responsibilities of the Data Architect: Own and shape the enterprise data architecture vision, strategy and maturity roadmap. Design the end-to-end data platform, including data models, data lakes, warehouses and integrations. Define data standards and establish a robust data governance framework, developing data quality and security standards in line with regulatory requirements. Ensure readiness for AI/ML integration. Evaluate emerging tools, cloud platforms and analytics technologies. Work with external partners to establish a single version of the truth across the industry and enable secure data sharing. Requirements Proven experience as a Data Architect, having previously shaped or matured an organisation s data landscape. Experience within a regulated industry (e.g. utilities, finance, defence). Energy/utilities experience (gas, water, power, electricity) would be highly desirable. Broad knowledge of modern data platforms (AWS, Azure, GCP, Snowflake, Databricks). Experience working within SAP environments/modules (e.g. S/4 HANA, RISE) would be advantageous. Strong data modelling expertise. Comfortable working through ambiguity and able to communicate architectural decisions at a senior level. The role: Data Architect Location: Birmingham (once a week) Salary: Up to £85,000 + bonus + generous pension Apply now to speak with VIQU IT in confidence. Or reach out to Jack McManus via the (url removed) Do you know someone great? We ll thank you with up to £1,000 if your referral is successful (terms apply). For more exciting roles and opportunities like this, please follow us on IT Recruitment
27/04/2026
Full time
The Role: Data Architect Location: Birmingham (once a week on site) Salary: Up to £85,000 + bonus + generous pension VIQU have partnered with a leading organisation undergoing a large scale brownfield transformation, replacing an existing data platform with a modern, scalable solution and so are hiring a Data Architect to lead the design of this platform. This role would suit a pragmatic Data Architect who has previously designed and built a greenfield or brownfield platform with a high level of autonomy. The organisation is currently relatively immature from a data architecture perspective, so the ideal candidate must be comfortable operating with ambiguity, shaping direction and implementing data governance frameworks. Experience within a highly regulated industry would be advantageous. Key Responsibilities of the Data Architect: Own and shape the enterprise data architecture vision, strategy and maturity roadmap. Design the end-to-end data platform, including data models, data lakes, warehouses and integrations. Define data standards and establish a robust data governance framework, developing data quality and security standards in line with regulatory requirements. Ensure readiness for AI/ML integration. Evaluate emerging tools, cloud platforms and analytics technologies. Work with external partners to establish a single version of the truth across the industry and enable secure data sharing. Requirements Proven experience as a Data Architect, having previously shaped or matured an organisation s data landscape. Experience within a regulated industry (e.g. utilities, finance, defence). Energy/utilities experience (gas, water, power, electricity) would be highly desirable. Broad knowledge of modern data platforms (AWS, Azure, GCP, Snowflake, Databricks). Experience working within SAP environments/modules (e.g. S/4 HANA, RISE) would be advantageous. Strong data modelling expertise. Comfortable working through ambiguity and able to communicate architectural decisions at a senior level. The role: Data Architect Location: Birmingham (once a week) Salary: Up to £85,000 + bonus + generous pension Apply now to speak with VIQU IT in confidence. Or reach out to Jack McManus via the (url removed) Do you know someone great? We ll thank you with up to £1,000 if your referral is successful (terms apply). For more exciting roles and opportunities like this, please follow us on IT Recruitment
Synechron
Azure AI Engineer
Synechron Glasgow, Lanarkshire
About Synechron: Synechron is a leading digital transformation consulting firm dedicated to delivering innovative technology solutions within banking, financial services, and insurance. We thrive on engineering excellence, collaboration, and a passion for cutting-edge technologies. Job Location: Glasgow (Hybrid - 3 days in the office) We are seeking an highly experienced Azure AI Search ( Azure Cognitive Search) Developer to work on the deployment and optimisation of Azure AI Search solutions within a leading global bank. The successful candidate will serve as the primary conduit to understand, implement, and optimise its capabilities across the firm. This role involves collaborating with clients and tech leads to set up and maintain Azure AI Search, manage document ingestion, metadata management, and tuning to ensure high-quality, accurate, and consistent results. Primary Responsibilities and Objectives: Act as the go-to expert for Azure AI Search (Azure Cognitive Search) , safeguarding best practices in implementation, ingestion, and optimization Partner with the Azure AI Search team to understand the platform's capabilities, roadmaps, and governance Lead the ingestion of documents into Azure AI Search, ensuring proper metadata tagging, data quality, and relevance Set up R etrieval-Augmented Generation (RAG) architecture to deliver accurate responses against a document corpus of approximately 500 documents Optimise and tune Azure AI Search indexes for best performance, relevance, and accuracy Identify and implement key features of Azure AI services needed for successful deployment Work with clients and technical teams to define use cases, document workflows, and solution architecture Contribute to establishing robust MLOps and CI/CD pipelines on Azure to streamline deployment and maintenance Maintain compliance with relevant data governance standards, including GDPR, especially concerning PII data Proactively highlight potential risks, suggest improvements, and demonstrate thought leadership in AI solutions Collaborate with multidisciplinary teams including Java developers, Azure AI Search specialists, and co-pilot engineers Experience: Extensive experience with Azure AI and Machine Learning services, especially Azure Cognitive Search Deep understanding of cloud architecture, deployment, and management on Azure Strong programming skills in Python, API integration, and Java Scripting (nice to have) Proven experience in data engineering tasks, model deployment, and MLOps best practices Experience designing, setting up, and managing RAG architectures and search solutions Ability to operate at a strategic level, identifying critical features and risks in Azure AI implementations Familiarity with GDPR and data governance best practices, especially concerning sensitive information Excellent collaboration, communication, and stakeholder management skills Key Azure AI and Machine Learning Services: Azure Cognitive Services Azure Machine Learning Azure Search (Azure Cognitive Search) Azure Data Factory/Databricks (potentially for data engineering tasks) Azure Blob Storage and Data Lake (for document storage and ingestion) Azure Logic Apps and Functions (for automation workflows) MLOps tools within Azure for deployment, monitoring, and management Diversity Statement Synechron are proud to be an equal opportunity employer. Our Diversity, Equity, and Inclusion (DEI) initiative Same Difference' is committed to fostering an inclusive culture - promoting equality, diversity and an environment that is respectful to all. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We offer flexible workplace arrangements, mentoring, internal mobility, learning and development programmes to support our global workforce. Empowerment and collaboration are at the core of how we operate. All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant's gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.
27/04/2026
About Synechron: Synechron is a leading digital transformation consulting firm dedicated to delivering innovative technology solutions within banking, financial services, and insurance. We thrive on engineering excellence, collaboration, and a passion for cutting-edge technologies. Job Location: Glasgow (Hybrid - 3 days in the office) We are seeking an highly experienced Azure AI Search ( Azure Cognitive Search) Developer to work on the deployment and optimisation of Azure AI Search solutions within a leading global bank. The successful candidate will serve as the primary conduit to understand, implement, and optimise its capabilities across the firm. This role involves collaborating with clients and tech leads to set up and maintain Azure AI Search, manage document ingestion, metadata management, and tuning to ensure high-quality, accurate, and consistent results. Primary Responsibilities and Objectives: Act as the go-to expert for Azure AI Search (Azure Cognitive Search) , safeguarding best practices in implementation, ingestion, and optimization Partner with the Azure AI Search team to understand the platform's capabilities, roadmaps, and governance Lead the ingestion of documents into Azure AI Search, ensuring proper metadata tagging, data quality, and relevance Set up R etrieval-Augmented Generation (RAG) architecture to deliver accurate responses against a document corpus of approximately 500 documents Optimise and tune Azure AI Search indexes for best performance, relevance, and accuracy Identify and implement key features of Azure AI services needed for successful deployment Work with clients and technical teams to define use cases, document workflows, and solution architecture Contribute to establishing robust MLOps and CI/CD pipelines on Azure to streamline deployment and maintenance Maintain compliance with relevant data governance standards, including GDPR, especially concerning PII data Proactively highlight potential risks, suggest improvements, and demonstrate thought leadership in AI solutions Collaborate with multidisciplinary teams including Java developers, Azure AI Search specialists, and co-pilot engineers Experience: Extensive experience with Azure AI and Machine Learning services, especially Azure Cognitive Search Deep understanding of cloud architecture, deployment, and management on Azure Strong programming skills in Python, API integration, and Java Scripting (nice to have) Proven experience in data engineering tasks, model deployment, and MLOps best practices Experience designing, setting up, and managing RAG architectures and search solutions Ability to operate at a strategic level, identifying critical features and risks in Azure AI implementations Familiarity with GDPR and data governance best practices, especially concerning sensitive information Excellent collaboration, communication, and stakeholder management skills Key Azure AI and Machine Learning Services: Azure Cognitive Services Azure Machine Learning Azure Search (Azure Cognitive Search) Azure Data Factory/Databricks (potentially for data engineering tasks) Azure Blob Storage and Data Lake (for document storage and ingestion) Azure Logic Apps and Functions (for automation workflows) MLOps tools within Azure for deployment, monitoring, and management Diversity Statement Synechron are proud to be an equal opportunity employer. Our Diversity, Equity, and Inclusion (DEI) initiative Same Difference' is committed to fostering an inclusive culture - promoting equality, diversity and an environment that is respectful to all. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We offer flexible workplace arrangements, mentoring, internal mobility, learning and development programmes to support our global workforce. Empowerment and collaboration are at the core of how we operate. All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant's gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.
Lynx Recruitment Ltd
Azure Architect/Consultant
Lynx Recruitment Ltd Manchester, Lancashire
Senior Azure Consultant/Architect Location: London or Manchester (Hybrid - 1-2 days per week onsite) Salary: Up to £90,000 We're recruiting a Senior Azure Consultant/Architect to join a growing consulting team delivering enterprise-scale Cloud, Data, and AI solutions . This role will focus on designing and leading modern Azure-based architectures while working closely with clients to translate business needs into scalable, secure platforms. You will take ownership of Azure solution architecture , including landing zones, identity, networking, governance, and DevOps approaches, while also supporting the design of modern data platforms and AI-enabled solutions using technologies such as Microsoft Fabric, Azure Databricks, and Azure AI services . This is a client-facing role requiring strong architecture leadership, the ability to run workshops and shape solutions, and experience guiding delivery teams across complex cloud transformation programmes. Key Requirements Strong experience designing and delivering Azure-based solutions in a consulting or enterprise environment Architecture expertise across cloud platforms, data solutions, and modern Azure services Experience producing and owning architecture designs and leading technical decisions Knowledge of modern data platforms and AI-enabled architectures is highly desirable Degree in Computer Science or a related field (2:1 or above preferred) Ability to obtain or hold UK Security Clearance (SC) This is a great opportunity to work on large-scale cloud and data transformation programmes while influencing technical direction and delivery.
27/04/2026
Full time
Senior Azure Consultant/Architect Location: London or Manchester (Hybrid - 1-2 days per week onsite) Salary: Up to £90,000 We're recruiting a Senior Azure Consultant/Architect to join a growing consulting team delivering enterprise-scale Cloud, Data, and AI solutions . This role will focus on designing and leading modern Azure-based architectures while working closely with clients to translate business needs into scalable, secure platforms. You will take ownership of Azure solution architecture , including landing zones, identity, networking, governance, and DevOps approaches, while also supporting the design of modern data platforms and AI-enabled solutions using technologies such as Microsoft Fabric, Azure Databricks, and Azure AI services . This is a client-facing role requiring strong architecture leadership, the ability to run workshops and shape solutions, and experience guiding delivery teams across complex cloud transformation programmes. Key Requirements Strong experience designing and delivering Azure-based solutions in a consulting or enterprise environment Architecture expertise across cloud platforms, data solutions, and modern Azure services Experience producing and owning architecture designs and leading technical decisions Knowledge of modern data platforms and AI-enabled architectures is highly desirable Degree in Computer Science or a related field (2:1 or above preferred) Ability to obtain or hold UK Security Clearance (SC) This is a great opportunity to work on large-scale cloud and data transformation programmes while influencing technical direction and delivery.
Tenth Revolution Group
Data Architect
Tenth Revolution Group Stratford-upon-avon, Warwickshire
Contract Data Architect - Senior SME (Modern Data Platform) Contract Length: 12-18 months (strong potential to extend or move permanent) It will be Inside IR35 Day Rate: £600-£700 Location: UK-based, primarily remote Ad hoc travel to Stratford-upon-Avon & London for workshops and occasional travel to Bucharest the main office is in Stratford. Overview We are seeking a Senior Data Architect to act as a subject matter expert for the design and build of a modern, enterprise-scale data platform . This is a high-impact contract role with full ownership of the data domain , supporting a multi-year roadmap to transform an existing platform into a cloud-native, federated data architecture on Azure. The organisation is in the process of lifting and modernising a large-scale data landscape into Azure , with a strong likelihood of winning a major government contract , making this a long-term and strategically critical engagement. This role requires a hands-on architect - someone who can define strategy and architecture, but also work closely with delivery teams to ensure successful execution. Key Responsibilities Own and shape the end-to-end data architecture and data strategy , treating the data platform as a product Design and build a modern Azure-based data platform supporting large-scale data processing Define and implement modern federated data architecture principles Lead the adoption of Medallion Architecture (Bronze / Silver / Gold) Design and oversee robust data ingestion and data pipeline frameworks , including ingestion of poor-quality and complex data Provide architectural leadership for: Data lakes and lakehouse patterns Analytics and reporting platforms Master Data Management (MDM) Act as a hands-on technical authority , supporting and mentoring engineering teams Collaborate with business and technical stakeholders across the UK and near-shore teams in Bucharest Ensure platforms are scalable, secure, and fit for future government and enterprise needs Technical Environment You will work extensively with: Azure Data Platform Databricks (Lakehouse architecture) Cosmos DB Azure Data Factory Talend (ETL) Power BI & Qlik Large-scale data processing and analytics Master Data Management (MDM) concepts and tooling Required Experience Proven experience as a Senior / Lead Data Architect on complex data transformation programmes Strong background designing and delivering modern cloud data platforms on Azure Hands-on experience with Databricks, Azure data services, and modern ETL pipelines Deep understanding of data architecture patterns , including lakehouse and federated models Experience working with imperfect and complex data sources Comfortable operating as the single point of ownership for a data domain Ability to translate strategy into detailed, actionable architecture Strong stakeholder engagement skills Desirable Experience working in regulated or government-aligned environments Prior exposure to long-term platform rebuild programmes Mentoring or leadership of multi-disciplinary data teams Why Apply? Long-term 12-18 month contract with strong extension or permanent potential Opportunity to own and architect a modern data platform end-to-end Significant influence on a multi-year data transformation roadmap Flexible day rate High-profile programme with future government exposure
27/04/2026
Contractor
Contract Data Architect - Senior SME (Modern Data Platform) Contract Length: 12-18 months (strong potential to extend or move permanent) It will be Inside IR35 Day Rate: £600-£700 Location: UK-based, primarily remote Ad hoc travel to Stratford-upon-Avon & London for workshops and occasional travel to Bucharest the main office is in Stratford. Overview We are seeking a Senior Data Architect to act as a subject matter expert for the design and build of a modern, enterprise-scale data platform . This is a high-impact contract role with full ownership of the data domain , supporting a multi-year roadmap to transform an existing platform into a cloud-native, federated data architecture on Azure. The organisation is in the process of lifting and modernising a large-scale data landscape into Azure , with a strong likelihood of winning a major government contract , making this a long-term and strategically critical engagement. This role requires a hands-on architect - someone who can define strategy and architecture, but also work closely with delivery teams to ensure successful execution. Key Responsibilities Own and shape the end-to-end data architecture and data strategy , treating the data platform as a product Design and build a modern Azure-based data platform supporting large-scale data processing Define and implement modern federated data architecture principles Lead the adoption of Medallion Architecture (Bronze / Silver / Gold) Design and oversee robust data ingestion and data pipeline frameworks , including ingestion of poor-quality and complex data Provide architectural leadership for: Data lakes and lakehouse patterns Analytics and reporting platforms Master Data Management (MDM) Act as a hands-on technical authority , supporting and mentoring engineering teams Collaborate with business and technical stakeholders across the UK and near-shore teams in Bucharest Ensure platforms are scalable, secure, and fit for future government and enterprise needs Technical Environment You will work extensively with: Azure Data Platform Databricks (Lakehouse architecture) Cosmos DB Azure Data Factory Talend (ETL) Power BI & Qlik Large-scale data processing and analytics Master Data Management (MDM) concepts and tooling Required Experience Proven experience as a Senior / Lead Data Architect on complex data transformation programmes Strong background designing and delivering modern cloud data platforms on Azure Hands-on experience with Databricks, Azure data services, and modern ETL pipelines Deep understanding of data architecture patterns , including lakehouse and federated models Experience working with imperfect and complex data sources Comfortable operating as the single point of ownership for a data domain Ability to translate strategy into detailed, actionable architecture Strong stakeholder engagement skills Desirable Experience working in regulated or government-aligned environments Prior exposure to long-term platform rebuild programmes Mentoring or leadership of multi-disciplinary data teams Why Apply? Long-term 12-18 month contract with strong extension or permanent potential Opportunity to own and architect a modern data platform end-to-end Significant influence on a multi-year data transformation roadmap Flexible day rate High-profile programme with future government exposure
Data Engineer
Breedon Group plc Bakewell, Derbyshire
BREEDON GROUP PLC is a leading construction materials group operating from over 400 sites across the UK, Ireland and US. We are growing! At Breedon we have an opportunity to join our Data & Analytics Team with an exciting plan and long-term vision. We are seeking an experienced Data Engineer, with skills in Microsoft Azure and SQL, to take the contribute to the design, build and deployment of our Azure data platform in line with our enterprise architecture. Key Responsibilities As we set out on the journey of moving towards a data led organisation, we have identified the need for an individual to transform our data management capabilities, implementing and maintaining a new Azure data platform which will enable our ambitions to become a data driven business. We are looking for an experienced individual who can support in the design and implementation of the data platform strategy and architecture that aligns with business objectives and building robust data pipelines. This will include the provision of a platform for analysts, data scientists and engineers, providing them with the data and an environment from which they can fulfil their roles. By investing in our data platform, we will transform the way we work, making us more effective, efficient and profitable for the future. We are looking to exceed the requirements set out by our stakeholders. Skills, Knowledge & Expertise Experience and knowledge: Proven experience in data engineering and cloud data platform development, with a strong focus on Microsoft Azure Demonstrated track record of designing, building, and maintaining end-to-end data pipelines using Azure-native technologies, including Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics and/or Databricks. Strong foundation in data modelling and data warehousing concepts (star/snowflake schemas, SCD2, Dimensions/Facts) Proficient in SQL and Python, with hands-on experience using PySpark, Spark SQL, and/or Pandas. Experience in technical documentation of migration processes, data mappings, data quality checks, and testing outcomes. Ability to work closely with business stakeholders, translating business and technical requirements into effective data transformation and modelling solutions Exposure to DevOps and CI/CD practices including source control, automated deployments and environment promotion. (desirable) Working knowledge of APIs (REST/SOAP) and common integration patterns (desirable) Working Knowledge of other Azure Integration tools like Azure Functions, Azure Logic Apps, API Management, Service Bus, Event Grid (desirable). Experience working with semi-structured data formats such as JSON and XML (desirable) Skills: Tenacious and curious nature that enables uncovering data availability and data quality constraints early in the process Problem solving - the ability to identify creative solutions to overcome problems Ability to impart knowledge and offer options to colleagues across the group Working with multiple data sources at one time delivering solutions that enable insights into complex data sets Personal Attributes: Ability to work to tight deadlines Ability to think and act purposefully and methodically A partnership approach to working with a variety of stakeholders Ability to keep it simple and to make it happen Thirst for continuous improvement Strong communication and engagement - ability to communicate calmly under pressure Positive and open outlook Job Benefits 25 days holiday plus bank holidays Contributory Pension Scheme Free on-site Parking Holiday Buy Scheme Volunteer Scheme Share Save Scheme Life Assurance Enhanced Maternity, Adoption & Paternity Scheme Health & Wellbeing Initiatives Discount Scheme
27/04/2026
Full time
BREEDON GROUP PLC is a leading construction materials group operating from over 400 sites across the UK, Ireland and US. We are growing! At Breedon we have an opportunity to join our Data & Analytics Team with an exciting plan and long-term vision. We are seeking an experienced Data Engineer, with skills in Microsoft Azure and SQL, to take the contribute to the design, build and deployment of our Azure data platform in line with our enterprise architecture. Key Responsibilities As we set out on the journey of moving towards a data led organisation, we have identified the need for an individual to transform our data management capabilities, implementing and maintaining a new Azure data platform which will enable our ambitions to become a data driven business. We are looking for an experienced individual who can support in the design and implementation of the data platform strategy and architecture that aligns with business objectives and building robust data pipelines. This will include the provision of a platform for analysts, data scientists and engineers, providing them with the data and an environment from which they can fulfil their roles. By investing in our data platform, we will transform the way we work, making us more effective, efficient and profitable for the future. We are looking to exceed the requirements set out by our stakeholders. Skills, Knowledge & Expertise Experience and knowledge: Proven experience in data engineering and cloud data platform development, with a strong focus on Microsoft Azure Demonstrated track record of designing, building, and maintaining end-to-end data pipelines using Azure-native technologies, including Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics and/or Databricks. Strong foundation in data modelling and data warehousing concepts (star/snowflake schemas, SCD2, Dimensions/Facts) Proficient in SQL and Python, with hands-on experience using PySpark, Spark SQL, and/or Pandas. Experience in technical documentation of migration processes, data mappings, data quality checks, and testing outcomes. Ability to work closely with business stakeholders, translating business and technical requirements into effective data transformation and modelling solutions Exposure to DevOps and CI/CD practices including source control, automated deployments and environment promotion. (desirable) Working knowledge of APIs (REST/SOAP) and common integration patterns (desirable) Working Knowledge of other Azure Integration tools like Azure Functions, Azure Logic Apps, API Management, Service Bus, Event Grid (desirable). Experience working with semi-structured data formats such as JSON and XML (desirable) Skills: Tenacious and curious nature that enables uncovering data availability and data quality constraints early in the process Problem solving - the ability to identify creative solutions to overcome problems Ability to impart knowledge and offer options to colleagues across the group Working with multiple data sources at one time delivering solutions that enable insights into complex data sets Personal Attributes: Ability to work to tight deadlines Ability to think and act purposefully and methodically A partnership approach to working with a variety of stakeholders Ability to keep it simple and to make it happen Thirst for continuous improvement Strong communication and engagement - ability to communicate calmly under pressure Positive and open outlook Job Benefits 25 days holiday plus bank holidays Contributory Pension Scheme Free on-site Parking Holiday Buy Scheme Volunteer Scheme Share Save Scheme Life Assurance Enhanced Maternity, Adoption & Paternity Scheme Health & Wellbeing Initiatives Discount Scheme
Ageas Insurance Limited
Senior Data Quality Analyst
Ageas Insurance Limited Eastleigh, Hampshire
Job Title : Senior Data Quality Analyst Target Start Date: Q2 2026 Contract Type: Permanent, Full Time Salary Range: £65,000 - £70,000 Location: Eastleigh, Hybrid (1x week) Closing Date for applications: 7th May Senior Data Quality Analyst: We are currently looking for a Senior Data Quality Analyst. You will work alongside Data Scientists, Engineers, Architects and Analysts to support the design, build and maintenance of cutting-edge data and AI services, ensuring strong data quality practices are embedded and monitored from the outset. Working closely with our governance leads and collaborating with risk, compliance and privacy teams, you'll help establish enterprise standards and drive trusted, high-quality data that powers analytics and AI innovation. Main Responsibilities as Senior Data Quality Analyst: Provide data quality advice and guidance across the business, promoting best practice and pragmatic solutions Design and implement data quality processes, controls and monitoring across our data platforms and enterprise systems Develop data profiling, reporting and monitoring solutions using SQL and Python Collaborate with data owners, stewards and the wider data community to improve trust and quality in critical datasets Curate and maintain key data artefacts such as data catalogues, dictionaries, lineage and asset registers Champion the value of data quality through governance forums, stakeholder engagement and guidance materials Support delivery of the strategic data quality roadmap and key governance outcomes Work with architects and AI teams to ensure high-quality, well-governed data supports scalable data products and GenAI services Skills and experience you need as Senior Data Quality Analyst: Strong experience implementing data quality processes and governance frameworks within complex data environments Hands-on coding capability in SQL, with experience using Python for data manipulation, profiling or automation Experience working with modern cloud data platforms, particularly Databricks Experience profiling datasets and defining data quality rules, controls and monitoring approaches Experience working with data governance frameworks and collaborating with data owners, stewards and governance teams Familiarity with data governance and data management tooling such as Unity Catalog, Collibra or similar Strong stakeholder engagement skills with the ability to influence across technical and non-technical teams Interest in AI and emerging technologies, and an understanding of how strong data management enables advanced analytics and GenAI Qualifications : DAMA CDMP (Certified Data Management Professional) or equivalent. Recognised Data Quality Specialist certification or training. Desirable: Experience in the insurance or financial services sector. Exposure to data migration or transformation programmes. At Ageas we offer a wide range of benefits to support you and your family inside and outside of work, which helped us achieve, Top Employer status in the UK. Here are some of the benefits you can enjoy at Ageas: Flexible Working- Smart gives employees flexibility around location (as long as it's within the UK) and, for many of our roles, flexibility within the working day to manage other commitments, such as school drop offs etc. We also offer all our vacancies part-time/job-shares. We also offer a minimum of 35 days holiday (inc. bank holidays) and you can buy and sell days. Supporting your Health- Dental Insurance Health Cash Plan, Health Screening, Will Writing, Voluntary Critical Illness, Mental Health First Aiders, Well Being Activities - Mindfulness. Supporting your Wealth- Annual Bonus Schemes, Annual Salary Reviews, Competitive Pension, Employee Savings, Employee Loans. Supporting you at Work- Well-being activities, mindfulness sessions, Sports and Social Club events and more. Supporting you and your Family- Maternity/pregnant parent/primary adopter entitlement of 16 weeks at full pay and paternity/non-pregnant parent/co-adopter at 8 weeks' full pay. Benefits for Them- Partner Life Assurance and Critical Illness cover. Get some Tech- Deals on various gadgets including Wearables, Tablets and Laptops. Getting around- Car Salary Exchange, Cycle Scheme, Vehicle Breakdown Cover. Supporting you back to work- Return to work programme after maternity leave. About Ageas: We are one of the largest car and home insurers in the UK. Our People help Ageas to be a thriving, creative and innovative place to work. We show this in the service we provide to over four million customers.As an inclusive employer, we encourage anyone to apply. We're a signatory of the Race at Work Charter and Women in Finance Charter , member of iCAN and GAIN . As a Disability Confident Leader , we are committed to ensuring our recruitment processes are fully inclusive. That means if you are applying for a job with us, you will have fair access to support and adjustments throughout your recruit
24/04/2026
Full time
Job Title : Senior Data Quality Analyst Target Start Date: Q2 2026 Contract Type: Permanent, Full Time Salary Range: £65,000 - £70,000 Location: Eastleigh, Hybrid (1x week) Closing Date for applications: 7th May Senior Data Quality Analyst: We are currently looking for a Senior Data Quality Analyst. You will work alongside Data Scientists, Engineers, Architects and Analysts to support the design, build and maintenance of cutting-edge data and AI services, ensuring strong data quality practices are embedded and monitored from the outset. Working closely with our governance leads and collaborating with risk, compliance and privacy teams, you'll help establish enterprise standards and drive trusted, high-quality data that powers analytics and AI innovation. Main Responsibilities as Senior Data Quality Analyst: Provide data quality advice and guidance across the business, promoting best practice and pragmatic solutions Design and implement data quality processes, controls and monitoring across our data platforms and enterprise systems Develop data profiling, reporting and monitoring solutions using SQL and Python Collaborate with data owners, stewards and the wider data community to improve trust and quality in critical datasets Curate and maintain key data artefacts such as data catalogues, dictionaries, lineage and asset registers Champion the value of data quality through governance forums, stakeholder engagement and guidance materials Support delivery of the strategic data quality roadmap and key governance outcomes Work with architects and AI teams to ensure high-quality, well-governed data supports scalable data products and GenAI services Skills and experience you need as Senior Data Quality Analyst: Strong experience implementing data quality processes and governance frameworks within complex data environments Hands-on coding capability in SQL, with experience using Python for data manipulation, profiling or automation Experience working with modern cloud data platforms, particularly Databricks Experience profiling datasets and defining data quality rules, controls and monitoring approaches Experience working with data governance frameworks and collaborating with data owners, stewards and governance teams Familiarity with data governance and data management tooling such as Unity Catalog, Collibra or similar Strong stakeholder engagement skills with the ability to influence across technical and non-technical teams Interest in AI and emerging technologies, and an understanding of how strong data management enables advanced analytics and GenAI Qualifications : DAMA CDMP (Certified Data Management Professional) or equivalent. Recognised Data Quality Specialist certification or training. Desirable: Experience in the insurance or financial services sector. Exposure to data migration or transformation programmes. At Ageas we offer a wide range of benefits to support you and your family inside and outside of work, which helped us achieve, Top Employer status in the UK. Here are some of the benefits you can enjoy at Ageas: Flexible Working- Smart gives employees flexibility around location (as long as it's within the UK) and, for many of our roles, flexibility within the working day to manage other commitments, such as school drop offs etc. We also offer all our vacancies part-time/job-shares. We also offer a minimum of 35 days holiday (inc. bank holidays) and you can buy and sell days. Supporting your Health- Dental Insurance Health Cash Plan, Health Screening, Will Writing, Voluntary Critical Illness, Mental Health First Aiders, Well Being Activities - Mindfulness. Supporting your Wealth- Annual Bonus Schemes, Annual Salary Reviews, Competitive Pension, Employee Savings, Employee Loans. Supporting you at Work- Well-being activities, mindfulness sessions, Sports and Social Club events and more. Supporting you and your Family- Maternity/pregnant parent/primary adopter entitlement of 16 weeks at full pay and paternity/non-pregnant parent/co-adopter at 8 weeks' full pay. Benefits for Them- Partner Life Assurance and Critical Illness cover. Get some Tech- Deals on various gadgets including Wearables, Tablets and Laptops. Getting around- Car Salary Exchange, Cycle Scheme, Vehicle Breakdown Cover. Supporting you back to work- Return to work programme after maternity leave. About Ageas: We are one of the largest car and home insurers in the UK. Our People help Ageas to be a thriving, creative and innovative place to work. We show this in the service we provide to over four million customers.As an inclusive employer, we encourage anyone to apply. We're a signatory of the Race at Work Charter and Women in Finance Charter , member of iCAN and GAIN . As a Disability Confident Leader , we are committed to ensuring our recruitment processes are fully inclusive. That means if you are applying for a job with us, you will have fair access to support and adjustments throughout your recruit
Taylor Hopkinson Limited
Data Engineer
Taylor Hopkinson Limited City, London
Data Engineer for a major offshore wind project in The United Kingdom Responsibilities Design and implement scalable ingestion pipelines from multiple source systems including and internal business data sources. Ensure reliable, automated, and monitored data flows into the Bronze layer of the Medallion architecture. Work within clients existing security framework to establish compliant connectivity to operational data sources. Build and maintain Silver and Gold layer transformations in Databricks using Python and SQL. Onboard datasets into Unity Catalog, ensuring proper governance, lineage, and discoverability. Platform Collaboration & Delivery Support the ML/Data Scientist in preparing clean, structured datasets for anomaly detection and asset performance modelling. Contribute to technical documentation and ensure pipelines are maintainable and transferable. Stay current on Databricks and Azure platform developments relevant to the stack. Support the Digital & AI Strategy Manager in assessing feasibility of new data source integrations as the roadmap evolves. Experience Master's degree in Computer Science, Data Engineering, Software Engineering, or a related technical field. Professional certifications in Azure, Databricks preferred Training or background in energy systems, renewable energy, offshore wind or BESS technologies is a strong plus. 4-7 years of hands-on data engineering experience in a cloud environment. Demonstrated experience delivering production pipelines on Databricks and Azure (ADLS Gen2, ADF or equivalent). Proven ability to implement Medallion architecture or equivalent layered data modelling patterns. Experience with REST API ingestion and integration of business systems (ERP, finance tools). Experience in a contractor or project-based delivery model preferred. Exposure to OT/SCADA environments or energy sector data. Exposure to MLOps workflows or collaboration with data science teams.
24/04/2026
Contractor
Data Engineer for a major offshore wind project in The United Kingdom Responsibilities Design and implement scalable ingestion pipelines from multiple source systems including and internal business data sources. Ensure reliable, automated, and monitored data flows into the Bronze layer of the Medallion architecture. Work within clients existing security framework to establish compliant connectivity to operational data sources. Build and maintain Silver and Gold layer transformations in Databricks using Python and SQL. Onboard datasets into Unity Catalog, ensuring proper governance, lineage, and discoverability. Platform Collaboration & Delivery Support the ML/Data Scientist in preparing clean, structured datasets for anomaly detection and asset performance modelling. Contribute to technical documentation and ensure pipelines are maintainable and transferable. Stay current on Databricks and Azure platform developments relevant to the stack. Support the Digital & AI Strategy Manager in assessing feasibility of new data source integrations as the roadmap evolves. Experience Master's degree in Computer Science, Data Engineering, Software Engineering, or a related technical field. Professional certifications in Azure, Databricks preferred Training or background in energy systems, renewable energy, offshore wind or BESS technologies is a strong plus. 4-7 years of hands-on data engineering experience in a cloud environment. Demonstrated experience delivering production pipelines on Databricks and Azure (ADLS Gen2, ADF or equivalent). Proven ability to implement Medallion architecture or equivalent layered data modelling patterns. Experience with REST API ingestion and integration of business systems (ERP, finance tools). Experience in a contractor or project-based delivery model preferred. Exposure to OT/SCADA environments or energy sector data. Exposure to MLOps workflows or collaboration with data science teams.
Fruition Group
Solution Architect - AI & Automation
Fruition Group
Job Title: Solutions Architect (AI and Automation) Location: Remote (occasional travel to London required) Salary: £75,000 - £85,000 (Depending on location) Would you like to work for an organisation plays a pivotal role in safeguarding and improving the quality of health and social care services across England. My client plays a vital role in improving outcomes across health and social care by using data, technology, and insight to drive meaningful change. With a strong focus on innovation and digital transformation, they are investing in modern cloud platforms, artificial intelligence, and advanced analytics to become a truly intelligence-led regulator. Solutions Architect Responsibilities Design scalable, secure, and ethical AI and data solutions using Microsoft Azure technologies. Translate business requirements into end-to-end solution architecture across AI, machine learning, and data platforms. Maintain architectural blueprints ensuring interoperability, resilience, governance, and regulatory compliance. Lead architecture design reviews and provide technical leadership to delivery teams using Agile, DevOps, and CI/CD practices. Mentor technical teams and promote responsible AI practices, including transparency, fairness, and security. Solutions Architect Requirements Extensive experience delivering solutions on the Microsoft Azure Data Platform (Synapse Analytics, Data Lake, Databricks, Azure ML, Power BI). Expertise in modern data engineering approaches such as ETL/ELT pipelines, streaming, APIs, and event-driven architecture. Experience with cloud-native architectures, microservices, and SaaS integrations. Knowledge of data governance, information security, and AI ethics within regulated industries. Excellent stakeholder engagement and leadership skills. We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief, sexual orientation or age.
24/04/2026
Full time
Job Title: Solutions Architect (AI and Automation) Location: Remote (occasional travel to London required) Salary: £75,000 - £85,000 (Depending on location) Would you like to work for an organisation plays a pivotal role in safeguarding and improving the quality of health and social care services across England. My client plays a vital role in improving outcomes across health and social care by using data, technology, and insight to drive meaningful change. With a strong focus on innovation and digital transformation, they are investing in modern cloud platforms, artificial intelligence, and advanced analytics to become a truly intelligence-led regulator. Solutions Architect Responsibilities Design scalable, secure, and ethical AI and data solutions using Microsoft Azure technologies. Translate business requirements into end-to-end solution architecture across AI, machine learning, and data platforms. Maintain architectural blueprints ensuring interoperability, resilience, governance, and regulatory compliance. Lead architecture design reviews and provide technical leadership to delivery teams using Agile, DevOps, and CI/CD practices. Mentor technical teams and promote responsible AI practices, including transparency, fairness, and security. Solutions Architect Requirements Extensive experience delivering solutions on the Microsoft Azure Data Platform (Synapse Analytics, Data Lake, Databricks, Azure ML, Power BI). Expertise in modern data engineering approaches such as ETL/ELT pipelines, streaming, APIs, and event-driven architecture. Experience with cloud-native architectures, microservices, and SaaS integrations. Knowledge of data governance, information security, and AI ethics within regulated industries. Excellent stakeholder engagement and leadership skills. We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief, sexual orientation or age.
Oscar Technology
Senior Data Engineer / Data Architect
Oscar Technology Warrington, Cheshire
Role: Senior Data Engineer / Data Architect Salary: Up to £85,000, plus £5,000 annual bonus Technology - Azure Platform, DataBricks, PowerBI Location: Warrington Working Pattern: Hybrid - 2 days a week in the office. The Role: This is a great new role for either a Senior Data Engineer, wanting to make a step towards architecture, or, a hand's on data architect who is still very much a do'er. You will be the Data Engineer / Architect for the business so this is a design and execute role, you will need to be happy to roll sleeves up and do some technical work alongside ownership of the architecture piece. The system is large, the company is a global business and there systems are already in a good shape, this is not like some roles I have seen where "everything is in a mess and we need someone to fix" - everything her is in place for the successful person to deliver. In addition to the MS Technologies, they also have Salesforce and SAP in the business and a large amount of data is moved and out of those systems so experience in this area would definitely be an advantage. This role is high profile, we are looking for someone to work with SLT, board members and international stakeholders. You will need to be confident and comfortable engaging in conversations and high level decision making. Please note - this is not a remote position, it is hybrid in the office, 2 days a week but there is quite a lot of flexibility to that. Responsibilities: Own and evolve the enterprise data architecture - defining data models, integration patterns, and standards that scale with the business. Design secure, resilient data solutions across cloud and on-premises environments, ensuring they are fit for purpose today and adaptable for tomorrow. Act as the bridge between business stakeholders, analytics teams, and engineering - translating commercial requirements into robust, well-reasoned architectural designs. Set the standard for data governance, data quality, metadata management, and master data management - and hold the organisation to it. Ensure all data practices meet security, privacy, and regulatory obligations, proactively identifying and mitigating compliance risk. Provide architectural leadership and assurance across data programmes, guiding teams to make the right design decisions at every stage. Assess, recommend, and champion the right data technologies, tools, and platforms - balancing innovation with pragmatism. Lead and support data migration, modernisation, and transformation initiatives, bringing structure and clarity to complex change. Produce clear, consistent documentation of data architectures, models, and design decisions that serve as a lasting reference for the organisation. Identify opportunities to commercialise data insights through automation and process efficiency - turning data into measurable business value. Align data architecture with global systems requirements and regulatory evolution, ensuring enabling technology delivers maximum business impact. Requirements: Azure Environment Strong, well - rounded data engineering skillset. Apply Now! If you have a range of experience in Data Engineering and you are looking to progress with an organisation that has a fantastic approach to work in a thriving and ambitious environment, then look no further - this is the role for you! Please note: this role does not offer sponsorship. Referrals: If this role isn't right for you, do you know someone that might be interested? You could earn £500 of retail vouchers if you refer a successful candidate to Oscar. Email: to recommend someone for this role. Interviews for this role will be held imminently. To be considered, please send your CV to me now to avoid disappointment. Role: Senior Data Engineer / Data Architect Salary: Up to £85,000, plus £5,000 annual bonus Technology - Azure Platform, DataBricks, PowerBI Location: Warrington Working Pattern: Hybrid - 2 days a week in the office. Oscar Associates (UK) Limited is acting as an Employment Agency in relation to this vacancy. To understand more about what we do with your data please review our privacy policy in the privacy section of the Oscar website.
24/04/2026
Full time
Role: Senior Data Engineer / Data Architect Salary: Up to £85,000, plus £5,000 annual bonus Technology - Azure Platform, DataBricks, PowerBI Location: Warrington Working Pattern: Hybrid - 2 days a week in the office. The Role: This is a great new role for either a Senior Data Engineer, wanting to make a step towards architecture, or, a hand's on data architect who is still very much a do'er. You will be the Data Engineer / Architect for the business so this is a design and execute role, you will need to be happy to roll sleeves up and do some technical work alongside ownership of the architecture piece. The system is large, the company is a global business and there systems are already in a good shape, this is not like some roles I have seen where "everything is in a mess and we need someone to fix" - everything her is in place for the successful person to deliver. In addition to the MS Technologies, they also have Salesforce and SAP in the business and a large amount of data is moved and out of those systems so experience in this area would definitely be an advantage. This role is high profile, we are looking for someone to work with SLT, board members and international stakeholders. You will need to be confident and comfortable engaging in conversations and high level decision making. Please note - this is not a remote position, it is hybrid in the office, 2 days a week but there is quite a lot of flexibility to that. Responsibilities: Own and evolve the enterprise data architecture - defining data models, integration patterns, and standards that scale with the business. Design secure, resilient data solutions across cloud and on-premises environments, ensuring they are fit for purpose today and adaptable for tomorrow. Act as the bridge between business stakeholders, analytics teams, and engineering - translating commercial requirements into robust, well-reasoned architectural designs. Set the standard for data governance, data quality, metadata management, and master data management - and hold the organisation to it. Ensure all data practices meet security, privacy, and regulatory obligations, proactively identifying and mitigating compliance risk. Provide architectural leadership and assurance across data programmes, guiding teams to make the right design decisions at every stage. Assess, recommend, and champion the right data technologies, tools, and platforms - balancing innovation with pragmatism. Lead and support data migration, modernisation, and transformation initiatives, bringing structure and clarity to complex change. Produce clear, consistent documentation of data architectures, models, and design decisions that serve as a lasting reference for the organisation. Identify opportunities to commercialise data insights through automation and process efficiency - turning data into measurable business value. Align data architecture with global systems requirements and regulatory evolution, ensuring enabling technology delivers maximum business impact. Requirements: Azure Environment Strong, well - rounded data engineering skillset. Apply Now! If you have a range of experience in Data Engineering and you are looking to progress with an organisation that has a fantastic approach to work in a thriving and ambitious environment, then look no further - this is the role for you! Please note: this role does not offer sponsorship. Referrals: If this role isn't right for you, do you know someone that might be interested? You could earn £500 of retail vouchers if you refer a successful candidate to Oscar. Email: to recommend someone for this role. Interviews for this role will be held imminently. To be considered, please send your CV to me now to avoid disappointment. Role: Senior Data Engineer / Data Architect Salary: Up to £85,000, plus £5,000 annual bonus Technology - Azure Platform, DataBricks, PowerBI Location: Warrington Working Pattern: Hybrid - 2 days a week in the office. Oscar Associates (UK) Limited is acting as an Employment Agency in relation to this vacancy. To understand more about what we do with your data please review our privacy policy in the privacy section of the Oscar website.
IntaPeople
Senior Data Engineer
IntaPeople
IntaPeople are hiring for a mid-senior level Data Engineer to join a growing digital engineering team working on modern technology platforms. You ll work alongside an established BI &Data to play a role in an active phase of platform modernisation. The successful candidate will join a small, collaborative team of data engineers and analysts delivering work across the full data lifecycle, from extraction and transformation through to data modelling and reporting. This role sits at the heart of a growing data engineering capability in Cardiff. You ll be actively involved in delivering high quality data solutions, while acting as a trusted reference point for best practice across the team. From shaping and delivering ETL workflows to collaborating directly with stakeholders, your work will help ensure data platforms evolve in line with the organisation s expanding requirements. You will be working primarily within the Microsoft Azure ecosystem, including Azure SQL Server, Azure Data Factory, and Azure DevOps. Required Skills Strong Python experience for data engineering Strong experience working with SQL Server Strong experience working with Azure Data Factory and Azure DevOps Hands-on experience with data lake platforms, Azure Synapse, Databricks, or equivalent experience/skills Experience with tooling s such as CI/CD pipelines and version control. The ability to lead feature specifications and working closely with key business stakeholders Adopting an AI-first approach to development and being inquisitive to it s benefits and features Key Responsibilities Reporting into the Director of Data Engineering and working closely with other Data Engineers and Analysts within the business You will be responsible for the Design, build, and maintenance of Python-based ETL pipelines Accounting and leading the SQL development Data lake development within Azure Synapse, working to the organisations architecture standards Meeting with business stakeholders to define requirements and translate them into solution designs Keeping stakeholders informed on the status of data initiatives Producing technical documentation: solution designs, data dictionaries, and engineering runbooks Reviewing and guiding the work of less experienced members of the team Contributing to solution design discussions and architecture decisions Role overview Senior Data Engineer Starting Salary of £55,000 - £60,000 Annual bonus scheme between 10%-20% 25 days holiday allowance (which increases with service) Central Cardiff office location True Flexible working hours Hybrid working setup expectations are typically 2-3 days per week Private Medical care Company wide trips Group Life Assurance, Income Protection & Critical Illness cover Matched pension contribution Cycle to work scheme If you're an experienced Data Engineer looking to make an impact in a modern, forward-thinking team, this is a great opportunity. Please not we do not have the ability to provide sponsorship and candidates must only apply who have the ability to work without restriction within the UK. Interested? Click apply now with your CV or call (phone number removed) for a chat!
24/04/2026
Full time
IntaPeople are hiring for a mid-senior level Data Engineer to join a growing digital engineering team working on modern technology platforms. You ll work alongside an established BI &Data to play a role in an active phase of platform modernisation. The successful candidate will join a small, collaborative team of data engineers and analysts delivering work across the full data lifecycle, from extraction and transformation through to data modelling and reporting. This role sits at the heart of a growing data engineering capability in Cardiff. You ll be actively involved in delivering high quality data solutions, while acting as a trusted reference point for best practice across the team. From shaping and delivering ETL workflows to collaborating directly with stakeholders, your work will help ensure data platforms evolve in line with the organisation s expanding requirements. You will be working primarily within the Microsoft Azure ecosystem, including Azure SQL Server, Azure Data Factory, and Azure DevOps. Required Skills Strong Python experience for data engineering Strong experience working with SQL Server Strong experience working with Azure Data Factory and Azure DevOps Hands-on experience with data lake platforms, Azure Synapse, Databricks, or equivalent experience/skills Experience with tooling s such as CI/CD pipelines and version control. The ability to lead feature specifications and working closely with key business stakeholders Adopting an AI-first approach to development and being inquisitive to it s benefits and features Key Responsibilities Reporting into the Director of Data Engineering and working closely with other Data Engineers and Analysts within the business You will be responsible for the Design, build, and maintenance of Python-based ETL pipelines Accounting and leading the SQL development Data lake development within Azure Synapse, working to the organisations architecture standards Meeting with business stakeholders to define requirements and translate them into solution designs Keeping stakeholders informed on the status of data initiatives Producing technical documentation: solution designs, data dictionaries, and engineering runbooks Reviewing and guiding the work of less experienced members of the team Contributing to solution design discussions and architecture decisions Role overview Senior Data Engineer Starting Salary of £55,000 - £60,000 Annual bonus scheme between 10%-20% 25 days holiday allowance (which increases with service) Central Cardiff office location True Flexible working hours Hybrid working setup expectations are typically 2-3 days per week Private Medical care Company wide trips Group Life Assurance, Income Protection & Critical Illness cover Matched pension contribution Cycle to work scheme If you're an experienced Data Engineer looking to make an impact in a modern, forward-thinking team, this is a great opportunity. Please not we do not have the ability to provide sponsorship and candidates must only apply who have the ability to work without restriction within the UK. Interested? Click apply now with your CV or call (phone number removed) for a chat!
Tenth Revolution Group
Senior Data Engineer - Insurance
Tenth Revolution Group
Senior Data Engineer - Insurance & London Market - Up to £95k Location: London (Hybrid) Type: Permanent Overview A London Market insurance organisation is seeking a Senior Data Engineer to design, build, and lead modern data engineering solutions across underwriting, pricing, claims, and reinsurance domains. This is a hands-on senior role combining deep insurance domain knowledge with cloud-native data engineering, delivered within complex and regulated environments. Key Responsibilities Design and build scalable cloud-based data platforms using Medallion Architecture (Bronze / Silver / Gold) Engineer pipelines using Python and PySpark on platforms such as Databricks Ingest and integrate data from multiple sources including policy, claims, broker, and third-party systems Contribute to CI/CD pipelines and modern DevOps delivery approaches Collaborate closely with data architects, modellers, analysts and business stakeholders Required Experience Strong experience as a Senior Data Engineer within Insurance, ideally the London Market Proven hands-on expertise in: Python & PySpark Databricks Cloud platforms (Azure, AWS, or GCP) Solid understanding of Medallion Architecture and analytics-focused data modelling Strong domain knowledge across: Lloyd's Syndicates Delegated Authority Reinsurance and Ceded Reinsurance Underwriting, pricing, and claims Confident communicator, comfortable working with senior technical and business stakeholders Why Apply? Senior, hands-on role working on complex London Market data challenges Opportunity to shape and scale modern cloud data platforms Long-term transformation programmes with strong stakeholder visibility
23/04/2026
Full time
Senior Data Engineer - Insurance & London Market - Up to £95k Location: London (Hybrid) Type: Permanent Overview A London Market insurance organisation is seeking a Senior Data Engineer to design, build, and lead modern data engineering solutions across underwriting, pricing, claims, and reinsurance domains. This is a hands-on senior role combining deep insurance domain knowledge with cloud-native data engineering, delivered within complex and regulated environments. Key Responsibilities Design and build scalable cloud-based data platforms using Medallion Architecture (Bronze / Silver / Gold) Engineer pipelines using Python and PySpark on platforms such as Databricks Ingest and integrate data from multiple sources including policy, claims, broker, and third-party systems Contribute to CI/CD pipelines and modern DevOps delivery approaches Collaborate closely with data architects, modellers, analysts and business stakeholders Required Experience Strong experience as a Senior Data Engineer within Insurance, ideally the London Market Proven hands-on expertise in: Python & PySpark Databricks Cloud platforms (Azure, AWS, or GCP) Solid understanding of Medallion Architecture and analytics-focused data modelling Strong domain knowledge across: Lloyd's Syndicates Delegated Authority Reinsurance and Ceded Reinsurance Underwriting, pricing, and claims Confident communicator, comfortable working with senior technical and business stakeholders Why Apply? Senior, hands-on role working on complex London Market data challenges Opportunity to shape and scale modern cloud data platforms Long-term transformation programmes with strong stakeholder visibility
TXP Technology x People
Power BI Developer
TXP Technology x People
Job Description Role: Power BI Developer Employment: Permanent Location: Hybrid (Birmingham) - combination of office, client site and home workingWe are TXP. We help businesses and organisations move forward, at pace and at scale. We believe in the transformative power of combining technology and people. By providing consulting expertise, development services and resourcing, we work closely with organisations to solve their most complex business problems. Role Overview TXP's Data & AI Practice is growing, and we're looking for a Power BI Developer to join us on a permanent basis. You'll work across a diverse portfolio of client engagements, delivering high-quality Power BI reporting solutions built on Microsoft Fabric, Direct Lake semantic models, and modern data architectures.This role is ideal for someone who is passionate about analytics delivery, enjoys working directly with stakeholders, and wants to play a part in shaping standards, reusable assets, and best practices within a specialist consultancy environment. Responsibilities Design, build, test and deploy Power BI dashboards and reports aligned to clearly defined acceptance criteria Engage directly with client stakeholders across commercial, finance and operational teams to gather requirements and define KPIs Translate business needs into production-grade analytics solutions using Microsoft Fabric and Power BI Manage Power BI environments including workspaces, publishing, Apps, and Row Level Security Apply best practice in visual design, performance optimisation, and semantic data modelling Support light data engineering where required, including Fabric Pipelines and Gen2 Dataflows Integrate data from a range of sources including SQL Server, Oracle, ERP/CRM systems, Excel and CSV Produce training materials, run demos and support end users who are new to Power BI Contribute to internal Power BI standards, reusable assets, and data catalogues Stay current with the Microsoft Fabric roadmap and proactively apply new features where they add value Skills and Experience Essential Strong hands-on experience with Power BI (report design, data modelling, DAX) Solid SQL skills with the ability to understand and reverse-engineer complex views Experience running requirements and KPI definition sessions with non-technical stakeholders A methodical, detail-oriented approach with strong unit testing discipline Desirable Experience with Microsoft Fabric, particularly Direct Lake semantic models Star schema and semantic data modelling expertise Advanced DAX development and optimisation Experience with Fabric Pipelines, Gen2 Dataflows, and enterprise data ingestion Power BI administration, deployment pipelines, and version control Multi-layout reporting (desktop, mobile, tabular) What We Offer A permanent role within a specialist practice, not a large system integrator. Exposure to varied client environments across financial services, insurance, healthcare, and retail. A clear Microsoft Fabric and Databricks technology track with access to training and certification support. A collaborative, senior-led team environment with direct access to practice leadership. Competitive salary, flexible hybrid working, and a role that grows with the practice. Benefits: 25 days annual leave (plus bank holidays). An additional day of paid leave for your birthday (or Christmas eve). 4% Matched employer contributed pension (salary sacrifice). Life assurance (3x). Access to an Employee Assistance Programme. Private medical insurance through our partner Aviva. Cycle to work scheme. Corporate eye-care vouchers. Access to an independent financial advisor. 2 x social value days per year to give back to local communities. Grow with us: Work on exciting new projects; If you want to avoid getting stuck with the mundane, you're in the right place. We work in many sectors with fantastic clients, so you'll always be working on something exciting and challenging. Career growth - we've got you! We recognise that you might have a career path planned out and you might need some support to help you move forward. We're here to support you and make the most out of your time with us, through challenging work, opportunities to grow and learning and development opportunities. Be part of the TXP growth journey ; We are a high growth, fast paced environment. We currently have 200+ employees and work with clients across the UK. Joining TXP means you'll be part of that.
22/04/2026
Full time
Job Description Role: Power BI Developer Employment: Permanent Location: Hybrid (Birmingham) - combination of office, client site and home workingWe are TXP. We help businesses and organisations move forward, at pace and at scale. We believe in the transformative power of combining technology and people. By providing consulting expertise, development services and resourcing, we work closely with organisations to solve their most complex business problems. Role Overview TXP's Data & AI Practice is growing, and we're looking for a Power BI Developer to join us on a permanent basis. You'll work across a diverse portfolio of client engagements, delivering high-quality Power BI reporting solutions built on Microsoft Fabric, Direct Lake semantic models, and modern data architectures.This role is ideal for someone who is passionate about analytics delivery, enjoys working directly with stakeholders, and wants to play a part in shaping standards, reusable assets, and best practices within a specialist consultancy environment. Responsibilities Design, build, test and deploy Power BI dashboards and reports aligned to clearly defined acceptance criteria Engage directly with client stakeholders across commercial, finance and operational teams to gather requirements and define KPIs Translate business needs into production-grade analytics solutions using Microsoft Fabric and Power BI Manage Power BI environments including workspaces, publishing, Apps, and Row Level Security Apply best practice in visual design, performance optimisation, and semantic data modelling Support light data engineering where required, including Fabric Pipelines and Gen2 Dataflows Integrate data from a range of sources including SQL Server, Oracle, ERP/CRM systems, Excel and CSV Produce training materials, run demos and support end users who are new to Power BI Contribute to internal Power BI standards, reusable assets, and data catalogues Stay current with the Microsoft Fabric roadmap and proactively apply new features where they add value Skills and Experience Essential Strong hands-on experience with Power BI (report design, data modelling, DAX) Solid SQL skills with the ability to understand and reverse-engineer complex views Experience running requirements and KPI definition sessions with non-technical stakeholders A methodical, detail-oriented approach with strong unit testing discipline Desirable Experience with Microsoft Fabric, particularly Direct Lake semantic models Star schema and semantic data modelling expertise Advanced DAX development and optimisation Experience with Fabric Pipelines, Gen2 Dataflows, and enterprise data ingestion Power BI administration, deployment pipelines, and version control Multi-layout reporting (desktop, mobile, tabular) What We Offer A permanent role within a specialist practice, not a large system integrator. Exposure to varied client environments across financial services, insurance, healthcare, and retail. A clear Microsoft Fabric and Databricks technology track with access to training and certification support. A collaborative, senior-led team environment with direct access to practice leadership. Competitive salary, flexible hybrid working, and a role that grows with the practice. Benefits: 25 days annual leave (plus bank holidays). An additional day of paid leave for your birthday (or Christmas eve). 4% Matched employer contributed pension (salary sacrifice). Life assurance (3x). Access to an Employee Assistance Programme. Private medical insurance through our partner Aviva. Cycle to work scheme. Corporate eye-care vouchers. Access to an independent financial advisor. 2 x social value days per year to give back to local communities. Grow with us: Work on exciting new projects; If you want to avoid getting stuck with the mundane, you're in the right place. We work in many sectors with fantastic clients, so you'll always be working on something exciting and challenging. Career growth - we've got you! We recognise that you might have a career path planned out and you might need some support to help you move forward. We're here to support you and make the most out of your time with us, through challenging work, opportunities to grow and learning and development opportunities. Be part of the TXP growth journey ; We are a high growth, fast paced environment. We currently have 200+ employees and work with clients across the UK. Joining TXP means you'll be part of that.
Gleeson Recruitment Group
Data Engineer
Gleeson Recruitment Group Leicester, Leicestershire
Data Engineer (SQL / Python) Onsite 3 times per week (Leicester or Nottingham office) £35K - £40K DOE Our client is looking to appoint a Data Engineer to join their expanding data team. This is an excellent opportunity for someone with solid foundational experience who is eager to develop their skills and grow within a modern, cloud-based data environment. Working alongside senior engineers and analysts, the successful candidate will contribute to the design and development of scalable data solutions while gaining exposure to cutting-edge technologies. The role will involve: Supporting the development and maintenance of cloud-based data pipelines Assisting in the design and optimisation of data models and architectures Working with analytics teams to ensure high-quality, reliable data outputs Contributing to best practices in data governance and engineering standards The successful candidate will ideally have: Experience with at least one cloud platform (Azure, AWS, or Snowflake) Exposure to Databricks or similar modern data processing tools Working knowledge of SQL and some experience with Python An understanding of data warehousing concepts A strong desire to learn, develop, and progress within a data engineering career Importantly, given the level of this role, our client is open to candidates who may not tick every box. They are keen to speak with individuals who demonstrate strong potential, a solid grasp of core concepts, and a genuine enthusiasm to build their technical capability within a supportive team environment. This role offers genuine progression, hands-on learning from experienced engineers, and the chance to be part of a collaborative and forward-thinking data team. Please apply asap if interested - GleeIT - Data Engineer At Gleeson Recruitment Group, we embrace inclusivity and welcome applicants of all backgrounds, experiences, and abilities. We are proud to be a disability confident employer.By applying you will be registered as a candidate with Gleeson Recruitment Limited. Our Privacy Policy is available on our website and explains how we will use your data.
22/04/2026
Full time
Data Engineer (SQL / Python) Onsite 3 times per week (Leicester or Nottingham office) £35K - £40K DOE Our client is looking to appoint a Data Engineer to join their expanding data team. This is an excellent opportunity for someone with solid foundational experience who is eager to develop their skills and grow within a modern, cloud-based data environment. Working alongside senior engineers and analysts, the successful candidate will contribute to the design and development of scalable data solutions while gaining exposure to cutting-edge technologies. The role will involve: Supporting the development and maintenance of cloud-based data pipelines Assisting in the design and optimisation of data models and architectures Working with analytics teams to ensure high-quality, reliable data outputs Contributing to best practices in data governance and engineering standards The successful candidate will ideally have: Experience with at least one cloud platform (Azure, AWS, or Snowflake) Exposure to Databricks or similar modern data processing tools Working knowledge of SQL and some experience with Python An understanding of data warehousing concepts A strong desire to learn, develop, and progress within a data engineering career Importantly, given the level of this role, our client is open to candidates who may not tick every box. They are keen to speak with individuals who demonstrate strong potential, a solid grasp of core concepts, and a genuine enthusiasm to build their technical capability within a supportive team environment. This role offers genuine progression, hands-on learning from experienced engineers, and the chance to be part of a collaborative and forward-thinking data team. Please apply asap if interested - GleeIT - Data Engineer At Gleeson Recruitment Group, we embrace inclusivity and welcome applicants of all backgrounds, experiences, and abilities. We are proud to be a disability confident employer.By applying you will be registered as a candidate with Gleeson Recruitment Limited. Our Privacy Policy is available on our website and explains how we will use your data.
TELSTRA Associates
Integration Engineer Kent/Remote
TELSTRA Associates
Integration Engineer Kent/Remote excellent salary We are looking for a strong Integration Engineer to work for our exciting client in Kent, the Integration Engineer will have as many of the skills below as possible. Integration Engineer Overall Purpose of the Role Develop and maintain integration solutions for the Common Good Platform (CGP), ensuring seamless data flow between applications, APIs, and the data platform. Build and operate GraphQL APIs and workflow orchestration using Apollo and Temporal IO. Collaborate with the Technology Lead to deliver platform capabilities aligned with business objectives. Ensure integration solutions comply with data governance, security standards, and GDPR requirements. Implement and maintain CI/CD pipelines, automated testing, and deployment processes. Key Accountabilities Design and implement GraphQL resolvers, schema definitions, and federated subgraphs. Build durable workflows using Temporal IO for loan life cycle and business processes. Develop and maintain automated test suites (unit, integration, end-to-end). Configure and maintain CI/CD pipelines using GitHub Actions. Integrate with the Databricks data platform via Unity Catalog and Delta Lake. Troubleshoot and resolve integration issues across distributed systems. Document integration patterns, APIs, and architectural decisions. Drive and collaborate in the continued adoption of AI enhanced development practices and SDLC
21/04/2026
Full time
Integration Engineer Kent/Remote excellent salary We are looking for a strong Integration Engineer to work for our exciting client in Kent, the Integration Engineer will have as many of the skills below as possible. Integration Engineer Overall Purpose of the Role Develop and maintain integration solutions for the Common Good Platform (CGP), ensuring seamless data flow between applications, APIs, and the data platform. Build and operate GraphQL APIs and workflow orchestration using Apollo and Temporal IO. Collaborate with the Technology Lead to deliver platform capabilities aligned with business objectives. Ensure integration solutions comply with data governance, security standards, and GDPR requirements. Implement and maintain CI/CD pipelines, automated testing, and deployment processes. Key Accountabilities Design and implement GraphQL resolvers, schema definitions, and federated subgraphs. Build durable workflows using Temporal IO for loan life cycle and business processes. Develop and maintain automated test suites (unit, integration, end-to-end). Configure and maintain CI/CD pipelines using GitHub Actions. Integrate with the Databricks data platform via Unity Catalog and Delta Lake. Troubleshoot and resolve integration issues across distributed systems. Document integration patterns, APIs, and architectural decisions. Drive and collaborate in the continued adoption of AI enhanced development practices and SDLC
Nexere Consulting Limited
Lead Data Platform Engineer - Databricks - IAC - Terraform - Azure Data Factory - Data Lakehouse
Nexere Consulting Limited
Lead Data Platform Engineer - Databricks - IAC - Terraform - Azure Data Factory - Data Lakehouse The Data Platform Engineer designs, develops, automates, and maintains secure, scalable, and compliant data platforms that enable the firm to efficiently manage, analyse, and utilise data. The role ensures that data solutions are robust and reliable while meeting regulatory obligations and safeguarding client confidentiality. Key Responsibilities Design and architect scalable, secure, and compliant data platforms and solutions, producing technical documentation and securing approvals through governance bodies such as Architecture Review Boards. Build and deliver robust data solutions using Databricks, PySpark, Spark SQL, Azure Data Factory, and Azure services. Develop APIs and write efficient Python, PySpark, and SQL code to support data integration, processing, and automation. Implement and manage CI/CD pipelines and automated deployments using Azure DevOps to enable reliable releases across environments. Develop and maintain infrastructure-as-code (eg, Terraform, ARM) to provision and manage cloud resources, including ADF pipelines, Databricks assets, and Unity Catalog components. Monitor, troubleshoot, and optimise data platform performance, reliability, and costs, identifying bottlenecks and recommending improvements. Create dashboards and observability tools to report on platform performance, usage, incidents, and operational KPIs. Knowledge, Skills & Experience Degree in Computer Science, Data Engineering, or a related field. Proven experience designing and building cloud-based data platforms, ideally within Azure. Strong hands-on expertise with Databricks, PySpark, Spark SQL, and Azure Data Factory. Solid understanding of Data Lakehouse architecture and modern data platform design. Proficiency in Python for data engineering, automation, and data processing. Experience developing and integrating REST APIs for data services. Strong DevOps experience, including CI/CD, automated testing, and release management for data platforms. Experience with Infrastructure as Code tools such as Terraform or ARM templates. Knowledge of data modelling, ETL/ELT pipelines, and data warehousing concepts. Familiarity with monitoring, logging, and alerting tools (eg, Azure Monitor). Desirable Experience with additional Azure services (eg, Fabric, Azure Functions, Logic Apps). Knowledge of cloud cost optimisation for data platforms. Understanding of data governance and regulatory compliance (eg, GDPR). Experience working in regulated or professional services environments.
21/04/2026
Full time
Lead Data Platform Engineer - Databricks - IAC - Terraform - Azure Data Factory - Data Lakehouse The Data Platform Engineer designs, develops, automates, and maintains secure, scalable, and compliant data platforms that enable the firm to efficiently manage, analyse, and utilise data. The role ensures that data solutions are robust and reliable while meeting regulatory obligations and safeguarding client confidentiality. Key Responsibilities Design and architect scalable, secure, and compliant data platforms and solutions, producing technical documentation and securing approvals through governance bodies such as Architecture Review Boards. Build and deliver robust data solutions using Databricks, PySpark, Spark SQL, Azure Data Factory, and Azure services. Develop APIs and write efficient Python, PySpark, and SQL code to support data integration, processing, and automation. Implement and manage CI/CD pipelines and automated deployments using Azure DevOps to enable reliable releases across environments. Develop and maintain infrastructure-as-code (eg, Terraform, ARM) to provision and manage cloud resources, including ADF pipelines, Databricks assets, and Unity Catalog components. Monitor, troubleshoot, and optimise data platform performance, reliability, and costs, identifying bottlenecks and recommending improvements. Create dashboards and observability tools to report on platform performance, usage, incidents, and operational KPIs. Knowledge, Skills & Experience Degree in Computer Science, Data Engineering, or a related field. Proven experience designing and building cloud-based data platforms, ideally within Azure. Strong hands-on expertise with Databricks, PySpark, Spark SQL, and Azure Data Factory. Solid understanding of Data Lakehouse architecture and modern data platform design. Proficiency in Python for data engineering, automation, and data processing. Experience developing and integrating REST APIs for data services. Strong DevOps experience, including CI/CD, automated testing, and release management for data platforms. Experience with Infrastructure as Code tools such as Terraform or ARM templates. Knowledge of data modelling, ETL/ELT pipelines, and data warehousing concepts. Familiarity with monitoring, logging, and alerting tools (eg, Azure Monitor). Desirable Experience with additional Azure services (eg, Fabric, Azure Functions, Logic Apps). Knowledge of cloud cost optimisation for data platforms. Understanding of data governance and regulatory compliance (eg, GDPR). Experience working in regulated or professional services environments.
IO Associates
Technical Architect - Data Platforms
IO Associates
Description We are seeking an experienced Technical Architect to support the design and evolution of large-scale, cloud-based data platforms across, working across our portfolio of clients. The Technical Architect will play a key role in shaping solution design patterns, ensuring alignment with established standards, and supporting a strategic transition and migrations across/to/from AWS & Azure. Key Responsibilities Define and evolve technical architecture patterns for data ingestion, processing, and access. Design scalable, resilient, and cost-efficient data solutions within a Hub and Spoke model. Support the design of new data ingestion pipelines (batch and Real Time). Ensure alignment with organisational architectural standards and governance frameworks. Contribute to target architecture roadmaps. Provide architectural guidance across: Data ingestion (Kafka, APIs, SFTP) Data processing (PySpark, EMR, Glue) Storage (S3 and data lake patterns) Collaborate with DevOps, Data Engineers, and Testers to ensure cohesive delivery. Promote engineering best practices, including CI/CD, infrastructure as code, and observability. Ensure robust handling of schema evolution and upstream data changes. Support onboarding of new data sources and services into the platform. Ensure solutions meet requirements for: Data quality and consistency Performance and scalability Security and compliance Work within defined data modelling ownership boundaries where applicable. Support cloud strategy evolution. Avoid platform lock-in and ensure portable, future-proof designs. Contribute to technical decision-making for future platform direction. Work in blended, cross-functional teams. Provide technical leadership and mentoring to delivery teams. Ensure effective knowledge transfer and capability uplift. Required Skills & Experience Strong experience designing modern cloud-based data platforms. Hands-on architectural experience with: AWS (essential): S3, EMR, Glue Kafka/event streaming architectures Python & PySpark-based data processing Experience designing data ingestion pipelines (batch and Real Time). Proficiency in Infrastructure as Code (Terraform). Experience with GitHub-based workflows and CI/CD pipelines. Experience with data lake and lakehouse architectures. Strong understanding of: Data ingestion patterns Data transformation and curation layers Data access and productisation Ability to design for large-scale datasets. Experience supporting cloud migrations Knowledge and experience with Azure & Microsoft Fabric & Databricks would be beneficial Familiarity with event-driven and streaming-first architectures at scale. Strong stakeholder engagement and cross-team collaboration skills. Ability to operate effectively within existing governance and standards. Pragmatic decision-making balancing delivery pace and technical quality. Clear communicator able to translate complex architecture into actionable guidance. Experience working in large, complex enterprise environments. This role will require the ability to obtain and hold UK SC Clearance
20/04/2026
Full time
Description We are seeking an experienced Technical Architect to support the design and evolution of large-scale, cloud-based data platforms across, working across our portfolio of clients. The Technical Architect will play a key role in shaping solution design patterns, ensuring alignment with established standards, and supporting a strategic transition and migrations across/to/from AWS & Azure. Key Responsibilities Define and evolve technical architecture patterns for data ingestion, processing, and access. Design scalable, resilient, and cost-efficient data solutions within a Hub and Spoke model. Support the design of new data ingestion pipelines (batch and Real Time). Ensure alignment with organisational architectural standards and governance frameworks. Contribute to target architecture roadmaps. Provide architectural guidance across: Data ingestion (Kafka, APIs, SFTP) Data processing (PySpark, EMR, Glue) Storage (S3 and data lake patterns) Collaborate with DevOps, Data Engineers, and Testers to ensure cohesive delivery. Promote engineering best practices, including CI/CD, infrastructure as code, and observability. Ensure robust handling of schema evolution and upstream data changes. Support onboarding of new data sources and services into the platform. Ensure solutions meet requirements for: Data quality and consistency Performance and scalability Security and compliance Work within defined data modelling ownership boundaries where applicable. Support cloud strategy evolution. Avoid platform lock-in and ensure portable, future-proof designs. Contribute to technical decision-making for future platform direction. Work in blended, cross-functional teams. Provide technical leadership and mentoring to delivery teams. Ensure effective knowledge transfer and capability uplift. Required Skills & Experience Strong experience designing modern cloud-based data platforms. Hands-on architectural experience with: AWS (essential): S3, EMR, Glue Kafka/event streaming architectures Python & PySpark-based data processing Experience designing data ingestion pipelines (batch and Real Time). Proficiency in Infrastructure as Code (Terraform). Experience with GitHub-based workflows and CI/CD pipelines. Experience with data lake and lakehouse architectures. Strong understanding of: Data ingestion patterns Data transformation and curation layers Data access and productisation Ability to design for large-scale datasets. Experience supporting cloud migrations Knowledge and experience with Azure & Microsoft Fabric & Databricks would be beneficial Familiarity with event-driven and streaming-first architectures at scale. Strong stakeholder engagement and cross-team collaboration skills. Ability to operate effectively within existing governance and standards. Pragmatic decision-making balancing delivery pace and technical quality. Clear communicator able to translate complex architecture into actionable guidance. Experience working in large, complex enterprise environments. This role will require the ability to obtain and hold UK SC Clearance
Gleeson Recruitment Group
Data Engineer
Gleeson Recruitment Group Leicester, Leicestershire
Data Engineer (SQL / Python) Onsite 3 times per week (Leicester or Nottingham office) 35K - 40K DOE Our client is looking to appoint a Data Engineer to join their expanding data team. This is an excellent opportunity for someone with solid foundational experience who is eager to develop their skills and grow within a modern, cloud-based data environment. Working alongside senior engineers and analysts, the successful candidate will contribute to the design and development of scalable data solutions while gaining exposure to cutting-edge technologies. The role will involve: Supporting the development and maintenance of cloud-based data pipelines Assisting in the design and optimisation of data models and architectures Working with analytics teams to ensure high-quality, reliable data outputs Contributing to best practices in data governance and engineering standards The successful candidate will ideally have: Experience with at least one cloud platform (Azure, AWS, or Snowflake) Exposure to Databricks or similar modern data processing tools Working knowledge of SQL and some experience with Python An understanding of data warehousing concepts A strong desire to learn, develop, and progress within a data engineering career Importantly, given the level of this role, our client is open to candidates who may not tick every box. They are keen to speak with individuals who demonstrate strong potential, a solid grasp of core concepts, and a genuine enthusiasm to build their technical capability within a supportive team environment. This role offers genuine progression, hands-on learning from experienced engineers, and the chance to be part of a collaborative and forward-thinking data team. Please apply asap if interested - GleeIT - Data Engineer At Gleeson Recruitment Group, we embrace inclusivity and welcome applicants of all backgrounds, experiences, and abilities. We are proud to be a disability confident employer. By applying you will be registered as a candidate with Gleeson Recruitment Limited. Our Privacy Policy is available on our website and explains how we will use your data.
17/04/2026
Full time
Data Engineer (SQL / Python) Onsite 3 times per week (Leicester or Nottingham office) 35K - 40K DOE Our client is looking to appoint a Data Engineer to join their expanding data team. This is an excellent opportunity for someone with solid foundational experience who is eager to develop their skills and grow within a modern, cloud-based data environment. Working alongside senior engineers and analysts, the successful candidate will contribute to the design and development of scalable data solutions while gaining exposure to cutting-edge technologies. The role will involve: Supporting the development and maintenance of cloud-based data pipelines Assisting in the design and optimisation of data models and architectures Working with analytics teams to ensure high-quality, reliable data outputs Contributing to best practices in data governance and engineering standards The successful candidate will ideally have: Experience with at least one cloud platform (Azure, AWS, or Snowflake) Exposure to Databricks or similar modern data processing tools Working knowledge of SQL and some experience with Python An understanding of data warehousing concepts A strong desire to learn, develop, and progress within a data engineering career Importantly, given the level of this role, our client is open to candidates who may not tick every box. They are keen to speak with individuals who demonstrate strong potential, a solid grasp of core concepts, and a genuine enthusiasm to build their technical capability within a supportive team environment. This role offers genuine progression, hands-on learning from experienced engineers, and the chance to be part of a collaborative and forward-thinking data team. Please apply asap if interested - GleeIT - Data Engineer At Gleeson Recruitment Group, we embrace inclusivity and welcome applicants of all backgrounds, experiences, and abilities. We are proud to be a disability confident employer. By applying you will be registered as a candidate with Gleeson Recruitment Limited. Our Privacy Policy is available on our website and explains how we will use your data.
MFK Recruitment
Senior Product Engineer, Full Stack - Energy
MFK Recruitment
Senior Product Engineer, Full Stack - Energy 70,000 to 95,000 Permanent London, hybrid The opportunity Our client is hiring a Senior Product Engineer, Full Stack, to help build the next generation of customer-facing software in the energy market. Operating in a complex, high-value sector, the business is combining technology, data, and product thinking to modernise an area of the energy industry that has historically been underserved by great software. We have already placed 3 people into the business and are pleased to be supporting them again on this important hire. This is a genuinely exciting opportunity for a frontend-strong engineer who wants real ownership, autonomy, and the chance to build products from scratch. They are looking for someone with strong React and TypeScript capability, alongside solid backend exposure across Python, APIs, and databases. As a Senior Product Engineer, Full Stack, you will play a key role in shaping what gets built, how it is built, and how it evolves as the business scales. For the right person, this is a chance to join a growing company at the right stage and make a visible impact. The role is based in Mayfair, London, with a hybrid setup of 2 to 3 days per week in the office. The role This Senior Product Engineer, Full Stack role has a clear frontend lean. They need someone who can build high-quality frontend products from scratch, while also working confidently across backend services, integrations, and data-driven workflows. The frontend is a central part of the brief, but they want someone who understands the wider system and can contribute beyond the UI. The product sits in a data-rich environment, with workflows across billing, metering, reporting, consumption, and asset performance. They need an engineer who can take complexity and turn it into clean, reliable software that customers genuinely value. This is a high-ownership role in a growing team, well suited to someone who enjoys pace, autonomy, and being trusted to deliver. What you will be doing Building customer-facing products from scratch using React, TypeScript, and modern frontend tooling Owning frontend quality, user experience, and engineering standards Contributing across the stack, including backend integrations, APIs, product logic, and Python-based services Working with SQL, Databricks, and related data systems Turning complex operational and commercial workflows into clear product experiences Collaborating closely with product, engineering, data, and leadership Helping shape technical standards and architecture as the team grows What they are looking for Strong React and TypeScript experience A frontend-strong engineer who can build products from scratch Solid backend exposure, ideally with Python Experience with databases and data platforms such as SQL and Databricks Experience building customer-facing software in a commercial environment Strong product instincts and a practical, hands-on approach Comfortable with ownership, pace, and autonomy Nice to have Energy, trading, or fintech experience Startup or scale-up background Data visualisation experience Interest in AI tools and modern engineering practices Why join High ownership from day one The chance to build frontend products from scratch Real influence over product and technical direction Broad exposure across frontend, backend, data, and product Strong long-term growth potential as the business scales Summary This is a strong opportunity for a Senior Product Engineer, Full Stack who wants to build meaningful products in the energy market, with real autonomy and scope to grow. If you are strong in React and TypeScript, but also comfortable across backend systems, Python, APIs, and data platforms, this role offers the chance to make a genuine impact in a growing business.
15/04/2026
Full time
Senior Product Engineer, Full Stack - Energy 70,000 to 95,000 Permanent London, hybrid The opportunity Our client is hiring a Senior Product Engineer, Full Stack, to help build the next generation of customer-facing software in the energy market. Operating in a complex, high-value sector, the business is combining technology, data, and product thinking to modernise an area of the energy industry that has historically been underserved by great software. We have already placed 3 people into the business and are pleased to be supporting them again on this important hire. This is a genuinely exciting opportunity for a frontend-strong engineer who wants real ownership, autonomy, and the chance to build products from scratch. They are looking for someone with strong React and TypeScript capability, alongside solid backend exposure across Python, APIs, and databases. As a Senior Product Engineer, Full Stack, you will play a key role in shaping what gets built, how it is built, and how it evolves as the business scales. For the right person, this is a chance to join a growing company at the right stage and make a visible impact. The role is based in Mayfair, London, with a hybrid setup of 2 to 3 days per week in the office. The role This Senior Product Engineer, Full Stack role has a clear frontend lean. They need someone who can build high-quality frontend products from scratch, while also working confidently across backend services, integrations, and data-driven workflows. The frontend is a central part of the brief, but they want someone who understands the wider system and can contribute beyond the UI. The product sits in a data-rich environment, with workflows across billing, metering, reporting, consumption, and asset performance. They need an engineer who can take complexity and turn it into clean, reliable software that customers genuinely value. This is a high-ownership role in a growing team, well suited to someone who enjoys pace, autonomy, and being trusted to deliver. What you will be doing Building customer-facing products from scratch using React, TypeScript, and modern frontend tooling Owning frontend quality, user experience, and engineering standards Contributing across the stack, including backend integrations, APIs, product logic, and Python-based services Working with SQL, Databricks, and related data systems Turning complex operational and commercial workflows into clear product experiences Collaborating closely with product, engineering, data, and leadership Helping shape technical standards and architecture as the team grows What they are looking for Strong React and TypeScript experience A frontend-strong engineer who can build products from scratch Solid backend exposure, ideally with Python Experience with databases and data platforms such as SQL and Databricks Experience building customer-facing software in a commercial environment Strong product instincts and a practical, hands-on approach Comfortable with ownership, pace, and autonomy Nice to have Energy, trading, or fintech experience Startup or scale-up background Data visualisation experience Interest in AI tools and modern engineering practices Why join High ownership from day one The chance to build frontend products from scratch Real influence over product and technical direction Broad exposure across frontend, backend, data, and product Strong long-term growth potential as the business scales Summary This is a strong opportunity for a Senior Product Engineer, Full Stack who wants to build meaningful products in the energy market, with real autonomy and scope to grow. If you are strong in React and TypeScript, but also comfortable across backend systems, Python, APIs, and data platforms, this role offers the chance to make a genuine impact in a growing business.
Akkodis
Lead Cloud Architect (must be eligible for SC clearance)
Akkodis Stevenage, Hertfordshire
Lead Cloud Architect (must be eligible for SC clearance) 80,000 - 120,000 dependant on experience plus benefits Full Time / Permanent Hybrid - 3 days a week in Stevenage, The Company Akkodis is a global leader in engineering, technology, and R&D, harnessing the power of connected data to drive digital transformation and innovation for a smarter, more sustainable future. As part of the Adecco Group, Akkodis employs over 50,000 engineers and digital specialists across 30 countries in North America, EMEA, and APAC. Our teams bring extensive cross-sector knowledge in critical technology areas such as mobility, software services, robotics, simulations, cybersecurity, AI, and data analytics, enabling clients to tackle complex challenges in today's rapidly evolving markets. The Role The Lead Cloud Architect will act as the senior technical authority within the IT & Digital Practice, responsible for defining, governing, and assuring end-to-end solution delivery across complex digital transformation programmes. This role provides architectural leadership across cloud, data, software integration, and application domains, ensuring solutions are secure, scalable, cost-efficient, and aligned with enterprise and customer strategies. You will work closely with Programme Managers, Solution Architects, and senior customer stakeholders to shape solution direction, make key technical decisions, and establish technical standards across multiple disciplines. This is a hybrid role with the successful candidate required to be in the Stevenage head office 3 days a week on average. You must either hold or be eligible for SC clearance. Responsibilities Own the overall technical vision and end-to-end solution architecture for one or more major programmes. Define and maintain architecture blueprints, patterns, and principles aligned with enterprise and cloud standards. Evaluate emerging technologies and recommend adoption strategies that support business and digital goals. Ensure all technical solutions align with business outcomes, regulatory obligations, and security requirements. Serve as the final escalation point for architectural and design decisions across delivery workstreams. Approve key design artefacts, including high- and low-level designs, integration models, and security architectures. Oversee technical risk management and mitigate architectural issues early in the lifecycle. Ensure consistent use of DevOps, CI/CD, and Infrastructure-as-Code practices across AWS and Azure environments. Maintain high-quality solution documentation and assurance artefacts. Support programme planning by translating architecture into actionable delivery plans. Provide hands-on technical leadership and guidance to multidisciplinary delivery teams. Engage with senior stakeholders to communicate solution direction, risks, and recommendations. Define and maintain solution design and architecture governance standards. Champion secure, well-architected, and cost-optimised cloud solutions through formal design reviews. Mentor and develop Solution Architects, Data Architects, and Technical Leads across the practice. Skills and Experience Extensive experience in solution architecture, with at least 3 years in a lead or principal capacity. Proven experience designing and governing complex, multi-cloud or hybrid solutions. Deep technical expertise in cloud, data, integration, and security architecture. Strong understanding of enterprise data platforms (Databricks, S3, Redshift), integration patterns (API Gateway, AppFlow, Logic Apps), and cloud-native services. Demonstrable leadership in delivering large-scale transformation or digital programmes. Proficiency in DevOps tooling (Terraform, GitHub, CodePipeline, Azure DevOps) and CI/CD practices. Exceptional communication, stakeholder engagement, and technical decision-making capabilities. Must be AWS Certified Solutions Architect Additional Azure Solutions Architect and TOGAF certifications also preferred. Must already hold or be eligible for SC clearance. Please apply via the link or contact (url removed) for more information Modis International Ltd acts as an employment agency for permanent recruitment and an employment business for the supply of temporary workers in the UK. Modis Europe Ltd provide a variety of international solutions that connect clients to the best talent in the world. For all positions based in Switzerland, Modis Europe Ltd works with its licensed Swiss partner Accurity GmbH to ensure that candidate applications are handled in accordance with Swiss law. Both Modis International Ltd and Modis Europe Ltd are Equal Opportunities Employers. By applying for this role your details will be submitted to Modis International Ltd and/ or Modis Europe Ltd. Our Candidate Privacy Information Statement which explains how we will use your information is available on the Modis website.
15/04/2026
Full time
Lead Cloud Architect (must be eligible for SC clearance) 80,000 - 120,000 dependant on experience plus benefits Full Time / Permanent Hybrid - 3 days a week in Stevenage, The Company Akkodis is a global leader in engineering, technology, and R&D, harnessing the power of connected data to drive digital transformation and innovation for a smarter, more sustainable future. As part of the Adecco Group, Akkodis employs over 50,000 engineers and digital specialists across 30 countries in North America, EMEA, and APAC. Our teams bring extensive cross-sector knowledge in critical technology areas such as mobility, software services, robotics, simulations, cybersecurity, AI, and data analytics, enabling clients to tackle complex challenges in today's rapidly evolving markets. The Role The Lead Cloud Architect will act as the senior technical authority within the IT & Digital Practice, responsible for defining, governing, and assuring end-to-end solution delivery across complex digital transformation programmes. This role provides architectural leadership across cloud, data, software integration, and application domains, ensuring solutions are secure, scalable, cost-efficient, and aligned with enterprise and customer strategies. You will work closely with Programme Managers, Solution Architects, and senior customer stakeholders to shape solution direction, make key technical decisions, and establish technical standards across multiple disciplines. This is a hybrid role with the successful candidate required to be in the Stevenage head office 3 days a week on average. You must either hold or be eligible for SC clearance. Responsibilities Own the overall technical vision and end-to-end solution architecture for one or more major programmes. Define and maintain architecture blueprints, patterns, and principles aligned with enterprise and cloud standards. Evaluate emerging technologies and recommend adoption strategies that support business and digital goals. Ensure all technical solutions align with business outcomes, regulatory obligations, and security requirements. Serve as the final escalation point for architectural and design decisions across delivery workstreams. Approve key design artefacts, including high- and low-level designs, integration models, and security architectures. Oversee technical risk management and mitigate architectural issues early in the lifecycle. Ensure consistent use of DevOps, CI/CD, and Infrastructure-as-Code practices across AWS and Azure environments. Maintain high-quality solution documentation and assurance artefacts. Support programme planning by translating architecture into actionable delivery plans. Provide hands-on technical leadership and guidance to multidisciplinary delivery teams. Engage with senior stakeholders to communicate solution direction, risks, and recommendations. Define and maintain solution design and architecture governance standards. Champion secure, well-architected, and cost-optimised cloud solutions through formal design reviews. Mentor and develop Solution Architects, Data Architects, and Technical Leads across the practice. Skills and Experience Extensive experience in solution architecture, with at least 3 years in a lead or principal capacity. Proven experience designing and governing complex, multi-cloud or hybrid solutions. Deep technical expertise in cloud, data, integration, and security architecture. Strong understanding of enterprise data platforms (Databricks, S3, Redshift), integration patterns (API Gateway, AppFlow, Logic Apps), and cloud-native services. Demonstrable leadership in delivering large-scale transformation or digital programmes. Proficiency in DevOps tooling (Terraform, GitHub, CodePipeline, Azure DevOps) and CI/CD practices. Exceptional communication, stakeholder engagement, and technical decision-making capabilities. Must be AWS Certified Solutions Architect Additional Azure Solutions Architect and TOGAF certifications also preferred. Must already hold or be eligible for SC clearance. Please apply via the link or contact (url removed) for more information Modis International Ltd acts as an employment agency for permanent recruitment and an employment business for the supply of temporary workers in the UK. Modis Europe Ltd provide a variety of international solutions that connect clients to the best talent in the world. For all positions based in Switzerland, Modis Europe Ltd works with its licensed Swiss partner Accurity GmbH to ensure that candidate applications are handled in accordance with Swiss law. Both Modis International Ltd and Modis Europe Ltd are Equal Opportunities Employers. By applying for this role your details will be submitted to Modis International Ltd and/ or Modis Europe Ltd. Our Candidate Privacy Information Statement which explains how we will use your information is available on the Modis website.
Hays Technology
Data Engineer
Hays Technology Rogerstone, Gwent
Your new company Monmouthshire Building Society is an independent mutual with over 160 years of history serving communities across South Wales and the South West of England. We offer mortgages and savings built around our members' needs, combining personal service, local knowledge and responsible lending. As a mutual, we reinvest profits back into the Society to support our members, our people and the communities we serve. Your new role The Data Engineer will design, build, support and operate data pipelines and core data platform technologies within the organisation's Azure-based analytics and lake house environment. The role is hands-on and implementation-focused, working within established architectural patterns to deliver reliable, secure and well-governed data for analytics, Power BI reporting and regulated finance use cases. The role also includes responsibility for supporting and maintaining underlying data platform services, including Azure Databricks and Azure Synapse. Key Accountabilities Delivery of reliable, scalable and well-governed data pipelines that meet business and regulatory requirements Operational stability and performance of core data platform services including Azure Databricks and Azure Synapse Availability of trusted, high-quality Gold-layer datasets for analytics and Power BI reporting Adherence to data security, access control and governance standards within a regulated environment Effective resolution of data pipeline and platform incidents with clear root-cause understanding Maintenance of accurate documentation, lineage and operational run books to support ongoing platform support Key Responsibilities Develop and maintain data pipelines using Azure Databricks (Python, PySpark and SQL) Implement ingestion and transformation across Bronze, Silver and Gold layers using Delta Lake Support and operate Azure Databricks and Azure Synapse platforms, including performance monitoring and configuration Build Gold-layer datasets optimised for Power BI and analytics consumption Support production workloads, resolve incidents and maintain operational documentation Apply security, governance and access control standards suitable for a regulated environment Contribute to engineering standards, CI/CD pipelines and code reviews What you'll need to succeed Essential Proven experience as a Data Engineer Strong hands-on experience with AzureDatabricks, Python, PySpark and SQL Experience supporting Azure data platforms in production Understanding of medallion architecture and Delta Lake Desirable Experience with Azure Synapse Analytics Experience with Azure Data Factory Experience in financial services or regulated environments Dimensional modelling / star schema knowledge Databricks Certification What you'll get in return Competitive Salary Great Career Progression Hybrid Working Policy - 2 days a week in the office. Benefits Package 25 days paid holiday plus bank holidays Holidays - Purchase extra annual leave (up to 5 days) Additional day's annual leave on birthday Contributory pension scheme - enrolment after 3rd month -3% employee contributions, 10% employer contributions 4 X life assurance Private healthcare scheme (after one year's service) Employee Assistance Programme Corporate uniform for branch staff Health and wellbeing benefits including flu jabs and eye tests Staff Socials 35 hour working week What you need to do now If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at (url removed)
15/04/2026
Full time
Your new company Monmouthshire Building Society is an independent mutual with over 160 years of history serving communities across South Wales and the South West of England. We offer mortgages and savings built around our members' needs, combining personal service, local knowledge and responsible lending. As a mutual, we reinvest profits back into the Society to support our members, our people and the communities we serve. Your new role The Data Engineer will design, build, support and operate data pipelines and core data platform technologies within the organisation's Azure-based analytics and lake house environment. The role is hands-on and implementation-focused, working within established architectural patterns to deliver reliable, secure and well-governed data for analytics, Power BI reporting and regulated finance use cases. The role also includes responsibility for supporting and maintaining underlying data platform services, including Azure Databricks and Azure Synapse. Key Accountabilities Delivery of reliable, scalable and well-governed data pipelines that meet business and regulatory requirements Operational stability and performance of core data platform services including Azure Databricks and Azure Synapse Availability of trusted, high-quality Gold-layer datasets for analytics and Power BI reporting Adherence to data security, access control and governance standards within a regulated environment Effective resolution of data pipeline and platform incidents with clear root-cause understanding Maintenance of accurate documentation, lineage and operational run books to support ongoing platform support Key Responsibilities Develop and maintain data pipelines using Azure Databricks (Python, PySpark and SQL) Implement ingestion and transformation across Bronze, Silver and Gold layers using Delta Lake Support and operate Azure Databricks and Azure Synapse platforms, including performance monitoring and configuration Build Gold-layer datasets optimised for Power BI and analytics consumption Support production workloads, resolve incidents and maintain operational documentation Apply security, governance and access control standards suitable for a regulated environment Contribute to engineering standards, CI/CD pipelines and code reviews What you'll need to succeed Essential Proven experience as a Data Engineer Strong hands-on experience with AzureDatabricks, Python, PySpark and SQL Experience supporting Azure data platforms in production Understanding of medallion architecture and Delta Lake Desirable Experience with Azure Synapse Analytics Experience with Azure Data Factory Experience in financial services or regulated environments Dimensional modelling / star schema knowledge Databricks Certification What you'll get in return Competitive Salary Great Career Progression Hybrid Working Policy - 2 days a week in the office. Benefits Package 25 days paid holiday plus bank holidays Holidays - Purchase extra annual leave (up to 5 days) Additional day's annual leave on birthday Contributory pension scheme - enrolment after 3rd month -3% employee contributions, 10% employer contributions 4 X life assurance Private healthcare scheme (after one year's service) Employee Assistance Programme Corporate uniform for branch staff Health and wellbeing benefits including flu jabs and eye tests Staff Socials 35 hour working week What you need to do now If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at (url removed)

Modal Window

  • Home
  • Contact
  • About Us
  • FAQs
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • IT blog
  • Facebook
  • Twitter
  • LinkedIn
  • Youtube
© 2008-2026 IT Job Board