it job board logo
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
  • Recruiting? Post a job
  • Sign in
  • Sign up
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

21 jobs found

Email me jobs like this
Refine Search
Current Search
microsoft fabric data platform engineer
Hays Specialist Recruitment Limited
Data Engineer (Fabric)
Hays Specialist Recruitment Limited Sheffield, Yorkshire
Data Engineer (Fabric) Sheffield City Centre Up to £50,000 Your new role Working closely with stakeholders across the UK, US and European businesses, you will help ensure data is accessible, trusted, and used effectively to support operational and strategic decision-making. This is a hands-on role combining data engineering with analytical delivery, reporting to the Global IT Manager, with a clear opportunity over time to take on greater technical ownership and help shape the direction of the data platform as it continues to mature. Responsibilities Champion data innovation within a forward-thinking manufacturing environment, integrating insights from energy systems, operational technology, ERP, and CRM platforms to drive operational improvement and support smarter financial and strategic decision-making Design, build, and continuously improve scalable end-to-end data pipelines within Microsoft Fabric, implementing and maintaining medallion architecture standards across ingestion, transformation, and presentation layers Develop and optimise trusted data models and semantic layers that enable high-quality reporting, self-service analytics, and advanced business insight Collaborate closely with stakeholders across the organisation to translate business requirements into practical data solutions and support analytics projects from discovery through to adoption and business use Shape and evolve platform standards, governance, and security practices while maintaining high levels of data quality, reliability, and performance, continuously identifying opportunities to improve tools, processes, and ways of working, and actively developing your own technical and professional capability Experience needed Hands-on experience with Microsoft Fabric, including Lakehouse, Dataflows, Notebooks, Pipelines, and workspace management, with the ability to design and support scalable data solutions across ingestion, transformation, and presentation layers Strong SQL skills, including developing, optimising, and troubleshooting queries to support data transformation and analytical models Understanding of medallion architecture and modern data engineering best practices, including data pipeline design, version control, testing approaches, and performance optimisation techniques Confident in building and maintaining robust data models and semantic layers that support high-quality reporting, self-service analytics, and advanced insight using tools such as Power BI or similar analytics platforms Proficient in preparing and validating real-world datasets, including cleansing, handling missing or duplicate records, standardising inputs, and performing exploratory analysis using Excel Experience working with structured business data from enterprise systems such as ERP and CRM platforms, with an understanding of data relationships, master data concepts, and business process integration Strong, clear communicator able to translate complex technical topics into simple, meaningful insights and narratives for a range of audiences Strong stakeholder engagement skills, building trusted relationships across business and IT teams Proactive and self-driven, with the ability to take ownership of work and see it through to completion Structured thinker with a logical approach to problem-solving and analysis Open to feedback and continuous improvement, with a growth mindset Desirable certifications (Microsoft Power BI Data Analyst, Microsoft Fabric Analytics Engineer Associate, Microsoft Fabric Data Engineer Associate) Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at hays.co.uk
21/04/2026
Full time
Data Engineer (Fabric) Sheffield City Centre Up to £50,000 Your new role Working closely with stakeholders across the UK, US and European businesses, you will help ensure data is accessible, trusted, and used effectively to support operational and strategic decision-making. This is a hands-on role combining data engineering with analytical delivery, reporting to the Global IT Manager, with a clear opportunity over time to take on greater technical ownership and help shape the direction of the data platform as it continues to mature. Responsibilities Champion data innovation within a forward-thinking manufacturing environment, integrating insights from energy systems, operational technology, ERP, and CRM platforms to drive operational improvement and support smarter financial and strategic decision-making Design, build, and continuously improve scalable end-to-end data pipelines within Microsoft Fabric, implementing and maintaining medallion architecture standards across ingestion, transformation, and presentation layers Develop and optimise trusted data models and semantic layers that enable high-quality reporting, self-service analytics, and advanced business insight Collaborate closely with stakeholders across the organisation to translate business requirements into practical data solutions and support analytics projects from discovery through to adoption and business use Shape and evolve platform standards, governance, and security practices while maintaining high levels of data quality, reliability, and performance, continuously identifying opportunities to improve tools, processes, and ways of working, and actively developing your own technical and professional capability Experience needed Hands-on experience with Microsoft Fabric, including Lakehouse, Dataflows, Notebooks, Pipelines, and workspace management, with the ability to design and support scalable data solutions across ingestion, transformation, and presentation layers Strong SQL skills, including developing, optimising, and troubleshooting queries to support data transformation and analytical models Understanding of medallion architecture and modern data engineering best practices, including data pipeline design, version control, testing approaches, and performance optimisation techniques Confident in building and maintaining robust data models and semantic layers that support high-quality reporting, self-service analytics, and advanced insight using tools such as Power BI or similar analytics platforms Proficient in preparing and validating real-world datasets, including cleansing, handling missing or duplicate records, standardising inputs, and performing exploratory analysis using Excel Experience working with structured business data from enterprise systems such as ERP and CRM platforms, with an understanding of data relationships, master data concepts, and business process integration Strong, clear communicator able to translate complex technical topics into simple, meaningful insights and narratives for a range of audiences Strong stakeholder engagement skills, building trusted relationships across business and IT teams Proactive and self-driven, with the ability to take ownership of work and see it through to completion Structured thinker with a logical approach to problem-solving and analysis Open to feedback and continuous improvement, with a growth mindset Desirable certifications (Microsoft Power BI Data Analyst, Microsoft Fabric Analytics Engineer Associate, Microsoft Fabric Data Engineer Associate) Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at hays.co.uk
Tenth Revolution Group
Data Architect
Tenth Revolution Group
Data Architect - Data Modelling, Architecture & Governance - London (Hybrid) - Up to £75,000 This is an opportunity to join an organisation that is investing in its data and digital capabilities, with a clear focus on building strong, well-designed data foundations. You'll be part of a collaborative environment where people care about good design, governance and making improvements that last. If you enjoy tackling complex data challenges, influencing how things are done and working in a setting where your input is valued, this role should be of interest. The Role This is an architecture-focused role, centred on how data is structured, governed and used across the organisation. It is not a hands-on data engineering position and it is not aimed at someone looking to step into data architecture for the first time. You will work with: Colleagues across Enterprise Architecture, BI, Data Governance and Technology Services Microsoft D365 and a modern medallion-style data architecture Data Architecture Assessments to support project and programme delivery The organisation's Data Architecture Framework, helping maintain consistency and good governance Translating business requirements into clear, scalable data models and architecture Creating and presenting architecture artefacts that make sense to both technical and non-technical audiences The organisation is working towards a centralised data platform using Microsoft Fabric, with a strong emphasis on putting the right architecture and models in place before focusing on technology delivery. What We're Looking For We're looking for a Data Architect with a solid grounding in data modelling and architecture principles. You should be comfortable designing data structures that support the business over the long term and confident in your architectural judgement. Key Experience Experience in a Data Architecture or architecture-led role within a complex environment Strong understanding of conceptual, logical and physical data modelling Knowledge of data modelling methodologies such as 3NF, Kimball and OBT A background in data modelling and/or data governance, with experience designing scalable data architectures Comfortable working technology-agnostic, providing architectural oversight rather than hands-on engineering Able to explain architectural concepts clearly to non-technical stakeholders, with a curious and thoughtful approach This role focuses on architecture, design and governance, not hands-on data engineering. Benefits 30 days annual leave Enhanced pension scheme - employer contributions start at 8% and can rise to 14% Hybrid working (2 days per week in the London office, with some flexibility) Enhanced maternity and paternity leave 24-hour Employee Assistance Programme Cycle to Work scheme) Season ticket loans Interested? If you're a Data Architect who enjoys designing strong data foundations and influencing how data is used across an organisation, this is well worth a conversation. Apply now!
21/04/2026
Full time
Data Architect - Data Modelling, Architecture & Governance - London (Hybrid) - Up to £75,000 This is an opportunity to join an organisation that is investing in its data and digital capabilities, with a clear focus on building strong, well-designed data foundations. You'll be part of a collaborative environment where people care about good design, governance and making improvements that last. If you enjoy tackling complex data challenges, influencing how things are done and working in a setting where your input is valued, this role should be of interest. The Role This is an architecture-focused role, centred on how data is structured, governed and used across the organisation. It is not a hands-on data engineering position and it is not aimed at someone looking to step into data architecture for the first time. You will work with: Colleagues across Enterprise Architecture, BI, Data Governance and Technology Services Microsoft D365 and a modern medallion-style data architecture Data Architecture Assessments to support project and programme delivery The organisation's Data Architecture Framework, helping maintain consistency and good governance Translating business requirements into clear, scalable data models and architecture Creating and presenting architecture artefacts that make sense to both technical and non-technical audiences The organisation is working towards a centralised data platform using Microsoft Fabric, with a strong emphasis on putting the right architecture and models in place before focusing on technology delivery. What We're Looking For We're looking for a Data Architect with a solid grounding in data modelling and architecture principles. You should be comfortable designing data structures that support the business over the long term and confident in your architectural judgement. Key Experience Experience in a Data Architecture or architecture-led role within a complex environment Strong understanding of conceptual, logical and physical data modelling Knowledge of data modelling methodologies such as 3NF, Kimball and OBT A background in data modelling and/or data governance, with experience designing scalable data architectures Comfortable working technology-agnostic, providing architectural oversight rather than hands-on engineering Able to explain architectural concepts clearly to non-technical stakeholders, with a curious and thoughtful approach This role focuses on architecture, design and governance, not hands-on data engineering. Benefits 30 days annual leave Enhanced pension scheme - employer contributions start at 8% and can rise to 14% Hybrid working (2 days per week in the London office, with some flexibility) Enhanced maternity and paternity leave 24-hour Employee Assistance Programme Cycle to Work scheme) Season ticket loans Interested? If you're a Data Architect who enjoys designing strong data foundations and influencing how data is used across an organisation, this is well worth a conversation. Apply now!
IO Associates
Technical Architect - Data Platforms
IO Associates
Description We are seeking an experienced Technical Architect to support the design and evolution of large-scale, cloud-based data platforms across, working across our portfolio of clients. The Technical Architect will play a key role in shaping solution design patterns, ensuring alignment with established standards, and supporting a strategic transition and migrations across/to/from AWS & Azure. Key Responsibilities Define and evolve technical architecture patterns for data ingestion, processing, and access. Design scalable, resilient, and cost-efficient data solutions within a Hub and Spoke model. Support the design of new data ingestion pipelines (batch and Real Time). Ensure alignment with organisational architectural standards and governance frameworks. Contribute to target architecture roadmaps. Provide architectural guidance across: Data ingestion (Kafka, APIs, SFTP) Data processing (PySpark, EMR, Glue) Storage (S3 and data lake patterns) Collaborate with DevOps, Data Engineers, and Testers to ensure cohesive delivery. Promote engineering best practices, including CI/CD, infrastructure as code, and observability. Ensure robust handling of schema evolution and upstream data changes. Support onboarding of new data sources and services into the platform. Ensure solutions meet requirements for: Data quality and consistency Performance and scalability Security and compliance Work within defined data modelling ownership boundaries where applicable. Support cloud strategy evolution. Avoid platform lock-in and ensure portable, future-proof designs. Contribute to technical decision-making for future platform direction. Work in blended, cross-functional teams. Provide technical leadership and mentoring to delivery teams. Ensure effective knowledge transfer and capability uplift. Required Skills & Experience Strong experience designing modern cloud-based data platforms. Hands-on architectural experience with: AWS (essential): S3, EMR, Glue Kafka/event streaming architectures Python & PySpark-based data processing Experience designing data ingestion pipelines (batch and Real Time). Proficiency in Infrastructure as Code (Terraform). Experience with GitHub-based workflows and CI/CD pipelines. Experience with data lake and lakehouse architectures. Strong understanding of: Data ingestion patterns Data transformation and curation layers Data access and productisation Ability to design for large-scale datasets. Experience supporting cloud migrations Knowledge and experience with Azure & Microsoft Fabric & Databricks would be beneficial Familiarity with event-driven and streaming-first architectures at scale. Strong stakeholder engagement and cross-team collaboration skills. Ability to operate effectively within existing governance and standards. Pragmatic decision-making balancing delivery pace and technical quality. Clear communicator able to translate complex architecture into actionable guidance. Experience working in large, complex enterprise environments. This role will require the ability to obtain and hold UK SC Clearance
20/04/2026
Full time
Description We are seeking an experienced Technical Architect to support the design and evolution of large-scale, cloud-based data platforms across, working across our portfolio of clients. The Technical Architect will play a key role in shaping solution design patterns, ensuring alignment with established standards, and supporting a strategic transition and migrations across/to/from AWS & Azure. Key Responsibilities Define and evolve technical architecture patterns for data ingestion, processing, and access. Design scalable, resilient, and cost-efficient data solutions within a Hub and Spoke model. Support the design of new data ingestion pipelines (batch and Real Time). Ensure alignment with organisational architectural standards and governance frameworks. Contribute to target architecture roadmaps. Provide architectural guidance across: Data ingestion (Kafka, APIs, SFTP) Data processing (PySpark, EMR, Glue) Storage (S3 and data lake patterns) Collaborate with DevOps, Data Engineers, and Testers to ensure cohesive delivery. Promote engineering best practices, including CI/CD, infrastructure as code, and observability. Ensure robust handling of schema evolution and upstream data changes. Support onboarding of new data sources and services into the platform. Ensure solutions meet requirements for: Data quality and consistency Performance and scalability Security and compliance Work within defined data modelling ownership boundaries where applicable. Support cloud strategy evolution. Avoid platform lock-in and ensure portable, future-proof designs. Contribute to technical decision-making for future platform direction. Work in blended, cross-functional teams. Provide technical leadership and mentoring to delivery teams. Ensure effective knowledge transfer and capability uplift. Required Skills & Experience Strong experience designing modern cloud-based data platforms. Hands-on architectural experience with: AWS (essential): S3, EMR, Glue Kafka/event streaming architectures Python & PySpark-based data processing Experience designing data ingestion pipelines (batch and Real Time). Proficiency in Infrastructure as Code (Terraform). Experience with GitHub-based workflows and CI/CD pipelines. Experience with data lake and lakehouse architectures. Strong understanding of: Data ingestion patterns Data transformation and curation layers Data access and productisation Ability to design for large-scale datasets. Experience supporting cloud migrations Knowledge and experience with Azure & Microsoft Fabric & Databricks would be beneficial Familiarity with event-driven and streaming-first architectures at scale. Strong stakeholder engagement and cross-team collaboration skills. Ability to operate effectively within existing governance and standards. Pragmatic decision-making balancing delivery pace and technical quality. Clear communicator able to translate complex architecture into actionable guidance. Experience working in large, complex enterprise environments. This role will require the ability to obtain and hold UK SC Clearance
Senior Data Engineer - Microsoft Fabric
Roc Search Europe Limited City, Leeds
Senior Data Engineer - Microsoft Fabric We're looking for an experienced Senior Data Engineer to join a growing team building a modern Microsoft Fabric data platform . This is a hands-on role designing and delivering scalable data pipelines, Lakehouse solutions, and analytics models within the Azure ecosystem . What You'll Do: Build and maintain ETL/ELT pipelines and data models in Fabric (Data Factory, Notebooks, Spark) Write high-performance Spark SQL, T-SQL, Python/PySpark Manage ingestion, transformation, and loading from multiple sources Translate stakeholder requirements into scalable technical solutions Mentor team members and establish engineering standards, security, and governance Leverage AI-assisted development tools like GitHub Copilot, ChatGPT, and Fabric Copilot Essential Experience: Microsoft Fabric & Azure Data ecosystem Lakehouse architectures & Data Factory Python, PySpark, Spark SQL Proven hands-on delivery in this stack What's on Offer: Salary: 70,000 Excellent benefits & annual leave package Strong progression & development opportunities Opportunity to work on a modern, AI-enabled data platform Real ownership and influence in a growing, forward-thinking data team If you're an experienced Data Engineer with solid Microsoft Fabric and Azure experience, we'd love to hear from you!
17/04/2026
Full time
Senior Data Engineer - Microsoft Fabric We're looking for an experienced Senior Data Engineer to join a growing team building a modern Microsoft Fabric data platform . This is a hands-on role designing and delivering scalable data pipelines, Lakehouse solutions, and analytics models within the Azure ecosystem . What You'll Do: Build and maintain ETL/ELT pipelines and data models in Fabric (Data Factory, Notebooks, Spark) Write high-performance Spark SQL, T-SQL, Python/PySpark Manage ingestion, transformation, and loading from multiple sources Translate stakeholder requirements into scalable technical solutions Mentor team members and establish engineering standards, security, and governance Leverage AI-assisted development tools like GitHub Copilot, ChatGPT, and Fabric Copilot Essential Experience: Microsoft Fabric & Azure Data ecosystem Lakehouse architectures & Data Factory Python, PySpark, Spark SQL Proven hands-on delivery in this stack What's on Offer: Salary: 70,000 Excellent benefits & annual leave package Strong progression & development opportunities Opportunity to work on a modern, AI-enabled data platform Real ownership and influence in a growing, forward-thinking data team If you're an experienced Data Engineer with solid Microsoft Fabric and Azure experience, we'd love to hear from you!
CoreCom Consulting
Head of Data and BI
CoreCom Consulting City, York
Head of Data & BI / York (3x per week) / 90k- 95k We're hiring a Head of Data & BI to take full ownership of an established but evolving data function within a growing, commercially driven business. This is a hands-on leadership role, not a pure strategy position. You'll operate as a player-manager, leading a small team while remaining close to the tech, delivery, and stakeholders. The environment is already strong, with a well-regarded Power BI estate, but now needs someone who can bring structure, prioritisation, and commercial impact. What do we need from you? Strong Power BI expertise (core to the environment) Solid experience with Azure, SQL, and ideally Python Proven ability to lead small teams while staying hands-on Experience working closely with senior stakeholders Ability to prioritise workload and align data to business needs Role overview You'll take ownership of a split data platform, including a large Power BI estate and a partially outsourced data warehouse. This role exists to bring: Clear ownership and direction Better stakeholder engagement and prioritisation Stronger focus on delivery and business outcomes You'll also act as a Data Product Owner, ensuring the team is solving the right problems, not just responding to requests. Key focus areas Own and evolve the end-to-end BI & data platform Lead a small team while remaining hands-on in delivery Drive data-led decision making across the business Improve stakeholder engagement and demand management Ensure reliable reporting, including business-critical financial data Support and shape a future Microsoft Fabric migration Balance BAU support with ongoing platform improvements Why join? Full autonomy to shape the data function Broad role across BI, engineering, and product ownership Opportunity to drive real commercial value from data Work on high-impact, business-critical use cases A platform with strong foundations but huge scope to improve If you're a hands-on data leader who enjoys owning problems, delivering solutions, and working closely with the business, apply now with a CV to Dominic Brown on Head of Data & BI / York (3x per week) / 90k- 95k
16/04/2026
Full time
Head of Data & BI / York (3x per week) / 90k- 95k We're hiring a Head of Data & BI to take full ownership of an established but evolving data function within a growing, commercially driven business. This is a hands-on leadership role, not a pure strategy position. You'll operate as a player-manager, leading a small team while remaining close to the tech, delivery, and stakeholders. The environment is already strong, with a well-regarded Power BI estate, but now needs someone who can bring structure, prioritisation, and commercial impact. What do we need from you? Strong Power BI expertise (core to the environment) Solid experience with Azure, SQL, and ideally Python Proven ability to lead small teams while staying hands-on Experience working closely with senior stakeholders Ability to prioritise workload and align data to business needs Role overview You'll take ownership of a split data platform, including a large Power BI estate and a partially outsourced data warehouse. This role exists to bring: Clear ownership and direction Better stakeholder engagement and prioritisation Stronger focus on delivery and business outcomes You'll also act as a Data Product Owner, ensuring the team is solving the right problems, not just responding to requests. Key focus areas Own and evolve the end-to-end BI & data platform Lead a small team while remaining hands-on in delivery Drive data-led decision making across the business Improve stakeholder engagement and demand management Ensure reliable reporting, including business-critical financial data Support and shape a future Microsoft Fabric migration Balance BAU support with ongoing platform improvements Why join? Full autonomy to shape the data function Broad role across BI, engineering, and product ownership Opportunity to drive real commercial value from data Work on high-impact, business-critical use cases A platform with strong foundations but huge scope to improve If you're a hands-on data leader who enjoys owning problems, delivering solutions, and working closely with the business, apply now with a CV to Dominic Brown on Head of Data & BI / York (3x per week) / 90k- 95k
CoreCom Consulting
Data Engineering Lead
CoreCom Consulting City, York
Data Engineering Lead / York (hybrid) / 75k- 90k We're hiring a Data Engineering Lead to take ownership of a growing Azure-based data platform, working as part of a small, high-impact team. This is a hands-on technical leadership role focused on building, improving, and stabilising data pipelines and architecture - while also supporting a broader BI environment. What do we need from you? Strong experience with Azure Data Factory and the Microsoft data stack Solid SQL and Python skills Experience building and maintaining data pipelines and integrations Ability to work in a hands-on, delivery-focused role Comfortable operating in a small team with high ownership Understanding of modern data platforms (Fabric exposure desirable) Role overview You'll take ownership of the data engineering layer, responsible for ingesting and transforming data from multiple sources into a large-scale reporting environment. The platform currently combines: Azure Data Factory pipelines A large Power BI estate A legacy warehouse (partially outsourced) There is a clear roadmap toward modernisation and migration (Fabric), and you'll play a key role in shaping that journey. Key focus areas Build and maintain robust data pipelines Manage data ingestion from multiple sources Support and optimise data flows feeding Power BI Work closely with BI and business teams to deliver data solutions Improve platform stability, performance, and scalability Contribute to future platform modernisation and migration Help reduce reliance on legacy and outsourced components Why join? High ownership in a small, agile team Opportunity to work across engineering and platform design Involvement in a major platform transformation (Fabric) Real impact on business-critical data systems A role where you'll build, fix, and improve - not just maintain If you're a hands-on Lead Data Engineer who enjoys building scalable platforms and taking ownership of delivery, apply now with a CV to Dominic Brown on Data Engineering Lead / York (hybrid) / 75k- 90k
16/04/2026
Full time
Data Engineering Lead / York (hybrid) / 75k- 90k We're hiring a Data Engineering Lead to take ownership of a growing Azure-based data platform, working as part of a small, high-impact team. This is a hands-on technical leadership role focused on building, improving, and stabilising data pipelines and architecture - while also supporting a broader BI environment. What do we need from you? Strong experience with Azure Data Factory and the Microsoft data stack Solid SQL and Python skills Experience building and maintaining data pipelines and integrations Ability to work in a hands-on, delivery-focused role Comfortable operating in a small team with high ownership Understanding of modern data platforms (Fabric exposure desirable) Role overview You'll take ownership of the data engineering layer, responsible for ingesting and transforming data from multiple sources into a large-scale reporting environment. The platform currently combines: Azure Data Factory pipelines A large Power BI estate A legacy warehouse (partially outsourced) There is a clear roadmap toward modernisation and migration (Fabric), and you'll play a key role in shaping that journey. Key focus areas Build and maintain robust data pipelines Manage data ingestion from multiple sources Support and optimise data flows feeding Power BI Work closely with BI and business teams to deliver data solutions Improve platform stability, performance, and scalability Contribute to future platform modernisation and migration Help reduce reliance on legacy and outsourced components Why join? High ownership in a small, agile team Opportunity to work across engineering and platform design Involvement in a major platform transformation (Fabric) Real impact on business-critical data systems A role where you'll build, fix, and improve - not just maintain If you're a hands-on Lead Data Engineer who enjoys building scalable platforms and taking ownership of delivery, apply now with a CV to Dominic Brown on Data Engineering Lead / York (hybrid) / 75k- 90k
Standard 8 Recruitment Ltd
Microsoft Fabric BI Developer
Standard 8 Recruitment Ltd Guildford, Surrey
Microsoft Fabric BI Developer Contract Guildford, Surrey. Full-time, inside IR35. Standard 8 is working with a regulated, engineering-led business looking for someone to take full ownership of its data platform and reporting. This is a genuinely hands-on role, sitting across engineering, modelling and reporting, focused on building something robust rather than just surface-level dashboards. You ll own Microsoft Fabric end-to-end across Warehouse and Lakehouse, build and manage pipelines using Azure Data Factory and Fabric, and ensure clean, reliable data flows from core systems including SAP. Alongside this, you ll set standards around storage, performance and governance, while running the reporting environment properly across workspaces, deployments and capacity. On the reporting side, you ll design strong Power BI semantic models, deliver dashboards and reports people actually use, and create structured golden datasets to support self-service without chaos. You ll also handle security properly, including RLS and OLS, ensuring everything meets regulatory standards such as GDPR, PCI and ITAR. They re looking for someone with solid hands-on experience in Microsoft Fabric, strong data pipeline delivery, and deep Power BI knowledge across modelling, DAX and performance. Experience working with ERP or financial data is important, given the complexity and scrutiny involved. This suits a proper data engineer who understands reporting, comfortable owning a platform and pushing back when needed, bringing structure and standards into an environment that needs it. SAP Datasphere, PowerShell or R experience would be useful but not essential. Degree is optional, Power BI certification a bonus, and SC clearance eligibility is required.
15/04/2026
Contractor
Microsoft Fabric BI Developer Contract Guildford, Surrey. Full-time, inside IR35. Standard 8 is working with a regulated, engineering-led business looking for someone to take full ownership of its data platform and reporting. This is a genuinely hands-on role, sitting across engineering, modelling and reporting, focused on building something robust rather than just surface-level dashboards. You ll own Microsoft Fabric end-to-end across Warehouse and Lakehouse, build and manage pipelines using Azure Data Factory and Fabric, and ensure clean, reliable data flows from core systems including SAP. Alongside this, you ll set standards around storage, performance and governance, while running the reporting environment properly across workspaces, deployments and capacity. On the reporting side, you ll design strong Power BI semantic models, deliver dashboards and reports people actually use, and create structured golden datasets to support self-service without chaos. You ll also handle security properly, including RLS and OLS, ensuring everything meets regulatory standards such as GDPR, PCI and ITAR. They re looking for someone with solid hands-on experience in Microsoft Fabric, strong data pipeline delivery, and deep Power BI knowledge across modelling, DAX and performance. Experience working with ERP or financial data is important, given the complexity and scrutiny involved. This suits a proper data engineer who understands reporting, comfortable owning a platform and pushing back when needed, bringing structure and standards into an environment that needs it. SAP Datasphere, PowerShell or R experience would be useful but not essential. Degree is optional, Power BI certification a bonus, and SC clearance eligibility is required.
Applause IT Recruitment Ltd
Data Insight Analyst
Applause IT Recruitment Ltd City, Birmingham
Role: Data Insights Analyst Location: UK Salary: Competitive Job Type: Full-time, Permanent We are recruiting for a Data Insights Analyst to join a growing MI, Data and Analytics function within a national organisation operating across care, education, and support services. This is an excellent opportunity for a Data Insights Analyst with strong Power BI and SQL skills to work across a wide range of operational and central datasets, helping drive better reporting, insight, and decision-making across the business. The successful Data Insights Analyst will play a key role in dashboard development, data engineering, analysis, and stakeholder collaboration, supporting teams across operations, HR, Finance, IT, and other central functions. The Role As a Data Insights Analyst , you will be responsible for developing and optimising Power BI dashboards, datasets, semantic models, and DAX, while also supporting ETL workflows and wider data platform activity. You will work with sensitive and regulated datasets, ensuring data quality, reliability, and strong governance standards. Key Responsibilities Build and improve Power BI dashboards, datasets, semantic models, and DAX Ensure reporting solutions are user-focused, well-governed, and high performing Support ETL workflows across SQL Server, Fabric, and cloud-based tools Contribute to modern data platform pipelines, including Databricks, Snowflake, or similar technologies Analyse data across multiple service areas including operational, HR, and Finance Maintain data quality, lineage, and reliability across reporting environments Work closely with stakeholders to translate business requirements into practical analytical solutions Required Skills and Experience Strong Power BI development experience Strong SQL / SQL Server skills Knowledge of Microsoft Fabric and modern data platforms Experience working with sensitive or regulated datasets Ability to work across multiple data domains and business functions Strong communication skills with the ability to engage non-technical stakeholders Proactive, analytical, and comfortable managing multiple priorities Desirable Experience TimeXtender experience Exposure to Databricks, Snowflake, or similar platforms Experience with APIs, automation, or Python Knowledge of data governance, master data management, and data quality frameworks Experience in healthcare, care, education, residential childcare, or supported living environments What's on Offer The chance to support meaningful, data-driven work within a national organisation Exposure to a modern data environment including Power BI, Fabric, and enterprise ETL tools Opportunities for professional development across analytics, data engineering, and cloud technologies A varied role with strong stakeholder engagement and real business impact If you are an experienced Data Insights Analyst looking for your next opportunity, click apply now.
15/04/2026
Full time
Role: Data Insights Analyst Location: UK Salary: Competitive Job Type: Full-time, Permanent We are recruiting for a Data Insights Analyst to join a growing MI, Data and Analytics function within a national organisation operating across care, education, and support services. This is an excellent opportunity for a Data Insights Analyst with strong Power BI and SQL skills to work across a wide range of operational and central datasets, helping drive better reporting, insight, and decision-making across the business. The successful Data Insights Analyst will play a key role in dashboard development, data engineering, analysis, and stakeholder collaboration, supporting teams across operations, HR, Finance, IT, and other central functions. The Role As a Data Insights Analyst , you will be responsible for developing and optimising Power BI dashboards, datasets, semantic models, and DAX, while also supporting ETL workflows and wider data platform activity. You will work with sensitive and regulated datasets, ensuring data quality, reliability, and strong governance standards. Key Responsibilities Build and improve Power BI dashboards, datasets, semantic models, and DAX Ensure reporting solutions are user-focused, well-governed, and high performing Support ETL workflows across SQL Server, Fabric, and cloud-based tools Contribute to modern data platform pipelines, including Databricks, Snowflake, or similar technologies Analyse data across multiple service areas including operational, HR, and Finance Maintain data quality, lineage, and reliability across reporting environments Work closely with stakeholders to translate business requirements into practical analytical solutions Required Skills and Experience Strong Power BI development experience Strong SQL / SQL Server skills Knowledge of Microsoft Fabric and modern data platforms Experience working with sensitive or regulated datasets Ability to work across multiple data domains and business functions Strong communication skills with the ability to engage non-technical stakeholders Proactive, analytical, and comfortable managing multiple priorities Desirable Experience TimeXtender experience Exposure to Databricks, Snowflake, or similar platforms Experience with APIs, automation, or Python Knowledge of data governance, master data management, and data quality frameworks Experience in healthcare, care, education, residential childcare, or supported living environments What's on Offer The chance to support meaningful, data-driven work within a national organisation Exposure to a modern data environment including Power BI, Fabric, and enterprise ETL tools Opportunities for professional development across analytics, data engineering, and cloud technologies A varied role with strong stakeholder engagement and real business impact If you are an experienced Data Insights Analyst looking for your next opportunity, click apply now.
SF Partners
Data Engineer
SF Partners City, Manchester
Data Engineer (Microsoft Fabric) Manchester / Birmingham / Nottingham (Hybrid - 2 days onsite) £50,000 Permanent - Full-time About the Opportunity SF Partners are partnering with a leading UK law firm to hire a Data Engineer with Microsoft Fabric experience to support a major data transformation programme. The business is moving away from SAP BW and building a modern data platform on Microsoft Fabric, and this role will be central to that journey. This is an opportunity to work hands-on with Fabric in a real enterprise environment - helping shape the platform, pipelines, and reporting layer from the ground up. The Role You'll play a key role in designing and delivering data solutions within Microsoft Fabric, working closely with both technical teams and business stakeholders. This role is ideal for someone who already has exposure to Fabric (or strong Azure experience with hands-on Fabric work) and wants to deepen their expertise in a growing, high-impact environment. Key Responsibilities Design and build data pipelines within Microsoft Fabric (Data Pipelines, Dataflows, OneLake, Lakehouse) Support the migration from SAP BW into a Fabric-based architecture Develop and optimise ETL/ELT processes using Fabric and/or Azure Data Factory Build and maintain scalable data models to support reporting and analytics Contribute to the development of a gold-standard reporting layer (Power BI semantic models) Work with stakeholders across Finance, HR, and Commercial teams Ensure strong data governance, security, and compliance (GDPR) Document processes and support best practice across the data team What We're Looking For Experience working with Microsoft Fabric (essential) Strong SQL skills and experience working with large datasets Experience building data pipelines and working with lakehouse architectures Knowledge of tools such as Azure Data Factory and/or Fabric Data Pipelines Experience with Python or PySpark for data transformation Understanding of data modelling (dimensional modelling, medallion architecture) Strong communication skills and ability to work with non-technical stakeholders Desirable: Experience migrating from legacy systems (e.g. SAP BW) Exposure to SAP data (HANA, BW, SuccessFactors, etc.) Experience with Power BI and semantic models Why Apply? Work on a high-profile Microsoft Fabric implementation Play a key role in a large-scale data migration project Gain deeper expertise in one of the fastest-growing data platforms Join a collaborative and forward-thinking data team Hybrid working and strong benefits package Apply Now If you're a Data Engineer with Microsoft Fabric experience looking to take ownership of a major transformation project, we'd love to hear from you. Apply now or get in touch with SF Partners for more information.
14/04/2026
Full time
Data Engineer (Microsoft Fabric) Manchester / Birmingham / Nottingham (Hybrid - 2 days onsite) £50,000 Permanent - Full-time About the Opportunity SF Partners are partnering with a leading UK law firm to hire a Data Engineer with Microsoft Fabric experience to support a major data transformation programme. The business is moving away from SAP BW and building a modern data platform on Microsoft Fabric, and this role will be central to that journey. This is an opportunity to work hands-on with Fabric in a real enterprise environment - helping shape the platform, pipelines, and reporting layer from the ground up. The Role You'll play a key role in designing and delivering data solutions within Microsoft Fabric, working closely with both technical teams and business stakeholders. This role is ideal for someone who already has exposure to Fabric (or strong Azure experience with hands-on Fabric work) and wants to deepen their expertise in a growing, high-impact environment. Key Responsibilities Design and build data pipelines within Microsoft Fabric (Data Pipelines, Dataflows, OneLake, Lakehouse) Support the migration from SAP BW into a Fabric-based architecture Develop and optimise ETL/ELT processes using Fabric and/or Azure Data Factory Build and maintain scalable data models to support reporting and analytics Contribute to the development of a gold-standard reporting layer (Power BI semantic models) Work with stakeholders across Finance, HR, and Commercial teams Ensure strong data governance, security, and compliance (GDPR) Document processes and support best practice across the data team What We're Looking For Experience working with Microsoft Fabric (essential) Strong SQL skills and experience working with large datasets Experience building data pipelines and working with lakehouse architectures Knowledge of tools such as Azure Data Factory and/or Fabric Data Pipelines Experience with Python or PySpark for data transformation Understanding of data modelling (dimensional modelling, medallion architecture) Strong communication skills and ability to work with non-technical stakeholders Desirable: Experience migrating from legacy systems (e.g. SAP BW) Exposure to SAP data (HANA, BW, SuccessFactors, etc.) Experience with Power BI and semantic models Why Apply? Work on a high-profile Microsoft Fabric implementation Play a key role in a large-scale data migration project Gain deeper expertise in one of the fastest-growing data platforms Join a collaborative and forward-thinking data team Hybrid working and strong benefits package Apply Now If you're a Data Engineer with Microsoft Fabric experience looking to take ownership of a major transformation project, we'd love to hear from you. Apply now or get in touch with SF Partners for more information.
itecopeople
Data Engineer (Microsoft Fabric)
itecopeople
Data Engineer (Microsoft Fabric) - Permanent Location: Hybrid - Milton Keynes (majority remote) Type: Full-time, Permanent Start: ASAP Salary: 55,000 - 60,000 per annum The Opportunity. We're working with a growing organisation investing in its data platform, now looking to hire a Data Engineer with Microsoft Fabric experience to help accelerate their journey. Microsoft Fabric is already in place, but the environment is still evolving and not yet fully optimised. There is now pressure to accelerate delivery and unlock value from data. This is not a traditional role with fixed specifications. You will work in a partially built environment, helping shape how data is structured, delivered, and used across the business. Responsibilities Design and build end-to-end data solutions within Microsoft Fabric Develop and optimise data pipelines using PySpark and SQL Work across bronze, silver, and gold layers (medallion architecture) Perform initial data analysis and help define requirements Collaborate with analysts, Client, and stakeholders Improve and automate existing processes Ensure data is secure, accurate, and scalable Requirements Technical: Strong experience with PySpark / Spark and SQL Hands-on experience with Microsoft Fabric Experience building data pipelines and models Azure Data Factory experience Understanding of modern data architectures Personal: Proactive and self-starting Comfortable working in evolving environments Able to think beyond requirements and suggest solutions Interest in both engineering and business use of data Why Join Opportunity to shape a developing Fabric platform. Work with large historic datasets. Influence how data is used across the organisation. Majority remote working To progress matters please email your CV to Laura (url removed) Services Advertised are those of an Employment Agency.
14/04/2026
Full time
Data Engineer (Microsoft Fabric) - Permanent Location: Hybrid - Milton Keynes (majority remote) Type: Full-time, Permanent Start: ASAP Salary: 55,000 - 60,000 per annum The Opportunity. We're working with a growing organisation investing in its data platform, now looking to hire a Data Engineer with Microsoft Fabric experience to help accelerate their journey. Microsoft Fabric is already in place, but the environment is still evolving and not yet fully optimised. There is now pressure to accelerate delivery and unlock value from data. This is not a traditional role with fixed specifications. You will work in a partially built environment, helping shape how data is structured, delivered, and used across the business. Responsibilities Design and build end-to-end data solutions within Microsoft Fabric Develop and optimise data pipelines using PySpark and SQL Work across bronze, silver, and gold layers (medallion architecture) Perform initial data analysis and help define requirements Collaborate with analysts, Client, and stakeholders Improve and automate existing processes Ensure data is secure, accurate, and scalable Requirements Technical: Strong experience with PySpark / Spark and SQL Hands-on experience with Microsoft Fabric Experience building data pipelines and models Azure Data Factory experience Understanding of modern data architectures Personal: Proactive and self-starting Comfortable working in evolving environments Able to think beyond requirements and suggest solutions Interest in both engineering and business use of data Why Join Opportunity to shape a developing Fabric platform. Work with large historic datasets. Influence how data is used across the organisation. Majority remote working To progress matters please email your CV to Laura (url removed) Services Advertised are those of an Employment Agency.
Reed Technology
Data Engineer - Celonis
Reed Technology City, Birmingham
Data Engineer - Celonis Location: United Kingdom (Fully remote - must live and be able to work in the UK) Job Type: Full-time Eligibility: Must have full right to work in the UK (no sponsorship available) I am very excited to offer an opportunity for a skilled Data Engineer with expertise in Celonis to join a well-established, international professional services organisation. This company is renowned for delivering data, analytics, and automation solutions to large, complex businesses, particularly in finance, operations, supply chain, and customer functions. As part of their UK delivery team, you will play a crucial role in supporting major transformation programmes through practical, hands-on data engineering. Day-to-day of the role: Design, develop, and maintain scalable data pipelines to feed Celonis from various source systems. Build, integrate, and optimise SQL Server-based data warehouse pipelines. Support migration initiatives towards a modern Microsoft Fabric architecture. Develop and maintain Celonis data models using Celonis Data Integration. Translate business and reporting requirements into technical data specifications. Write and optimise SQL and Celonis PQL queries to support dashboards and analysis. Develop Python scripts for data cleansing, transformation, and automation. Implement data quality checks and reconciliation processes across systems. Provide technical support to analysts and business users working with Celonis. Ensure adherence to data governance, security, and access control standards. Required Skills & Qualifications: Proven commercial hands-on experience with Celonis in a delivery environment. Strong experience with Celonis Data Integration and analysis development. Solid SQL skills (data transformation, optimisation, and performance tuning). Proficiency in Python for data engineering and automation tasks. Good understanding of ETL / ELT processes and data modelling. Experience working within Agile delivery teams. Strong analytical skills and meticulous attention to detail. Ability to engage effectively with both technical and non-technical stakeholders. Desirable Skills: Celonis Data Engineer certification. Experience with SQL Server and Microsoft Fabric. Experience integrating data from enterprise platforms (ERP, CRM, finance systems, etc.). Experience in a consulting or client-facing delivery role. Familiarity with cloud data platforms (Azure, AWS, or GCP). Benefits: Opportunity to work in a dynamic, international environment. Competitive salary+ bonus and benefits package. Professional development and career advancement opportunities. Flexible working arrangements (remote / hybrid). To apply for this Celonis Data Engineer position, please apply with an updated CV
10/04/2026
Full time
Data Engineer - Celonis Location: United Kingdom (Fully remote - must live and be able to work in the UK) Job Type: Full-time Eligibility: Must have full right to work in the UK (no sponsorship available) I am very excited to offer an opportunity for a skilled Data Engineer with expertise in Celonis to join a well-established, international professional services organisation. This company is renowned for delivering data, analytics, and automation solutions to large, complex businesses, particularly in finance, operations, supply chain, and customer functions. As part of their UK delivery team, you will play a crucial role in supporting major transformation programmes through practical, hands-on data engineering. Day-to-day of the role: Design, develop, and maintain scalable data pipelines to feed Celonis from various source systems. Build, integrate, and optimise SQL Server-based data warehouse pipelines. Support migration initiatives towards a modern Microsoft Fabric architecture. Develop and maintain Celonis data models using Celonis Data Integration. Translate business and reporting requirements into technical data specifications. Write and optimise SQL and Celonis PQL queries to support dashboards and analysis. Develop Python scripts for data cleansing, transformation, and automation. Implement data quality checks and reconciliation processes across systems. Provide technical support to analysts and business users working with Celonis. Ensure adherence to data governance, security, and access control standards. Required Skills & Qualifications: Proven commercial hands-on experience with Celonis in a delivery environment. Strong experience with Celonis Data Integration and analysis development. Solid SQL skills (data transformation, optimisation, and performance tuning). Proficiency in Python for data engineering and automation tasks. Good understanding of ETL / ELT processes and data modelling. Experience working within Agile delivery teams. Strong analytical skills and meticulous attention to detail. Ability to engage effectively with both technical and non-technical stakeholders. Desirable Skills: Celonis Data Engineer certification. Experience with SQL Server and Microsoft Fabric. Experience integrating data from enterprise platforms (ERP, CRM, finance systems, etc.). Experience in a consulting or client-facing delivery role. Familiarity with cloud data platforms (Azure, AWS, or GCP). Benefits: Opportunity to work in a dynamic, international environment. Competitive salary+ bonus and benefits package. Professional development and career advancement opportunities. Flexible working arrangements (remote / hybrid). To apply for this Celonis Data Engineer position, please apply with an updated CV
Hays Specialist Recruitment Limited
Data Architect - Bristol - Hybrid Opportunity
Hays Specialist Recruitment Limited Bristol, Somerset
Your new company They are a specialist insurance and risk solutions provider, supporting clients with tailored coverage and expert advice across a range of sectors. The business is known for its client-focused approach, strong market relationships and commitment to delivering practical, dependable solutions.With a collaborative culture and a focus on professional development, they offer a supportive environment where people are trusted, valued and encouraged to grow their careers within a forward-thinking organisation. Your new role As a Data Architect, you'll play a key role in shaping how data is designed, managed and used across the business. You'll set the architectural direction for our data estate - from the point data first lands on the platform, through the Bronze, Silver and Gold layers of our Medallion Architecture, and all the way to analytics, AI and self-service reporting.Working within the Microsoft Azure and Databricks ecosystem, you'll help build a data platform that's scalable, flexible and built to last. Your work will directly support high-impact use cases, including advanced analytics, pricing models, AI/ML solutions and regulatory reporting - ensuring teams across the business can trust and use data with confidence. Data Architecture & Modelling Define and own the architectural principles, standards and policies governing SBG's data estate from the landing zone through to the Gold layer.Design and govern the Medallion Architecture (Bronze / Silver / Gold), ensuring every layer is built for analytics, AI/ML and self-service consumption.Own data modelling standards - conceptual, logical and physical - and ensure models are fit for both regulatory reporting and AI-driven insight.Define Unity Catalogue structure, metadata standards and data lineage governance across the estate. Data Ingestion & ProcessingDefine ingestion standards and data contracts for data arriving from the landing zone into the Bronze layer, working in partnership with the Development and Application Management team.Design and optimise ETL/ELT pipeline frameworks using Databricks, Delta Lake and Azure Data Factory. Ensure Silver and Gold layer data products are fit for purpose for analytics, pricing, AI and ML model consumption.Optimise data pipelines for efficiency, cost-effectiveness and high performance, leveraging Databricks for big data processing and machine learning. Governance & Standards Act as the architectural authority for the data estate - reviewing designs, enforcing standards and preventing platform fragmentation as SBG scales.Ensure all data architecture decisions align with regulatory requirements - FCA, GDPR, Solvency II, IFRS 17 and BCBS 239.Define and maintain data architecture policies and guidelines ensuring long-term scalability and sustainability. Analytics & AI Enablement Design the Gold layer to ensure data products are structured, documented and accessible for self-service analytics and AI/ML model consumption.Collaborate with ML Ops and Data Science teams to define data product standards and feature engineering patterns.Evaluate and lead adoption of emerging Azure and Databricks capabilities - including Microsoft Fabric, OneLake and DirectLake - where they advance the data architecture.Drive innovation by evaluating and implementing emerging cloud-based data technologies to enhance SBG's competitive advantage. What you'll need: Strong stakeholder management across business, IT and compliance teams. Excellent communication, collaboration and influencing skills at all levels of an organisation. Experience leading data architecture and engineering teams in an enterprise environment. Ability to define and implement a data strategy aligned with business objectives. Proven track record of delivering enterprise-scale data solutions with a focus on performance, security and scalability. Experience in regulated financial services, ensuring compliance with industry standards. Deep expertise in data modelling - conceptual, logical and physical. Data warehousing and data lake architecture for high-performance analytics. ETL/ELT pipeline development and optimisation to support large-scale data processing. Data integration across structured and unstructured sources, ensuring high availability. Metadata management and governance to maintain data quality and lineage. Experience defining data contracts and ingestion standards between source delivery teams and the data estate. Deep expertise in Microsoft Azure cloud services - ADF, ADLS, Synapse, Purview. Databricks - Delta Lake architecture, optimisation and advanced data processing. Apache Spark for large-scale distributed computing and performance tuning. Microsoft Fabric - OneLake and DirectLake integration. Azure Synapse Analytics for enterprise-scale data warehousing. Infrastructure-as-Code (Terraform or Azure Bicep) to automate cloud deployments. CI/CD pipelines with Azure DevOps or GitHub Actions for automated deployment of data pipelines. MLOps best practices - MLflow, Databricks Model Serving, Feature Store. Knowledge of IFRS 17, BCBS 239, UK Data Protection Act and Solvency II compliance. Experience with pricing models, claims processing and fraud detection in the insurance sector. Strong problem-solving skills and ability to translate business needs into technical solutions. Ability to document and present complex data architectures to technical and non-technical stakeholders What you'll get in return Hybrid working - 2 days in the office and 3 days working from home 25 days annual leave, rising to 27 days over 2 years' service and 30 days after 5 years' service. Plus bank holidays! Discretionary annual bonus Pension scheme - 5% employee, 6% employer & many more What you need to do now If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV.If this job isn't quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at hays.co.uk
08/04/2026
Full time
Your new company They are a specialist insurance and risk solutions provider, supporting clients with tailored coverage and expert advice across a range of sectors. The business is known for its client-focused approach, strong market relationships and commitment to delivering practical, dependable solutions.With a collaborative culture and a focus on professional development, they offer a supportive environment where people are trusted, valued and encouraged to grow their careers within a forward-thinking organisation. Your new role As a Data Architect, you'll play a key role in shaping how data is designed, managed and used across the business. You'll set the architectural direction for our data estate - from the point data first lands on the platform, through the Bronze, Silver and Gold layers of our Medallion Architecture, and all the way to analytics, AI and self-service reporting.Working within the Microsoft Azure and Databricks ecosystem, you'll help build a data platform that's scalable, flexible and built to last. Your work will directly support high-impact use cases, including advanced analytics, pricing models, AI/ML solutions and regulatory reporting - ensuring teams across the business can trust and use data with confidence. Data Architecture & Modelling Define and own the architectural principles, standards and policies governing SBG's data estate from the landing zone through to the Gold layer.Design and govern the Medallion Architecture (Bronze / Silver / Gold), ensuring every layer is built for analytics, AI/ML and self-service consumption.Own data modelling standards - conceptual, logical and physical - and ensure models are fit for both regulatory reporting and AI-driven insight.Define Unity Catalogue structure, metadata standards and data lineage governance across the estate. Data Ingestion & ProcessingDefine ingestion standards and data contracts for data arriving from the landing zone into the Bronze layer, working in partnership with the Development and Application Management team.Design and optimise ETL/ELT pipeline frameworks using Databricks, Delta Lake and Azure Data Factory. Ensure Silver and Gold layer data products are fit for purpose for analytics, pricing, AI and ML model consumption.Optimise data pipelines for efficiency, cost-effectiveness and high performance, leveraging Databricks for big data processing and machine learning. Governance & Standards Act as the architectural authority for the data estate - reviewing designs, enforcing standards and preventing platform fragmentation as SBG scales.Ensure all data architecture decisions align with regulatory requirements - FCA, GDPR, Solvency II, IFRS 17 and BCBS 239.Define and maintain data architecture policies and guidelines ensuring long-term scalability and sustainability. Analytics & AI Enablement Design the Gold layer to ensure data products are structured, documented and accessible for self-service analytics and AI/ML model consumption.Collaborate with ML Ops and Data Science teams to define data product standards and feature engineering patterns.Evaluate and lead adoption of emerging Azure and Databricks capabilities - including Microsoft Fabric, OneLake and DirectLake - where they advance the data architecture.Drive innovation by evaluating and implementing emerging cloud-based data technologies to enhance SBG's competitive advantage. What you'll need: Strong stakeholder management across business, IT and compliance teams. Excellent communication, collaboration and influencing skills at all levels of an organisation. Experience leading data architecture and engineering teams in an enterprise environment. Ability to define and implement a data strategy aligned with business objectives. Proven track record of delivering enterprise-scale data solutions with a focus on performance, security and scalability. Experience in regulated financial services, ensuring compliance with industry standards. Deep expertise in data modelling - conceptual, logical and physical. Data warehousing and data lake architecture for high-performance analytics. ETL/ELT pipeline development and optimisation to support large-scale data processing. Data integration across structured and unstructured sources, ensuring high availability. Metadata management and governance to maintain data quality and lineage. Experience defining data contracts and ingestion standards between source delivery teams and the data estate. Deep expertise in Microsoft Azure cloud services - ADF, ADLS, Synapse, Purview. Databricks - Delta Lake architecture, optimisation and advanced data processing. Apache Spark for large-scale distributed computing and performance tuning. Microsoft Fabric - OneLake and DirectLake integration. Azure Synapse Analytics for enterprise-scale data warehousing. Infrastructure-as-Code (Terraform or Azure Bicep) to automate cloud deployments. CI/CD pipelines with Azure DevOps or GitHub Actions for automated deployment of data pipelines. MLOps best practices - MLflow, Databricks Model Serving, Feature Store. Knowledge of IFRS 17, BCBS 239, UK Data Protection Act and Solvency II compliance. Experience with pricing models, claims processing and fraud detection in the insurance sector. Strong problem-solving skills and ability to translate business needs into technical solutions. Ability to document and present complex data architectures to technical and non-technical stakeholders What you'll get in return Hybrid working - 2 days in the office and 3 days working from home 25 days annual leave, rising to 27 days over 2 years' service and 30 days after 5 years' service. Plus bank holidays! Discretionary annual bonus Pension scheme - 5% employee, 6% employer & many more What you need to do now If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV.If this job isn't quite right for you, but you are looking for a new position, please contact us for a confidential discussion about your career. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found at hays.co.uk
Integral Recruitment Ltd
Data Engineer
Integral Recruitment Ltd Epsom, Surrey
Data Engineer £50,000 £60,000 Hybrid (1 day per week onsite) Epsom, Surrey, KT17 Benefits: Bonus, generous pension, private health insurance, professional study financial support and lots more Are you a data problem-solver who enjoys getting under the hood of complex datasets and turning messy data into meaningful insight? We re partnering with a prestigious financial services organisation looking for a Data Engineer to join their growing data team, reporting directly to the Head of Development & Data. This is a brilliant opportunity to play a key role in shaping their data landscape as they move towards building a modern data warehouse. The Data Engineer Opportunity This role is all about making data work better. You ll take ownership of improving and standardising existing data, building robust pipelines, and delivering high-quality reporting solutions that genuinely impact the business. It s a hands-on role with plenty of variety from writing SQL to working closely with stakeholders to bring data to life. What You ll Be Doing Designing and developing bespoke MI reporting solutions Writing and optimising SQL scripts and stored procedures Building and validating data pipelines to support reporting and deployments Working with tools like Power BI and Microsoft Fabric Cleaning, transforming, and standardising existing datasets Troubleshooting and improving legacy scripts and processes Collaborating with stakeholders to understand reporting needs and translate them into technical solutions Supporting the journey towards a new data warehouse environment What We re Looking For Experience with SQL Server, Power BI, Power Automate and Microsoft Fabric in complex data environments Proven ability to build data pipelines and transformations Experience working with structured and unstructured data (e.g. Excel, blob storage, SQL) Solid understanding of data modelling, schema design, and permissions Experience across the full SDLC, including deployments and pipelines Confident communicator able to work closely with both technical and non-technical stakeholders A proactive, organised mindset with strong problem-solving skills In addition, experience in Financial Services, Agile and experience of mentoring or supporting junior team members is a bonus. Why Apply? Be part of a team modernising their data platform Work closely with senior leadership and influence data strategy Hybrid working from day one (generally 1 day in the office per week) A role that blends technical delivery with real business impact Extremely well-established, stable organisation offering a generous benefits package If you enjoy untangling complex data, improving systems, and building smarter reporting, we d love to hear from you.
31/03/2026
Full time
Data Engineer £50,000 £60,000 Hybrid (1 day per week onsite) Epsom, Surrey, KT17 Benefits: Bonus, generous pension, private health insurance, professional study financial support and lots more Are you a data problem-solver who enjoys getting under the hood of complex datasets and turning messy data into meaningful insight? We re partnering with a prestigious financial services organisation looking for a Data Engineer to join their growing data team, reporting directly to the Head of Development & Data. This is a brilliant opportunity to play a key role in shaping their data landscape as they move towards building a modern data warehouse. The Data Engineer Opportunity This role is all about making data work better. You ll take ownership of improving and standardising existing data, building robust pipelines, and delivering high-quality reporting solutions that genuinely impact the business. It s a hands-on role with plenty of variety from writing SQL to working closely with stakeholders to bring data to life. What You ll Be Doing Designing and developing bespoke MI reporting solutions Writing and optimising SQL scripts and stored procedures Building and validating data pipelines to support reporting and deployments Working with tools like Power BI and Microsoft Fabric Cleaning, transforming, and standardising existing datasets Troubleshooting and improving legacy scripts and processes Collaborating with stakeholders to understand reporting needs and translate them into technical solutions Supporting the journey towards a new data warehouse environment What We re Looking For Experience with SQL Server, Power BI, Power Automate and Microsoft Fabric in complex data environments Proven ability to build data pipelines and transformations Experience working with structured and unstructured data (e.g. Excel, blob storage, SQL) Solid understanding of data modelling, schema design, and permissions Experience across the full SDLC, including deployments and pipelines Confident communicator able to work closely with both technical and non-technical stakeholders A proactive, organised mindset with strong problem-solving skills In addition, experience in Financial Services, Agile and experience of mentoring or supporting junior team members is a bonus. Why Apply? Be part of a team modernising their data platform Work closely with senior leadership and influence data strategy Hybrid working from day one (generally 1 day in the office per week) A role that blends technical delivery with real business impact Extremely well-established, stable organisation offering a generous benefits package If you enjoy untangling complex data, improving systems, and building smarter reporting, we d love to hear from you.
Greencore
Senior Data Engineer
Greencore Worksop, Nottinghamshire
Why Greencore? We're a leading manufacturer of convenience food in the UK and our purpose is to make everyday taste better! We're a vibrant, fast-paced leading food manufacturer. Employing 13,300 colleagues across 16 manufacturing units and 17 distribution depots across the UK. We supply all the UK's food retailers with everything from Sandwiches, soups and sushi to cooking sauces, pickles and ready meals, and in FY24, we generated revenues of £1.8bn. Our vast direct-to-store (DTS) distribution network, comprising of 17 depots nationwide, enables us to make over 10,500 daily deliveries of our own chilled and frozen produce and that of third parties. Why is this exciting for your career as a Senior Data Engineer? The MBE Programme presents a huge opportunity for colleagues across the technology function to play a central role in the design, shape, delivery and execution of an enterprise wide digital transformation programme. The complexity of the initiative, within a FTSE 250 business, will allow for large-scale problem solving, group wide impact assessment and supporting the delivery of an enablement project to future proof the business. Why we embarked on Making Business Easier? Over time processes have become increasingly complex, increasing both the risk and cost they pose, whilst restricting our agility. At the same time, our customers and the market expect more from us than ever before. Making Business Easier forms a fundamental foundation for our commercial and operational excellence agendas, whilst supporting managing our cost base effectively in the future. The MBE Programme will streamline and simplify core processes, provide easier access to quality business data and will invest in the right technology to enable these processes. What you'll be doing: As a Senior Data Engineer, you will play a key role in shaping and delivering enterprise-wide data solutions that translate complex business requirements into scalable, high-performance data platforms. In this role, you will help define and guide the structure of data systems, focusing on seamless integration, accessibility, and governance, while optimising data flows to support both analytics and operational needs. Collaborating closely with business stakeholders, data engineers, and analysts, you will ensure that data platforms are robust, efficient, and adaptable to evolving business priorities. You will also support the usage, alignment, and consistency of data models; therefore, will have a wide-ranging role across many business projects and deliverables Shape and implement data solutions that align with business objectives and leverage both cloud and on-premise technologies Translate complex business needs into scalable, high-performing data solutions Support the development and application of best practices in data governance, security, and system design Collaborate closely with business stakeholders, product teams, and engineers to design and deliver effective, integrated data solutions Optimise data flows and pipelines to enable a wide range of analytical and operational use cases Promote data consistency across transactional and analytical systems through well-designed integration approaches Contribute to the design and ongoing improvement of data platforms - including data lakes, data warehouses, and other distributed storage environments - focused on efficiency, scalability, and ease of maintenance Mentor and support junior engineers and analysts in applying best practices in data engineering and solution design What you'll need: 5+ years of Data Engineering experience, with expertise in Azure data services and/or Microsoft Fabric Strong expertise in designing scalable data platforms and managing cloud-based data ecosystems Proven track record in data integration, ETL processes, and optimising large-scale data systems Expertise in cloud-based data platforms (AWS, Azure, Google Cloud) and distributed storage solutions Proficiency in Python, PySpark, SQL, NoSQL, and data processing frameworks (Spark, Databricks) Expertise in ETL/ELT design and orchestration in Azure, as well as pipeline performance tuning & optimisation Competent in integrating relational, NoSQL, and streaming data sources Management of CI/CD pipelines & Git-based workflows Good knowledge of data governance, privacy regulations, and security best practices Experience with modern data architectures, including data lakes, data mesh, and event-driven data processing Strong problem-solving and analytical skills to translate complex business needs into scalable data solutions Excellent communication and stakeholder management to align business and technical goals High attention to detail and commitment to data quality, security, and governance Ability to mentor and guide teams, fostering a culture of best practices in data architecture Power BI and DAX for data visualisation (desirable) Knowledge of Azure Machine Learning and AI services (desirable) Experience with streaming platforms like Event Hub or Kafka Familiarity with cloud cost optimisation techniques (desirable) What you'll get: Competitive salary and job-related benefits 25 days holiday allowance plus bank holidays Car Allowance Annual Target Bonus Pension up to 8% matched PMI Cover: Individual Life insurance up to 4x salary Company share save scheme Greencore Qualifications Exclusive Greencore employee discount platform Access to a full Wellbeing Centre platform
03/10/2025
Full time
Why Greencore? We're a leading manufacturer of convenience food in the UK and our purpose is to make everyday taste better! We're a vibrant, fast-paced leading food manufacturer. Employing 13,300 colleagues across 16 manufacturing units and 17 distribution depots across the UK. We supply all the UK's food retailers with everything from Sandwiches, soups and sushi to cooking sauces, pickles and ready meals, and in FY24, we generated revenues of £1.8bn. Our vast direct-to-store (DTS) distribution network, comprising of 17 depots nationwide, enables us to make over 10,500 daily deliveries of our own chilled and frozen produce and that of third parties. Why is this exciting for your career as a Senior Data Engineer? The MBE Programme presents a huge opportunity for colleagues across the technology function to play a central role in the design, shape, delivery and execution of an enterprise wide digital transformation programme. The complexity of the initiative, within a FTSE 250 business, will allow for large-scale problem solving, group wide impact assessment and supporting the delivery of an enablement project to future proof the business. Why we embarked on Making Business Easier? Over time processes have become increasingly complex, increasing both the risk and cost they pose, whilst restricting our agility. At the same time, our customers and the market expect more from us than ever before. Making Business Easier forms a fundamental foundation for our commercial and operational excellence agendas, whilst supporting managing our cost base effectively in the future. The MBE Programme will streamline and simplify core processes, provide easier access to quality business data and will invest in the right technology to enable these processes. What you'll be doing: As a Senior Data Engineer, you will play a key role in shaping and delivering enterprise-wide data solutions that translate complex business requirements into scalable, high-performance data platforms. In this role, you will help define and guide the structure of data systems, focusing on seamless integration, accessibility, and governance, while optimising data flows to support both analytics and operational needs. Collaborating closely with business stakeholders, data engineers, and analysts, you will ensure that data platforms are robust, efficient, and adaptable to evolving business priorities. You will also support the usage, alignment, and consistency of data models; therefore, will have a wide-ranging role across many business projects and deliverables Shape and implement data solutions that align with business objectives and leverage both cloud and on-premise technologies Translate complex business needs into scalable, high-performing data solutions Support the development and application of best practices in data governance, security, and system design Collaborate closely with business stakeholders, product teams, and engineers to design and deliver effective, integrated data solutions Optimise data flows and pipelines to enable a wide range of analytical and operational use cases Promote data consistency across transactional and analytical systems through well-designed integration approaches Contribute to the design and ongoing improvement of data platforms - including data lakes, data warehouses, and other distributed storage environments - focused on efficiency, scalability, and ease of maintenance Mentor and support junior engineers and analysts in applying best practices in data engineering and solution design What you'll need: 5+ years of Data Engineering experience, with expertise in Azure data services and/or Microsoft Fabric Strong expertise in designing scalable data platforms and managing cloud-based data ecosystems Proven track record in data integration, ETL processes, and optimising large-scale data systems Expertise in cloud-based data platforms (AWS, Azure, Google Cloud) and distributed storage solutions Proficiency in Python, PySpark, SQL, NoSQL, and data processing frameworks (Spark, Databricks) Expertise in ETL/ELT design and orchestration in Azure, as well as pipeline performance tuning & optimisation Competent in integrating relational, NoSQL, and streaming data sources Management of CI/CD pipelines & Git-based workflows Good knowledge of data governance, privacy regulations, and security best practices Experience with modern data architectures, including data lakes, data mesh, and event-driven data processing Strong problem-solving and analytical skills to translate complex business needs into scalable data solutions Excellent communication and stakeholder management to align business and technical goals High attention to detail and commitment to data quality, security, and governance Ability to mentor and guide teams, fostering a culture of best practices in data architecture Power BI and DAX for data visualisation (desirable) Knowledge of Azure Machine Learning and AI services (desirable) Experience with streaming platforms like Event Hub or Kafka Familiarity with cloud cost optimisation techniques (desirable) What you'll get: Competitive salary and job-related benefits 25 days holiday allowance plus bank holidays Car Allowance Annual Target Bonus Pension up to 8% matched PMI Cover: Individual Life insurance up to 4x salary Company share save scheme Greencore Qualifications Exclusive Greencore employee discount platform Access to a full Wellbeing Centre platform
VIQU IT Recruitment
Data Engineer
VIQU IT Recruitment Cardiff, South Glamorgan
Data Engineer - 3 Month Contract - Hybrid - Cardiff VIQU have partnered with an NHS client who are seeking a Data Engineer to support an ongoing project. The successful Data Engineer will support the modernisation of the data estate, migrating legacy SQL Server warehouses into Azure. You will play a key role in shaping the new cloud data platform. Responsibilities: Build and optimise ETL/ELT pipelines with Azure Data Factory, Synapse, and SQL Database. Lead the migration of on-premises SQL Server/SSIS workloads into Azure. Design data lakes, marts, and models to support analytics and reporting. Integrate external sources including Google Cloud Platform. Drive data governance with tools like Azure Purview. Collaborate with architects, BI teams, and stakeholders to ensure seamless data access. Apply DevOps (ADO, GitHub) and Agile practices for delivery. Key skills & experience: Strong experience with SQL Server, T-SQL, and SSIS. Hands-on with Azure SQL, Data Factory, Synapse, and Data Lake. Track record in data warehouse development and cloud migrations. Proficient with Azure DevOps/GitHub. Strong problem-solving and documentation skills. Python for automation (desirable) Knowledge of Microsoft Fabric & Azure Purview (desirable) Exposure to GCP (desirable) NHS experience (desirable) Contract Overview Role: Data Engineer Duration: 3-month contract IR35: Inside IR35 Rate: £400 - £420 per day Location: Hybrid - Cardiff Apply now to speak with VIQU IT in confidence. Or reach out to Suzie Stone via the VIQU IT website. Do you know someone great? We'll thank you with up to £1,000 if your referral is successful (terms apply). For more exciting roles and opportunities like this, please follow us on IT Recruitment.
03/10/2025
Full time
Data Engineer - 3 Month Contract - Hybrid - Cardiff VIQU have partnered with an NHS client who are seeking a Data Engineer to support an ongoing project. The successful Data Engineer will support the modernisation of the data estate, migrating legacy SQL Server warehouses into Azure. You will play a key role in shaping the new cloud data platform. Responsibilities: Build and optimise ETL/ELT pipelines with Azure Data Factory, Synapse, and SQL Database. Lead the migration of on-premises SQL Server/SSIS workloads into Azure. Design data lakes, marts, and models to support analytics and reporting. Integrate external sources including Google Cloud Platform. Drive data governance with tools like Azure Purview. Collaborate with architects, BI teams, and stakeholders to ensure seamless data access. Apply DevOps (ADO, GitHub) and Agile practices for delivery. Key skills & experience: Strong experience with SQL Server, T-SQL, and SSIS. Hands-on with Azure SQL, Data Factory, Synapse, and Data Lake. Track record in data warehouse development and cloud migrations. Proficient with Azure DevOps/GitHub. Strong problem-solving and documentation skills. Python for automation (desirable) Knowledge of Microsoft Fabric & Azure Purview (desirable) Exposure to GCP (desirable) NHS experience (desirable) Contract Overview Role: Data Engineer Duration: 3-month contract IR35: Inside IR35 Rate: £400 - £420 per day Location: Hybrid - Cardiff Apply now to speak with VIQU IT in confidence. Or reach out to Suzie Stone via the VIQU IT website. Do you know someone great? We'll thank you with up to £1,000 if your referral is successful (terms apply). For more exciting roles and opportunities like this, please follow us on IT Recruitment.
Greencore
Senior Data Engineer
Greencore Scofton, Nottinghamshire
Why Greencore? We're a leading manufacturer of convenience food in the UK and our purpose is to make everyday taste better! We're a vibrant, fast-paced leading food manufacturer. Employing 13,300 colleagues across 16 manufacturing units and 17 distribution depots across the UK. We supply all the UK's food retailers with everything from Sandwiches, soups and sushi to cooking sauces, pickles and ready meals, and in FY24, we generated revenues of 1.8bn. Our vast direct-to-store (DTS) distribution network, comprising of 17 depots nationwide, enables us to make over 10,500 daily deliveries of our own chilled and frozen produce and that of third parties. Why is this exciting for your career as a Senior Data Engineer? The MBE Programme presents a huge opportunity for colleagues across the technology function to play a central role in the design, shape, delivery and execution of an enterprise wide digital transformation programme. The complexity of the initiative, within a FTSE 250 business, will allow for large-scale problem solving, group wide impact assessment and supporting the delivery of an enablement project to future proof the business. Why we embarked on Making Business Easier? Over time processes have become increasingly complex, increasing both the risk and cost they pose, whilst restricting our agility. At the same time, our customers and the market expect more from us than ever before. Making Business Easier forms a fundamental foundation for our commercial and operational excellence agendas, whilst supporting managing our cost base effectively in the future. The MBE Programme will streamline and simplify core processes, provide easier access to quality business data and will invest in the right technology to enable these processes. What you'll be doing: As a Senior Data Engineer, you will play a key role in shaping and delivering enterprise-wide data solutions that translate complex business requirements into scalable, high-performance data platforms. In this role, you will help define and guide the structure of data systems, focusing on seamless integration, accessibility, and governance, while optimising data flows to support both analytics and operational needs. Collaborating closely with business stakeholders, data engineers, and analysts, you will ensure that data platforms are robust, efficient, and adaptable to evolving business priorities. You will also support the usage, alignment, and consistency of data models; therefore, will have a wide-ranging role across many business projects and deliverables Shape and implement data solutions that align with business objectives and leverage both cloud and on-premise technologies Translate complex business needs into scalable, high-performing data solutions Support the development and application of best practices in data governance, security, and system design Collaborate closely with business stakeholders, product teams, and engineers to design and deliver effective, integrated data solutions Optimise data flows and pipelines to enable a wide range of analytical and operational use cases Promote data consistency across transactional and analytical systems through well-designed integration approaches Contribute to the design and ongoing improvement of data platforms - including data lakes, data warehouses, and other distributed storage environments - focused on efficiency, scalability, and ease of maintenance Mentor and support junior engineers and analysts in applying best practices in data engineering and solution design What you'll need: 5+ years of Data Engineering experience, with expertise in Azure data services and/or Microsoft Fabric Strong expertise in designing scalable data platforms and managing cloud-based data ecosystems Proven track record in data integration, ETL processes, and optimising large-scale data systems Expertise in cloud-based data platforms (AWS, Azure, Google Cloud) and distributed storage solutions Proficiency in Python, PySpark, SQL, NoSQL, and data processing frameworks (Spark, Databricks) Expertise in ETL/ELT design and orchestration in Azure, as well as pipeline performance tuning & optimisation Competent in integrating relational, NoSQL, and streaming data sources Management of CI/CD pipelines & Git-based workflows Good knowledge of data governance, privacy regulations, and security best practices Experience with modern data architectures, including data lakes, data mesh, and event-driven data processing Strong problem-solving and analytical skills to translate complex business needs into scalable data solutions Excellent communication and stakeholder management to align business and technical goals High attention to detail and commitment to data quality, security, and governance Ability to mentor and guide teams, fostering a culture of best practices in data architecture Power BI and DAX for data visualisation (desirable) Knowledge of Azure Machine Learning and AI services (desirable) Experience with streaming platforms like Event Hub or Kafka Familiarity with cloud cost optimisation techniques (desirable) What you'll get: Competitive salary and job-related benefits 25 days holiday allowance plus bank holidays Car Allowance Annual Target Bonus Pension up to 8% matched PMI Cover: Individual Life insurance up to 4x salary Company share save scheme Greencore Qualifications Exclusive Greencore employee discount platform Access to a full Wellbeing Centre platform
02/10/2025
Full time
Why Greencore? We're a leading manufacturer of convenience food in the UK and our purpose is to make everyday taste better! We're a vibrant, fast-paced leading food manufacturer. Employing 13,300 colleagues across 16 manufacturing units and 17 distribution depots across the UK. We supply all the UK's food retailers with everything from Sandwiches, soups and sushi to cooking sauces, pickles and ready meals, and in FY24, we generated revenues of 1.8bn. Our vast direct-to-store (DTS) distribution network, comprising of 17 depots nationwide, enables us to make over 10,500 daily deliveries of our own chilled and frozen produce and that of third parties. Why is this exciting for your career as a Senior Data Engineer? The MBE Programme presents a huge opportunity for colleagues across the technology function to play a central role in the design, shape, delivery and execution of an enterprise wide digital transformation programme. The complexity of the initiative, within a FTSE 250 business, will allow for large-scale problem solving, group wide impact assessment and supporting the delivery of an enablement project to future proof the business. Why we embarked on Making Business Easier? Over time processes have become increasingly complex, increasing both the risk and cost they pose, whilst restricting our agility. At the same time, our customers and the market expect more from us than ever before. Making Business Easier forms a fundamental foundation for our commercial and operational excellence agendas, whilst supporting managing our cost base effectively in the future. The MBE Programme will streamline and simplify core processes, provide easier access to quality business data and will invest in the right technology to enable these processes. What you'll be doing: As a Senior Data Engineer, you will play a key role in shaping and delivering enterprise-wide data solutions that translate complex business requirements into scalable, high-performance data platforms. In this role, you will help define and guide the structure of data systems, focusing on seamless integration, accessibility, and governance, while optimising data flows to support both analytics and operational needs. Collaborating closely with business stakeholders, data engineers, and analysts, you will ensure that data platforms are robust, efficient, and adaptable to evolving business priorities. You will also support the usage, alignment, and consistency of data models; therefore, will have a wide-ranging role across many business projects and deliverables Shape and implement data solutions that align with business objectives and leverage both cloud and on-premise technologies Translate complex business needs into scalable, high-performing data solutions Support the development and application of best practices in data governance, security, and system design Collaborate closely with business stakeholders, product teams, and engineers to design and deliver effective, integrated data solutions Optimise data flows and pipelines to enable a wide range of analytical and operational use cases Promote data consistency across transactional and analytical systems through well-designed integration approaches Contribute to the design and ongoing improvement of data platforms - including data lakes, data warehouses, and other distributed storage environments - focused on efficiency, scalability, and ease of maintenance Mentor and support junior engineers and analysts in applying best practices in data engineering and solution design What you'll need: 5+ years of Data Engineering experience, with expertise in Azure data services and/or Microsoft Fabric Strong expertise in designing scalable data platforms and managing cloud-based data ecosystems Proven track record in data integration, ETL processes, and optimising large-scale data systems Expertise in cloud-based data platforms (AWS, Azure, Google Cloud) and distributed storage solutions Proficiency in Python, PySpark, SQL, NoSQL, and data processing frameworks (Spark, Databricks) Expertise in ETL/ELT design and orchestration in Azure, as well as pipeline performance tuning & optimisation Competent in integrating relational, NoSQL, and streaming data sources Management of CI/CD pipelines & Git-based workflows Good knowledge of data governance, privacy regulations, and security best practices Experience with modern data architectures, including data lakes, data mesh, and event-driven data processing Strong problem-solving and analytical skills to translate complex business needs into scalable data solutions Excellent communication and stakeholder management to align business and technical goals High attention to detail and commitment to data quality, security, and governance Ability to mentor and guide teams, fostering a culture of best practices in data architecture Power BI and DAX for data visualisation (desirable) Knowledge of Azure Machine Learning and AI services (desirable) Experience with streaming platforms like Event Hub or Kafka Familiarity with cloud cost optimisation techniques (desirable) What you'll get: Competitive salary and job-related benefits 25 days holiday allowance plus bank holidays Car Allowance Annual Target Bonus Pension up to 8% matched PMI Cover: Individual Life insurance up to 4x salary Company share save scheme Greencore Qualifications Exclusive Greencore employee discount platform Access to a full Wellbeing Centre platform
Tenth Revolution Group
EXL - Data Solutions Lead Pre Sales
Tenth Revolution Group City, London
Data Solutions lead- 95,000-Hybrid About the Role We are seeking a dynamic and client-focused Pre-Sales Data Solutions Architect to join our high-performing team. In this role, you will lead the design and positioning of modern data platform solutions for enterprise clients. Your primary focus will be on pre-sales solutioning , including RFP/RFI response leadership , client workshops , and translating complex data architectures into compelling business value propositions. Key Responsibilities Client-Facing Solution Leadership Lead the creation of tailored, data-driven solutions in response to RFPs, RFIs, and client needs. Engage with senior stakeholders to understand business objectives and align them with scalable, modern data platform architectures. Craft solution narratives that clearly articulate business value and technical feasibility. Cross-Functional Collaboration Collaborate with sales, data architects, engineers, and delivery teams to shape commercially viable and technically sound solutions. Conduct discovery workshops and requirements-gathering sessions with clients to inform solution design. Act as a bridge between business and technical teams, ensuring a shared understanding of goals and outcomes. Proposal Management & Industry Insight Own and manage end-to-end proposal processes including content creation, reviews, pricing inputs, and executive summaries. Maintain a repository of reusable proposal assets, templates, and case studies. Stay current on trends in cloud platforms (Azure, AWS, GCP), data governance, analytics, and emerging technologies to inform solution strategy. Required Qualifications Proven experience in a pre-sales or consulting role focused on data platforms, cloud architecture, or analytics solutions. Deep knowledge of modern cloud data platforms (Azure, AWS, GCP), including data lakes, warehousing, governance, and pipelines. Demonstrated ability to lead RFP/RFI/RFS responses and engage effectively with senior business and IT stakeholders . Strong communication skills with the ability to translate complex technical concepts into clear business benefits. Experience with discovery workshops, solution design sessions, and proposal presentations. Preferred Qualifications Prior experience in a consulting, systems integrator, or cloud services environment. Certifications in Azure, AWS, or GCP (e.g., Azure Data Engineer, AWS Data Analytics, GCP Professional Data Engineer). Familiarity with data governance tools and practices (e.g., Collibra, Informatica, Microsoft Purview). Industry-specific knowledge in sectors like Financial Services, Healthcare, or Retail is a plus. To apply for this role please submit your CV or contact Dillon Blackburn on (phone number removed) or at (url removed). Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
02/10/2025
Full time
Data Solutions lead- 95,000-Hybrid About the Role We are seeking a dynamic and client-focused Pre-Sales Data Solutions Architect to join our high-performing team. In this role, you will lead the design and positioning of modern data platform solutions for enterprise clients. Your primary focus will be on pre-sales solutioning , including RFP/RFI response leadership , client workshops , and translating complex data architectures into compelling business value propositions. Key Responsibilities Client-Facing Solution Leadership Lead the creation of tailored, data-driven solutions in response to RFPs, RFIs, and client needs. Engage with senior stakeholders to understand business objectives and align them with scalable, modern data platform architectures. Craft solution narratives that clearly articulate business value and technical feasibility. Cross-Functional Collaboration Collaborate with sales, data architects, engineers, and delivery teams to shape commercially viable and technically sound solutions. Conduct discovery workshops and requirements-gathering sessions with clients to inform solution design. Act as a bridge between business and technical teams, ensuring a shared understanding of goals and outcomes. Proposal Management & Industry Insight Own and manage end-to-end proposal processes including content creation, reviews, pricing inputs, and executive summaries. Maintain a repository of reusable proposal assets, templates, and case studies. Stay current on trends in cloud platforms (Azure, AWS, GCP), data governance, analytics, and emerging technologies to inform solution strategy. Required Qualifications Proven experience in a pre-sales or consulting role focused on data platforms, cloud architecture, or analytics solutions. Deep knowledge of modern cloud data platforms (Azure, AWS, GCP), including data lakes, warehousing, governance, and pipelines. Demonstrated ability to lead RFP/RFI/RFS responses and engage effectively with senior business and IT stakeholders . Strong communication skills with the ability to translate complex technical concepts into clear business benefits. Experience with discovery workshops, solution design sessions, and proposal presentations. Preferred Qualifications Prior experience in a consulting, systems integrator, or cloud services environment. Certifications in Azure, AWS, or GCP (e.g., Azure Data Engineer, AWS Data Analytics, GCP Professional Data Engineer). Familiarity with data governance tools and practices (e.g., Collibra, Informatica, Microsoft Purview). Industry-specific knowledge in sectors like Financial Services, Healthcare, or Retail is a plus. To apply for this role please submit your CV or contact Dillon Blackburn on (phone number removed) or at (url removed). Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Tenth Revolution Group
EXL - Data Solutions Lead Pre Sales
Tenth Revolution Group City, London
Data Solutions lead-£95,000-Hybrid About the Role We are seeking a dynamic and client-focused Pre-Sales Data Solutions Architect to join our high-performing team. In this role, you will lead the design and positioning of modern data platform solutions for enterprise clients. Your primary focus will be on pre-sales solutioning , including RFP/RFI response leadership , client workshops , and translating complex data architectures into compelling business value propositions. Key Responsibilities Client-Facing Solution Leadership Lead the creation of tailored, data-driven solutions in response to RFPs, RFIs, and client needs. Engage with senior stakeholders to understand business objectives and align them with scalable, modern data platform architectures. Craft solution narratives that clearly articulate business value and technical feasibility. Cross-Functional Collaboration Collaborate with sales, data architects, engineers, and delivery teams to shape commercially viable and technically sound solutions. Conduct discovery workshops and requirements-gathering sessions with clients to inform solution design. Act as a bridge between business and technical teams, ensuring a shared understanding of goals and outcomes. Proposal Management & Industry Insight Own and manage end-to-end proposal processes including content creation, reviews, pricing inputs, and executive summaries. Maintain a repository of reusable proposal assets, templates, and case studies. Stay current on trends in cloud platforms (Azure, AWS, GCP), data governance, analytics, and emerging technologies to inform solution strategy. Required Qualifications Proven experience in a pre-sales or consulting role focused on data platforms, cloud architecture, or analytics solutions. Deep knowledge of modern cloud data platforms (Azure, AWS, GCP), including data lakes, warehousing, governance, and pipelines. Demonstrated ability to lead RFP/RFI/RFS responses and engage effectively with senior business and IT stakeholders . Strong communication skills with the ability to translate complex technical concepts into clear business benefits. Experience with discovery workshops, solution design sessions, and proposal presentations. Preferred Qualifications Prior experience in a consulting, systems integrator, or cloud services environment. Certifications in Azure, AWS, or GCP (eg, Azure Data Engineer, AWS Data Analytics, GCP Professional Data Engineer). Familiarity with data governance tools and practices (eg, Collibra, Informatica, Microsoft Purview). Industry-specific knowledge in sectors like Financial Services, Healthcare, or Retail is a plus. To apply for this role please submit your CV or contact Dillon Blackburn (see below) Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
02/10/2025
Full time
Data Solutions lead-£95,000-Hybrid About the Role We are seeking a dynamic and client-focused Pre-Sales Data Solutions Architect to join our high-performing team. In this role, you will lead the design and positioning of modern data platform solutions for enterprise clients. Your primary focus will be on pre-sales solutioning , including RFP/RFI response leadership , client workshops , and translating complex data architectures into compelling business value propositions. Key Responsibilities Client-Facing Solution Leadership Lead the creation of tailored, data-driven solutions in response to RFPs, RFIs, and client needs. Engage with senior stakeholders to understand business objectives and align them with scalable, modern data platform architectures. Craft solution narratives that clearly articulate business value and technical feasibility. Cross-Functional Collaboration Collaborate with sales, data architects, engineers, and delivery teams to shape commercially viable and technically sound solutions. Conduct discovery workshops and requirements-gathering sessions with clients to inform solution design. Act as a bridge between business and technical teams, ensuring a shared understanding of goals and outcomes. Proposal Management & Industry Insight Own and manage end-to-end proposal processes including content creation, reviews, pricing inputs, and executive summaries. Maintain a repository of reusable proposal assets, templates, and case studies. Stay current on trends in cloud platforms (Azure, AWS, GCP), data governance, analytics, and emerging technologies to inform solution strategy. Required Qualifications Proven experience in a pre-sales or consulting role focused on data platforms, cloud architecture, or analytics solutions. Deep knowledge of modern cloud data platforms (Azure, AWS, GCP), including data lakes, warehousing, governance, and pipelines. Demonstrated ability to lead RFP/RFI/RFS responses and engage effectively with senior business and IT stakeholders . Strong communication skills with the ability to translate complex technical concepts into clear business benefits. Experience with discovery workshops, solution design sessions, and proposal presentations. Preferred Qualifications Prior experience in a consulting, systems integrator, or cloud services environment. Certifications in Azure, AWS, or GCP (eg, Azure Data Engineer, AWS Data Analytics, GCP Professional Data Engineer). Familiarity with data governance tools and practices (eg, Collibra, Informatica, Microsoft Purview). Industry-specific knowledge in sectors like Financial Services, Healthcare, or Retail is a plus. To apply for this role please submit your CV or contact Dillon Blackburn (see below) Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Five Guys
Data Engineer
Five Guys
Burgers & Fries and Incredible Careers! We're the burger restaurant with the uncomplicated formula: burgers and fries cooked to perfection, with no frozen ingredients. And we've stuck to the same 'perfect and serve' philosophy since our family business began in 1986. We are seeking a highly capable and driven Data Engineer to join our dynamic data team. This role is pivotal in leading and initiating the migration of our data infrastructure from AWS to Microsoft Azure, shaping the future state of our data platform. You'll work at the heart of the business, enabling data integration, transformation, and reporting across a range of complex data sources. This is a strategic, hands-on role that requires someone who is not afraid to explore new technologies, drive change, and build future-proof data engineering solutions. You'll collaborate with cross-functional teams, manage data integrations, and optimize the performance of databases and cloud environments in a high-growth, fast-paced setting. YOUR RESPONSIBILITIES: Cloud Migration Leadership Play a key role in the execution of our data migration from AWS to Azure, working closely with the Solution Architect and Head of data to shape the technical approach and deliver scalable, cloud-native solutions. Evaluate existing data infrastructure and propose Azure-native solutions aligned to business needs. Define and lead the migration roadmap, collaborating with internal stakeholders and third parties. Advocate for innovation and drive adoption of Azure services, standards, and tools across the team. Apply and champion Azure Medallion architecture principles in future-state design. Data Ingestion & Integration Design and implement ingestion pipelines into Azure (Data Lake, Synapse, Blob Storage, Azure SQL). Ingest structured, semi-structured, and unstructured data from APIs, FTP/SFTP, cloud platforms, and file-based sources. Manage data integrations with suppliers, including file transfers and API-based solutions. Use Power Automate (Flow) to automate tasks like copying files from SharePoint or emails to SFTP. Provide third-line support for code issues and existing workflows, lead resolution of data failures and integration problems. Collaborate with stakeholders to define integration requirements and implement robust solutions. Data Transformation & ETL Build, optimize, and maintain ETL/ELT pipelines using Azure Data Factory and other cloud-native tools. Develop reusable, scalable transformation logic to enable analytics and reporting needs. Apply data validation, cleansing, and lineage tracking throughout the pipeline lifecycle. Support both existing and future ETL workloads with strong SQL and scripting capabilities. Data Platform & Server Management Revamp and support existing database designs to enable a scalable migration to Azure. Introduce and enforce standards and governance around data structures and code. Optimize T-SQL code and SQL Server performance before deploying into production. Manage consistent data platforms across regions with a focus on reusability and modularity. Monitor and control database file growth, indexing strategies, and performance tuning. Develop scalable solutions based on both current and emerging business requirements. Data Cloud Operations & Resilience Lead initiatives in data cloud management, including usage monitoring and optimization. Configure and test disaster recovery processes, ensuring business continuity through backup and restore exercises. Archive or delete legacy data in line with governance and retention requirements. Delivery & Documentation Conduct data mapping exercises, analysing various sources and formats to design fit-for-purpose models. Create and present complete logical data models to both technical teams and business stakeholders. Participate in scoping future platforms and assessing legacy systems for migration to Azure. Perform functional and systems analysis, producing supporting documentation (e.g., flowcharts, data diagrams). Be the lead in discovery workshops, weekly stand-ups, and cross-functional solution delivery. YOUR TECHNICAL SKILLS & EXPERERIENCE Essential 6-10 years' experience working with SQL Server (including T-SQL development and performance tuning). Proven experience designing ETL/ELT pipelines, ideally in Azure Data Factory. Hands-on experience ingesting and transforming data in cloud environments (preferably Azure). Familiarity with Azure Medallion architecture or similar layered data designs. Integration experience with APIs, SFTP, cloud databases, and file systems. Comfortable working with Power Automate, SharePoint, and Office 365 integrations. Basic knowledge of C# or PowerShell for custom data integration tasks. Experience troubleshooting production systems and optimizing SQL workloads. Strong understanding of database administration, performance, and scalability. Cloud certification (e.g., Azure Practitioner or equivalent) is highly desirable. Desirable Experience working with notebooks and Python in Microsoft Fabric for data exploration, transformation, and pipeline development. Experience with AWS services and data environments, especially helpful for transition planning. Exposure to CI/CD, DevOps pipelines, and Git for data engineering workflows. Familiarity with data governance, quality management, and security practices. YOUR BEHAVIOURS & ATTRIBUTES Technically curious and excited by the opportunity to learn and adopt new tools. Naturally proactive and comfortable driving change in existing environments. Calm and solution-oriented in high-pressure or production-critical scenarios. Collaborative and open communicator who brings people on the journey. Able to travel occasionally within Europe to support cross-territory platforms and teams. LOCATION This role is based in our West London Office including some remote working
02/10/2025
Full time
Burgers & Fries and Incredible Careers! We're the burger restaurant with the uncomplicated formula: burgers and fries cooked to perfection, with no frozen ingredients. And we've stuck to the same 'perfect and serve' philosophy since our family business began in 1986. We are seeking a highly capable and driven Data Engineer to join our dynamic data team. This role is pivotal in leading and initiating the migration of our data infrastructure from AWS to Microsoft Azure, shaping the future state of our data platform. You'll work at the heart of the business, enabling data integration, transformation, and reporting across a range of complex data sources. This is a strategic, hands-on role that requires someone who is not afraid to explore new technologies, drive change, and build future-proof data engineering solutions. You'll collaborate with cross-functional teams, manage data integrations, and optimize the performance of databases and cloud environments in a high-growth, fast-paced setting. YOUR RESPONSIBILITIES: Cloud Migration Leadership Play a key role in the execution of our data migration from AWS to Azure, working closely with the Solution Architect and Head of data to shape the technical approach and deliver scalable, cloud-native solutions. Evaluate existing data infrastructure and propose Azure-native solutions aligned to business needs. Define and lead the migration roadmap, collaborating with internal stakeholders and third parties. Advocate for innovation and drive adoption of Azure services, standards, and tools across the team. Apply and champion Azure Medallion architecture principles in future-state design. Data Ingestion & Integration Design and implement ingestion pipelines into Azure (Data Lake, Synapse, Blob Storage, Azure SQL). Ingest structured, semi-structured, and unstructured data from APIs, FTP/SFTP, cloud platforms, and file-based sources. Manage data integrations with suppliers, including file transfers and API-based solutions. Use Power Automate (Flow) to automate tasks like copying files from SharePoint or emails to SFTP. Provide third-line support for code issues and existing workflows, lead resolution of data failures and integration problems. Collaborate with stakeholders to define integration requirements and implement robust solutions. Data Transformation & ETL Build, optimize, and maintain ETL/ELT pipelines using Azure Data Factory and other cloud-native tools. Develop reusable, scalable transformation logic to enable analytics and reporting needs. Apply data validation, cleansing, and lineage tracking throughout the pipeline lifecycle. Support both existing and future ETL workloads with strong SQL and scripting capabilities. Data Platform & Server Management Revamp and support existing database designs to enable a scalable migration to Azure. Introduce and enforce standards and governance around data structures and code. Optimize T-SQL code and SQL Server performance before deploying into production. Manage consistent data platforms across regions with a focus on reusability and modularity. Monitor and control database file growth, indexing strategies, and performance tuning. Develop scalable solutions based on both current and emerging business requirements. Data Cloud Operations & Resilience Lead initiatives in data cloud management, including usage monitoring and optimization. Configure and test disaster recovery processes, ensuring business continuity through backup and restore exercises. Archive or delete legacy data in line with governance and retention requirements. Delivery & Documentation Conduct data mapping exercises, analysing various sources and formats to design fit-for-purpose models. Create and present complete logical data models to both technical teams and business stakeholders. Participate in scoping future platforms and assessing legacy systems for migration to Azure. Perform functional and systems analysis, producing supporting documentation (e.g., flowcharts, data diagrams). Be the lead in discovery workshops, weekly stand-ups, and cross-functional solution delivery. YOUR TECHNICAL SKILLS & EXPERERIENCE Essential 6-10 years' experience working with SQL Server (including T-SQL development and performance tuning). Proven experience designing ETL/ELT pipelines, ideally in Azure Data Factory. Hands-on experience ingesting and transforming data in cloud environments (preferably Azure). Familiarity with Azure Medallion architecture or similar layered data designs. Integration experience with APIs, SFTP, cloud databases, and file systems. Comfortable working with Power Automate, SharePoint, and Office 365 integrations. Basic knowledge of C# or PowerShell for custom data integration tasks. Experience troubleshooting production systems and optimizing SQL workloads. Strong understanding of database administration, performance, and scalability. Cloud certification (e.g., Azure Practitioner or equivalent) is highly desirable. Desirable Experience working with notebooks and Python in Microsoft Fabric for data exploration, transformation, and pipeline development. Experience with AWS services and data environments, especially helpful for transition planning. Exposure to CI/CD, DevOps pipelines, and Git for data engineering workflows. Familiarity with data governance, quality management, and security practices. YOUR BEHAVIOURS & ATTRIBUTES Technically curious and excited by the opportunity to learn and adopt new tools. Naturally proactive and comfortable driving change in existing environments. Calm and solution-oriented in high-pressure or production-critical scenarios. Collaborative and open communicator who brings people on the journey. Able to travel occasionally within Europe to support cross-territory platforms and teams. LOCATION This role is based in our West London Office including some remote working
Enterprise Data Architect
IT Jobs City of London, London
We're looking for an experienced Enterprise Data Architect to lead the design and evolution of our clients data architecture across their key business domains. You'll work on major transformation programmes, shaping the data strategy and enabling unified, scalable, and secure solutions. We are looking for candidates that have very strong experience across Retail, Hospitality, Hotels or customer focussed service industries, working in an end client environment, not candidates from a Consultancy background. PLEASE NOTE: The client requires candidates to be on site in London 3 days per week - this is not negotiable. Please do not apply if you cannot be in London 3 days pw. Key Responsibilities Design conceptual, logical, and physical data models for customer, product, HR, and finance data. Collaborate across business and tech teams to deliver cohesive data architecture. Lead architecture reviews and guide cloud-based data engineering initiatives. Champion governance, quality frameworks, and GenAI integration.Thought Leadership: Provide architectural leadership across key business areas, advocating for best practices in data strategy and governance. Lead the definition and execution of enterprise data strategy in partnership with other architects and data leaders. Define and drive improvements in data governance, ingestion optimisation, and exploration processes. Design and implement real-time data quality frameworks, including proactive anomaly detection and alerting mechanisms.Tech Stack & Tools ETL: Microsoft Fabric, Databricks Analytics: Power BI, SAP Analytics Cloud, Azure Synapse Governance: Azure Purview AI Tools: GenAI platformsWhat We're Looking For Strong experience in enterprise data architecture and transformation programmes. Expertise in distributed data domains, migrations (e.g. SAP Marketing Cloud to Emarsys), and GDPR compliance. Familiarity with SAP, Oracle, Workday, Microsoft Dynamics, and E-Commerce platforms. Strategic thinker with experience creating data strategies and capability-aligned roadmaps.If you're ready to drive enterprise-level data transformation and make a tangible impact, we'd love to hear from you
01/06/2025
We're looking for an experienced Enterprise Data Architect to lead the design and evolution of our clients data architecture across their key business domains. You'll work on major transformation programmes, shaping the data strategy and enabling unified, scalable, and secure solutions. We are looking for candidates that have very strong experience across Retail, Hospitality, Hotels or customer focussed service industries, working in an end client environment, not candidates from a Consultancy background. PLEASE NOTE: The client requires candidates to be on site in London 3 days per week - this is not negotiable. Please do not apply if you cannot be in London 3 days pw. Key Responsibilities Design conceptual, logical, and physical data models for customer, product, HR, and finance data. Collaborate across business and tech teams to deliver cohesive data architecture. Lead architecture reviews and guide cloud-based data engineering initiatives. Champion governance, quality frameworks, and GenAI integration.Thought Leadership: Provide architectural leadership across key business areas, advocating for best practices in data strategy and governance. Lead the definition and execution of enterprise data strategy in partnership with other architects and data leaders. Define and drive improvements in data governance, ingestion optimisation, and exploration processes. Design and implement real-time data quality frameworks, including proactive anomaly detection and alerting mechanisms.Tech Stack & Tools ETL: Microsoft Fabric, Databricks Analytics: Power BI, SAP Analytics Cloud, Azure Synapse Governance: Azure Purview AI Tools: GenAI platformsWhat We're Looking For Strong experience in enterprise data architecture and transformation programmes. Expertise in distributed data domains, migrations (e.g. SAP Marketing Cloud to Emarsys), and GDPR compliance. Familiarity with SAP, Oracle, Workday, Microsoft Dynamics, and E-Commerce platforms. Strategic thinker with experience creating data strategies and capability-aligned roadmaps.If you're ready to drive enterprise-level data transformation and make a tangible impact, we'd love to hear from you

Modal Window

  • Home
  • Contact
  • About Us
  • FAQs
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • IT blog
  • Facebook
  • Twitter
  • LinkedIn
  • Youtube
© 2008-2026 IT Job Board