it job board logo
  • Home
  • Find IT Jobs
  • Register CV
  • Register as Employer
  • Contact us
  • Career Advice
  • Recruiting? Post a job
  • Sign in
  • Sign up
  • Home
  • Find IT Jobs
  • Register CV
  • Register as Employer
  • Contact us
  • Career Advice
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

103 jobs found

Email me jobs like this
Refine Search
Current Search
data analytics gen ai engineers x 2
Crimson
Senior Data Engineer - Azure Data - Burton-on-Trent - Hybrid
Crimson Burton-on-trent, Staffordshire
Senior Data Engineer - Azure Data - Burton-on-Trent - Permanent - Hybrid Salary - £60,000 - £67,000 per annum This role requires 1 day / week in Burton-on-Trent, with hybrid working arrangements. Our client is seeking a highly skilled Senior Data Engineer to join their dynamic IT team, based in Burton-on-Trent. The Senior Data Engineer will come on board to support the Strategic Data Manager in establishing and managing an efficient Business Intelligence technical service. Assisting in the advancement of our cloud-based data platforms, providing options for timely processing and cost-efficient solutions. A strong background in Azure Data Pipeline development is key for this position. Key Skills & Responsibilities: Build and manage pipelines using Azure Data Factory, Databricks, CI/CD, and Terraform. Optimisation of ETL processes for performance and cost-efficiency. Design scalable data models aligned with business needs. Azure data solutions for efficient data storage and retrieval. Ensure compliance with data protection laws (e.g., GDPR), implement encryption and access controls. Work with cross-functional teams and mentor junior engineers. Manage and tune Azure SQL Database instances. Proactively monitor pipelines and infrastructure for performance and reliability. Maintain technical documentation and lead knowledge-sharing initiatives. Deploy advanced analytics and machine learning solutions using Azure. Stay current with Azure technologies and identify areas for enhancement. Databricks (Unity Catalog, DLT), Data Factory, Synapse, Data Lake, Stream Analytics, Event Hubs. Strong knowledge of Python, Scala, C#, .NET. Experience with advanced SQL, T-SQL, relational databases. Azure DevOps, Terraform, BICEP, ARM templates. Distributed computing, cloud-native design patterns. Data modelling, metadata management, data quality, data as a product. Strong communication, empathy, determination, openness to innovation. Strong Microsoft Office 365 experience Interested? Please submit your updated CV to Lewis Rushton at Crimson for immediate consideration. Not interested? Do you know someone who might be a perfect fit for this role? Refer a friend and earn £250 worth of vouchers! Crimson is acting as an employment agency regarding this vacancy Crimson is acting as an employment agency regarding this vacancy
06/12/2025
Full time
Senior Data Engineer - Azure Data - Burton-on-Trent - Permanent - Hybrid Salary - £60,000 - £67,000 per annum This role requires 1 day / week in Burton-on-Trent, with hybrid working arrangements. Our client is seeking a highly skilled Senior Data Engineer to join their dynamic IT team, based in Burton-on-Trent. The Senior Data Engineer will come on board to support the Strategic Data Manager in establishing and managing an efficient Business Intelligence technical service. Assisting in the advancement of our cloud-based data platforms, providing options for timely processing and cost-efficient solutions. A strong background in Azure Data Pipeline development is key for this position. Key Skills & Responsibilities: Build and manage pipelines using Azure Data Factory, Databricks, CI/CD, and Terraform. Optimisation of ETL processes for performance and cost-efficiency. Design scalable data models aligned with business needs. Azure data solutions for efficient data storage and retrieval. Ensure compliance with data protection laws (e.g., GDPR), implement encryption and access controls. Work with cross-functional teams and mentor junior engineers. Manage and tune Azure SQL Database instances. Proactively monitor pipelines and infrastructure for performance and reliability. Maintain technical documentation and lead knowledge-sharing initiatives. Deploy advanced analytics and machine learning solutions using Azure. Stay current with Azure technologies and identify areas for enhancement. Databricks (Unity Catalog, DLT), Data Factory, Synapse, Data Lake, Stream Analytics, Event Hubs. Strong knowledge of Python, Scala, C#, .NET. Experience with advanced SQL, T-SQL, relational databases. Azure DevOps, Terraform, BICEP, ARM templates. Distributed computing, cloud-native design patterns. Data modelling, metadata management, data quality, data as a product. Strong communication, empathy, determination, openness to innovation. Strong Microsoft Office 365 experience Interested? Please submit your updated CV to Lewis Rushton at Crimson for immediate consideration. Not interested? Do you know someone who might be a perfect fit for this role? Refer a friend and earn £250 worth of vouchers! Crimson is acting as an employment agency regarding this vacancy Crimson is acting as an employment agency regarding this vacancy
Hampshire County Council
Data Architect
Hampshire County Council Winchester, Hampshire
Joining our Data and Systems Team as a skilled Data Architect, passionate about data architecture and digital transformation, you'll support the SEN Digital Optimisation Programme, which aims to transform how services for children and young people with Special Educational Needs (SEN) are delivered. In this pivotal role, you'll design, implement, and maintain robust data architecture to drive strategic objectives and service optimisation. You'll collaborate with colleagues across Data and Systems, SEN Service, and IT to ensure data is structured, governed, and accessible for analytics, operational efficiency, and informed decision-making. What you'll do: Data Architecture Design: Design, implement and maintain enterprise data models, data flow diagrams, and architecture blueprints aligned with SEN business needs. Governance and Standards: Define and enforce data standards, policies, and governance frameworks to ensure data quality, security, and compliance. Integration and Interoperability: Develop data integration strategies across systems and platforms for seamless data exchange and consistency. Collaboration: Work with data engineers, analysts, developers, and stakeholders to translate requirements into scalable data solutions. Lead technical solution development and manage a technical analyst focused on SQL and reporting alignment. Documentation and Communication: Maintain clear and comprehensive documentation and communicate complex concepts clearly to technical and non-technical audiences. What we're looking for: Degree and/or formal industry-recognised qualifications or equivalent experience. Proven experience in data architecture, data modelling, and database design. Strong understanding of data warehousing, ETL processes, and cloud data platforms (e.g. Azure). Proficiency in SQL and data modelling tools, with familiarity in data governance frameworks (e.g. DAMA). Excellent problem-solving, communication, and stakeholder management skills. Ability to manage and prioritise workloads flexibly. Aptitude for troubleshooting and exploring new/existing data tools. Why join us: Play a key role in a high-profile digital transformation programme. Work in a collaborative environment that values innovation, continuous improvement, and customer focus. Opportunities for professional development and to contribute to service improvement. Access to Health Assured's comprehensive Employee Assistance Programme to support your physical and mental wellbeing, including 24/7 telephone support, a suite of online resources, and legal and financial advice. A competitive benefits package that includes generous annual leave entitlement, occupational sick pay, and access to the Local Government Pension Scheme. To learn more about this role, please review our Candidate Pack available on our website. Please click on the Apply button for details. Applicants can expect to hear from us within two weeks of the advertised closing date. Please note: We are unable to offer sponsorship for this role and therefore it is essential that you already have the right to work in the UK before applying. Other roles you may have experience of may include: Enterprise Data Architect, Solutions Data Architect, Information Architect, Solutions Data Architect, Data Integration Architect, Data Platform Architect, Analytics Architect, Digital Data Architect.
06/12/2025
Seasonal
Joining our Data and Systems Team as a skilled Data Architect, passionate about data architecture and digital transformation, you'll support the SEN Digital Optimisation Programme, which aims to transform how services for children and young people with Special Educational Needs (SEN) are delivered. In this pivotal role, you'll design, implement, and maintain robust data architecture to drive strategic objectives and service optimisation. You'll collaborate with colleagues across Data and Systems, SEN Service, and IT to ensure data is structured, governed, and accessible for analytics, operational efficiency, and informed decision-making. What you'll do: Data Architecture Design: Design, implement and maintain enterprise data models, data flow diagrams, and architecture blueprints aligned with SEN business needs. Governance and Standards: Define and enforce data standards, policies, and governance frameworks to ensure data quality, security, and compliance. Integration and Interoperability: Develop data integration strategies across systems and platforms for seamless data exchange and consistency. Collaboration: Work with data engineers, analysts, developers, and stakeholders to translate requirements into scalable data solutions. Lead technical solution development and manage a technical analyst focused on SQL and reporting alignment. Documentation and Communication: Maintain clear and comprehensive documentation and communicate complex concepts clearly to technical and non-technical audiences. What we're looking for: Degree and/or formal industry-recognised qualifications or equivalent experience. Proven experience in data architecture, data modelling, and database design. Strong understanding of data warehousing, ETL processes, and cloud data platforms (e.g. Azure). Proficiency in SQL and data modelling tools, with familiarity in data governance frameworks (e.g. DAMA). Excellent problem-solving, communication, and stakeholder management skills. Ability to manage and prioritise workloads flexibly. Aptitude for troubleshooting and exploring new/existing data tools. Why join us: Play a key role in a high-profile digital transformation programme. Work in a collaborative environment that values innovation, continuous improvement, and customer focus. Opportunities for professional development and to contribute to service improvement. Access to Health Assured's comprehensive Employee Assistance Programme to support your physical and mental wellbeing, including 24/7 telephone support, a suite of online resources, and legal and financial advice. A competitive benefits package that includes generous annual leave entitlement, occupational sick pay, and access to the Local Government Pension Scheme. To learn more about this role, please review our Candidate Pack available on our website. Please click on the Apply button for details. Applicants can expect to hear from us within two weeks of the advertised closing date. Please note: We are unable to offer sponsorship for this role and therefore it is essential that you already have the right to work in the UK before applying. Other roles you may have experience of may include: Enterprise Data Architect, Solutions Data Architect, Information Architect, Solutions Data Architect, Data Integration Architect, Data Platform Architect, Analytics Architect, Digital Data Architect.
CGI
Data Architect
CGI
Data Architect Position Description At CGI, we're helping to transform the future of healthcare through the power of data. As a Data Architect, you'll play a pivotal role in designing, building, and optimising data platforms that underpin critical national services. Working at the heart of our Healthcare team, you'll use your expertise in AWS, and Databricks to deliver high-impact solutions that improve outcomes, enhance decision-making, and drive innovation across the sector. You'll collaborate with experts who share your passion for problem-solving, ownership, and technical excellence-empowered to shape the data foundations of tomorrow. CGI was recognised in the Sunday Times Best Places to Work List 2025 and has been named a UK 'Best Employer' by the Financial Times. We offer a competitive salary, excellent pension, private healthcare, plus a share scheme (3.5% + 3.5% matching) which makes you a CGI Partner not just an employee. We are committed to inclusivity, building a genuinely diverse community of tech talent and inspiring everyone to pursue careers in our sector, including our Armed Forces, and are proud to hold a Gold Award in recognition of our support of the Armed Forces Corporate Covenant. Join us and you'll be part of an open, friendly community of experts. We'll train and support you in taking your career wherever you want it to go. Due to the secure nature of the programme, you will need to hold UK Security Clearance or be eligible to go through this clearance. This is a hybrid position based in Leeds. Your future duties and responsibilities You'll take ownership of complex data challenges, partner with architects and data engineers to shape technical direction. Working within CGI's supportive environment, you'll be encouraged to explore new technologies, share knowledge, and contribute to a culture of excellence and innovation. Key responsibilities include: • Architectural Vision & Modelling: Define and govern the conceptual, logical and physical data models ensuring consistency and best practice. • Platform Strategy & Design: Design highly scalable and secure cloud-native data platforms on AWS, identifying the most appropriate services, within the client's constraints. • Data Governance & Compliance: Define and enforce data governance policies, security frameworks, and compliance standards across the data lifecycle. • Strategic Consultation & Alignment: Partner with business leaders and technical teams to translate high-level strategic goals and business requirements into a clear, implementable roadmap. • Data Flow: Define the standards and architecture for data ingestion, transformation and consumption, ensuring reliable high-quality data movement across systems. • Technology Roadmap & Innovation: Research, evaluate and recommend emerging technologies and patterns, such as Data Mesh and Data Lakehouse. Required qualifications to be successful in this role To excel in this role, you'll bring extensive architectural leadership, strategic vision and deep domain expertise in designing and governing enterprise-level data solutions in cloud-based data solutions, ideally within regulated or complex environments such as healthcare. You'll be confident in translating business strategy into technical data roadmaps and providing architectural guidance and oversight to engineering teams. You must have: • Architectural expertise in designing and governing large-scale data platforms with at least one of Databricks, Palantir, or a similar unified analytics platform. You should have: • Proven experience as a Data Architect or Principal Data Engineer defining the architecture for large, complex, and high-volume or highly sensitive data datasets. • Deep expertise in data modelling, database design, and advanced architectural patterns (e.g. Data Mesh, Data Lakehouse). • Strong knowledge of the AWS cloud data service ecosystem, including S3, Redshift, Glue, Lake Formation, and IAM, and how they integrate into a cohesive solution. • Extensive experience in defining and implementing enterprise data governance, security, and compliance frameworks. • Excellent communication, documentation, and consulting skills, capable of presenting complex architectural designs and trade-offs to technical teams and executive stakeholders. • Experience in the healthcare sector or knowledge of NHS data standards (advantageous). Together, as owners, let's turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you'll reach your full potential because You are invited to be an owner from day 1 as we work together to bring our Dream to life. That's why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company's strategy and direction. Your work creates value. You'll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You'll shape your career by joining a company built to grow and last. You'll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team-one of the largest IT and business consulting services firms in the world.
06/12/2025
Full time
Data Architect Position Description At CGI, we're helping to transform the future of healthcare through the power of data. As a Data Architect, you'll play a pivotal role in designing, building, and optimising data platforms that underpin critical national services. Working at the heart of our Healthcare team, you'll use your expertise in AWS, and Databricks to deliver high-impact solutions that improve outcomes, enhance decision-making, and drive innovation across the sector. You'll collaborate with experts who share your passion for problem-solving, ownership, and technical excellence-empowered to shape the data foundations of tomorrow. CGI was recognised in the Sunday Times Best Places to Work List 2025 and has been named a UK 'Best Employer' by the Financial Times. We offer a competitive salary, excellent pension, private healthcare, plus a share scheme (3.5% + 3.5% matching) which makes you a CGI Partner not just an employee. We are committed to inclusivity, building a genuinely diverse community of tech talent and inspiring everyone to pursue careers in our sector, including our Armed Forces, and are proud to hold a Gold Award in recognition of our support of the Armed Forces Corporate Covenant. Join us and you'll be part of an open, friendly community of experts. We'll train and support you in taking your career wherever you want it to go. Due to the secure nature of the programme, you will need to hold UK Security Clearance or be eligible to go through this clearance. This is a hybrid position based in Leeds. Your future duties and responsibilities You'll take ownership of complex data challenges, partner with architects and data engineers to shape technical direction. Working within CGI's supportive environment, you'll be encouraged to explore new technologies, share knowledge, and contribute to a culture of excellence and innovation. Key responsibilities include: • Architectural Vision & Modelling: Define and govern the conceptual, logical and physical data models ensuring consistency and best practice. • Platform Strategy & Design: Design highly scalable and secure cloud-native data platforms on AWS, identifying the most appropriate services, within the client's constraints. • Data Governance & Compliance: Define and enforce data governance policies, security frameworks, and compliance standards across the data lifecycle. • Strategic Consultation & Alignment: Partner with business leaders and technical teams to translate high-level strategic goals and business requirements into a clear, implementable roadmap. • Data Flow: Define the standards and architecture for data ingestion, transformation and consumption, ensuring reliable high-quality data movement across systems. • Technology Roadmap & Innovation: Research, evaluate and recommend emerging technologies and patterns, such as Data Mesh and Data Lakehouse. Required qualifications to be successful in this role To excel in this role, you'll bring extensive architectural leadership, strategic vision and deep domain expertise in designing and governing enterprise-level data solutions in cloud-based data solutions, ideally within regulated or complex environments such as healthcare. You'll be confident in translating business strategy into technical data roadmaps and providing architectural guidance and oversight to engineering teams. You must have: • Architectural expertise in designing and governing large-scale data platforms with at least one of Databricks, Palantir, or a similar unified analytics platform. You should have: • Proven experience as a Data Architect or Principal Data Engineer defining the architecture for large, complex, and high-volume or highly sensitive data datasets. • Deep expertise in data modelling, database design, and advanced architectural patterns (e.g. Data Mesh, Data Lakehouse). • Strong knowledge of the AWS cloud data service ecosystem, including S3, Redshift, Glue, Lake Formation, and IAM, and how they integrate into a cohesive solution. • Extensive experience in defining and implementing enterprise data governance, security, and compliance frameworks. • Excellent communication, documentation, and consulting skills, capable of presenting complex architectural designs and trade-offs to technical teams and executive stakeholders. • Experience in the healthcare sector or knowledge of NHS data standards (advantageous). Together, as owners, let's turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you'll reach your full potential because You are invited to be an owner from day 1 as we work together to bring our Dream to life. That's why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company's strategy and direction. Your work creates value. You'll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You'll shape your career by joining a company built to grow and last. You'll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team-one of the largest IT and business consulting services firms in the world.
E.ON
Senior Data Analyst
E.ON Nottingham, Nottinghamshire
We are looking for a Senior Data Analyst to join our Credit Risk Portfolio Management team - a key function responsible for understanding and forecasting the financial health of the customer base. The role requires the use of advanced analytics to provide deep insights into credit risk and performance to inform our risk management strategy. This role is ideal for a proven senior analyst with strong technical and mathematical skills combined with the commercial acumen needed to translate insights into action. Here's a taste of what you'll be doing Partnering with Finance to develop and maintain debt forecasts to track performance vs. plan, forecast risk and deep dive into the drivers of variations to plan Developing scenario models to enable data-based decisioning within operational and strategic planning cycles Staying up to date with emerging analytical techniques and technologies, partnering with Data Science to deliver advanced segmentation and behavioural analysis Delivering and consulting on the advanced analytics required to support project delivery across the wider Credit Management function e.g. simulation models, data pipeline mock ups, statistically robust testing Translating analytical outputs into clear recommendations for business stakeholders, influencing decisions across debt prevention and collections strategy Liaising with Data Engineers to drive enhancements to data quality, availability and usability. Are we the perfect match? Proven experience in a data analytics or credit risk role, ideally within utilities, financial services or other regulated industry Strong coding skills in SQL, Python and PySpark for data extraction, transformation, modeling and forecasting Solid understanding of forecasting techniques, scenario modelling, and regression-based analytics Strong commercial acumen, with the ability to translate complex analytical findings into clear narratives with direct links to business value Excellent stakeholder engagement and communication skills, with confidence working across operational, strategic and technical teams A degree (or equivalent experience) in a quantitative discipline such as statistics, mathematics, economics or data science It would be great if you had Experience working with Databricks Understanding of macroeconomic and market drivers affecting customer affordability and credit risk Prior experience working across both residential and commercial consumer bases Experience working with credit bureau data A general understanding of accounting principles Here's what else you need to know Role may close earlier due to high applications. Competitive salary Location - Nottingham E.ON Next office, Trinity House,?2 Burton St, Nottingham NG1 4BX - with travel to our other sites when required. Excellent parental leave allowance. Award-Winning Workplace - We're proud to be named a Sunday Times Best Place to Work 2025 and the Best Place to Work for 16-34-year-olds Outstanding Benefits - Enjoy 26 days of annual leave plus bank holidays, a generous pension, life cover, bonus opportunities and access to 20 flexible benefits with tax/NI savings Flexible & Family-Friendly - Our industry-leading hybrid and family-friendly policies earned us double recognition at the Personnel Today Awards 2024 . We're open to discussing how flexibility can work for you Inclusive & Diverse - We're the only energy company in the Inclusive Top 50 UK Employers . We're also proud winners of Best Employer for Women and Human Company of the Year -recognising our inclusive, people-first culture Support at Every Stage of Life - We're Fertility Friendly and Menopause Friendly accredited, with inclusive support for everyone Accessible & Supportive - Do you consider yourself as having a disability? As a Disability Confident Employer, we guarantee interviews for disabled applicants who meet the minimum criteria for the role and will make any adjustments needed during the process Invested in Your Growth - From inclusive talent networks to top-tier development programmes, we'll support your growth every step of the way For all successful candidates.? Due to the nature of this role your employment will be subject to a basic DBS (Disclosure Barring Service) check being carried out by ourselves via a 3rd party service provider
05/12/2025
Full time
We are looking for a Senior Data Analyst to join our Credit Risk Portfolio Management team - a key function responsible for understanding and forecasting the financial health of the customer base. The role requires the use of advanced analytics to provide deep insights into credit risk and performance to inform our risk management strategy. This role is ideal for a proven senior analyst with strong technical and mathematical skills combined with the commercial acumen needed to translate insights into action. Here's a taste of what you'll be doing Partnering with Finance to develop and maintain debt forecasts to track performance vs. plan, forecast risk and deep dive into the drivers of variations to plan Developing scenario models to enable data-based decisioning within operational and strategic planning cycles Staying up to date with emerging analytical techniques and technologies, partnering with Data Science to deliver advanced segmentation and behavioural analysis Delivering and consulting on the advanced analytics required to support project delivery across the wider Credit Management function e.g. simulation models, data pipeline mock ups, statistically robust testing Translating analytical outputs into clear recommendations for business stakeholders, influencing decisions across debt prevention and collections strategy Liaising with Data Engineers to drive enhancements to data quality, availability and usability. Are we the perfect match? Proven experience in a data analytics or credit risk role, ideally within utilities, financial services or other regulated industry Strong coding skills in SQL, Python and PySpark for data extraction, transformation, modeling and forecasting Solid understanding of forecasting techniques, scenario modelling, and regression-based analytics Strong commercial acumen, with the ability to translate complex analytical findings into clear narratives with direct links to business value Excellent stakeholder engagement and communication skills, with confidence working across operational, strategic and technical teams A degree (or equivalent experience) in a quantitative discipline such as statistics, mathematics, economics or data science It would be great if you had Experience working with Databricks Understanding of macroeconomic and market drivers affecting customer affordability and credit risk Prior experience working across both residential and commercial consumer bases Experience working with credit bureau data A general understanding of accounting principles Here's what else you need to know Role may close earlier due to high applications. Competitive salary Location - Nottingham E.ON Next office, Trinity House,?2 Burton St, Nottingham NG1 4BX - with travel to our other sites when required. Excellent parental leave allowance. Award-Winning Workplace - We're proud to be named a Sunday Times Best Place to Work 2025 and the Best Place to Work for 16-34-year-olds Outstanding Benefits - Enjoy 26 days of annual leave plus bank holidays, a generous pension, life cover, bonus opportunities and access to 20 flexible benefits with tax/NI savings Flexible & Family-Friendly - Our industry-leading hybrid and family-friendly policies earned us double recognition at the Personnel Today Awards 2024 . We're open to discussing how flexibility can work for you Inclusive & Diverse - We're the only energy company in the Inclusive Top 50 UK Employers . We're also proud winners of Best Employer for Women and Human Company of the Year -recognising our inclusive, people-first culture Support at Every Stage of Life - We're Fertility Friendly and Menopause Friendly accredited, with inclusive support for everyone Accessible & Supportive - Do you consider yourself as having a disability? As a Disability Confident Employer, we guarantee interviews for disabled applicants who meet the minimum criteria for the role and will make any adjustments needed during the process Invested in Your Growth - From inclusive talent networks to top-tier development programmes, we'll support your growth every step of the way For all successful candidates.? Due to the nature of this role your employment will be subject to a basic DBS (Disclosure Barring Service) check being carried out by ourselves via a 3rd party service provider
Cost Engineer
Mactech Energy Group Gloucester, Gloucestershire
Cost Engineer 1635MG Hinkley Point C, Somerset PAYE £321.01 or Umbrella £449.32 Principal Accountabilities -Produce and maintain accurate cost and forecast data in alignment with the Cost Breakdown Structure or Work Breakdown Structure (CBS/WBS); -Accurately maintain all Cost and Forecast data for their area of responsibility within the Cost and Forecast software system e.g. EcoSys, SAP -Co-ordinate and produce Cost and Forecasting reports to a defined reporting cycle, including commentary on key time-related drivers and performance issues - Assist in the Trend & Change process, including analysis at Project level, generating performance indicators and providing feedback on areas of risk and opportunities. - Delivers clear and concise insights to support robust decision-making utilising commercial and technical information to influence decisions. - Analytical support for Senior Cost Engineers through robust analysis and interpretation of technical, financial, and performance data to facilitate prioritisation and any necessary actions. - Ability to effectively collaborate with colleagues within and across organisational boundaries to achieve mutually successful outcomes in keeping with the HPC project values of "Humility, Positivity, Respect, Solidarity and Clarity" and culture of "Trust, Transparency and Teamwork" - Applying fundamental project and business project controls principles and interfaces with wider management processes. - Help promote and embed a culture of good governance, risk awareness and compliance across the organisation. Support the education of project controls reporting functionality and provide feedback and recommendations for improvement. - Ensure all Project documents, in particular Sensitive Nuclear Information, are correctly protectively marked, and protected in accordance with EDF policies and procedures. Knowledge, Skills, Qualifications & Experience Knowledge & Skills - Understanding of project controls methodologies and techniques. - High attention to detail, ensuring accuracy of outputs and validity of quality data. - Analytical, critical thinking and problem-solving skills. - Effectively communicates complex issues and concepts (unique insights) in simple ways, to both technical and nontechnical audiences. - Strong organisational and time management skills. - High standard of interpersonal skills - Collaboration with colleagues within and across organisational boundaries to achieve mutually successful outcomes. Qualifications & Experience - Minimum of HND or equivalent qualification in project management, project controls, engineering, or another related field - Experience in a Project Management environment - Demonstrable experience of working in a project controls discipline. - Experience of working on a major construction project is desirable but not necessary - Experience of working in the Nuclear Industry is desirable but not necessary Tools and Software The jobholder will be expected to have an ability to use, the following (or similar equivalent) software tools: Microsoft Office software (Excel, Word, PowerPoint, Access); Teamcenter (Document Control). EcoSys (Cost, Changes, Earned Value Management). Power BI (Business Intelligence Analytics). SAP (Financial). JBRP1_UKTJ
05/12/2025
Full time
Cost Engineer 1635MG Hinkley Point C, Somerset PAYE £321.01 or Umbrella £449.32 Principal Accountabilities -Produce and maintain accurate cost and forecast data in alignment with the Cost Breakdown Structure or Work Breakdown Structure (CBS/WBS); -Accurately maintain all Cost and Forecast data for their area of responsibility within the Cost and Forecast software system e.g. EcoSys, SAP -Co-ordinate and produce Cost and Forecasting reports to a defined reporting cycle, including commentary on key time-related drivers and performance issues - Assist in the Trend & Change process, including analysis at Project level, generating performance indicators and providing feedback on areas of risk and opportunities. - Delivers clear and concise insights to support robust decision-making utilising commercial and technical information to influence decisions. - Analytical support for Senior Cost Engineers through robust analysis and interpretation of technical, financial, and performance data to facilitate prioritisation and any necessary actions. - Ability to effectively collaborate with colleagues within and across organisational boundaries to achieve mutually successful outcomes in keeping with the HPC project values of "Humility, Positivity, Respect, Solidarity and Clarity" and culture of "Trust, Transparency and Teamwork" - Applying fundamental project and business project controls principles and interfaces with wider management processes. - Help promote and embed a culture of good governance, risk awareness and compliance across the organisation. Support the education of project controls reporting functionality and provide feedback and recommendations for improvement. - Ensure all Project documents, in particular Sensitive Nuclear Information, are correctly protectively marked, and protected in accordance with EDF policies and procedures. Knowledge, Skills, Qualifications & Experience Knowledge & Skills - Understanding of project controls methodologies and techniques. - High attention to detail, ensuring accuracy of outputs and validity of quality data. - Analytical, critical thinking and problem-solving skills. - Effectively communicates complex issues and concepts (unique insights) in simple ways, to both technical and nontechnical audiences. - Strong organisational and time management skills. - High standard of interpersonal skills - Collaboration with colleagues within and across organisational boundaries to achieve mutually successful outcomes. Qualifications & Experience - Minimum of HND or equivalent qualification in project management, project controls, engineering, or another related field - Experience in a Project Management environment - Demonstrable experience of working in a project controls discipline. - Experience of working on a major construction project is desirable but not necessary - Experience of working in the Nuclear Industry is desirable but not necessary Tools and Software The jobholder will be expected to have an ability to use, the following (or similar equivalent) software tools: Microsoft Office software (Excel, Word, PowerPoint, Access); Teamcenter (Document Control). EcoSys (Cost, Changes, Earned Value Management). Power BI (Business Intelligence Analytics). SAP (Financial). JBRP1_UKTJ
Crimson
Senior Data Engineer - Azure Data - Burton-on-Trent - Hybrid
Crimson
Senior Data Engineer - Azure Data - Burton-on-Trent - Permanent - Hybrid Salary - £60,000 - £67,000 per annum This role requires 1 day / week in Burton-on-Trent, with hybrid working arrangements. Our client is seeking a highly skilled Senior Data Engineer to join their dynamic IT team, based in Burton-on-Trent. The Senior Data Engineer will come on board to support the Strategic Data Manager in establishing and managing an efficient Business Intelligence technical service. Assisting in the advancement of our cloud-based data platforms, providing options for timely processing and cost-efficient solutions. A strong background in Azure Data Pipeline development is key for this position. Key Skills & Responsibilities: Build and manage pipelines using Azure Data Factory, Databricks, CI/CD, and Terraform. Optimisation of ETL processes for performance and cost-efficiency. Design scalable data models aligned with business needs. Azure data solutions for efficient data storage and retrieval. Ensure compliance with data protection laws (e.g., GDPR), implement encryption and access controls. Work with cross-functional teams and mentor junior engineers. Manage and tune Azure SQL Database instances. Proactively monitor pipelines and infrastructure for performance and reliability. Maintain technical documentation and lead knowledge-sharing initiatives. Deploy advanced analytics and machine learning solutions using Azure. Stay current with Azure technologies and identify areas for enhancement. Databricks (Unity Catalog, DLT), Data Factory, Synapse, Data Lake, Stream Analytics, Event Hubs. Strong knowledge of Python, Scala, C#, .NET. Experience with advanced SQL, T-SQL, relational databases. Azure DevOps, Terraform, BICEP, ARM templates. Distributed computing, cloud-native design patterns. Data modelling, metadata management, data quality, data as a product. Strong communication, empathy, determination, openness to innovation. Strong Microsoft Office 365 experience Interested? Please submit your updated CV to Lewis Rushton at Crimson for immediate consideration. Not interested? Do you know someone who might be a perfect fit for this role? Refer a friend and earn £250 worth of vouchers! Crimson is acting as an employment agency regarding this vacancy Crimson is acting as an employment agency regarding this vacancy JBRP1_UKTJ
05/12/2025
Full time
Senior Data Engineer - Azure Data - Burton-on-Trent - Permanent - Hybrid Salary - £60,000 - £67,000 per annum This role requires 1 day / week in Burton-on-Trent, with hybrid working arrangements. Our client is seeking a highly skilled Senior Data Engineer to join their dynamic IT team, based in Burton-on-Trent. The Senior Data Engineer will come on board to support the Strategic Data Manager in establishing and managing an efficient Business Intelligence technical service. Assisting in the advancement of our cloud-based data platforms, providing options for timely processing and cost-efficient solutions. A strong background in Azure Data Pipeline development is key for this position. Key Skills & Responsibilities: Build and manage pipelines using Azure Data Factory, Databricks, CI/CD, and Terraform. Optimisation of ETL processes for performance and cost-efficiency. Design scalable data models aligned with business needs. Azure data solutions for efficient data storage and retrieval. Ensure compliance with data protection laws (e.g., GDPR), implement encryption and access controls. Work with cross-functional teams and mentor junior engineers. Manage and tune Azure SQL Database instances. Proactively monitor pipelines and infrastructure for performance and reliability. Maintain technical documentation and lead knowledge-sharing initiatives. Deploy advanced analytics and machine learning solutions using Azure. Stay current with Azure technologies and identify areas for enhancement. Databricks (Unity Catalog, DLT), Data Factory, Synapse, Data Lake, Stream Analytics, Event Hubs. Strong knowledge of Python, Scala, C#, .NET. Experience with advanced SQL, T-SQL, relational databases. Azure DevOps, Terraform, BICEP, ARM templates. Distributed computing, cloud-native design patterns. Data modelling, metadata management, data quality, data as a product. Strong communication, empathy, determination, openness to innovation. Strong Microsoft Office 365 experience Interested? Please submit your updated CV to Lewis Rushton at Crimson for immediate consideration. Not interested? Do you know someone who might be a perfect fit for this role? Refer a friend and earn £250 worth of vouchers! Crimson is acting as an employment agency regarding this vacancy Crimson is acting as an employment agency regarding this vacancy JBRP1_UKTJ
IPS Group
Technical Architect
IPS Group
About the Organisation Our client is a specialist insurer and reinsurer operating globally, with a focus on complex commercial and industrial risks. The business combines deep sector expertise with advanced use of data, analytics, and modern technology to deliver informed, efficient, and fair underwriting decisions.The company is in a period of significant growth and is committed to maintaining a collaborative, inclusive, and high-performing culture. Employees are encouraged to ask questions, explore new ideas, work openly with others, and operate with clarity and simplicity. The environment is dynamic, supportive, and designed to enable long-term career progression. Role Summary Our client is seeking an experienced Technical Architect with strong expertise in integration architecture, systems design best practice, and API design. The role will take technical ownership of the organisation's enterprise technology landscape.This is a pivotal position within the company's scaling and modernisation programme. The successful candidate will act as the bridge between internally developed underwriting platforms, the data lakehouse and analytics estate, and a range of commercial off-the-shelf (COTS) applications used across operations, finance, and support functions.You will be responsible for defining standards, blueprints, and governance for how core systems interact, ensuring security, performance, scalability, and data quality across the entire ecosystem. Key Responsibilities Integration Architecture Define the enterprise-wide integration strategy, including the governance of integration patterns (such as synchronous APIs, event-driven architecture, asynchronous messaging, and batch ETL). Lead the design and documentation of internal and external APIs, ensuring best practice in RESTful design, versioning, and OpenAPI standards. Work with Engineering and Data teams to standardise the technical approach for messaging, data extraction, transformation, and load (ETL/ELT) processes across underwriting, finance, accounting, and data warehouse systems. Design secure, scalable, and auditable architectures for Generative AI, Agentic AI, and large language model (LLM) technologies. Collaborate with data scientists, engineers, and developers to integrate and deploy AI models from development through to production. Define security standards for all integration points, including OAuth, API key management, encryption, and network segmentation within an Azure environment. Platform Architecture Provide technical input into new platform selections and project implementations, ensuring alignment with architectural principles. Work closely with Engineering leadership on the design and evolution of the .NET and Angular technology stack, focusing on modularity, microservices, performance, and technical debt management. Collaborate with the Data Architecture function to ensure alignment between the enterprise architecture and the organisation's data platform, including optimisation of the Databricks environment. Establish and govern non-functional requirements (NFRs) covering performance, scalability, resilience, and disaster recovery. Technical Design & Governance Translate business strategy into an executable architecture roadmap. Conduct architectural and code reviews to ensure alignment with agreed standards. Present technical recommendations, solution designs, and trade-off considerations to both technical and non-technical stakeholders. Serve as the final escalation point for complex technical design challenges and technology selections. Mentor engineers across software, infrastructure, and data disciplines in architectural best practice and emerging technologies. Essential Experience & Skills At least 10 years in senior technical roles, including 5+ years as an architect specialising in software development, integration, and technical architecture. Advanced knowledge of API implementation, message queues (e.g., Azure Service Bus, Kafka), and event-driven architecture. Deep expertise in RESTful API design, API gateways (ideally Azure API Management), and the OpenAPI specification. Strong understanding of DevOps practices and hands-on experience with CI/CD tooling such as Azure DevOps or GitLab CI. Extensive knowledge of the Microsoft Azure ecosystem, covering IaaS and PaaS, with particular experience in Azure Databricks, Data Factory, and ADLS Gen2. Excellent communication and presentation skills, with the ability to influence diverse technical teams. Desirable Experience Background in the insurance, reinsurance, or wider financial services industry. Strong architectural understanding of the .NET/C# ecosystem and modern front-end frameworks (ideally Angular). Experience with data architecture concepts including relational, NoSQL, and data lake technologies. Knowledge of cloud-based AI architectures, including LLMs, Generative AI, and Agentic AI. Experience integrating proprietary systems with COTS finance or ERP platforms, particularly within general ledger, financial close, or regulatory reporting domains.
04/12/2025
Full time
About the Organisation Our client is a specialist insurer and reinsurer operating globally, with a focus on complex commercial and industrial risks. The business combines deep sector expertise with advanced use of data, analytics, and modern technology to deliver informed, efficient, and fair underwriting decisions.The company is in a period of significant growth and is committed to maintaining a collaborative, inclusive, and high-performing culture. Employees are encouraged to ask questions, explore new ideas, work openly with others, and operate with clarity and simplicity. The environment is dynamic, supportive, and designed to enable long-term career progression. Role Summary Our client is seeking an experienced Technical Architect with strong expertise in integration architecture, systems design best practice, and API design. The role will take technical ownership of the organisation's enterprise technology landscape.This is a pivotal position within the company's scaling and modernisation programme. The successful candidate will act as the bridge between internally developed underwriting platforms, the data lakehouse and analytics estate, and a range of commercial off-the-shelf (COTS) applications used across operations, finance, and support functions.You will be responsible for defining standards, blueprints, and governance for how core systems interact, ensuring security, performance, scalability, and data quality across the entire ecosystem. Key Responsibilities Integration Architecture Define the enterprise-wide integration strategy, including the governance of integration patterns (such as synchronous APIs, event-driven architecture, asynchronous messaging, and batch ETL). Lead the design and documentation of internal and external APIs, ensuring best practice in RESTful design, versioning, and OpenAPI standards. Work with Engineering and Data teams to standardise the technical approach for messaging, data extraction, transformation, and load (ETL/ELT) processes across underwriting, finance, accounting, and data warehouse systems. Design secure, scalable, and auditable architectures for Generative AI, Agentic AI, and large language model (LLM) technologies. Collaborate with data scientists, engineers, and developers to integrate and deploy AI models from development through to production. Define security standards for all integration points, including OAuth, API key management, encryption, and network segmentation within an Azure environment. Platform Architecture Provide technical input into new platform selections and project implementations, ensuring alignment with architectural principles. Work closely with Engineering leadership on the design and evolution of the .NET and Angular technology stack, focusing on modularity, microservices, performance, and technical debt management. Collaborate with the Data Architecture function to ensure alignment between the enterprise architecture and the organisation's data platform, including optimisation of the Databricks environment. Establish and govern non-functional requirements (NFRs) covering performance, scalability, resilience, and disaster recovery. Technical Design & Governance Translate business strategy into an executable architecture roadmap. Conduct architectural and code reviews to ensure alignment with agreed standards. Present technical recommendations, solution designs, and trade-off considerations to both technical and non-technical stakeholders. Serve as the final escalation point for complex technical design challenges and technology selections. Mentor engineers across software, infrastructure, and data disciplines in architectural best practice and emerging technologies. Essential Experience & Skills At least 10 years in senior technical roles, including 5+ years as an architect specialising in software development, integration, and technical architecture. Advanced knowledge of API implementation, message queues (e.g., Azure Service Bus, Kafka), and event-driven architecture. Deep expertise in RESTful API design, API gateways (ideally Azure API Management), and the OpenAPI specification. Strong understanding of DevOps practices and hands-on experience with CI/CD tooling such as Azure DevOps or GitLab CI. Extensive knowledge of the Microsoft Azure ecosystem, covering IaaS and PaaS, with particular experience in Azure Databricks, Data Factory, and ADLS Gen2. Excellent communication and presentation skills, with the ability to influence diverse technical teams. Desirable Experience Background in the insurance, reinsurance, or wider financial services industry. Strong architectural understanding of the .NET/C# ecosystem and modern front-end frameworks (ideally Angular). Experience with data architecture concepts including relational, NoSQL, and data lake technologies. Knowledge of cloud-based AI architectures, including LLMs, Generative AI, and Agentic AI. Experience integrating proprietary systems with COTS finance or ERP platforms, particularly within general ledger, financial close, or regulatory reporting domains.
Randstad Technologies
Data Architect Insurance Domain
Randstad Technologies
Adword Job Title: Data Architect (Insurance Domain) Location: London UK Hybrid role Responsibilities: Experience Level 15 years with at least 5 years in Azure ecosystem Role: We are seeking a seasoned Data Architect to lead the design and implementation of scalable data solutions for a strategic insurance project The ideal candidate will have deep expertise in Azure cloud services Azure Data Factory and Databricks with a strong understanding of data modeling data integration and analytics in the insurance domain Key Responsibilities Architect and design end to end data solutions on Azure for insurance related data workflows Lead data ingestion transformation and orchestration using ADF and Databricks Collaborate with business stakeholders to understand data requirements and translate them into technical solutions Ensure data quality governance and security compliance across the data lifecycle Optimize performance and cost efficiency of data pipelines and storage Provide technical leadership and mentoring to data engineers and developers Mandatory Skillset: Azure Cloud Services Strong experience with Azure Data Lake Azure Synapse Azure SQL and Azure Storage Azure Data Factory ADF Expertise in building and managing complex data pipelines Databricks Handson experience with Spark based data processing notebooks and ML workflows Data Modeling Proficiency in conceptual logical and physical data modeling SQL Python Advanced skills for data manipulation and transformation Insurance Domain Knowledge Understanding of insurance data structures claims policy underwriting and regulatory requirements Preferred Skillset: Power BI Experience in building dashboards and visualizations Data Governance Tools Familiarity with tools like Purview or Collibra Machine Learning Exposure to ML model deployment and monitoring in Databricks CICD Knowledge of DevOps practices for data pipelines Certifications: Azure Data Engineer or Azure Solutions Architect certifications Skills Mandatory Skills: Python for DATA,Java,Python,Scala,Snowflake,Azure BLOB,Azure Data Factory, Azure Functions, Azure SQL, Azure Synapse Analytics, AZURE DATA LAKE,ANSI-SQL,Databricks,HDInsight If you're excited about this role then we would like to hear from you! Please apply with a copy of your CV or send it to Prasanna com and let's start the conversation! Randstad Technologies Ltd is a leading specialist recruitment business for the IT & Engineering industries. Please note that due to a high level of applications, we can only respond to applicants whose skills & qualifications are suitable for this position. No terminology in this advert is intended to discriminate against any of the protected characteristics that fall under the Equality Act 2010. For the purposes of the Conduct Regulations 2003, when advertising permanent vacancies we are acting as an Employment Agency, and when advertising temporary/contract vacancies we are acting as an Employment Business.
04/12/2025
Full time
Adword Job Title: Data Architect (Insurance Domain) Location: London UK Hybrid role Responsibilities: Experience Level 15 years with at least 5 years in Azure ecosystem Role: We are seeking a seasoned Data Architect to lead the design and implementation of scalable data solutions for a strategic insurance project The ideal candidate will have deep expertise in Azure cloud services Azure Data Factory and Databricks with a strong understanding of data modeling data integration and analytics in the insurance domain Key Responsibilities Architect and design end to end data solutions on Azure for insurance related data workflows Lead data ingestion transformation and orchestration using ADF and Databricks Collaborate with business stakeholders to understand data requirements and translate them into technical solutions Ensure data quality governance and security compliance across the data lifecycle Optimize performance and cost efficiency of data pipelines and storage Provide technical leadership and mentoring to data engineers and developers Mandatory Skillset: Azure Cloud Services Strong experience with Azure Data Lake Azure Synapse Azure SQL and Azure Storage Azure Data Factory ADF Expertise in building and managing complex data pipelines Databricks Handson experience with Spark based data processing notebooks and ML workflows Data Modeling Proficiency in conceptual logical and physical data modeling SQL Python Advanced skills for data manipulation and transformation Insurance Domain Knowledge Understanding of insurance data structures claims policy underwriting and regulatory requirements Preferred Skillset: Power BI Experience in building dashboards and visualizations Data Governance Tools Familiarity with tools like Purview or Collibra Machine Learning Exposure to ML model deployment and monitoring in Databricks CICD Knowledge of DevOps practices for data pipelines Certifications: Azure Data Engineer or Azure Solutions Architect certifications Skills Mandatory Skills: Python for DATA,Java,Python,Scala,Snowflake,Azure BLOB,Azure Data Factory, Azure Functions, Azure SQL, Azure Synapse Analytics, AZURE DATA LAKE,ANSI-SQL,Databricks,HDInsight If you're excited about this role then we would like to hear from you! Please apply with a copy of your CV or send it to Prasanna com and let's start the conversation! Randstad Technologies Ltd is a leading specialist recruitment business for the IT & Engineering industries. Please note that due to a high level of applications, we can only respond to applicants whose skills & qualifications are suitable for this position. No terminology in this advert is intended to discriminate against any of the protected characteristics that fall under the Equality Act 2010. For the purposes of the Conduct Regulations 2003, when advertising permanent vacancies we are acting as an Employment Agency, and when advertising temporary/contract vacancies we are acting as an Employment Business.
Pontoon
Data Pipeline Engineer On prem to AWS migration
Pontoon Bristol, Somerset
Data Pipeline Engineer (On prem to AWS migration)UtilitiesPredominantly remote: occasional travel to Bristol6 months+£700 - £750 per day In short: We require 3 Data Engineers to join a large on prem to AWS migration programme in replicating data pipelines in readiness for a lift and shift to the cloud. Essential: AWS and Redshift In full: Role purpose Reporting to the Lead Data Engineer, the Data Engineer is responsible for designing and maintaining scalable data pipelines, ensuring data availability, quality, and performance to support analytics and operational decision-making. They contribute to the data engineering roadmap through research, technical vision, business alignment, and prioritisation. This role is also expected to be a subject matter expert in data engineering, fostering collaboration, continuous improvement, and adaptability within the organisation. They act as a mentor and enabler of best practices, advocating for automation, modularity, and an agile approach to data engineering. A person who advocates for a culture of agility, encouraging all to embrace Agile values and practices. The role also has accountability to deputise for their line manager (whenever necessary) and is expected to support the product owner community while also driving a positive culture (primarily through role modelling) across the Technology department and wider business. Key accountabilities Design, develop, and maintain scalable, secure, and efficient data pipelines, ensuring data is accessible, high-quality, and optimised for analytical and operational use. Contribute to the data engineering roadmap, ensuring solutions align with business priorities, technical strategy, and long-term sustainability. Optimise data workflows and infrastructure, leveraging automation and best practices to improve performance, cost efficiency, and scalability. Collaborate with Data Science, Insight, and Governance teams to support data-driven decision-making, ensuring seamless integration of data across the organisation. Implement and uphold data governance, security, and compliance standards, ensuring adherence to regulatory and organisational best practices. Identify and mitigate risks, issues, and dependencies, ensuring continuous improvement in data engineering processes and system reliability, in line with the company risk framework. Ensure quality is maintained throughout the data engineering lifecycle, delivering robust, scalable, and cost-effective solutions on time and within budget. Monitor and drive data pipeline lifecycle, from design and implementation through to optimisation and post-deployment performance analysis. Support operational resilience, ensuring data solutions are maintainable, supportable, and aligned with architectural principles. Engage with third-party vendors where required, ensuring external contributions align with technical requirements, project timelines, and quality expectations. Continuously assess emerging technologies and industry trends, identifying opportunities to enhance data engineering capabilities. Document and maintain clear technical processes, facilitating knowledge sharing and operational continuity within the data engineering function. Knowledge and experience required Excellent communication skills, with the ability to convey technical concepts to both technical and non-technical audiences. Experience working with large-scale data processing and distributed computing frameworks. Knowledge of cloud platforms such as AWS, Azure, or GCP, with hands-on experience in cloud-based data services. Proficiency in SQL and Python for data manipulation and transformation. Experience with modern data engineering tools, including Apache Spark, Kafka, and Airflow. Strong understanding of data modelling, schema design, and data warehousing concepts. Familiarity with data governance, privacy, and compliance frameworks (e.g., GDPR, ISO27001). Hands-on experience with version control systems (e.g., Git) and infrastructure as code (e.g., Terraform, CloudFormation). Understanding of Agile methodologies and DevOps practices for data engineering Please be advised if you haven't heard from us within 48 hours then unfortunately your application has not been successful on this occasion, we may however keep your details on file for any suitable future vacancies and contact you accordingly. Pontoon is an employment consultancy and operates as an equal opportunities employer. We use generative AI tools to support our candidate screening process. This helps us ensure a fair, consistent, and efficient experience for all applicants. Rest assured, all final decisions are made by our hiring team, and your application will be reviewed with care and attention.
04/12/2025
Contractor
Data Pipeline Engineer (On prem to AWS migration)UtilitiesPredominantly remote: occasional travel to Bristol6 months+£700 - £750 per day In short: We require 3 Data Engineers to join a large on prem to AWS migration programme in replicating data pipelines in readiness for a lift and shift to the cloud. Essential: AWS and Redshift In full: Role purpose Reporting to the Lead Data Engineer, the Data Engineer is responsible for designing and maintaining scalable data pipelines, ensuring data availability, quality, and performance to support analytics and operational decision-making. They contribute to the data engineering roadmap through research, technical vision, business alignment, and prioritisation. This role is also expected to be a subject matter expert in data engineering, fostering collaboration, continuous improvement, and adaptability within the organisation. They act as a mentor and enabler of best practices, advocating for automation, modularity, and an agile approach to data engineering. A person who advocates for a culture of agility, encouraging all to embrace Agile values and practices. The role also has accountability to deputise for their line manager (whenever necessary) and is expected to support the product owner community while also driving a positive culture (primarily through role modelling) across the Technology department and wider business. Key accountabilities Design, develop, and maintain scalable, secure, and efficient data pipelines, ensuring data is accessible, high-quality, and optimised for analytical and operational use. Contribute to the data engineering roadmap, ensuring solutions align with business priorities, technical strategy, and long-term sustainability. Optimise data workflows and infrastructure, leveraging automation and best practices to improve performance, cost efficiency, and scalability. Collaborate with Data Science, Insight, and Governance teams to support data-driven decision-making, ensuring seamless integration of data across the organisation. Implement and uphold data governance, security, and compliance standards, ensuring adherence to regulatory and organisational best practices. Identify and mitigate risks, issues, and dependencies, ensuring continuous improvement in data engineering processes and system reliability, in line with the company risk framework. Ensure quality is maintained throughout the data engineering lifecycle, delivering robust, scalable, and cost-effective solutions on time and within budget. Monitor and drive data pipeline lifecycle, from design and implementation through to optimisation and post-deployment performance analysis. Support operational resilience, ensuring data solutions are maintainable, supportable, and aligned with architectural principles. Engage with third-party vendors where required, ensuring external contributions align with technical requirements, project timelines, and quality expectations. Continuously assess emerging technologies and industry trends, identifying opportunities to enhance data engineering capabilities. Document and maintain clear technical processes, facilitating knowledge sharing and operational continuity within the data engineering function. Knowledge and experience required Excellent communication skills, with the ability to convey technical concepts to both technical and non-technical audiences. Experience working with large-scale data processing and distributed computing frameworks. Knowledge of cloud platforms such as AWS, Azure, or GCP, with hands-on experience in cloud-based data services. Proficiency in SQL and Python for data manipulation and transformation. Experience with modern data engineering tools, including Apache Spark, Kafka, and Airflow. Strong understanding of data modelling, schema design, and data warehousing concepts. Familiarity with data governance, privacy, and compliance frameworks (e.g., GDPR, ISO27001). Hands-on experience with version control systems (e.g., Git) and infrastructure as code (e.g., Terraform, CloudFormation). Understanding of Agile methodologies and DevOps practices for data engineering Please be advised if you haven't heard from us within 48 hours then unfortunately your application has not been successful on this occasion, we may however keep your details on file for any suitable future vacancies and contact you accordingly. Pontoon is an employment consultancy and operates as an equal opportunities employer. We use generative AI tools to support our candidate screening process. This helps us ensure a fair, consistent, and efficient experience for all applicants. Rest assured, all final decisions are made by our hiring team, and your application will be reviewed with care and attention.
recruitment22
Power BI / Microsoft Fabric Consultant
recruitment22
Recruiting a highly skilled and motivated Power BI / Microsoft Fabric Consultant to lead design, development, and delivering cutting-edge analytics solutions that make a real impact. This role combines hands-on technical expertise with strategic thinking. Enabling clients to unlock insights and value from their data using Microsoft's latest analytics stack. As an expert in Power BI and Microsoft Fabric, you'll architect scalable, modern data solutions that unlock real business value and guide delivery teams through implementation. With your deep knowledge in Power BI, Microsoft Fabric, and Azure data services, and your passion for innovation, technical expertise with strategic thinking to support data-driven transformation. Here, you'll partner with stakeholders, guide delivery teams, and help clients harness the full potential of Microsoft's newest analytics ecosystem. Key Responsibilities: Lead the design and architecture of end-to-end analytics solutions using Power BI and Microsoft Fabric Translate business requirements into scalable data models, lakehouse architectures, and reporting frameworks Develop and optimize Power BI reports, dashboards, and semantic models using DAX and Power Query Implement Fabric components including Data Factory, Lakehouse, Dataflows Gen2, and Notebooks Collaborate with data engineers and business analysts to ensure seamless integration and data quality Support presales activities including solution design, effort estimation, and client presentations Establish and promote best practices in data governance, performance tuning, and DevOps for BI Required Skills & Expertise: Proven experience delivering enterprise-grade Power BI solutions Implement robust data transformation and validation processes to ensure accuracy and alignment with business definitions Collaborate with business analysts, data modellers, and reporting teams to ensure the curated data supports deeper insights into corporate performance Optimise data pipelines for scalability, reliability, and maintainability using best practices (e.g., modular ETL design, version control, CI/CD) Strong understanding of Microsoft Fabric architecture and components Expertise in Microsoft Fabric Data Engineering Fabric Dataflows / Azure Data Factory Experience with Azure Synapse, Data Lake, SQL, and integration pipelines Desirable: Experience with other low-code or analytics platforms Knowledge of data governance frameworks and security models Microsoft certifications (e.g., Power BI Data Analyst Associate, DP-600) Package: Full-time position to start as soon as possible. Remote working or work from the London or Edinburgh office. Encouragement to learn and increase accreditations with established bonus structure for exam passes in stipulated current technology areas. Significant focus on Learning + Development. training courses, seminars, and social events in London. Two paid leave days per year to volunteer for causes that resonate with Core's values (charitable efforts, environmental initiatives, and supporting the next generation in tech). Private healthcare
04/12/2025
Full time
Recruiting a highly skilled and motivated Power BI / Microsoft Fabric Consultant to lead design, development, and delivering cutting-edge analytics solutions that make a real impact. This role combines hands-on technical expertise with strategic thinking. Enabling clients to unlock insights and value from their data using Microsoft's latest analytics stack. As an expert in Power BI and Microsoft Fabric, you'll architect scalable, modern data solutions that unlock real business value and guide delivery teams through implementation. With your deep knowledge in Power BI, Microsoft Fabric, and Azure data services, and your passion for innovation, technical expertise with strategic thinking to support data-driven transformation. Here, you'll partner with stakeholders, guide delivery teams, and help clients harness the full potential of Microsoft's newest analytics ecosystem. Key Responsibilities: Lead the design and architecture of end-to-end analytics solutions using Power BI and Microsoft Fabric Translate business requirements into scalable data models, lakehouse architectures, and reporting frameworks Develop and optimize Power BI reports, dashboards, and semantic models using DAX and Power Query Implement Fabric components including Data Factory, Lakehouse, Dataflows Gen2, and Notebooks Collaborate with data engineers and business analysts to ensure seamless integration and data quality Support presales activities including solution design, effort estimation, and client presentations Establish and promote best practices in data governance, performance tuning, and DevOps for BI Required Skills & Expertise: Proven experience delivering enterprise-grade Power BI solutions Implement robust data transformation and validation processes to ensure accuracy and alignment with business definitions Collaborate with business analysts, data modellers, and reporting teams to ensure the curated data supports deeper insights into corporate performance Optimise data pipelines for scalability, reliability, and maintainability using best practices (e.g., modular ETL design, version control, CI/CD) Strong understanding of Microsoft Fabric architecture and components Expertise in Microsoft Fabric Data Engineering Fabric Dataflows / Azure Data Factory Experience with Azure Synapse, Data Lake, SQL, and integration pipelines Desirable: Experience with other low-code or analytics platforms Knowledge of data governance frameworks and security models Microsoft certifications (e.g., Power BI Data Analyst Associate, DP-600) Package: Full-time position to start as soon as possible. Remote working or work from the London or Edinburgh office. Encouragement to learn and increase accreditations with established bonus structure for exam passes in stipulated current technology areas. Significant focus on Learning + Development. training courses, seminars, and social events in London. Two paid leave days per year to volunteer for causes that resonate with Core's values (charitable efforts, environmental initiatives, and supporting the next generation in tech). Private healthcare
Cathcart Technology
Lead Data Engineer
Cathcart Technology Edinburgh, Midlothian
I'm working with a world-class technology company in Edinburgh to help them find a Lead Data Engineer to join their team (hybrid working but there is flex on this for the right person). This is your chance to take the technical lead on complex, large-scale data projects that power real-world products used by millions of people . The organisation has been steadily growing for a number of years and have become a market leader in their field so it's genuinely a really exciting time to join! You'll be joining a forward-thinking team that's passionate about doing things properly using a modern tech stack , cloud-first approach, and a genuine commitment to engineering excellence. As Lead Data Engineer, you'll be hands-on in designing and building scalable data platforms and pipelines that enable advanced analytics, machine learning, and business-critical insights. You'll shape the technical vision , set best practices, and make key architectural decisions that define how data flows across the organisation. You won't be working in isolation either as collaboration is at the heart of this role. You'll work closely with engineers, product managers, and data scientists to turn ideas into high-performing, production-ready systems. You'll also play a big part in mentoring others , driving standards across the team, and influencing the overall data strategy. The ideal person for this role will have a strong background in data engineering , with experience building modern data solutions using technologies like Kafka , Spark , Databricks , dbt , and Airflow . You'll know your way around cloud platforms (AWS, GCP, or Azure) and be confident coding in Python , Java , or Scala . Most importantly, you'll understand what it takes to design data systems that are scalable , reliable and built for the long haul. In return, they are offering a competitive salary (happy to discuss prior to application), great benefits which includes uncapped holidays and multiple bonuses! Their office in central Edinburgh is only a short walk from Haymarket train station. The role is Hybrid (ideally 1 or 2 days in office), however, they can be flex on this for the right candidate. If you're ready to step into a role where your technical leadership will have a visible impact and where you can build data systems that continue to scale then please apply or contact Matthew MacAlpine at Cathcart Technology.
04/12/2025
Full time
I'm working with a world-class technology company in Edinburgh to help them find a Lead Data Engineer to join their team (hybrid working but there is flex on this for the right person). This is your chance to take the technical lead on complex, large-scale data projects that power real-world products used by millions of people . The organisation has been steadily growing for a number of years and have become a market leader in their field so it's genuinely a really exciting time to join! You'll be joining a forward-thinking team that's passionate about doing things properly using a modern tech stack , cloud-first approach, and a genuine commitment to engineering excellence. As Lead Data Engineer, you'll be hands-on in designing and building scalable data platforms and pipelines that enable advanced analytics, machine learning, and business-critical insights. You'll shape the technical vision , set best practices, and make key architectural decisions that define how data flows across the organisation. You won't be working in isolation either as collaboration is at the heart of this role. You'll work closely with engineers, product managers, and data scientists to turn ideas into high-performing, production-ready systems. You'll also play a big part in mentoring others , driving standards across the team, and influencing the overall data strategy. The ideal person for this role will have a strong background in data engineering , with experience building modern data solutions using technologies like Kafka , Spark , Databricks , dbt , and Airflow . You'll know your way around cloud platforms (AWS, GCP, or Azure) and be confident coding in Python , Java , or Scala . Most importantly, you'll understand what it takes to design data systems that are scalable , reliable and built for the long haul. In return, they are offering a competitive salary (happy to discuss prior to application), great benefits which includes uncapped holidays and multiple bonuses! Their office in central Edinburgh is only a short walk from Haymarket train station. The role is Hybrid (ideally 1 or 2 days in office), however, they can be flex on this for the right candidate. If you're ready to step into a role where your technical leadership will have a visible impact and where you can build data systems that continue to scale then please apply or contact Matthew MacAlpine at Cathcart Technology.
Morson Edge
Contract Data Analyst BigQuery - Inside IR35
Morson Edge Manchester, Lancashire
Contract Data Analyst (BigQuery) - Inside IR35 Location: Manchester - 2 days a week in the office Contract: 3 months (Inside IR35) Day Rate: Competitive (Inside IR35) Start Date: ASAP About the Role I'm seeking an experienced Data Analyst with deep expertise in Google BigQuery to join my clients award winning analytics team on a 3 month contract. You will play a key role in transforming raw data into actionable insights, building scalable data models, and supporting business stakeholders with high-quality analysis. This role sits inside IR35 , and the successful contractor will be engaged via an approved umbrella company. Key Responsibilities Develop, maintain, and optimise BigQuery datasets, SQL queries, and analytical data models . Analyse large, complex data sets to produce actionable insights for business and product teams. Build automated dashboards in Looker Studio , Tableau , or similar BI tools. Collaborate with Data Engineers to improve data quality, pipelines, and structure. Partner with stakeholders to gather requirements and translate them into analytical outputs. Create robust documentation and ensure best practices across data governance and security. Essential Skills & Experience Proven experience as a Data Analyst or BI Analyst in mid-to-large data environments. Advanced SQL , with significant hands-on experience using Google BigQuery for analytics at scale. Strong data modelling skills (star/snowflake schema, optimisation, partitioning, clustering). Experience building dashboards using Looker Studio, Looker, Power BI, or Tableau . Proficiency working with large datasets and performance-tuning analytical queries. Strong communication skills with the ability to simplify complex data for non-technical stakeholders. Desirable Experience with Python for data analysis/ETL. Exposure to GCP ecosystem (Cloud Storage, Cloud Functions, Pub/Sub). Experience in an agile product or fast-paced commercial environment. InterQuest Group is acting as an employment agency for this vacancy. InterQuest Group is an equal opportunities employer and we welcome applications from all suitably qualified persons regardless of age, disability, gender, religion/belief, race, marriage, civil partnership, pregnancy, maternity, sex or sexual orientation. Please make us aware if you require any reasonable adjustments throughout the recruitment process.
04/12/2025
Contractor
Contract Data Analyst (BigQuery) - Inside IR35 Location: Manchester - 2 days a week in the office Contract: 3 months (Inside IR35) Day Rate: Competitive (Inside IR35) Start Date: ASAP About the Role I'm seeking an experienced Data Analyst with deep expertise in Google BigQuery to join my clients award winning analytics team on a 3 month contract. You will play a key role in transforming raw data into actionable insights, building scalable data models, and supporting business stakeholders with high-quality analysis. This role sits inside IR35 , and the successful contractor will be engaged via an approved umbrella company. Key Responsibilities Develop, maintain, and optimise BigQuery datasets, SQL queries, and analytical data models . Analyse large, complex data sets to produce actionable insights for business and product teams. Build automated dashboards in Looker Studio , Tableau , or similar BI tools. Collaborate with Data Engineers to improve data quality, pipelines, and structure. Partner with stakeholders to gather requirements and translate them into analytical outputs. Create robust documentation and ensure best practices across data governance and security. Essential Skills & Experience Proven experience as a Data Analyst or BI Analyst in mid-to-large data environments. Advanced SQL , with significant hands-on experience using Google BigQuery for analytics at scale. Strong data modelling skills (star/snowflake schema, optimisation, partitioning, clustering). Experience building dashboards using Looker Studio, Looker, Power BI, or Tableau . Proficiency working with large datasets and performance-tuning analytical queries. Strong communication skills with the ability to simplify complex data for non-technical stakeholders. Desirable Experience with Python for data analysis/ETL. Exposure to GCP ecosystem (Cloud Storage, Cloud Functions, Pub/Sub). Experience in an agile product or fast-paced commercial environment. InterQuest Group is acting as an employment agency for this vacancy. InterQuest Group is an equal opportunities employer and we welcome applications from all suitably qualified persons regardless of age, disability, gender, religion/belief, race, marriage, civil partnership, pregnancy, maternity, sex or sexual orientation. Please make us aware if you require any reasonable adjustments throughout the recruitment process.
Senior AI Data Scientist
Halliburton Abingdon, Oxfordshire
We are looking for the right people - people who want to innovate, achieve, grow and lead. We attract and retain the best talent by investing in our employees and empowering them to develop themselves and their careers. Experience the challenges, rewards and opportunity of working for one of the world's largest providers of products and services to the global energy industry. Job Duties We are looking for the right people - people who want to innovate, achieve, grow and lead. We attract and retain the best talent by investing in our employees and empowering them to develop themselves and their careers. Experience the challenges, rewards and opportunity of working for one of the world's largest providers of products and services to the global energy industry. About the Role We are seeking a highly skilled and motivated Senior AI Data Scientist to join our Subsurface team at our Abingdon office in Oxfordshire. This is a unique opportunity to apply advanced data science techniques to geological and geospatial challenges, helping us unlock insights from complex subsurface data. Key Responsibilities Collaborate with geoscientists and engineers to understand requirements and design effective solutions Develop robust Python pipelines for data manipulation Implement secure coding practices and manage version control using Git Work with cloud platforms (AWS and Azure) to scale data workflows and manage infrastructure Optimize database performance and spatial queries using PostgreSQL/PostGIS Champion Python best practices across the team and support the development of junior team members Required Qualifications Honors degree (2:1 or above) in data science/AI or related field. Minimum of 10 years related work experience. Desirable Qualifications Postgraduate qualification in AI or related field Essential Skills Proficiency in Python, with a strong adherence to Python best practices Experience using Git for version control and collaboration Knowledge of secure coding principles Expertise in geospatial libraries such as GeoPandas, Shapely, and GDAL Advanced knowledge of PostgreSQL/PostGIS for spatial data management Experience with AWS and Azure platforms, including AI services (e.g., AWS SageMaker, Azure ML) Proven experience developing or deploying AI models across domains such as natural language processing, computer vision, or predictive analytics Familiarity with machine learning frameworks (e.g., TensorFlow, PyTorch, Scikit-learn) and data science tools (e.g., Jupyter, Pandas, NumPy) Ability to design, train, and evaluate supervised and unsupervised learning algorithms Strong teamwork and interpersonal skills, with a collaborative and agile mindset Proven ability to work within agile development environments Self-motivated, detail-oriented, and capable of managing multiple tasks Desirable Skills Knowledge of geological or subsurface data domains Experience with containerization tools such as Docker and Kubernetes Familiarity with CI/CD pipelines for automated deployment Understanding of data governance and compliance in scientific environments Experience with database virtualisation, including DecisionSpace integration server Experience with data analysis applications from the Neftex Predictions portfolio Halliburton is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, disability, genetic information, pregnancy, citizenship, marital status, sex/gender, sexual preference/ orientation, gender identity, age, veteran status, national origin, or any other status protected by law or regulation. Location 97 Jubilee Avenue, Milton Park, Abingdon, Oxfordshire, OX14 4RW, United Kingdom Job Details Requisition Number: 204382 Experience Level: Entry-Level Job Family: Engineering/Science/Technology Product Service Line: division Full Time / Part Time: Full Time Additional Locations for this position: Compensation Information Compensation is competitive and commensurate with experience.
03/12/2025
Full time
We are looking for the right people - people who want to innovate, achieve, grow and lead. We attract and retain the best talent by investing in our employees and empowering them to develop themselves and their careers. Experience the challenges, rewards and opportunity of working for one of the world's largest providers of products and services to the global energy industry. Job Duties We are looking for the right people - people who want to innovate, achieve, grow and lead. We attract and retain the best talent by investing in our employees and empowering them to develop themselves and their careers. Experience the challenges, rewards and opportunity of working for one of the world's largest providers of products and services to the global energy industry. About the Role We are seeking a highly skilled and motivated Senior AI Data Scientist to join our Subsurface team at our Abingdon office in Oxfordshire. This is a unique opportunity to apply advanced data science techniques to geological and geospatial challenges, helping us unlock insights from complex subsurface data. Key Responsibilities Collaborate with geoscientists and engineers to understand requirements and design effective solutions Develop robust Python pipelines for data manipulation Implement secure coding practices and manage version control using Git Work with cloud platforms (AWS and Azure) to scale data workflows and manage infrastructure Optimize database performance and spatial queries using PostgreSQL/PostGIS Champion Python best practices across the team and support the development of junior team members Required Qualifications Honors degree (2:1 or above) in data science/AI or related field. Minimum of 10 years related work experience. Desirable Qualifications Postgraduate qualification in AI or related field Essential Skills Proficiency in Python, with a strong adherence to Python best practices Experience using Git for version control and collaboration Knowledge of secure coding principles Expertise in geospatial libraries such as GeoPandas, Shapely, and GDAL Advanced knowledge of PostgreSQL/PostGIS for spatial data management Experience with AWS and Azure platforms, including AI services (e.g., AWS SageMaker, Azure ML) Proven experience developing or deploying AI models across domains such as natural language processing, computer vision, or predictive analytics Familiarity with machine learning frameworks (e.g., TensorFlow, PyTorch, Scikit-learn) and data science tools (e.g., Jupyter, Pandas, NumPy) Ability to design, train, and evaluate supervised and unsupervised learning algorithms Strong teamwork and interpersonal skills, with a collaborative and agile mindset Proven ability to work within agile development environments Self-motivated, detail-oriented, and capable of managing multiple tasks Desirable Skills Knowledge of geological or subsurface data domains Experience with containerization tools such as Docker and Kubernetes Familiarity with CI/CD pipelines for automated deployment Understanding of data governance and compliance in scientific environments Experience with database virtualisation, including DecisionSpace integration server Experience with data analysis applications from the Neftex Predictions portfolio Halliburton is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, disability, genetic information, pregnancy, citizenship, marital status, sex/gender, sexual preference/ orientation, gender identity, age, veteran status, national origin, or any other status protected by law or regulation. Location 97 Jubilee Avenue, Milton Park, Abingdon, Oxfordshire, OX14 4RW, United Kingdom Job Details Requisition Number: 204382 Experience Level: Entry-Level Job Family: Engineering/Science/Technology Product Service Line: division Full Time / Part Time: Full Time Additional Locations for this position: Compensation Information Compensation is competitive and commensurate with experience.
DWP Digital
Senior Data Test Engineer
DWP Digital Blackpool, Lancashire
Pay up to £78,205, plus 28.97% employer pension contributions, hybrid working, flexible hours, and great work life balance. Own the testing strategy for DWP's data transformation. You'll lead test engineering for the migration of our core analytics platform, and define strategy, mentoring engineers to integrate testing into agile delivery, playing a key role in ensuring quality and performance in one of the UK's largest digital modernisation programmes. DWP. Digital with Purpose. We are looking for a Senior Data Test Engineer to join our community of tech experts in DWP Digital. We're using fresh ideas and leading-edge tech to build and maintain digital solutions that will be used by nearly every person in the UK, every day and at key moments in their lives. DWP is the UK's largest government department. We help people into work, and make payments worth over £195bn a year to support and empower millions of people. The scale of what we do is extraordinary, and our purpose is unique. We'd love you to join us. What skills, knowledge and experience will you need? Analytical Thinking & Problem Solving - You can analyse complex requirements to identify gaps and risks, think creatively, and ask probing questions to critically evaluate findings. You lead investigations to synthesise conflicting information into clear, actionable recommendations. Non-Functional Testing Expertise - You have extensive hands-on experience in designing and executing a wide range of non-functional tests. You communicate results effectively to technical and non-technical stakeholders, make informed decisions on testing approaches and priorities, and coach others while raising and prioritising defects based on severity. Agile & Lean Delivery - You apply Agile and Lean principles to optimise delivery and testing processes, encourage experimentation and adaptability, support teams in visualising outcomes and prioritising work, and ensure delivery meets agreed MVP standards. You and your role We're looking for a Senior Data Test Engineer to join our Data Enablement team, an integral part of DWP's digital modernisation programme. This team owns data-enabling products and is driving transformation to deliver managed data through new, innovative solutions. You'll work on the Strategic Modern Analytical Tooling programme, helping migrate and modernise our core analytical platform from SAS 9.4 to SAS Viya. In this role, you'll define the overall testing strategy, lead test engineering teams, and oversee all testing activities. You'll ensure best practices are embedded, drive improvements, and share knowledge across teams. You'll also set up test infrastructure, select the right tools, and monitor performance to reduce technical debt. Collaboration is key, you'll work closely with other teams in the software development lifecycle to align goals and integrate testing into agile delivery. Beyond delivery, you'll champion continuous improvement by running reviews, mentoring team members, and leading community discussions on the latest trends and tools. You'll support professional development, contribute to recruitment, and help shape standards for testing practices. Regular travel to other digital hubs will be required, and this will be discussed further if you're successful. Details. Wages. Perks. Location: You'll join us in one of our brilliant digital hubs in Birmingham, Blackpool, Leeds, Manchester, Newcastle and Sheffield, whichever is most convenient for you. Hybrid Working: We work a hybrid model - you'll spend some time working at home and some time collaborating face to face in a hub. Pay: We offer competitive pay of up to £78,205. Pension: You'll get a brilliant civil service pension with employer contributions worth 28.97%, worth up to £X per year. Holidays: A generous leave package starting at 26 days rising to 31 days over time. You can also take up to 3 extra days off a month on flexi-time. You'll also get all the usual public holidays. We have a broad benefits package built around your work-life balance which includes: Flexible working including flexible hours and flex-friendly policies Time off volunteering and charitable giving Bring your authentic self to work with 'I Can Be Me in DWP' Discounts and savings on shopping, fun days out and more Interest-free loans to buy a bike or a season ticket, so it's even easier for you to get to work and start making a difference Sports and social activities Professional development, coaching, mentoring and career progression opportunities. And we have an award-winning environment and culture: DWP have been recognised as 2024 Diversity Employer of the Year at the Computing Women in Tech Excellence awards Diverse and Inclusive Leadership at Digital Leaders Awards. Commended as Best Place to Work in Digital category in the Computing Digital Technology Leaders awards 2025 Recognised as one of the Best Public Sector Employers at 2025 Women In Digital Awards Process: We know your time is valuable so our application and selection process is just two stages: Apply: complete your application on Civil Service Jobs. There'll be full instructions when you click through.Interview: a single stage interview online. CLICK APPLY for more information and to start your application. JBRP1_UKTJ
02/12/2025
Full time
Pay up to £78,205, plus 28.97% employer pension contributions, hybrid working, flexible hours, and great work life balance. Own the testing strategy for DWP's data transformation. You'll lead test engineering for the migration of our core analytics platform, and define strategy, mentoring engineers to integrate testing into agile delivery, playing a key role in ensuring quality and performance in one of the UK's largest digital modernisation programmes. DWP. Digital with Purpose. We are looking for a Senior Data Test Engineer to join our community of tech experts in DWP Digital. We're using fresh ideas and leading-edge tech to build and maintain digital solutions that will be used by nearly every person in the UK, every day and at key moments in their lives. DWP is the UK's largest government department. We help people into work, and make payments worth over £195bn a year to support and empower millions of people. The scale of what we do is extraordinary, and our purpose is unique. We'd love you to join us. What skills, knowledge and experience will you need? Analytical Thinking & Problem Solving - You can analyse complex requirements to identify gaps and risks, think creatively, and ask probing questions to critically evaluate findings. You lead investigations to synthesise conflicting information into clear, actionable recommendations. Non-Functional Testing Expertise - You have extensive hands-on experience in designing and executing a wide range of non-functional tests. You communicate results effectively to technical and non-technical stakeholders, make informed decisions on testing approaches and priorities, and coach others while raising and prioritising defects based on severity. Agile & Lean Delivery - You apply Agile and Lean principles to optimise delivery and testing processes, encourage experimentation and adaptability, support teams in visualising outcomes and prioritising work, and ensure delivery meets agreed MVP standards. You and your role We're looking for a Senior Data Test Engineer to join our Data Enablement team, an integral part of DWP's digital modernisation programme. This team owns data-enabling products and is driving transformation to deliver managed data through new, innovative solutions. You'll work on the Strategic Modern Analytical Tooling programme, helping migrate and modernise our core analytical platform from SAS 9.4 to SAS Viya. In this role, you'll define the overall testing strategy, lead test engineering teams, and oversee all testing activities. You'll ensure best practices are embedded, drive improvements, and share knowledge across teams. You'll also set up test infrastructure, select the right tools, and monitor performance to reduce technical debt. Collaboration is key, you'll work closely with other teams in the software development lifecycle to align goals and integrate testing into agile delivery. Beyond delivery, you'll champion continuous improvement by running reviews, mentoring team members, and leading community discussions on the latest trends and tools. You'll support professional development, contribute to recruitment, and help shape standards for testing practices. Regular travel to other digital hubs will be required, and this will be discussed further if you're successful. Details. Wages. Perks. Location: You'll join us in one of our brilliant digital hubs in Birmingham, Blackpool, Leeds, Manchester, Newcastle and Sheffield, whichever is most convenient for you. Hybrid Working: We work a hybrid model - you'll spend some time working at home and some time collaborating face to face in a hub. Pay: We offer competitive pay of up to £78,205. Pension: You'll get a brilliant civil service pension with employer contributions worth 28.97%, worth up to £X per year. Holidays: A generous leave package starting at 26 days rising to 31 days over time. You can also take up to 3 extra days off a month on flexi-time. You'll also get all the usual public holidays. We have a broad benefits package built around your work-life balance which includes: Flexible working including flexible hours and flex-friendly policies Time off volunteering and charitable giving Bring your authentic self to work with 'I Can Be Me in DWP' Discounts and savings on shopping, fun days out and more Interest-free loans to buy a bike or a season ticket, so it's even easier for you to get to work and start making a difference Sports and social activities Professional development, coaching, mentoring and career progression opportunities. And we have an award-winning environment and culture: DWP have been recognised as 2024 Diversity Employer of the Year at the Computing Women in Tech Excellence awards Diverse and Inclusive Leadership at Digital Leaders Awards. Commended as Best Place to Work in Digital category in the Computing Digital Technology Leaders awards 2025 Recognised as one of the Best Public Sector Employers at 2025 Women In Digital Awards Process: We know your time is valuable so our application and selection process is just two stages: Apply: complete your application on Civil Service Jobs. There'll be full instructions when you click through.Interview: a single stage interview online. CLICK APPLY for more information and to start your application. JBRP1_UKTJ
Adecco
Global Design Business Analyst
Adecco
Global Design Business Analyst Duration: 6 Months Possibility for extension) Location: London/Hybrid (2 - 3 days per week on site) Rate: A highly competitive Umbrella Day Rate is available for suitable candidates Role Overview The Global Design Business Analyst will play an essential role on a major global compliance programme, capturing clear requirements for global design initiatives, supporting product owners and delivery teams, and helping to produce high quality documentation that will be used as the basis for global rollout. BCBS239 is a set of principles for risk data aggregation and risk reporting, which has become the de facto standard for high quality data management and reporting. As part of our Data Management Global Coordination Team (GCT), EMEA are leading a programme of work to secure and maintain compliance with BCBS239 globally and this role will form part of a core Global team that will coordinate delivery of global initiatives, while also helping regional teams (in EMEA, Americas, APAC, East Asia and our Tokyo HQ) to deliver against their compliance obligations. We are looking for someone who has strong data management, data governance, and data & analytics subject matter knowledge. You will have experienced business analysis skills as the role will require an ability to break down problems and regulatory concepts into understandable, implementable designs and solutions. You should have experience of working in financial services and ideally large risk, finance or regulatory programmes. You will be required to develop a good understanding of the Bank's business processes and interface with our Global stakeholders and across our regions and be sensitive to different cultural approaches. Key Responsibilities: Supporting and contributing to meetings, workshops, and discovery sessions with business owners, SMEs, and stakeholders at all levels of the organisation to elicit, clarify, translate, and document business requirements (functional and non-functional) Define problem statements, scope, objectives, and success criteria aligned to initiative strategy and outcomes Working closely with stakeholders to develop designs and processes through to closure Produce clear, version controlled, high quality documentation Preparation of design governance papers to support review and approval through programme and BAU governance committees Collaborating with the wider data family, including analysts, engineers, and product managers Build productive cross-functional relationships with a network of business stakeholders, technical delivery teams and external suppliers Skills & Experience: Experience of working in the financial services industry Experience working in a data team and collaborating cross-functionally to identify, scope and develop solutions Familiarity across a range of data & analytics disciplines (e.g. data governance & management, data quality, business intelligence, data engineering, ad hoc analytics) Experience of the business analysis process end to end; information gathering through to design approval and benefits realisation Experience as a Business Analyst, Data Analyst, Junior Project Manager or similar role. Understanding of project management principles and approaches like waterfall and agile Effective written and verbal communication skills: (i) comfortable presenting to, and facilitating group sessions, of business users at all levels, (ii) ability to describe technical solutions to non-technical colleagues Documentation excellence; ability to produce high quality, version controlled artefacts (e.g. BRD, user stories, TOM) Stakeholder management skills and ability navigate ambiguity and align diverse stakeholders Ability to plan and manage own work to meet challenging deadlines with minimal supervision Proficiency in standard tooling; JIRA, Confluence, Visio, MS Office Suite (Excel, Word, PPT, MSP) Knowledge of BCBS239 principles and requirements Experience of working across multiple jurisdictions and cultures Use of industry standards and frameworks (e.g. DAMA, DCAM) Data management tooling experience or proficiency (metadata/lineage, DQ platforms, reporting solutions) SQL skills to support requirements and data analysis, ideally including both raw and aggregated data with the ability to review transformation logic Candidates will need to show evidence of the above in their CV in order to be considered. If you feel you have the skills and experience and want to hear more about this role 'apply now' to declare your interest in this opportunity with our client. Your application will be observed by our dedicated team. We will respond to all successful applicants ASAP however, please be advised that we will always look to contact you further from this time should we need further applicants or if other opportunities arise relevant to your skillset. Pontoon is an employment consultancy. We put expertise, energy, and enthusiasm into improving everyone's chance of being part of the workplace. We respect and appreciate people of all ethnicities, generations, religious beliefs, sexual orientations, gender identities, and more. We do this by showcasing their talents, skills, and unique experience in an inclusive environment that helps them thrive. As part of our standard hiring process to manage risk, please note background screening checks will be conducted on all hires before commencing employment. We use generative AI tools to support our candidate screening process. This helps us ensure a fair, consistent, and efficient experience for all applicants. Rest assured, all final decisions are made by our hiring team, and your application will be reviewed with care and attention.
28/11/2025
Contractor
Global Design Business Analyst Duration: 6 Months Possibility for extension) Location: London/Hybrid (2 - 3 days per week on site) Rate: A highly competitive Umbrella Day Rate is available for suitable candidates Role Overview The Global Design Business Analyst will play an essential role on a major global compliance programme, capturing clear requirements for global design initiatives, supporting product owners and delivery teams, and helping to produce high quality documentation that will be used as the basis for global rollout. BCBS239 is a set of principles for risk data aggregation and risk reporting, which has become the de facto standard for high quality data management and reporting. As part of our Data Management Global Coordination Team (GCT), EMEA are leading a programme of work to secure and maintain compliance with BCBS239 globally and this role will form part of a core Global team that will coordinate delivery of global initiatives, while also helping regional teams (in EMEA, Americas, APAC, East Asia and our Tokyo HQ) to deliver against their compliance obligations. We are looking for someone who has strong data management, data governance, and data & analytics subject matter knowledge. You will have experienced business analysis skills as the role will require an ability to break down problems and regulatory concepts into understandable, implementable designs and solutions. You should have experience of working in financial services and ideally large risk, finance or regulatory programmes. You will be required to develop a good understanding of the Bank's business processes and interface with our Global stakeholders and across our regions and be sensitive to different cultural approaches. Key Responsibilities: Supporting and contributing to meetings, workshops, and discovery sessions with business owners, SMEs, and stakeholders at all levels of the organisation to elicit, clarify, translate, and document business requirements (functional and non-functional) Define problem statements, scope, objectives, and success criteria aligned to initiative strategy and outcomes Working closely with stakeholders to develop designs and processes through to closure Produce clear, version controlled, high quality documentation Preparation of design governance papers to support review and approval through programme and BAU governance committees Collaborating with the wider data family, including analysts, engineers, and product managers Build productive cross-functional relationships with a network of business stakeholders, technical delivery teams and external suppliers Skills & Experience: Experience of working in the financial services industry Experience working in a data team and collaborating cross-functionally to identify, scope and develop solutions Familiarity across a range of data & analytics disciplines (e.g. data governance & management, data quality, business intelligence, data engineering, ad hoc analytics) Experience of the business analysis process end to end; information gathering through to design approval and benefits realisation Experience as a Business Analyst, Data Analyst, Junior Project Manager or similar role. Understanding of project management principles and approaches like waterfall and agile Effective written and verbal communication skills: (i) comfortable presenting to, and facilitating group sessions, of business users at all levels, (ii) ability to describe technical solutions to non-technical colleagues Documentation excellence; ability to produce high quality, version controlled artefacts (e.g. BRD, user stories, TOM) Stakeholder management skills and ability navigate ambiguity and align diverse stakeholders Ability to plan and manage own work to meet challenging deadlines with minimal supervision Proficiency in standard tooling; JIRA, Confluence, Visio, MS Office Suite (Excel, Word, PPT, MSP) Knowledge of BCBS239 principles and requirements Experience of working across multiple jurisdictions and cultures Use of industry standards and frameworks (e.g. DAMA, DCAM) Data management tooling experience or proficiency (metadata/lineage, DQ platforms, reporting solutions) SQL skills to support requirements and data analysis, ideally including both raw and aggregated data with the ability to review transformation logic Candidates will need to show evidence of the above in their CV in order to be considered. If you feel you have the skills and experience and want to hear more about this role 'apply now' to declare your interest in this opportunity with our client. Your application will be observed by our dedicated team. We will respond to all successful applicants ASAP however, please be advised that we will always look to contact you further from this time should we need further applicants or if other opportunities arise relevant to your skillset. Pontoon is an employment consultancy. We put expertise, energy, and enthusiasm into improving everyone's chance of being part of the workplace. We respect and appreciate people of all ethnicities, generations, religious beliefs, sexual orientations, gender identities, and more. We do this by showcasing their talents, skills, and unique experience in an inclusive environment that helps them thrive. As part of our standard hiring process to manage risk, please note background screening checks will be conducted on all hires before commencing employment. We use generative AI tools to support our candidate screening process. This helps us ensure a fair, consistent, and efficient experience for all applicants. Rest assured, all final decisions are made by our hiring team, and your application will be reviewed with care and attention.
Randstad Technologies Recruitment
Data Architect (Insurance Domain)
Randstad Technologies Recruitment
Adword Job Title: Data Architect (Insurance Domain) Location: London UK Hybrid role Responsibilities: Experience Level 15 years with at least 5 years in Azure ecosystem Role: We are seeking a seasoned Data Architect to lead the design and implementation of scalable data solutions for a strategic insurance project The ideal candidate will have deep expertise in Azure cloud services Azure Data Factory and Databricks with a strong understanding of data modeling data integration and analytics in the insurance domain Key Responsibilities Architect and design end to end data solutions on Azure for insurance related data workflows Lead data ingestion transformation and orchestration using ADF and Databricks Collaborate with business stakeholders to understand data requirements and translate them into technical solutions Ensure data quality governance and security compliance across the data lifecycle Optimize performance and cost efficiency of data pipelines and storage Provide technical leadership and mentoring to data engineers and developers Mandatory Skillset: Azure Cloud Services Strong experience with Azure Data Lake Azure Synapse Azure SQL and Azure Storage Azure Data Factory ADF Expertise in building and managing complex data pipelines Databricks Handson experience with Spark based data processing notebooks and ML workflows Data Modeling Proficiency in conceptual logical and physical data modeling SQL Python Advanced skills for data manipulation and transformation Insurance Domain Knowledge Understanding of insurance data structures claims policy underwriting and regulatory requirements Preferred Skillset: Power BI Experience in building dashboards and visualizations Data Governance Tools Familiarity with tools like Purview or Collibra Machine Learning Exposure to ML model deployment and monitoring in Databricks CICD Knowledge of DevOps practices for data pipelines Certifications: Azure Data Engineer or Azure Solutions Architect certifications Skills Mandatory Skills: Python for DATA,Java,Python,Scala,Snowflake,Azure BLOB,Azure Data Factory, Azure Functions, Azure SQL, Azure Synapse Analytics, AZURE DATA LAKE,ANSI-SQL,Databricks,HDInsight If you're excited about this role then we would like to hear from you! Please apply with a copy of your CV or send it to Prasanna com and let's start the conversation! Randstad Technologies Ltd is a leading specialist recruitment business for the IT & Engineering industries. Please note that due to a high level of applications, we can only respond to applicants whose skills & qualifications are suitable for this position. No terminology in this advert is intended to discriminate against any of the protected characteristics that fall under the Equality Act 2010. For the purposes of the Conduct Regulations 2003, when advertising permanent vacancies we are acting as an Employment Agency, and when advertising temporary/contract vacancies we are acting as an Employment Business.
24/11/2025
Full time
Adword Job Title: Data Architect (Insurance Domain) Location: London UK Hybrid role Responsibilities: Experience Level 15 years with at least 5 years in Azure ecosystem Role: We are seeking a seasoned Data Architect to lead the design and implementation of scalable data solutions for a strategic insurance project The ideal candidate will have deep expertise in Azure cloud services Azure Data Factory and Databricks with a strong understanding of data modeling data integration and analytics in the insurance domain Key Responsibilities Architect and design end to end data solutions on Azure for insurance related data workflows Lead data ingestion transformation and orchestration using ADF and Databricks Collaborate with business stakeholders to understand data requirements and translate them into technical solutions Ensure data quality governance and security compliance across the data lifecycle Optimize performance and cost efficiency of data pipelines and storage Provide technical leadership and mentoring to data engineers and developers Mandatory Skillset: Azure Cloud Services Strong experience with Azure Data Lake Azure Synapse Azure SQL and Azure Storage Azure Data Factory ADF Expertise in building and managing complex data pipelines Databricks Handson experience with Spark based data processing notebooks and ML workflows Data Modeling Proficiency in conceptual logical and physical data modeling SQL Python Advanced skills for data manipulation and transformation Insurance Domain Knowledge Understanding of insurance data structures claims policy underwriting and regulatory requirements Preferred Skillset: Power BI Experience in building dashboards and visualizations Data Governance Tools Familiarity with tools like Purview or Collibra Machine Learning Exposure to ML model deployment and monitoring in Databricks CICD Knowledge of DevOps practices for data pipelines Certifications: Azure Data Engineer or Azure Solutions Architect certifications Skills Mandatory Skills: Python for DATA,Java,Python,Scala,Snowflake,Azure BLOB,Azure Data Factory, Azure Functions, Azure SQL, Azure Synapse Analytics, AZURE DATA LAKE,ANSI-SQL,Databricks,HDInsight If you're excited about this role then we would like to hear from you! Please apply with a copy of your CV or send it to Prasanna com and let's start the conversation! Randstad Technologies Ltd is a leading specialist recruitment business for the IT & Engineering industries. Please note that due to a high level of applications, we can only respond to applicants whose skills & qualifications are suitable for this position. No terminology in this advert is intended to discriminate against any of the protected characteristics that fall under the Equality Act 2010. For the purposes of the Conduct Regulations 2003, when advertising permanent vacancies we are acting as an Employment Agency, and when advertising temporary/contract vacancies we are acting as an Employment Business.
DGH Recruitment Ltd
Power BI Developer
DGH Recruitment Ltd
Power BI Developer - 12 month fixed term contract Join an agile team to design and deliver enterprise-level BI solutions that transform complex data into actionable insights. This role focuses on developing and managing Power BI reports, dashboards, and data models, while driving best practices for data governance and self-service analytics. About the Role This role focuses on developing and managing Power BI reports, dashboards, and data models, while driving best practices for data governance and self-service analytics. Responsibilities Build and maintain Power BI reports, dashboards, and datasets using DAX, Power Query (M), KQL, and SQL. Develop reusable data models and implement CI/CD pipelines in Azure DevOps. Manage Power BI Service administration, including security and refresh schedules. Collaborate with engineers, analysts, and architects to design scalable data solutions. Apply coding best practices, troubleshoot issues, and communicate technical concepts clearly. Stay current with BI technologies and share knowledge across teams. Qualifications Strong Power BI development experience (reports, dashboards, data models). Advanced knowledge of DAX, Power Query (M), KQL, and Power BI Service. Expertise in SQL and relational database design. Familiarity with Azure DevOps for CI/CD and version control. Understanding of data warehousing and dimensional modelling. Effective communication and agile team collaboration. Required Skills Strong Power BI development experience (reports, dashboards, data models). Advanced knowledge of DAX, Power Query (M), KQL, and Power BI Service. Expertise in SQL and relational database design. Familiarity with Azure DevOps for CI/CD and version control. Understanding of data warehousing and dimensional modelling. Effective communication and agile team collaboration. Preferred Skills Experience with Azure Data Services (SQL, Data Factory, Synapse). Knowledge of reporting governance, security, and performance optimization. Integration with Microsoft tools (SharePoint, Power Apps, Teams). Exposure to REST APIs and MDM tools. In accordance with the Employment Agencies and Employment Businesses Regulations 2003, this position is advertised based upon DGH Recruitment Limited having first sought approval of its client to find candidates for this position. DGH Recruitment Limited acts as both an Employment Agency and Employment Business
24/11/2025
Contractor
Power BI Developer - 12 month fixed term contract Join an agile team to design and deliver enterprise-level BI solutions that transform complex data into actionable insights. This role focuses on developing and managing Power BI reports, dashboards, and data models, while driving best practices for data governance and self-service analytics. About the Role This role focuses on developing and managing Power BI reports, dashboards, and data models, while driving best practices for data governance and self-service analytics. Responsibilities Build and maintain Power BI reports, dashboards, and datasets using DAX, Power Query (M), KQL, and SQL. Develop reusable data models and implement CI/CD pipelines in Azure DevOps. Manage Power BI Service administration, including security and refresh schedules. Collaborate with engineers, analysts, and architects to design scalable data solutions. Apply coding best practices, troubleshoot issues, and communicate technical concepts clearly. Stay current with BI technologies and share knowledge across teams. Qualifications Strong Power BI development experience (reports, dashboards, data models). Advanced knowledge of DAX, Power Query (M), KQL, and Power BI Service. Expertise in SQL and relational database design. Familiarity with Azure DevOps for CI/CD and version control. Understanding of data warehousing and dimensional modelling. Effective communication and agile team collaboration. Required Skills Strong Power BI development experience (reports, dashboards, data models). Advanced knowledge of DAX, Power Query (M), KQL, and Power BI Service. Expertise in SQL and relational database design. Familiarity with Azure DevOps for CI/CD and version control. Understanding of data warehousing and dimensional modelling. Effective communication and agile team collaboration. Preferred Skills Experience with Azure Data Services (SQL, Data Factory, Synapse). Knowledge of reporting governance, security, and performance optimization. Integration with Microsoft tools (SharePoint, Power Apps, Teams). Exposure to REST APIs and MDM tools. In accordance with the Employment Agencies and Employment Businesses Regulations 2003, this position is advertised based upon DGH Recruitment Limited having first sought approval of its client to find candidates for this position. DGH Recruitment Limited acts as both an Employment Agency and Employment Business
DGH Recruitment Ltd
Power BI Developer
DGH Recruitment Ltd City, Manchester
Power BI Developer - 12 month fixed term contract Join an agile team to design and deliver enterprise-level BI solutions that transform complex data into actionable insights. This role focuses on developing and managing Power BI reports, dashboards, and data models, while driving best practices for data governance and self-service analytics. About the Role This role focuses on developing and managing Power BI reports, dashboards, and data models, while driving best practices for data governance and self-service analytics. Responsibilities Build and maintain Power BI reports, dashboards, and datasets using DAX, Power Query (M), KQL, and SQL. Develop reusable data models and implement CI/CD pipelines in Azure DevOps. Manage Power BI Service administration, including security and refresh schedules. Collaborate with engineers, analysts, and architects to design scalable data solutions. Apply coding best practices, troubleshoot issues, and communicate technical concepts clearly. Stay current with BI technologies and share knowledge across teams. Qualifications Strong Power BI development experience (reports, dashboards, data models). Advanced knowledge of DAX, Power Query (M), KQL, and Power BI Service. Expertise in SQL and relational database design. Familiarity with Azure DevOps for CI/CD and version control. Understanding of data warehousing and dimensional modelling. Effective communication and agile team collaboration. Required Skills Strong Power BI development experience (reports, dashboards, data models). Advanced knowledge of DAX, Power Query (M), KQL, and Power BI Service. Expertise in SQL and relational database design. Familiarity with Azure DevOps for CI/CD and version control. Understanding of data warehousing and dimensional modelling. Effective communication and agile team collaboration. Preferred Skills Experience with Azure Data Services (SQL, Data Factory, Synapse). Knowledge of reporting governance, security, and performance optimization. Integration with Microsoft tools (SharePoint, Power Apps, Teams). Exposure to REST APIs and MDM tools. In accordance with the Employment Agencies and Employment Businesses Regulations 2003, this position is advertised based upon DGH Recruitment Limited having first sought approval of its client to find candidates for this position. DGH Recruitment Limited acts as both an Employment Agency and Employment Business
24/11/2025
Contractor
Power BI Developer - 12 month fixed term contract Join an agile team to design and deliver enterprise-level BI solutions that transform complex data into actionable insights. This role focuses on developing and managing Power BI reports, dashboards, and data models, while driving best practices for data governance and self-service analytics. About the Role This role focuses on developing and managing Power BI reports, dashboards, and data models, while driving best practices for data governance and self-service analytics. Responsibilities Build and maintain Power BI reports, dashboards, and datasets using DAX, Power Query (M), KQL, and SQL. Develop reusable data models and implement CI/CD pipelines in Azure DevOps. Manage Power BI Service administration, including security and refresh schedules. Collaborate with engineers, analysts, and architects to design scalable data solutions. Apply coding best practices, troubleshoot issues, and communicate technical concepts clearly. Stay current with BI technologies and share knowledge across teams. Qualifications Strong Power BI development experience (reports, dashboards, data models). Advanced knowledge of DAX, Power Query (M), KQL, and Power BI Service. Expertise in SQL and relational database design. Familiarity with Azure DevOps for CI/CD and version control. Understanding of data warehousing and dimensional modelling. Effective communication and agile team collaboration. Required Skills Strong Power BI development experience (reports, dashboards, data models). Advanced knowledge of DAX, Power Query (M), KQL, and Power BI Service. Expertise in SQL and relational database design. Familiarity with Azure DevOps for CI/CD and version control. Understanding of data warehousing and dimensional modelling. Effective communication and agile team collaboration. Preferred Skills Experience with Azure Data Services (SQL, Data Factory, Synapse). Knowledge of reporting governance, security, and performance optimization. Integration with Microsoft tools (SharePoint, Power Apps, Teams). Exposure to REST APIs and MDM tools. In accordance with the Employment Agencies and Employment Businesses Regulations 2003, this position is advertised based upon DGH Recruitment Limited having first sought approval of its client to find candidates for this position. DGH Recruitment Limited acts as both an Employment Agency and Employment Business
Cathcart Technology
Lead Data Engineer
Cathcart Technology Edinburgh, Midlothian
I'm working with a world-class technology company in Edinburgh to help them find a Lead Data Engineer to join their team (hybrid working but there is flex on this for the right person). This is your chance to take the technical lead on complex, large-scale data projects that power real-world products used by millions of people . The organisation has been steadily growing for a number of years and have become a market leader in their field so it's genuinely a really exciting time to join! You'll be joining a forward-thinking team that's passionate about doing things properly using a modern tech stack , cloud-first approach, and a genuine commitment to engineering excellence. As Lead Data Engineer, you'll be hands-on in designing and building scalable data platforms and pipelines that enable advanced analytics, machine learning, and business-critical insights. You'll shape the technical vision , set best practices, and make key architectural decisions that define how data flows across the organisation. You won't be working in isolation either as collaboration is at the heart of this role. You'll work closely with engineers, product managers, and data scientists to turn ideas into high-performing, production-ready systems. You'll also play a big part in mentoring others , driving standards across the team, and influencing the overall data strategy. The ideal person for this role will have a strong background in data engineering , with experience building modern data solutions using technologies like Kafka , Spark , Databricks , dbt , and Airflow . You'll know your way around cloud platforms (AWS, GCP, or Azure) and be confident coding in Python , Java , or Scala . Most importantly, you'll understand what it takes to design data systems that are scalable , reliable and built for the long haul. In return, they are offering a competitive salary (happy to discuss prior to application), great benefits which includes uncapped holidays and multiple bonuses! Their office in central Edinburgh is only a short walk from Haymarket train station. The role is Hybrid (ideally 1 or 2 days in office), however, they can be flex on this for the right candidate. If you're ready to step into a role where your technical leadership will have a visible impact and where you can build data systems that continue to scale then please apply or contact Matthew MacAlpine at Cathcart Technology.
18/11/2025
Full time
I'm working with a world-class technology company in Edinburgh to help them find a Lead Data Engineer to join their team (hybrid working but there is flex on this for the right person). This is your chance to take the technical lead on complex, large-scale data projects that power real-world products used by millions of people . The organisation has been steadily growing for a number of years and have become a market leader in their field so it's genuinely a really exciting time to join! You'll be joining a forward-thinking team that's passionate about doing things properly using a modern tech stack , cloud-first approach, and a genuine commitment to engineering excellence. As Lead Data Engineer, you'll be hands-on in designing and building scalable data platforms and pipelines that enable advanced analytics, machine learning, and business-critical insights. You'll shape the technical vision , set best practices, and make key architectural decisions that define how data flows across the organisation. You won't be working in isolation either as collaboration is at the heart of this role. You'll work closely with engineers, product managers, and data scientists to turn ideas into high-performing, production-ready systems. You'll also play a big part in mentoring others , driving standards across the team, and influencing the overall data strategy. The ideal person for this role will have a strong background in data engineering , with experience building modern data solutions using technologies like Kafka , Spark , Databricks , dbt , and Airflow . You'll know your way around cloud platforms (AWS, GCP, or Azure) and be confident coding in Python , Java , or Scala . Most importantly, you'll understand what it takes to design data systems that are scalable , reliable and built for the long haul. In return, they are offering a competitive salary (happy to discuss prior to application), great benefits which includes uncapped holidays and multiple bonuses! Their office in central Edinburgh is only a short walk from Haymarket train station. The role is Hybrid (ideally 1 or 2 days in office), however, they can be flex on this for the right candidate. If you're ready to step into a role where your technical leadership will have a visible impact and where you can build data systems that continue to scale then please apply or contact Matthew MacAlpine at Cathcart Technology.
Think Specialist Recruitment
Principle Product Owner / Product Specialist
Think Specialist Recruitment City, London
Would you like to join an innovative company nominated by Forbes as one of the top 500 companies in the World to work for? We are looking for a Principle Product Owner/ Product Specialist to join a Global medical technology leader that is reimagining digital solutions. This organisation is developing a connected, data-driven ecosystem that applies AI, augmented reality, computer vision, and live video collaboration to transform workflows in operating theatres and procedure rooms around the world. As part of this continued growth, the business is now seeking a Principal Product Owner to play a central role in the evolution of its digital surgery ecosystem - a suite of products combining artificial intelligence, data analytics, and intuitive design to improve how procedures are captured, reviewed, and shared globally. This is a 1-year temporary position, looking to start ASAP To be considered for a position, you must be available to begin work within the next 6 weeks. Working Hours: Monday - Friday 08:30 - 17:00 - Hybrid working - Tues & Weds in Central London office in walking distance of Old Street and Angel stations. 35 - 45ph ( 68,000 - 87,000pa) You do not need a medical background for this position, but previous Product Owner, Product Specialist experience is essential. About the Position: As a Product Owner, you'll be the bridge between user needs, business objectives, and technical delivery. You'll work within an agile product team to define and prioritise features, ensuring each release delivers value to clinicians and aligns with product vision and strategy. Main duties to include: Defining, refining, and prioritising the product backlog for the digital surgery platform. Translating user and stakeholder requirements into clear, testable user stories with acceptance criteria. Supporting product discovery, guiding research and usability testing alongside Product Managers and UX teams. Acting as a key link between Engineering, Product, and Design, ensuring alignment across disciplines. Collaborating with engineering to deliver high-quality, compliant software and hardware solutions. Leading Agile ceremonies - including sprint planning, backlog refinement, reviews, and retrospectives. Applying Behaviour-Driven Development (BDD) principles to ensure quality and user-centred design. Monitoring performance post-release and identifying opportunities for continuous improvement. Supporting compliance with medical device standards and documentation requirements. Partnering with Tech Comms to ensure clear, accurate release notes and user documentation. This is a highly collaborative role that combines strategic thinking with hands-on execution and an understanding of clinical workflows. About You You're a confident, detail-oriented Product Owner who's passionate about creating technology that makes a real difference in healthcare who enjoys bridging the gaps between engineers and stakeholders to ensure the team builds the right product in the right way. Essential skills and experience: 1-3 years' experience in product ownership, software development, or UX within an Agile environment. Proven ability to manage and deliver Agile digital product development projects end-to-end. Understanding of healthcare software systems or medical devices, ideally within surgical or interventional settings. Working knowledge of regulatory frameworks (e.g. ISO 13485). Strong communication and collaboration skills - confident engaging with engineers, clinicians, and stakeholders. Excellent attention to detail and organisational ability. Flexible, proactive, and comfortable balancing independent work with teamwork. Desirable: Experience working with clinicians or surgeons. Familiarity with Test-Driven and Behaviour-Driven Development environments. Exposure to Linux subsystems or medical software integration. Willingness to travel occasionally for meetings, workshops, or customer engagement. Looking for the next step in your career? Think Specialist Recruitment. Think Specialist Recruitment is an independent support staff recruitment agency based in Hemel Hempstead and working across the Herts, Beds and Bucks area. We specialise in permanent, temporary and contract recruitment with areas of expertise including administration, customer service/call centre, PA/secretarial, human resources, accountancy and finance, sales admin/sales support, marketing and IT Helpdesk/IT support.
13/11/2025
Seasonal
Would you like to join an innovative company nominated by Forbes as one of the top 500 companies in the World to work for? We are looking for a Principle Product Owner/ Product Specialist to join a Global medical technology leader that is reimagining digital solutions. This organisation is developing a connected, data-driven ecosystem that applies AI, augmented reality, computer vision, and live video collaboration to transform workflows in operating theatres and procedure rooms around the world. As part of this continued growth, the business is now seeking a Principal Product Owner to play a central role in the evolution of its digital surgery ecosystem - a suite of products combining artificial intelligence, data analytics, and intuitive design to improve how procedures are captured, reviewed, and shared globally. This is a 1-year temporary position, looking to start ASAP To be considered for a position, you must be available to begin work within the next 6 weeks. Working Hours: Monday - Friday 08:30 - 17:00 - Hybrid working - Tues & Weds in Central London office in walking distance of Old Street and Angel stations. 35 - 45ph ( 68,000 - 87,000pa) You do not need a medical background for this position, but previous Product Owner, Product Specialist experience is essential. About the Position: As a Product Owner, you'll be the bridge between user needs, business objectives, and technical delivery. You'll work within an agile product team to define and prioritise features, ensuring each release delivers value to clinicians and aligns with product vision and strategy. Main duties to include: Defining, refining, and prioritising the product backlog for the digital surgery platform. Translating user and stakeholder requirements into clear, testable user stories with acceptance criteria. Supporting product discovery, guiding research and usability testing alongside Product Managers and UX teams. Acting as a key link between Engineering, Product, and Design, ensuring alignment across disciplines. Collaborating with engineering to deliver high-quality, compliant software and hardware solutions. Leading Agile ceremonies - including sprint planning, backlog refinement, reviews, and retrospectives. Applying Behaviour-Driven Development (BDD) principles to ensure quality and user-centred design. Monitoring performance post-release and identifying opportunities for continuous improvement. Supporting compliance with medical device standards and documentation requirements. Partnering with Tech Comms to ensure clear, accurate release notes and user documentation. This is a highly collaborative role that combines strategic thinking with hands-on execution and an understanding of clinical workflows. About You You're a confident, detail-oriented Product Owner who's passionate about creating technology that makes a real difference in healthcare who enjoys bridging the gaps between engineers and stakeholders to ensure the team builds the right product in the right way. Essential skills and experience: 1-3 years' experience in product ownership, software development, or UX within an Agile environment. Proven ability to manage and deliver Agile digital product development projects end-to-end. Understanding of healthcare software systems or medical devices, ideally within surgical or interventional settings. Working knowledge of regulatory frameworks (e.g. ISO 13485). Strong communication and collaboration skills - confident engaging with engineers, clinicians, and stakeholders. Excellent attention to detail and organisational ability. Flexible, proactive, and comfortable balancing independent work with teamwork. Desirable: Experience working with clinicians or surgeons. Familiarity with Test-Driven and Behaviour-Driven Development environments. Exposure to Linux subsystems or medical software integration. Willingness to travel occasionally for meetings, workshops, or customer engagement. Looking for the next step in your career? Think Specialist Recruitment. Think Specialist Recruitment is an independent support staff recruitment agency based in Hemel Hempstead and working across the Herts, Beds and Bucks area. We specialise in permanent, temporary and contract recruitment with areas of expertise including administration, customer service/call centre, PA/secretarial, human resources, accountancy and finance, sales admin/sales support, marketing and IT Helpdesk/IT support.

Modal Window

  • Home
  • Contact
  • About Us
  • FAQs
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • IT blog
  • Facebook
  • Twitter
  • LinkedIn
  • Youtube
© 2008-2025 IT Job Board