it job board logo
  • Home
  • Find IT Jobs
  • Register CV
  • Register as Employer
  • Contact us
  • Career Advice
  • Recruiting? Post a job
  • Sign in
  • Sign up
  • Home
  • Find IT Jobs
  • Register CV
  • Register as Employer
  • Contact us
  • Career Advice
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

64 jobs found

Email me jobs like this
Refine Search
Current Search
senior data engineer databricks
Canada Life
Senior Data Business Analyst / Senior Data Modeller
Canada Life
Location: London, Potters Bar or Bristol (Hybrid working options available) The Senior BA / Senior Data Modeller will take end-to-end ownership of the design, development, and maintenance of data models that underpin our enterprise data platform, with a particular focus on Databricks and the Nova platform. This role is critical in ensuring that our data assets are structured, governed, and optimised to support business intelligence, analytics, and reporting requirements. We are seeking an experienced professional who is comfortable working closely with data engineers, architects, and business stakeholders to translate business requirements into robust, scalable, and well-documented data models. You will be responsible for both logical and physical data modelling and will act as the subject matter expert and owner for all data models within your domain. What you'll do Take full ownership of the data modelling lifecycle, from requirements gathering through to deployment and ongoing maintenance. Design, develop, and maintain logical and physical data models, primarily using Databricks and the Nova platform. Ensure data models are aligned with business requirements, data governance standards, and best practices. Collaborate with data engineers, architects, and business stakeholders to understand data requirements and translate them into effective data structures. Document data models, definitions, and metadata to ensure clarity and consistency across the organisation. Review and optimise existing data models for performance, scalability, and maintainability. Lead model validation and quality assurance activities, ensuring models are accurate and fit for purpose. Provide guidance and support to project teams and business users on data modelling best practices. Act as the primary point of contact and subject matter expert for all matters relating to data models within your area of responsibility. Champion data model governance and ensure models are maintained in line with organisational standards and policies. Support the integration of new data sources and the evolution of the enterprise data platform. Knowledge / Experience / Skills Essential Extensive experience in data modelling, including both logical and physical modelling, ideally within Databricks and/or similar cloud-based platforms. Demonstrable experience of owning and maintaining enterprise data models, including responsibility for model governance and documentation. Strong understanding of data warehousing, lakehouse architectures, and modern data platforms. Ability to translate complex business requirements into scalable and efficient data models. Proficient in data modelling tools and techniques, with hands-on experience in Databricks and the Nova platform (or similar). Excellent communication skills, with the ability to work collaboratively across technical and business teams. Experience in data governance, metadata management, and data quality assurance. Strong analytical and problem-solving skills. Desirable Experience working in regulated industries such as insurance or financial services. Familiarity with reporting requirements such as IFRS and Solvency II. Experience with data visualisation tools and supporting analytics/reporting teams. Knowledge of automation tools and scripting for data model management. Experience working in Agile product teams and using tools such as JIRA/Confluence.
06/12/2025
Full time
Location: London, Potters Bar or Bristol (Hybrid working options available) The Senior BA / Senior Data Modeller will take end-to-end ownership of the design, development, and maintenance of data models that underpin our enterprise data platform, with a particular focus on Databricks and the Nova platform. This role is critical in ensuring that our data assets are structured, governed, and optimised to support business intelligence, analytics, and reporting requirements. We are seeking an experienced professional who is comfortable working closely with data engineers, architects, and business stakeholders to translate business requirements into robust, scalable, and well-documented data models. You will be responsible for both logical and physical data modelling and will act as the subject matter expert and owner for all data models within your domain. What you'll do Take full ownership of the data modelling lifecycle, from requirements gathering through to deployment and ongoing maintenance. Design, develop, and maintain logical and physical data models, primarily using Databricks and the Nova platform. Ensure data models are aligned with business requirements, data governance standards, and best practices. Collaborate with data engineers, architects, and business stakeholders to understand data requirements and translate them into effective data structures. Document data models, definitions, and metadata to ensure clarity and consistency across the organisation. Review and optimise existing data models for performance, scalability, and maintainability. Lead model validation and quality assurance activities, ensuring models are accurate and fit for purpose. Provide guidance and support to project teams and business users on data modelling best practices. Act as the primary point of contact and subject matter expert for all matters relating to data models within your area of responsibility. Champion data model governance and ensure models are maintained in line with organisational standards and policies. Support the integration of new data sources and the evolution of the enterprise data platform. Knowledge / Experience / Skills Essential Extensive experience in data modelling, including both logical and physical modelling, ideally within Databricks and/or similar cloud-based platforms. Demonstrable experience of owning and maintaining enterprise data models, including responsibility for model governance and documentation. Strong understanding of data warehousing, lakehouse architectures, and modern data platforms. Ability to translate complex business requirements into scalable and efficient data models. Proficient in data modelling tools and techniques, with hands-on experience in Databricks and the Nova platform (or similar). Excellent communication skills, with the ability to work collaboratively across technical and business teams. Experience in data governance, metadata management, and data quality assurance. Strong analytical and problem-solving skills. Desirable Experience working in regulated industries such as insurance or financial services. Familiarity with reporting requirements such as IFRS and Solvency II. Experience with data visualisation tools and supporting analytics/reporting teams. Knowledge of automation tools and scripting for data model management. Experience working in Agile product teams and using tools such as JIRA/Confluence.
Crimson
Senior Data Engineer - Azure Data - Burton-on-Trent - Hybrid
Crimson Burton-on-trent, Staffordshire
Senior Data Engineer - Azure Data - Burton-on-Trent - Permanent - Hybrid Salary - £60,000 - £67,000 per annum This role requires 1 day / week in Burton-on-Trent, with hybrid working arrangements. Our client is seeking a highly skilled Senior Data Engineer to join their dynamic IT team, based in Burton-on-Trent. The Senior Data Engineer will come on board to support the Strategic Data Manager in establishing and managing an efficient Business Intelligence technical service. Assisting in the advancement of our cloud-based data platforms, providing options for timely processing and cost-efficient solutions. A strong background in Azure Data Pipeline development is key for this position. Key Skills & Responsibilities: Build and manage pipelines using Azure Data Factory, Databricks, CI/CD, and Terraform. Optimisation of ETL processes for performance and cost-efficiency. Design scalable data models aligned with business needs. Azure data solutions for efficient data storage and retrieval. Ensure compliance with data protection laws (e.g., GDPR), implement encryption and access controls. Work with cross-functional teams and mentor junior engineers. Manage and tune Azure SQL Database instances. Proactively monitor pipelines and infrastructure for performance and reliability. Maintain technical documentation and lead knowledge-sharing initiatives. Deploy advanced analytics and machine learning solutions using Azure. Stay current with Azure technologies and identify areas for enhancement. Databricks (Unity Catalog, DLT), Data Factory, Synapse, Data Lake, Stream Analytics, Event Hubs. Strong knowledge of Python, Scala, C#, .NET. Experience with advanced SQL, T-SQL, relational databases. Azure DevOps, Terraform, BICEP, ARM templates. Distributed computing, cloud-native design patterns. Data modelling, metadata management, data quality, data as a product. Strong communication, empathy, determination, openness to innovation. Strong Microsoft Office 365 experience Interested? Please submit your updated CV to Lewis Rushton at Crimson for immediate consideration. Not interested? Do you know someone who might be a perfect fit for this role? Refer a friend and earn £250 worth of vouchers! Crimson is acting as an employment agency regarding this vacancy Crimson is acting as an employment agency regarding this vacancy
06/12/2025
Full time
Senior Data Engineer - Azure Data - Burton-on-Trent - Permanent - Hybrid Salary - £60,000 - £67,000 per annum This role requires 1 day / week in Burton-on-Trent, with hybrid working arrangements. Our client is seeking a highly skilled Senior Data Engineer to join their dynamic IT team, based in Burton-on-Trent. The Senior Data Engineer will come on board to support the Strategic Data Manager in establishing and managing an efficient Business Intelligence technical service. Assisting in the advancement of our cloud-based data platforms, providing options for timely processing and cost-efficient solutions. A strong background in Azure Data Pipeline development is key for this position. Key Skills & Responsibilities: Build and manage pipelines using Azure Data Factory, Databricks, CI/CD, and Terraform. Optimisation of ETL processes for performance and cost-efficiency. Design scalable data models aligned with business needs. Azure data solutions for efficient data storage and retrieval. Ensure compliance with data protection laws (e.g., GDPR), implement encryption and access controls. Work with cross-functional teams and mentor junior engineers. Manage and tune Azure SQL Database instances. Proactively monitor pipelines and infrastructure for performance and reliability. Maintain technical documentation and lead knowledge-sharing initiatives. Deploy advanced analytics and machine learning solutions using Azure. Stay current with Azure technologies and identify areas for enhancement. Databricks (Unity Catalog, DLT), Data Factory, Synapse, Data Lake, Stream Analytics, Event Hubs. Strong knowledge of Python, Scala, C#, .NET. Experience with advanced SQL, T-SQL, relational databases. Azure DevOps, Terraform, BICEP, ARM templates. Distributed computing, cloud-native design patterns. Data modelling, metadata management, data quality, data as a product. Strong communication, empathy, determination, openness to innovation. Strong Microsoft Office 365 experience Interested? Please submit your updated CV to Lewis Rushton at Crimson for immediate consideration. Not interested? Do you know someone who might be a perfect fit for this role? Refer a friend and earn £250 worth of vouchers! Crimson is acting as an employment agency regarding this vacancy Crimson is acting as an employment agency regarding this vacancy
Young's Employment Services Ltd
Senior Azure Data Engineer
Young's Employment Services Ltd
Senior Azure Data Engineer Hybrid - Work From Home and West London Circ £70,000 - £80,000 + Range of benefits A well-known and prestigious business is looking to add a Senior Azure Data Engineer to their data team. This is an exciting opportunity for a Data Engineer that's not just technical, but also enjoys directly engaging and collaborating with stakeholders from across business functions such as finance, operations, planning, manufacturing, retail, e-commerce etc. Having nearly completed the process of migrating data from their existing on-prem databases to an Azure Cloud based platform, the Senior Data Engineer will play a key role in helping make best use of the data by gathering and agreeing requirements with the business to build data solutions that align accordingly. Working with diverse data sets from multiple systems and overseeing their integration and optimisation will require raw development, management and optimisation of data pipelines using tools in the Azure Cloud. Our client has expanded rapidly and been transformed in recent years, they're an iconic business with a special work environment that's manifested a strong and positive culture amongst the whole workforce. This is a hybrid role where the postholder can work from home 2 or 3 days per week, the other days will be based onsite in West London just a few minutes walk from a Central Line tube station. The key responsibilities for the post include; Develop, construct, test and maintain data architectures within large scale data processing systems. Develop and manage data pipelines using Azure Data Factory, Delta Lake and Spark. Utilise Azure Cloud architecture knowledge to design and implement scalable data solutions. Utilise Spark, SQL, Python, R, and other data frameworks to manipulate data and gain a thorough understanding of the dataset's characteristics. Interact with API systems to query and retrieve data for analysis. Collaborate with business users / stakeholders to gather and agree requirements. To be considered for the post you'll need at least 5 years experience ideally with 1 or 2 years at a senior / lead level. You'll need to be goal driven and able to take ownership of work tasks without the need for constant supervision. You'll be engaging with multiple business areas so the ability to communicate effectively to understand requirements and build trusted relationships is a must. It's likely you'll have most, if not all the following: Experience as a Senior Data Engineer or similar Strong knowledge of Azure Cloud architecture and Azure Databricks, DevOps and CI/CD. Experience with PySpark, Python, SQL and other data engineering development tools. Experience with metadata driven pipelines and SQL serverless data warehouses. Knowledge of querying API systems. Experience building and optimising ETL pipelines using Databricks. Strong problem-solving skills and attention to detail. Understanding of data governance and data quality principles. A degree in computer science, engineering, or equivalent experience. Salary will be dependent on experience and likely to be in the region of £70,000 - £80,000 although client may consider higher for outstanding candidate. Our client can also provide a vibrant, rewarding, and diverse work environment that supports career development. Candidates must be authorised to work in the UK and not require sponsoring either now or in the future. For further information, please send your CV to Wayne Young at Young's Employment Services Ltd. Young's Employment Services acts in the capacity of both an Employment Agent and Employment Business.
05/12/2025
Full time
Senior Azure Data Engineer Hybrid - Work From Home and West London Circ £70,000 - £80,000 + Range of benefits A well-known and prestigious business is looking to add a Senior Azure Data Engineer to their data team. This is an exciting opportunity for a Data Engineer that's not just technical, but also enjoys directly engaging and collaborating with stakeholders from across business functions such as finance, operations, planning, manufacturing, retail, e-commerce etc. Having nearly completed the process of migrating data from their existing on-prem databases to an Azure Cloud based platform, the Senior Data Engineer will play a key role in helping make best use of the data by gathering and agreeing requirements with the business to build data solutions that align accordingly. Working with diverse data sets from multiple systems and overseeing their integration and optimisation will require raw development, management and optimisation of data pipelines using tools in the Azure Cloud. Our client has expanded rapidly and been transformed in recent years, they're an iconic business with a special work environment that's manifested a strong and positive culture amongst the whole workforce. This is a hybrid role where the postholder can work from home 2 or 3 days per week, the other days will be based onsite in West London just a few minutes walk from a Central Line tube station. The key responsibilities for the post include; Develop, construct, test and maintain data architectures within large scale data processing systems. Develop and manage data pipelines using Azure Data Factory, Delta Lake and Spark. Utilise Azure Cloud architecture knowledge to design and implement scalable data solutions. Utilise Spark, SQL, Python, R, and other data frameworks to manipulate data and gain a thorough understanding of the dataset's characteristics. Interact with API systems to query and retrieve data for analysis. Collaborate with business users / stakeholders to gather and agree requirements. To be considered for the post you'll need at least 5 years experience ideally with 1 or 2 years at a senior / lead level. You'll need to be goal driven and able to take ownership of work tasks without the need for constant supervision. You'll be engaging with multiple business areas so the ability to communicate effectively to understand requirements and build trusted relationships is a must. It's likely you'll have most, if not all the following: Experience as a Senior Data Engineer or similar Strong knowledge of Azure Cloud architecture and Azure Databricks, DevOps and CI/CD. Experience with PySpark, Python, SQL and other data engineering development tools. Experience with metadata driven pipelines and SQL serverless data warehouses. Knowledge of querying API systems. Experience building and optimising ETL pipelines using Databricks. Strong problem-solving skills and attention to detail. Understanding of data governance and data quality principles. A degree in computer science, engineering, or equivalent experience. Salary will be dependent on experience and likely to be in the region of £70,000 - £80,000 although client may consider higher for outstanding candidate. Our client can also provide a vibrant, rewarding, and diverse work environment that supports career development. Candidates must be authorised to work in the UK and not require sponsoring either now or in the future. For further information, please send your CV to Wayne Young at Young's Employment Services Ltd. Young's Employment Services acts in the capacity of both an Employment Agent and Employment Business.
Tenth Revolution Group
Senior Databricks Engineer - Oxfordshire Hybrid -£Competitive
Tenth Revolution Group Banbury, Oxfordshire
Databricks Engineer Location: Oxfordshire (Hybrid)Salary: £Competitive + Benefits Are you an experienced Databricks Engineer looking for your next challenge? The Role This is a hands-on technical role with leadership responsibilities. You'll design and deliver scalable data solutions, work closely with data leaders on architecture and strategy, and mentor a small team of Data Engineers to ensure best practices. Key Responsibilities Build and maintain scalable data pipelines and ETL processes using Databricks Collaborate on data architecture and translate designs into build plans Deliver large-scale data workflows and optimise for performance Implement data quality and validation processes What We're Looking For Strong experience with Databricks Proficiency in Python, Spark, and SQL Experience with cloud platforms Knowledge of pipeline tools Excellent problem-solving and leadership skills If you're passionate about data engineering and want to make an impact, apply today!
05/12/2025
Full time
Databricks Engineer Location: Oxfordshire (Hybrid)Salary: £Competitive + Benefits Are you an experienced Databricks Engineer looking for your next challenge? The Role This is a hands-on technical role with leadership responsibilities. You'll design and deliver scalable data solutions, work closely with data leaders on architecture and strategy, and mentor a small team of Data Engineers to ensure best practices. Key Responsibilities Build and maintain scalable data pipelines and ETL processes using Databricks Collaborate on data architecture and translate designs into build plans Deliver large-scale data workflows and optimise for performance Implement data quality and validation processes What We're Looking For Strong experience with Databricks Proficiency in Python, Spark, and SQL Experience with cloud platforms Knowledge of pipeline tools Excellent problem-solving and leadership skills If you're passionate about data engineering and want to make an impact, apply today!
TXP Technology x People
BI Manager Azure Data Lake, Fabric, DW
TXP Technology x People
Role: BI Manager Salary: £70,000 - £80,000 PA Plus Bonus and Benefits Location: Central Birmingham, West Midlands (Hybrid Working - 2 days per week onsite) We are currently working with a leading Midlands based services provider who require a technically strong Senior BI Manager with a good understanding of Azure Data and Data Engineering tools. Working as a key member of a newly formed Data Engineering team, the successful candidate will lead the design, development, and ongoing enhancement of the client's data and reporting infrastructure. You will be the strategic owner of the Azure Data Platform, overseeing services such as Azure Data Lake, Data Warehouse, Data Factory, Databricks, and Power BI. The technical focus is all Microsoft, primarily Azure so any Fabric experience would be very beneficial. Our client is looking for someone who is going to lead the function, and has previous experience doing this. Someone who really understands data and what it can be used for and challenge the business on what they need from the data and challenge the teams to produce the most effective data outputs for the business need so that it can improve and become a first-class function. You will need to be able to drive the direction of how data works for the organisation and the overall Data/BI strategy, design solutions that fit, and demonstrate what value data can bring to the company if it is used effectively. A technical background is essential to be able understand and bridge the gap between the Data Team and the Business environment so that the two collaborate effectively and are challenged both ways. Someone who can understand and appreciate both the technical side and the business strategy side. Our client offers a good, supportive environment which is going through a major transformation being driven by technology. Skills & experience required: Experience leading a BI function Expertise in Azure BI architecture and Cloud services Hands-on experience with Azure Fabric, SQL warehousing, DataLakes, Databricks Track record in MI/BI product development using Agile and Waterfall methods Experience managing cross-functional teams and sprint activities Experience in leading a BI team and a business through the development and transition to a Data Lake / Factory / Warehouse Technical BI development/architect background Benefits: Achievable bonus scheme 4% Pension Life Insurance 3 x salary 25 days annual leave plus statutory - 1 x extra day every year for the first 3 years Blue Light Card Medicash - includes discounted gym memberships etc. If your profile demonstrates strong and recent experience in the above areas - please submit your application ASAP to Jackie Dean at TXP for consideration. TXP takes great pride in representing socially responsible clients who not only prioritise diversity and inclusion but also actively combat social inequality. Together, we have the power to make a profound impact on fostering a more equitable and inclusive society. By working with us, you become part of a movement dedicated to promoting a diverse and inclusive workforce.
05/12/2025
Full time
Role: BI Manager Salary: £70,000 - £80,000 PA Plus Bonus and Benefits Location: Central Birmingham, West Midlands (Hybrid Working - 2 days per week onsite) We are currently working with a leading Midlands based services provider who require a technically strong Senior BI Manager with a good understanding of Azure Data and Data Engineering tools. Working as a key member of a newly formed Data Engineering team, the successful candidate will lead the design, development, and ongoing enhancement of the client's data and reporting infrastructure. You will be the strategic owner of the Azure Data Platform, overseeing services such as Azure Data Lake, Data Warehouse, Data Factory, Databricks, and Power BI. The technical focus is all Microsoft, primarily Azure so any Fabric experience would be very beneficial. Our client is looking for someone who is going to lead the function, and has previous experience doing this. Someone who really understands data and what it can be used for and challenge the business on what they need from the data and challenge the teams to produce the most effective data outputs for the business need so that it can improve and become a first-class function. You will need to be able to drive the direction of how data works for the organisation and the overall Data/BI strategy, design solutions that fit, and demonstrate what value data can bring to the company if it is used effectively. A technical background is essential to be able understand and bridge the gap between the Data Team and the Business environment so that the two collaborate effectively and are challenged both ways. Someone who can understand and appreciate both the technical side and the business strategy side. Our client offers a good, supportive environment which is going through a major transformation being driven by technology. Skills & experience required: Experience leading a BI function Expertise in Azure BI architecture and Cloud services Hands-on experience with Azure Fabric, SQL warehousing, DataLakes, Databricks Track record in MI/BI product development using Agile and Waterfall methods Experience managing cross-functional teams and sprint activities Experience in leading a BI team and a business through the development and transition to a Data Lake / Factory / Warehouse Technical BI development/architect background Benefits: Achievable bonus scheme 4% Pension Life Insurance 3 x salary 25 days annual leave plus statutory - 1 x extra day every year for the first 3 years Blue Light Card Medicash - includes discounted gym memberships etc. If your profile demonstrates strong and recent experience in the above areas - please submit your application ASAP to Jackie Dean at TXP for consideration. TXP takes great pride in representing socially responsible clients who not only prioritise diversity and inclusion but also actively combat social inequality. Together, we have the power to make a profound impact on fostering a more equitable and inclusive society. By working with us, you become part of a movement dedicated to promoting a diverse and inclusive workforce.
Senior Cloud Infrastructure Engineer
Synextra Warrington, Cheshire
About Us: We're not your typical MSP. As a second-generation Managed Service Provider (MSP) born from the expertise of industry veterans, we're crafting a new narrative in the UK tech scene. Our team, akin to the Navy SEALs of the tech world, consists of elite operators who've excelled in larger MSP environments but yearned for something more. At Synextra, we've spent 10+ transformative years building a company that's small by design but colossal in impact, guided by a technical leader whose vision is always client-first. We are currently looking for a skilled and enthusiastic Senior Cloud Infrastructure Engineer to join our dynamic technical team. You can expect to work with the latest technology and leading brands in a team where everyone has a passion for technology. This will be a hands-on, fast-paced role where there will be endless opportunities to develop new skills and get involved in projects across the business. About you: A highly ambitious, energetic and hands-on Azure cloud-native engineer, who works as well in a team as you can alone. Thrives in a fast-paced, loosely structured start-up environment where extra effort and a high quality of work won't go unrecognised. Eager to work fast, iterate quickly and with a drive for excellence in everything they do. Excited to be able to constantly work with and learn new technologies. About the role: At least 90% of BAU will operate across the full Azure suite (see essential skills for key areas of interest). The remaining BAU will cover a multitude of enterprise-level MSP and ISP technologies (Such as Windows Server, Active Directory, M365 etc.) No two days will be the same; different technologies, challenges and opportunities will always be present. Occasional project-based work. Help lead the 5 support provided by the Cloud Infrastructure team as one of a few key senior team members. Mentor the junior team members to continue their technical development and provide feedback to help shape their growth and progression. Out of hours work is involved (on-call, overtime). Working from our modern office workplace on-site in Warrington, 8.5h/day, Mon-Fri What you gain: A rare chance to help an Expertise-driven MSP continue to grow and provide excellence under the technical vision of the founding team. Exposure to an extreme amount of core and cutting-edge cloud native technologies across dozens of SMEs. Mentorship for your own development from the Lead Engineers with the Cloud Infrastructure, Cloud Platform and Cloud Consultancy teams. Collaboration with an ambitious team of industry experts who are always trying to push the company, and each other to deliver the best solutions possible. Essential skills required: Expert with Azure IaaS services including but not limited to; Virtual Machines, Azure Networking (VNETs, VWAN, Azure Firewall), relevant backup and DR services (Azure Backup, Azure Site Recovery, Veeam). Experienced with Azure PaaS services such as App Services, Azure Front Door, Azure Data Factory, Azure SQL and more. Exposure to IAC (Terraform, Bicep) would be advantageous. Exposure to Azure's development and data services; Azure DevOps, Azure APIM, Azure Fabric, Azure Databricks, etc. Expert understanding of Active Directory & Windows Server. Strong understanding of core cybersecurity principles including least privilege access, zero-trust networking and defence in depth. Solid understanding of virtual desktop infrastructure and direct experience working with Azure Virtual Desktop and FSLogix. Experience in a customer-facing technical role dealing with both technical and non-technical stakeholders. Perks & Benefits: Great office location, free parking, plenty of on-campus amenities. Stylish breakout spaces, pool table, games console, premium coffee, fruit, snacks and soft drinks. Comprehensive benefits including private healthcare, dental cover, and more. A Garmin smartwatch to prioritise health and fitness. Perkbox employee benefits platform A culture that celebrates success, encourages innovation, and supports your professional and personal growth Are you the game changer we are looking for? If this sounds like the challenge you've been waiting for, we're excited to hear from you. Reach out to us, and schedule your chat with our team.
05/12/2025
Full time
About Us: We're not your typical MSP. As a second-generation Managed Service Provider (MSP) born from the expertise of industry veterans, we're crafting a new narrative in the UK tech scene. Our team, akin to the Navy SEALs of the tech world, consists of elite operators who've excelled in larger MSP environments but yearned for something more. At Synextra, we've spent 10+ transformative years building a company that's small by design but colossal in impact, guided by a technical leader whose vision is always client-first. We are currently looking for a skilled and enthusiastic Senior Cloud Infrastructure Engineer to join our dynamic technical team. You can expect to work with the latest technology and leading brands in a team where everyone has a passion for technology. This will be a hands-on, fast-paced role where there will be endless opportunities to develop new skills and get involved in projects across the business. About you: A highly ambitious, energetic and hands-on Azure cloud-native engineer, who works as well in a team as you can alone. Thrives in a fast-paced, loosely structured start-up environment where extra effort and a high quality of work won't go unrecognised. Eager to work fast, iterate quickly and with a drive for excellence in everything they do. Excited to be able to constantly work with and learn new technologies. About the role: At least 90% of BAU will operate across the full Azure suite (see essential skills for key areas of interest). The remaining BAU will cover a multitude of enterprise-level MSP and ISP technologies (Such as Windows Server, Active Directory, M365 etc.) No two days will be the same; different technologies, challenges and opportunities will always be present. Occasional project-based work. Help lead the 5 support provided by the Cloud Infrastructure team as one of a few key senior team members. Mentor the junior team members to continue their technical development and provide feedback to help shape their growth and progression. Out of hours work is involved (on-call, overtime). Working from our modern office workplace on-site in Warrington, 8.5h/day, Mon-Fri What you gain: A rare chance to help an Expertise-driven MSP continue to grow and provide excellence under the technical vision of the founding team. Exposure to an extreme amount of core and cutting-edge cloud native technologies across dozens of SMEs. Mentorship for your own development from the Lead Engineers with the Cloud Infrastructure, Cloud Platform and Cloud Consultancy teams. Collaboration with an ambitious team of industry experts who are always trying to push the company, and each other to deliver the best solutions possible. Essential skills required: Expert with Azure IaaS services including but not limited to; Virtual Machines, Azure Networking (VNETs, VWAN, Azure Firewall), relevant backup and DR services (Azure Backup, Azure Site Recovery, Veeam). Experienced with Azure PaaS services such as App Services, Azure Front Door, Azure Data Factory, Azure SQL and more. Exposure to IAC (Terraform, Bicep) would be advantageous. Exposure to Azure's development and data services; Azure DevOps, Azure APIM, Azure Fabric, Azure Databricks, etc. Expert understanding of Active Directory & Windows Server. Strong understanding of core cybersecurity principles including least privilege access, zero-trust networking and defence in depth. Solid understanding of virtual desktop infrastructure and direct experience working with Azure Virtual Desktop and FSLogix. Experience in a customer-facing technical role dealing with both technical and non-technical stakeholders. Perks & Benefits: Great office location, free parking, plenty of on-campus amenities. Stylish breakout spaces, pool table, games console, premium coffee, fruit, snacks and soft drinks. Comprehensive benefits including private healthcare, dental cover, and more. A Garmin smartwatch to prioritise health and fitness. Perkbox employee benefits platform A culture that celebrates success, encourages innovation, and supports your professional and personal growth Are you the game changer we are looking for? If this sounds like the challenge you've been waiting for, we're excited to hear from you. Reach out to us, and schedule your chat with our team.
E.ON
Senior Data Analyst
E.ON Nottingham, Nottinghamshire
We are looking for a Senior Data Analyst to join our Credit Risk Portfolio Management team - a key function responsible for understanding and forecasting the financial health of the customer base. The role requires the use of advanced analytics to provide deep insights into credit risk and performance to inform our risk management strategy. This role is ideal for a proven senior analyst with strong technical and mathematical skills combined with the commercial acumen needed to translate insights into action. Here's a taste of what you'll be doing Partnering with Finance to develop and maintain debt forecasts to track performance vs. plan, forecast risk and deep dive into the drivers of variations to plan Developing scenario models to enable data-based decisioning within operational and strategic planning cycles Staying up to date with emerging analytical techniques and technologies, partnering with Data Science to deliver advanced segmentation and behavioural analysis Delivering and consulting on the advanced analytics required to support project delivery across the wider Credit Management function e.g. simulation models, data pipeline mock ups, statistically robust testing Translating analytical outputs into clear recommendations for business stakeholders, influencing decisions across debt prevention and collections strategy Liaising with Data Engineers to drive enhancements to data quality, availability and usability. Are we the perfect match? Proven experience in a data analytics or credit risk role, ideally within utilities, financial services or other regulated industry Strong coding skills in SQL, Python and PySpark for data extraction, transformation, modeling and forecasting Solid understanding of forecasting techniques, scenario modelling, and regression-based analytics Strong commercial acumen, with the ability to translate complex analytical findings into clear narratives with direct links to business value Excellent stakeholder engagement and communication skills, with confidence working across operational, strategic and technical teams A degree (or equivalent experience) in a quantitative discipline such as statistics, mathematics, economics or data science It would be great if you had Experience working with Databricks Understanding of macroeconomic and market drivers affecting customer affordability and credit risk Prior experience working across both residential and commercial consumer bases Experience working with credit bureau data A general understanding of accounting principles Here's what else you need to know Role may close earlier due to high applications. Competitive salary Location - Nottingham E.ON Next office, Trinity House,?2 Burton St, Nottingham NG1 4BX - with travel to our other sites when required. Excellent parental leave allowance. Award-Winning Workplace - We're proud to be named a Sunday Times Best Place to Work 2025 and the Best Place to Work for 16-34-year-olds Outstanding Benefits - Enjoy 26 days of annual leave plus bank holidays, a generous pension, life cover, bonus opportunities and access to 20 flexible benefits with tax/NI savings Flexible & Family-Friendly - Our industry-leading hybrid and family-friendly policies earned us double recognition at the Personnel Today Awards 2024 . We're open to discussing how flexibility can work for you Inclusive & Diverse - We're the only energy company in the Inclusive Top 50 UK Employers . We're also proud winners of Best Employer for Women and Human Company of the Year -recognising our inclusive, people-first culture Support at Every Stage of Life - We're Fertility Friendly and Menopause Friendly accredited, with inclusive support for everyone Accessible & Supportive - Do you consider yourself as having a disability? As a Disability Confident Employer, we guarantee interviews for disabled applicants who meet the minimum criteria for the role and will make any adjustments needed during the process Invested in Your Growth - From inclusive talent networks to top-tier development programmes, we'll support your growth every step of the way For all successful candidates.? Due to the nature of this role your employment will be subject to a basic DBS (Disclosure Barring Service) check being carried out by ourselves via a 3rd party service provider
05/12/2025
Full time
We are looking for a Senior Data Analyst to join our Credit Risk Portfolio Management team - a key function responsible for understanding and forecasting the financial health of the customer base. The role requires the use of advanced analytics to provide deep insights into credit risk and performance to inform our risk management strategy. This role is ideal for a proven senior analyst with strong technical and mathematical skills combined with the commercial acumen needed to translate insights into action. Here's a taste of what you'll be doing Partnering with Finance to develop and maintain debt forecasts to track performance vs. plan, forecast risk and deep dive into the drivers of variations to plan Developing scenario models to enable data-based decisioning within operational and strategic planning cycles Staying up to date with emerging analytical techniques and technologies, partnering with Data Science to deliver advanced segmentation and behavioural analysis Delivering and consulting on the advanced analytics required to support project delivery across the wider Credit Management function e.g. simulation models, data pipeline mock ups, statistically robust testing Translating analytical outputs into clear recommendations for business stakeholders, influencing decisions across debt prevention and collections strategy Liaising with Data Engineers to drive enhancements to data quality, availability and usability. Are we the perfect match? Proven experience in a data analytics or credit risk role, ideally within utilities, financial services or other regulated industry Strong coding skills in SQL, Python and PySpark for data extraction, transformation, modeling and forecasting Solid understanding of forecasting techniques, scenario modelling, and regression-based analytics Strong commercial acumen, with the ability to translate complex analytical findings into clear narratives with direct links to business value Excellent stakeholder engagement and communication skills, with confidence working across operational, strategic and technical teams A degree (or equivalent experience) in a quantitative discipline such as statistics, mathematics, economics or data science It would be great if you had Experience working with Databricks Understanding of macroeconomic and market drivers affecting customer affordability and credit risk Prior experience working across both residential and commercial consumer bases Experience working with credit bureau data A general understanding of accounting principles Here's what else you need to know Role may close earlier due to high applications. Competitive salary Location - Nottingham E.ON Next office, Trinity House,?2 Burton St, Nottingham NG1 4BX - with travel to our other sites when required. Excellent parental leave allowance. Award-Winning Workplace - We're proud to be named a Sunday Times Best Place to Work 2025 and the Best Place to Work for 16-34-year-olds Outstanding Benefits - Enjoy 26 days of annual leave plus bank holidays, a generous pension, life cover, bonus opportunities and access to 20 flexible benefits with tax/NI savings Flexible & Family-Friendly - Our industry-leading hybrid and family-friendly policies earned us double recognition at the Personnel Today Awards 2024 . We're open to discussing how flexibility can work for you Inclusive & Diverse - We're the only energy company in the Inclusive Top 50 UK Employers . We're also proud winners of Best Employer for Women and Human Company of the Year -recognising our inclusive, people-first culture Support at Every Stage of Life - We're Fertility Friendly and Menopause Friendly accredited, with inclusive support for everyone Accessible & Supportive - Do you consider yourself as having a disability? As a Disability Confident Employer, we guarantee interviews for disabled applicants who meet the minimum criteria for the role and will make any adjustments needed during the process Invested in Your Growth - From inclusive talent networks to top-tier development programmes, we'll support your growth every step of the way For all successful candidates.? Due to the nature of this role your employment will be subject to a basic DBS (Disclosure Barring Service) check being carried out by ourselves via a 3rd party service provider
Brookson
Data Engineer
Brookson Warrington, Cheshire
Following the acquisition of Brookson by People2.0, the responsibilities of the Data and Analytics team have significantly expanded. We are now seeking a skilled Data Engineer to join our dynamic team. As a Data Engineer within the Data and Analytics team, you will play a pivotal role in designing, maintaining, and optimising the data architecture for People2.0 Group. This architecture ensures the seamless flow of data from source systems to the Data Warehouse, providing a robust foundation of aggregated analytical base tables and operational sources. Our data architecture is critical in enabling People2.0 Group to become a data-driven organisation. Your responsibilities will include maintaining existing data pipelines, developing new branches within the architecture, and collaborating closely with stakeholders across the business. You will engage with stakeholders from various regions, including EMEA, US, and APAC, ensuring that the architecture aligns with regional and global requirements. Reporting directly to the Senior Data Engineer within the Data and Analytics team, you will work alongside internal and external stakeholders, depending on project requirements. Your ability to communicate effectively and adapt to different regions and business needs will be key to your success. This role offers an exciting opportunity to be part of a team that directly contributes to the data-driven evolution of People2.0 Group. Our Warrington office (WA1) is easily accessible by car and a 10-minute walk from the nearest train station. We offer hybrid working, with a minimum requirement of 2 days in the office and the flexibility to work from home the rest of the week. What will you be doing as Data Engineer: ETL Maintenance: Collaborating with the Senior Data Engineer to ensure that the ETL process between source systems and the Data Warehouse remains fully operational, with minimal downtime and blockages. Implementing new systems: Working with key stakeholders to integrate new systems and acquisitions into the People2.0 Data Architecture. This includes mapping fields to existing systems, evaluating master data management (MDM) opportunities, and building core data sources for end-user consumption. Building Data Marts: Working with analytical operations and business analyst to develop key data sources for business end users. This includes analytical base tables for MI, operational metrics for data integrity, Develop Native Databases: Working with the business and/or the Analytics team to build bespoke and native applications that require database structures. This includes designing, optimising and promoting it through the Development process. Improve Data Literacy: Work with the Senior Data Engineer to improve data literacy on end point architecture. Improving engagement, business buy-in and understanding of the data, thus promoting a self-serve analytical culture. What are the qualities that can help you thrive as a Data Engineer? Essential Experience and Qualifications: Strong SQL skills - Tables, Stored Procedures, performance tuning etc. Experience with Azure ETL tools - Azure data Factory / Synapse Strong experience in Data movement methodologies and standards, ELT & ETL. A self-motivated, enthusiastic problem solver, with the ability to work under pressure and prioritise workload in order to meet deadlines. Educated to degree level BSc in - for example - Computer Science, Mathematics, Engineering or other STEM A strong team player with empathy, humility and dedication to joint success and shared development. Desirable Experience and Qualifications: Experience with Databricks or Delta Lake architecture. Experience building architecture and Data Warehousing within the Microsoft Stack Experience in development Source control (e.g. Bit Bucket, Github) Experience in Low Code Analytical Tools (e.g. Alteryx) Experience in Power Platform Stack (Power BI, Power Automate) In Return for joining us as a Senior Data Engineer: Salary of £34,000 - £38,000, depending on experience 23 days annual leave, plus bank holidays Your birthday off 2 Press Pause Days (An opportunity to step back, breathe, and focus on your wellness - whatever that may look like) Hybrid working 5% company pension contribution after 3 months Access to free Financial Advice including Mortgages, and Savings Cyle2Work scheme Perkbox employee discounts Next Steps If you are interested in being considered for this opportunity, please apply with your CV highlighting your relevant skills in relation to the above criteria. Regardless of the outcome of your application, all candidates will be contacted, and we aim to do this within 3 working days.
05/12/2025
Full time
Following the acquisition of Brookson by People2.0, the responsibilities of the Data and Analytics team have significantly expanded. We are now seeking a skilled Data Engineer to join our dynamic team. As a Data Engineer within the Data and Analytics team, you will play a pivotal role in designing, maintaining, and optimising the data architecture for People2.0 Group. This architecture ensures the seamless flow of data from source systems to the Data Warehouse, providing a robust foundation of aggregated analytical base tables and operational sources. Our data architecture is critical in enabling People2.0 Group to become a data-driven organisation. Your responsibilities will include maintaining existing data pipelines, developing new branches within the architecture, and collaborating closely with stakeholders across the business. You will engage with stakeholders from various regions, including EMEA, US, and APAC, ensuring that the architecture aligns with regional and global requirements. Reporting directly to the Senior Data Engineer within the Data and Analytics team, you will work alongside internal and external stakeholders, depending on project requirements. Your ability to communicate effectively and adapt to different regions and business needs will be key to your success. This role offers an exciting opportunity to be part of a team that directly contributes to the data-driven evolution of People2.0 Group. Our Warrington office (WA1) is easily accessible by car and a 10-minute walk from the nearest train station. We offer hybrid working, with a minimum requirement of 2 days in the office and the flexibility to work from home the rest of the week. What will you be doing as Data Engineer: ETL Maintenance: Collaborating with the Senior Data Engineer to ensure that the ETL process between source systems and the Data Warehouse remains fully operational, with minimal downtime and blockages. Implementing new systems: Working with key stakeholders to integrate new systems and acquisitions into the People2.0 Data Architecture. This includes mapping fields to existing systems, evaluating master data management (MDM) opportunities, and building core data sources for end-user consumption. Building Data Marts: Working with analytical operations and business analyst to develop key data sources for business end users. This includes analytical base tables for MI, operational metrics for data integrity, Develop Native Databases: Working with the business and/or the Analytics team to build bespoke and native applications that require database structures. This includes designing, optimising and promoting it through the Development process. Improve Data Literacy: Work with the Senior Data Engineer to improve data literacy on end point architecture. Improving engagement, business buy-in and understanding of the data, thus promoting a self-serve analytical culture. What are the qualities that can help you thrive as a Data Engineer? Essential Experience and Qualifications: Strong SQL skills - Tables, Stored Procedures, performance tuning etc. Experience with Azure ETL tools - Azure data Factory / Synapse Strong experience in Data movement methodologies and standards, ELT & ETL. A self-motivated, enthusiastic problem solver, with the ability to work under pressure and prioritise workload in order to meet deadlines. Educated to degree level BSc in - for example - Computer Science, Mathematics, Engineering or other STEM A strong team player with empathy, humility and dedication to joint success and shared development. Desirable Experience and Qualifications: Experience with Databricks or Delta Lake architecture. Experience building architecture and Data Warehousing within the Microsoft Stack Experience in development Source control (e.g. Bit Bucket, Github) Experience in Low Code Analytical Tools (e.g. Alteryx) Experience in Power Platform Stack (Power BI, Power Automate) In Return for joining us as a Senior Data Engineer: Salary of £34,000 - £38,000, depending on experience 23 days annual leave, plus bank holidays Your birthday off 2 Press Pause Days (An opportunity to step back, breathe, and focus on your wellness - whatever that may look like) Hybrid working 5% company pension contribution after 3 months Access to free Financial Advice including Mortgages, and Savings Cyle2Work scheme Perkbox employee discounts Next Steps If you are interested in being considered for this opportunity, please apply with your CV highlighting your relevant skills in relation to the above criteria. Regardless of the outcome of your application, all candidates will be contacted, and we aim to do this within 3 working days.
TXP
BI Manager (Azure Data Lake, Fabric, DW)
TXP City, Birmingham
Role: BI Manager Salary: 70,000 - 80,000 PA Plus Bonus and Benefits Location: Central Birmingham, West Midlands (Hybrid Working - 2 days per week onsite) We are currently working with a leading Midlands based services provider who require a technically strong Senior BI Manager with a good understanding of Azure Data and Data Engineering tools. Working as a key member of a newly formed Data Engineering team, the successful candidate will lead the design, development, and ongoing enhancement of the client's data and reporting infrastructure. You will be the strategic owner of the Azure Data Platform, overseeing services such as Azure Data Lake, Data Warehouse, Data Factory, Databricks, and Power BI. The technical focus is all Microsoft, primarily Azure so any Fabric experience would be very beneficial. Our client is looking for someone who is going to lead the function, and has previous experience doing this. Someone who really understands data and what it can be used for and challenge the business on what they need from the data and challenge the teams to produce the most effective data outputs for the business need so that it can improve and become a first-class function. You will need to be able to drive the direction of how data works for the organisation and the overall Data/BI strategy, design solutions that fit, and demonstrate what value data can bring to the company if it is used effectively. A technical background is essential to be able understand and bridge the gap between the Data Team and the Business environment so that the two collaborate effectively and are challenged both ways. Someone who can understand and appreciate both the technical side and the business strategy side. Our client offers a good, supportive environment which is going through a major transformation being driven by technology. Skills & experience required: Experience leading a BI function Expertise in Azure BI architecture and Cloud services Hands-on experience with Azure Fabric, SQL warehousing, DataLakes, Databricks Track record in MI/BI product development using Agile and Waterfall methods Experience managing cross-functional teams and sprint activities Experience in leading a BI team and a business through the development and transition to a Data Lake / Factory / Warehouse Technical BI development/architect background Benefits: Achievable bonus scheme 4% Pension Life Insurance 3 x salary 25 days annual leave plus statutory - 1 x extra day every year for the first 3 years Blue Light Card Medicash - includes discounted gym memberships etc. If your profile demonstrates strong and recent experience in the above areas - please submit your application ASAP to Jackie Dean at TXP for consideration. TXP takes great pride in representing socially responsible clients who not only prioritise diversity and inclusion but also actively combat social inequality. Together, we have the power to make a profound impact on fostering a more equitable and inclusive society. By working with us, you become part of a movement dedicated to promoting a diverse and inclusive workforce.
05/12/2025
Full time
Role: BI Manager Salary: 70,000 - 80,000 PA Plus Bonus and Benefits Location: Central Birmingham, West Midlands (Hybrid Working - 2 days per week onsite) We are currently working with a leading Midlands based services provider who require a technically strong Senior BI Manager with a good understanding of Azure Data and Data Engineering tools. Working as a key member of a newly formed Data Engineering team, the successful candidate will lead the design, development, and ongoing enhancement of the client's data and reporting infrastructure. You will be the strategic owner of the Azure Data Platform, overseeing services such as Azure Data Lake, Data Warehouse, Data Factory, Databricks, and Power BI. The technical focus is all Microsoft, primarily Azure so any Fabric experience would be very beneficial. Our client is looking for someone who is going to lead the function, and has previous experience doing this. Someone who really understands data and what it can be used for and challenge the business on what they need from the data and challenge the teams to produce the most effective data outputs for the business need so that it can improve and become a first-class function. You will need to be able to drive the direction of how data works for the organisation and the overall Data/BI strategy, design solutions that fit, and demonstrate what value data can bring to the company if it is used effectively. A technical background is essential to be able understand and bridge the gap between the Data Team and the Business environment so that the two collaborate effectively and are challenged both ways. Someone who can understand and appreciate both the technical side and the business strategy side. Our client offers a good, supportive environment which is going through a major transformation being driven by technology. Skills & experience required: Experience leading a BI function Expertise in Azure BI architecture and Cloud services Hands-on experience with Azure Fabric, SQL warehousing, DataLakes, Databricks Track record in MI/BI product development using Agile and Waterfall methods Experience managing cross-functional teams and sprint activities Experience in leading a BI team and a business through the development and transition to a Data Lake / Factory / Warehouse Technical BI development/architect background Benefits: Achievable bonus scheme 4% Pension Life Insurance 3 x salary 25 days annual leave plus statutory - 1 x extra day every year for the first 3 years Blue Light Card Medicash - includes discounted gym memberships etc. If your profile demonstrates strong and recent experience in the above areas - please submit your application ASAP to Jackie Dean at TXP for consideration. TXP takes great pride in representing socially responsible clients who not only prioritise diversity and inclusion but also actively combat social inequality. Together, we have the power to make a profound impact on fostering a more equitable and inclusive society. By working with us, you become part of a movement dedicated to promoting a diverse and inclusive workforce.
Tenth Revolution Group
Senior Databricks Engineer - Oxfordshire Hybrid -£Competitive
Tenth Revolution Group Hook Norton, Oxfordshire
Databricks Engineer Location: Oxfordshire (Hybrid) Salary: Competitive + Benefits Are you an experienced Databricks Engineer looking for your next challenge? The Role This is a hands-on technical role with leadership responsibilities. You'll design and deliver scalable data solutions, work closely with data leaders on architecture and strategy, and mentor a small team of Data Engineers to ensure best practices. Key Responsibilities Build and maintain scalable data pipelines and ETL processes using Databricks Collaborate on data architecture and translate designs into build plans Deliver large-scale data workflows and optimise for performance Implement data quality and validation processes What We're Looking For Strong experience with Databricks Proficiency in Python, Spark, and SQL Experience with cloud platforms Knowledge of pipeline tools Excellent problem-solving and leadership skills If you're passionate about data engineering and want to make an impact, apply today!
05/12/2025
Full time
Databricks Engineer Location: Oxfordshire (Hybrid) Salary: Competitive + Benefits Are you an experienced Databricks Engineer looking for your next challenge? The Role This is a hands-on technical role with leadership responsibilities. You'll design and deliver scalable data solutions, work closely with data leaders on architecture and strategy, and mentor a small team of Data Engineers to ensure best practices. Key Responsibilities Build and maintain scalable data pipelines and ETL processes using Databricks Collaborate on data architecture and translate designs into build plans Deliver large-scale data workflows and optimise for performance Implement data quality and validation processes What We're Looking For Strong experience with Databricks Proficiency in Python, Spark, and SQL Experience with cloud platforms Knowledge of pipeline tools Excellent problem-solving and leadership skills If you're passionate about data engineering and want to make an impact, apply today!
Senior Azure Data Engineer
Youngs Employment Services
Senior Azure Data Engineer Hybrid - Work From Home and West London Circ £70,000 - £80,000 + Range of benefits A well-known and prestigious business is looking to add a Senior Azure Data Engineer to their data team. This is an exciting opportunity for a Data Engineer that's not just technical, but also enjoys directly engaging and collaborating with stakeholders from across business functions such as finance, operations, planning, manufacturing, retail, e-commerce etc. Having nearly completed the process of migrating data from their existing on-prem databases to an Azure Cloud based platform, the Senior Data Engineer will play a key role in helping make best use of the data by gathering and agreeing requirements with the business to build data solutions that align accordingly. Working with diverse data sets from multiple systems and overseeing their integration and optimisation will require raw development, management and optimisation of data pipelines using tools in the Azure Cloud. Our client has expanded rapidly and been transformed in recent years, they're an iconic business with a special work environment that's manifested a strong and positive culture amongst the whole workforce. This is a hybrid role where the postholder can work from home 2 or 3 days per week, the other days will be based onsite in West London just a few minutes walk from a Central Line tube station. The key responsibilities for the post include; Develop, construct, test and maintain data architectures within large scale data processing systems. Develop and manage data pipelines using Azure Data Factory, Delta Lake and Spark. Utilise Azure Cloud architecture knowledge to design and implement scalable data solutions. Utilise Spark, SQL, Python, R, and other data frameworks to manipulate data and gain a thorough understanding of the dataset's characteristics. Interact with API systems to query and retrieve data for analysis. Collaborate with business users / stakeholders to gather and agree requirements. To be considered for the post you'll need at least 5 years experience ideally with 1 or 2 years at a senior / lead level. You'll need to be goal driven and able to take ownership of work tasks without the need for constant supervision. You'll be engaging with multiple business areas so the ability to communicate effectively to understand requirements and build trusted relationships is a must. It's likely you'll have most, if not all the following: Experience as a Senior Data Engineer or similar Strong knowledge of Azure Cloud architecture and Azure Databricks, DevOps and CI/CD. Experience with PySpark, Python, SQL and other data engineering development tools. Experience with metadata driven pipelines and SQL serverless data warehouses. Knowledge of querying API systems. Experience building and optimising ETL pipelines using Databricks. Strong problem-solving skills and attention to detail. Understanding of data governance and data quality principles. A degree in computer science, engineering, or equivalent experience. Salary will be dependent on experience and likely to be in the region of £70,000 - £80,000 although client may consider higher for outstanding candidate. Our client can also provide a vibrant, rewarding, and diverse work environment that supports career development. Candidates must be authorised to work in the UK and not require sponsoring either now or in the future. For further information, please send your CV to Wayne Young at Young's Employment Services Ltd. Young's Employment Services acts in the capacity of both an Employment Agent and Employment Business.
05/12/2025
Full time
Senior Azure Data Engineer Hybrid - Work From Home and West London Circ £70,000 - £80,000 + Range of benefits A well-known and prestigious business is looking to add a Senior Azure Data Engineer to their data team. This is an exciting opportunity for a Data Engineer that's not just technical, but also enjoys directly engaging and collaborating with stakeholders from across business functions such as finance, operations, planning, manufacturing, retail, e-commerce etc. Having nearly completed the process of migrating data from their existing on-prem databases to an Azure Cloud based platform, the Senior Data Engineer will play a key role in helping make best use of the data by gathering and agreeing requirements with the business to build data solutions that align accordingly. Working with diverse data sets from multiple systems and overseeing their integration and optimisation will require raw development, management and optimisation of data pipelines using tools in the Azure Cloud. Our client has expanded rapidly and been transformed in recent years, they're an iconic business with a special work environment that's manifested a strong and positive culture amongst the whole workforce. This is a hybrid role where the postholder can work from home 2 or 3 days per week, the other days will be based onsite in West London just a few minutes walk from a Central Line tube station. The key responsibilities for the post include; Develop, construct, test and maintain data architectures within large scale data processing systems. Develop and manage data pipelines using Azure Data Factory, Delta Lake and Spark. Utilise Azure Cloud architecture knowledge to design and implement scalable data solutions. Utilise Spark, SQL, Python, R, and other data frameworks to manipulate data and gain a thorough understanding of the dataset's characteristics. Interact with API systems to query and retrieve data for analysis. Collaborate with business users / stakeholders to gather and agree requirements. To be considered for the post you'll need at least 5 years experience ideally with 1 or 2 years at a senior / lead level. You'll need to be goal driven and able to take ownership of work tasks without the need for constant supervision. You'll be engaging with multiple business areas so the ability to communicate effectively to understand requirements and build trusted relationships is a must. It's likely you'll have most, if not all the following: Experience as a Senior Data Engineer or similar Strong knowledge of Azure Cloud architecture and Azure Databricks, DevOps and CI/CD. Experience with PySpark, Python, SQL and other data engineering development tools. Experience with metadata driven pipelines and SQL serverless data warehouses. Knowledge of querying API systems. Experience building and optimising ETL pipelines using Databricks. Strong problem-solving skills and attention to detail. Understanding of data governance and data quality principles. A degree in computer science, engineering, or equivalent experience. Salary will be dependent on experience and likely to be in the region of £70,000 - £80,000 although client may consider higher for outstanding candidate. Our client can also provide a vibrant, rewarding, and diverse work environment that supports career development. Candidates must be authorised to work in the UK and not require sponsoring either now or in the future. For further information, please send your CV to Wayne Young at Young's Employment Services Ltd. Young's Employment Services acts in the capacity of both an Employment Agent and Employment Business.
Harnham - Data & Analytics Recruitment
MLOPs Engineer
Harnham - Data & Analytics Recruitment
MLOps Engineer Outside IR35 - 500-600 Per Day Ideally, 1 day per week/fortnight in the office, flexibility for remote work for the right candidate. A market-leading global e-commerce client is urgently seeking a Senior MLOps Lead to establish and drive operational excellence within their largest, most established data function (60+ engineers). This is a mission-critical role focused on scaling their core on-site advertising platform from daily batch processing to real-time capability. This role suits a hands-on MLOps expert who is capable of implementing new standards, automating deployment lifecycles, and mentoring a large engineering team on best practices. What you'll be doing: MLOps Strategy & Implementation: Design and deploy end-to-end MLOps processes, focusing heavily on governance, reproducibility, and automation. Real-Time Pipeline Build: Architect and implement solutions to transition high-volume model serving (10M+ customers, 1.2M+ product variants) to real-time performance. MLflow & Databricks Mastery: Lead the optimal integration and use of MLflow for model registry, experiment tracking, and deployment within the Databricks platform. DevOps for ML: Build and automate robust CI/CD pipelines using GIT to ensure stable, reliable, and frequent model releases. Performance Engineering: Profile and optimise large-scale Spark/Python codebases for production efficiency, focusing on minimising latency and cost. Knowledge Transfer: Act as the technical lead to embed MLOps standards into the core Data Engineering team. Key Skills: Must Have: MLOps: Proven experience designing and implementing end-to-end MLOps processes in a production environment. Cloud ML Stack: Expert proficiency with Databricks and MLflow . Big Data/Coding: Expert Apache Spark and Python engineering experience on large datasets. Core Engineering: Strong experience with GIT for version control and building CI/CD / release pipelines. Data Fundamentals: Excellent SQL skills. Nice-to-Have/Desirable Skills DevOps/CICD (Pipeline experience) GCP (Familiarity with Google Cloud Platform) Data Science (Good understanding of math/model fundamentals for optimisation) Familiarity with low-latency data stores (e.g., CosmosDB ). If you have the capability to bring MLOps maturity to a traditional Engineering team using the MLFlow/Databricks/Spark stack, please email: with your CV and contract details.
05/12/2025
Contractor
MLOps Engineer Outside IR35 - 500-600 Per Day Ideally, 1 day per week/fortnight in the office, flexibility for remote work for the right candidate. A market-leading global e-commerce client is urgently seeking a Senior MLOps Lead to establish and drive operational excellence within their largest, most established data function (60+ engineers). This is a mission-critical role focused on scaling their core on-site advertising platform from daily batch processing to real-time capability. This role suits a hands-on MLOps expert who is capable of implementing new standards, automating deployment lifecycles, and mentoring a large engineering team on best practices. What you'll be doing: MLOps Strategy & Implementation: Design and deploy end-to-end MLOps processes, focusing heavily on governance, reproducibility, and automation. Real-Time Pipeline Build: Architect and implement solutions to transition high-volume model serving (10M+ customers, 1.2M+ product variants) to real-time performance. MLflow & Databricks Mastery: Lead the optimal integration and use of MLflow for model registry, experiment tracking, and deployment within the Databricks platform. DevOps for ML: Build and automate robust CI/CD pipelines using GIT to ensure stable, reliable, and frequent model releases. Performance Engineering: Profile and optimise large-scale Spark/Python codebases for production efficiency, focusing on minimising latency and cost. Knowledge Transfer: Act as the technical lead to embed MLOps standards into the core Data Engineering team. Key Skills: Must Have: MLOps: Proven experience designing and implementing end-to-end MLOps processes in a production environment. Cloud ML Stack: Expert proficiency with Databricks and MLflow . Big Data/Coding: Expert Apache Spark and Python engineering experience on large datasets. Core Engineering: Strong experience with GIT for version control and building CI/CD / release pipelines. Data Fundamentals: Excellent SQL skills. Nice-to-Have/Desirable Skills DevOps/CICD (Pipeline experience) GCP (Familiarity with Google Cloud Platform) Data Science (Good understanding of math/model fundamentals for optimisation) Familiarity with low-latency data stores (e.g., CosmosDB ). If you have the capability to bring MLOps maturity to a traditional Engineering team using the MLFlow/Databricks/Spark stack, please email: with your CV and contract details.
Tenth Revolution Group
Data Engineer
Tenth Revolution Group
Data Engineer - AI & Data Integration Up to £75,000 + Bonus I have partnered with a trailblazing technology company that is redefining how industrial enterprises operate leveraging deep data integration, generative AI and bespoke enterprise workflows to drive transformation at scale. As part of their continued growth, they are seeking a Data Engineer to lead high-impact client delivery and embed their platform into global operations. This is a strategic role where you will collaborate directly with senior stakeholders to tackle complex challenges and deliver tangible business outcomes through data and AI integration. You will be joining a mission-driven team of world-class technologists and business leaders working at the forefront of innovation and real-world impact. In this role, you will: Deliver complex data projects using PySpark and modern data tools Build scalable generative AI workflows using modern infrastructure Collaborate cross-functionally to ensure seamless delivery and adoption Drive innovation and continuous improvement across client engagements To be successful in this role, you will have: Proven experience in data engineering or integration Strong proficiency in Python and PySpark Exposure to generative AI platforms or a passion for building AI-powered solutions Ability to lead client delivery in dynamic, fast-paced environments Familiarity with tools like Airflow, Databricks or DBT is a plus What's on offer: Salary up to £75,000 Bonus and equity options Hybrid working: 3 days in a vibrant central London office, 2 days remote This is just a brief overview of the role. For the full details, simply apply with your CV and I'll be in touch to discuss further.
05/12/2025
Full time
Data Engineer - AI & Data Integration Up to £75,000 + Bonus I have partnered with a trailblazing technology company that is redefining how industrial enterprises operate leveraging deep data integration, generative AI and bespoke enterprise workflows to drive transformation at scale. As part of their continued growth, they are seeking a Data Engineer to lead high-impact client delivery and embed their platform into global operations. This is a strategic role where you will collaborate directly with senior stakeholders to tackle complex challenges and deliver tangible business outcomes through data and AI integration. You will be joining a mission-driven team of world-class technologists and business leaders working at the forefront of innovation and real-world impact. In this role, you will: Deliver complex data projects using PySpark and modern data tools Build scalable generative AI workflows using modern infrastructure Collaborate cross-functionally to ensure seamless delivery and adoption Drive innovation and continuous improvement across client engagements To be successful in this role, you will have: Proven experience in data engineering or integration Strong proficiency in Python and PySpark Exposure to generative AI platforms or a passion for building AI-powered solutions Ability to lead client delivery in dynamic, fast-paced environments Familiarity with tools like Airflow, Databricks or DBT is a plus What's on offer: Salary up to £75,000 Bonus and equity options Hybrid working: 3 days in a vibrant central London office, 2 days remote This is just a brief overview of the role. For the full details, simply apply with your CV and I'll be in touch to discuss further.
Harnham - Data & Analytics Recruitment
Head of Data Engineering Azure
Harnham - Data & Analytics Recruitment
Head of Data Engineering Hybrid - Kent (3 Days per Week) Up to £130,000 + 30% Bonus + Benefits Are you an experienced data leader ready to take ownership of a large-scale data transformation? We're working with a leading UK financial services group that's on a major journey to modernise its data estate - and they're looking for a Head of Data Engineering to lead the migration from on-premise to a modern, Azure-based cloud platform. This is a high-impact leadership role, ideal for someone who combines strategic vision with hands-on technical credibility and thrives on building, scaling, and delivering enterprise-grade data solutions. Why this role? Lead the migration of legacy on-premise systems to Azure , driving one of the organisation's most strategic technology programmes. Shape and execute the data engineering roadmap , delivering scalable, secure, and compliant data solutions. Build and mentor a high-performing engineering team , embedding modern cloud-first practices. Hybrid flexibility: 3 days per week in the Kent office , with the rest remote. Excellent package: up to £130,000 + 30% bonus and comprehensive benefits. ? What you'll be doing: Leading the end-to-end migration from on-prem to Azure , defining architecture, governance, and delivery best practices. Designing, developing, and optimising data pipelines, integration frameworks, and ETL processes in Azure. Building and maintaining scalable solutions using Azure Data Lake, Synapse, Data Factory, and Databricks . Establishing robust data governance, quality, and lineage frameworks across all environments. Collaborating closely with data architecture, analytics, and IT to ensure a seamless transition and platform stability. Managing delivery across multiple workstreams, ensuring projects are on time, within budget, and high quality. Developing team capability - coaching engineers and fostering a culture of innovation and ownership. Presenting updates and strategic insights to senior leadership and executive stakeholders . ? What we're looking for: Proven track record leading data engineering teams through large-scale transformations. Strong, hands-on understanding of Azure data services - e.g. Synapse, Data Factory, Data Lake, Databricks, Purview . Direct experience migrating from on-premise environments to Azure , ideally within a regulated or financial services context. Deep technical knowledge of data architecture, ETL/ELT, and data modelling . Excellent coding skills in SQL and Python , with experience implementing DevOps for data and CI/CD. Strong leadership and stakeholder engagement - able to translate technical progress into business outcomes. Strategic thinker with a delivery mindset and the ability to influence at senior levels. ? Nice-to-haves: Experience with Snowflake or hybrid multi-cloud data solutions. Familiarity with banking or financial data frameworks such as BCBS239 or IRB. Background in agile delivery and infrastructure automation using Terraform or similar. Package & Benefits: Salary up to £130,000 + 30% annual bonus . Hybrid working - 3 days per week in Kent office . Private medical insurance, life assurance, and pension scheme.
05/12/2025
Full time
Head of Data Engineering Hybrid - Kent (3 Days per Week) Up to £130,000 + 30% Bonus + Benefits Are you an experienced data leader ready to take ownership of a large-scale data transformation? We're working with a leading UK financial services group that's on a major journey to modernise its data estate - and they're looking for a Head of Data Engineering to lead the migration from on-premise to a modern, Azure-based cloud platform. This is a high-impact leadership role, ideal for someone who combines strategic vision with hands-on technical credibility and thrives on building, scaling, and delivering enterprise-grade data solutions. Why this role? Lead the migration of legacy on-premise systems to Azure , driving one of the organisation's most strategic technology programmes. Shape and execute the data engineering roadmap , delivering scalable, secure, and compliant data solutions. Build and mentor a high-performing engineering team , embedding modern cloud-first practices. Hybrid flexibility: 3 days per week in the Kent office , with the rest remote. Excellent package: up to £130,000 + 30% bonus and comprehensive benefits. ? What you'll be doing: Leading the end-to-end migration from on-prem to Azure , defining architecture, governance, and delivery best practices. Designing, developing, and optimising data pipelines, integration frameworks, and ETL processes in Azure. Building and maintaining scalable solutions using Azure Data Lake, Synapse, Data Factory, and Databricks . Establishing robust data governance, quality, and lineage frameworks across all environments. Collaborating closely with data architecture, analytics, and IT to ensure a seamless transition and platform stability. Managing delivery across multiple workstreams, ensuring projects are on time, within budget, and high quality. Developing team capability - coaching engineers and fostering a culture of innovation and ownership. Presenting updates and strategic insights to senior leadership and executive stakeholders . ? What we're looking for: Proven track record leading data engineering teams through large-scale transformations. Strong, hands-on understanding of Azure data services - e.g. Synapse, Data Factory, Data Lake, Databricks, Purview . Direct experience migrating from on-premise environments to Azure , ideally within a regulated or financial services context. Deep technical knowledge of data architecture, ETL/ELT, and data modelling . Excellent coding skills in SQL and Python , with experience implementing DevOps for data and CI/CD. Strong leadership and stakeholder engagement - able to translate technical progress into business outcomes. Strategic thinker with a delivery mindset and the ability to influence at senior levels. ? Nice-to-haves: Experience with Snowflake or hybrid multi-cloud data solutions. Familiarity with banking or financial data frameworks such as BCBS239 or IRB. Background in agile delivery and infrastructure automation using Terraform or similar. Package & Benefits: Salary up to £130,000 + 30% annual bonus . Hybrid working - 3 days per week in Kent office . Private medical insurance, life assurance, and pension scheme.
Big Red Recruitment
Senior Data Engineer
Big Red Recruitment Atherstone, Warwickshire
Are you a skilled Data Engineer looking to step into the world of architecture?An exciting opportunity has opened for an experienced Data Engineer to join our national Data & Analytics function at a time of significant technical modernisation. The team is about to embark on a greenfield project to build a futureproof data warehousing platform using Azure Data Factory and Databricks, with a clear focus on scalability, quality, and best practice.Working within a specialist Data Platform & Engineering team, you'll join as the expert within Azure Databricks. You'll help to upskill our teams knowledge of Databricks, get involved in data modelling and ETL pipeline development, integrations, and performance tuning. This is an ever evolving project as the business becomes more and more data driven and your knowledge of Databricks will be pivitol in shaping the architecture. This is an ideal role for a Senior Data Engineer ready to step into an architectural role, or an established architect keen to take ownership of a highly visible and strategic data platform project.What you'll be doing: Leading the design and implementation of a new Databricks-based data warehousing solution Designing and developing data models, ETL pipelines, and data integration processes Large scale data processing using PySpark Monitoring, tuning, and optimising data platforms for reliability and performance Upskilling the wider team in Databricks best practices, including modern architecture patterns Location: Atherstone (Hybrid - 3 days office / 2 days remote)Salary: £60 000 to £80 000 depending on experienceIf you're passionate about shaping modern data platforms and have the technical expertise to make a measurable impact, we'd love to hear from you. Apply now as we have interview slots available! Please note: Visa sponsorship is not provided for this role and we are only considering applications of candidates who have permanent residency in the UK. We are an equal opportunity recruitment company. This means we welcome applications from all suitably qualified people regardless of race, sex, disability, religion, sexual orientation or age. We are particularly invested in Neurodiversity inclusion and offer reasonable adjustments in the interview process. Reasonable adjustments are changes that we can make in the interview process if your disability puts you at a disadvantage compared with others who are not disabled. If you would benefit from a reasonable adjustment in your interview process, please call or email one of our recruiters.
05/12/2025
Full time
Are you a skilled Data Engineer looking to step into the world of architecture?An exciting opportunity has opened for an experienced Data Engineer to join our national Data & Analytics function at a time of significant technical modernisation. The team is about to embark on a greenfield project to build a futureproof data warehousing platform using Azure Data Factory and Databricks, with a clear focus on scalability, quality, and best practice.Working within a specialist Data Platform & Engineering team, you'll join as the expert within Azure Databricks. You'll help to upskill our teams knowledge of Databricks, get involved in data modelling and ETL pipeline development, integrations, and performance tuning. This is an ever evolving project as the business becomes more and more data driven and your knowledge of Databricks will be pivitol in shaping the architecture. This is an ideal role for a Senior Data Engineer ready to step into an architectural role, or an established architect keen to take ownership of a highly visible and strategic data platform project.What you'll be doing: Leading the design and implementation of a new Databricks-based data warehousing solution Designing and developing data models, ETL pipelines, and data integration processes Large scale data processing using PySpark Monitoring, tuning, and optimising data platforms for reliability and performance Upskilling the wider team in Databricks best practices, including modern architecture patterns Location: Atherstone (Hybrid - 3 days office / 2 days remote)Salary: £60 000 to £80 000 depending on experienceIf you're passionate about shaping modern data platforms and have the technical expertise to make a measurable impact, we'd love to hear from you. Apply now as we have interview slots available! Please note: Visa sponsorship is not provided for this role and we are only considering applications of candidates who have permanent residency in the UK. We are an equal opportunity recruitment company. This means we welcome applications from all suitably qualified people regardless of race, sex, disability, religion, sexual orientation or age. We are particularly invested in Neurodiversity inclusion and offer reasonable adjustments in the interview process. Reasonable adjustments are changes that we can make in the interview process if your disability puts you at a disadvantage compared with others who are not disabled. If you would benefit from a reasonable adjustment in your interview process, please call or email one of our recruiters.
Crimson
Senior Data Engineer - Azure Data - Burton-on-Trent - Hybrid
Crimson
Senior Data Engineer - Azure Data - Burton-on-Trent - Permanent - Hybrid Salary - £60,000 - £67,000 per annum This role requires 1 day / week in Burton-on-Trent, with hybrid working arrangements. Our client is seeking a highly skilled Senior Data Engineer to join their dynamic IT team, based in Burton-on-Trent. The Senior Data Engineer will come on board to support the Strategic Data Manager in establishing and managing an efficient Business Intelligence technical service. Assisting in the advancement of our cloud-based data platforms, providing options for timely processing and cost-efficient solutions. A strong background in Azure Data Pipeline development is key for this position. Key Skills & Responsibilities: Build and manage pipelines using Azure Data Factory, Databricks, CI/CD, and Terraform. Optimisation of ETL processes for performance and cost-efficiency. Design scalable data models aligned with business needs. Azure data solutions for efficient data storage and retrieval. Ensure compliance with data protection laws (e.g., GDPR), implement encryption and access controls. Work with cross-functional teams and mentor junior engineers. Manage and tune Azure SQL Database instances. Proactively monitor pipelines and infrastructure for performance and reliability. Maintain technical documentation and lead knowledge-sharing initiatives. Deploy advanced analytics and machine learning solutions using Azure. Stay current with Azure technologies and identify areas for enhancement. Databricks (Unity Catalog, DLT), Data Factory, Synapse, Data Lake, Stream Analytics, Event Hubs. Strong knowledge of Python, Scala, C#, .NET. Experience with advanced SQL, T-SQL, relational databases. Azure DevOps, Terraform, BICEP, ARM templates. Distributed computing, cloud-native design patterns. Data modelling, metadata management, data quality, data as a product. Strong communication, empathy, determination, openness to innovation. Strong Microsoft Office 365 experience Interested? Please submit your updated CV to Lewis Rushton at Crimson for immediate consideration. Not interested? Do you know someone who might be a perfect fit for this role? Refer a friend and earn £250 worth of vouchers! Crimson is acting as an employment agency regarding this vacancy Crimson is acting as an employment agency regarding this vacancy JBRP1_UKTJ
05/12/2025
Full time
Senior Data Engineer - Azure Data - Burton-on-Trent - Permanent - Hybrid Salary - £60,000 - £67,000 per annum This role requires 1 day / week in Burton-on-Trent, with hybrid working arrangements. Our client is seeking a highly skilled Senior Data Engineer to join their dynamic IT team, based in Burton-on-Trent. The Senior Data Engineer will come on board to support the Strategic Data Manager in establishing and managing an efficient Business Intelligence technical service. Assisting in the advancement of our cloud-based data platforms, providing options for timely processing and cost-efficient solutions. A strong background in Azure Data Pipeline development is key for this position. Key Skills & Responsibilities: Build and manage pipelines using Azure Data Factory, Databricks, CI/CD, and Terraform. Optimisation of ETL processes for performance and cost-efficiency. Design scalable data models aligned with business needs. Azure data solutions for efficient data storage and retrieval. Ensure compliance with data protection laws (e.g., GDPR), implement encryption and access controls. Work with cross-functional teams and mentor junior engineers. Manage and tune Azure SQL Database instances. Proactively monitor pipelines and infrastructure for performance and reliability. Maintain technical documentation and lead knowledge-sharing initiatives. Deploy advanced analytics and machine learning solutions using Azure. Stay current with Azure technologies and identify areas for enhancement. Databricks (Unity Catalog, DLT), Data Factory, Synapse, Data Lake, Stream Analytics, Event Hubs. Strong knowledge of Python, Scala, C#, .NET. Experience with advanced SQL, T-SQL, relational databases. Azure DevOps, Terraform, BICEP, ARM templates. Distributed computing, cloud-native design patterns. Data modelling, metadata management, data quality, data as a product. Strong communication, empathy, determination, openness to innovation. Strong Microsoft Office 365 experience Interested? Please submit your updated CV to Lewis Rushton at Crimson for immediate consideration. Not interested? Do you know someone who might be a perfect fit for this role? Refer a friend and earn £250 worth of vouchers! Crimson is acting as an employment agency regarding this vacancy Crimson is acting as an employment agency regarding this vacancy JBRP1_UKTJ
IPS Group
Technical Architect
IPS Group
About the Organisation Our client is a specialist insurer and reinsurer operating globally, with a focus on complex commercial and industrial risks. The business combines deep sector expertise with advanced use of data, analytics, and modern technology to deliver informed, efficient, and fair underwriting decisions.The company is in a period of significant growth and is committed to maintaining a collaborative, inclusive, and high-performing culture. Employees are encouraged to ask questions, explore new ideas, work openly with others, and operate with clarity and simplicity. The environment is dynamic, supportive, and designed to enable long-term career progression. Role Summary Our client is seeking an experienced Technical Architect with strong expertise in integration architecture, systems design best practice, and API design. The role will take technical ownership of the organisation's enterprise technology landscape.This is a pivotal position within the company's scaling and modernisation programme. The successful candidate will act as the bridge between internally developed underwriting platforms, the data lakehouse and analytics estate, and a range of commercial off-the-shelf (COTS) applications used across operations, finance, and support functions.You will be responsible for defining standards, blueprints, and governance for how core systems interact, ensuring security, performance, scalability, and data quality across the entire ecosystem. Key Responsibilities Integration Architecture Define the enterprise-wide integration strategy, including the governance of integration patterns (such as synchronous APIs, event-driven architecture, asynchronous messaging, and batch ETL). Lead the design and documentation of internal and external APIs, ensuring best practice in RESTful design, versioning, and OpenAPI standards. Work with Engineering and Data teams to standardise the technical approach for messaging, data extraction, transformation, and load (ETL/ELT) processes across underwriting, finance, accounting, and data warehouse systems. Design secure, scalable, and auditable architectures for Generative AI, Agentic AI, and large language model (LLM) technologies. Collaborate with data scientists, engineers, and developers to integrate and deploy AI models from development through to production. Define security standards for all integration points, including OAuth, API key management, encryption, and network segmentation within an Azure environment. Platform Architecture Provide technical input into new platform selections and project implementations, ensuring alignment with architectural principles. Work closely with Engineering leadership on the design and evolution of the .NET and Angular technology stack, focusing on modularity, microservices, performance, and technical debt management. Collaborate with the Data Architecture function to ensure alignment between the enterprise architecture and the organisation's data platform, including optimisation of the Databricks environment. Establish and govern non-functional requirements (NFRs) covering performance, scalability, resilience, and disaster recovery. Technical Design & Governance Translate business strategy into an executable architecture roadmap. Conduct architectural and code reviews to ensure alignment with agreed standards. Present technical recommendations, solution designs, and trade-off considerations to both technical and non-technical stakeholders. Serve as the final escalation point for complex technical design challenges and technology selections. Mentor engineers across software, infrastructure, and data disciplines in architectural best practice and emerging technologies. Essential Experience & Skills At least 10 years in senior technical roles, including 5+ years as an architect specialising in software development, integration, and technical architecture. Advanced knowledge of API implementation, message queues (e.g., Azure Service Bus, Kafka), and event-driven architecture. Deep expertise in RESTful API design, API gateways (ideally Azure API Management), and the OpenAPI specification. Strong understanding of DevOps practices and hands-on experience with CI/CD tooling such as Azure DevOps or GitLab CI. Extensive knowledge of the Microsoft Azure ecosystem, covering IaaS and PaaS, with particular experience in Azure Databricks, Data Factory, and ADLS Gen2. Excellent communication and presentation skills, with the ability to influence diverse technical teams. Desirable Experience Background in the insurance, reinsurance, or wider financial services industry. Strong architectural understanding of the .NET/C# ecosystem and modern front-end frameworks (ideally Angular). Experience with data architecture concepts including relational, NoSQL, and data lake technologies. Knowledge of cloud-based AI architectures, including LLMs, Generative AI, and Agentic AI. Experience integrating proprietary systems with COTS finance or ERP platforms, particularly within general ledger, financial close, or regulatory reporting domains.
04/12/2025
Full time
About the Organisation Our client is a specialist insurer and reinsurer operating globally, with a focus on complex commercial and industrial risks. The business combines deep sector expertise with advanced use of data, analytics, and modern technology to deliver informed, efficient, and fair underwriting decisions.The company is in a period of significant growth and is committed to maintaining a collaborative, inclusive, and high-performing culture. Employees are encouraged to ask questions, explore new ideas, work openly with others, and operate with clarity and simplicity. The environment is dynamic, supportive, and designed to enable long-term career progression. Role Summary Our client is seeking an experienced Technical Architect with strong expertise in integration architecture, systems design best practice, and API design. The role will take technical ownership of the organisation's enterprise technology landscape.This is a pivotal position within the company's scaling and modernisation programme. The successful candidate will act as the bridge between internally developed underwriting platforms, the data lakehouse and analytics estate, and a range of commercial off-the-shelf (COTS) applications used across operations, finance, and support functions.You will be responsible for defining standards, blueprints, and governance for how core systems interact, ensuring security, performance, scalability, and data quality across the entire ecosystem. Key Responsibilities Integration Architecture Define the enterprise-wide integration strategy, including the governance of integration patterns (such as synchronous APIs, event-driven architecture, asynchronous messaging, and batch ETL). Lead the design and documentation of internal and external APIs, ensuring best practice in RESTful design, versioning, and OpenAPI standards. Work with Engineering and Data teams to standardise the technical approach for messaging, data extraction, transformation, and load (ETL/ELT) processes across underwriting, finance, accounting, and data warehouse systems. Design secure, scalable, and auditable architectures for Generative AI, Agentic AI, and large language model (LLM) technologies. Collaborate with data scientists, engineers, and developers to integrate and deploy AI models from development through to production. Define security standards for all integration points, including OAuth, API key management, encryption, and network segmentation within an Azure environment. Platform Architecture Provide technical input into new platform selections and project implementations, ensuring alignment with architectural principles. Work closely with Engineering leadership on the design and evolution of the .NET and Angular technology stack, focusing on modularity, microservices, performance, and technical debt management. Collaborate with the Data Architecture function to ensure alignment between the enterprise architecture and the organisation's data platform, including optimisation of the Databricks environment. Establish and govern non-functional requirements (NFRs) covering performance, scalability, resilience, and disaster recovery. Technical Design & Governance Translate business strategy into an executable architecture roadmap. Conduct architectural and code reviews to ensure alignment with agreed standards. Present technical recommendations, solution designs, and trade-off considerations to both technical and non-technical stakeholders. Serve as the final escalation point for complex technical design challenges and technology selections. Mentor engineers across software, infrastructure, and data disciplines in architectural best practice and emerging technologies. Essential Experience & Skills At least 10 years in senior technical roles, including 5+ years as an architect specialising in software development, integration, and technical architecture. Advanced knowledge of API implementation, message queues (e.g., Azure Service Bus, Kafka), and event-driven architecture. Deep expertise in RESTful API design, API gateways (ideally Azure API Management), and the OpenAPI specification. Strong understanding of DevOps practices and hands-on experience with CI/CD tooling such as Azure DevOps or GitLab CI. Extensive knowledge of the Microsoft Azure ecosystem, covering IaaS and PaaS, with particular experience in Azure Databricks, Data Factory, and ADLS Gen2. Excellent communication and presentation skills, with the ability to influence diverse technical teams. Desirable Experience Background in the insurance, reinsurance, or wider financial services industry. Strong architectural understanding of the .NET/C# ecosystem and modern front-end frameworks (ideally Angular). Experience with data architecture concepts including relational, NoSQL, and data lake technologies. Knowledge of cloud-based AI architectures, including LLMs, Generative AI, and Agentic AI. Experience integrating proprietary systems with COTS finance or ERP platforms, particularly within general ledger, financial close, or regulatory reporting domains.
Carrington Recruitment Solutions Ltd
Technology Product Owner, AI, Microsoft Stack, Data Bricks, Remote
Carrington Recruitment Solutions Ltd
Technology Product Owner, AI Data Analytics, Microsoft Stack, Azure, Data Engineering, ML, Azure, Mainly Remote Technology Product Owner required to join a global Professional Services business based in Central London. However, this is practically a remote role, but when travel is required (to London, Europe and the States on occasions. We need someone who has come from a Development background with a hardened skillset in Microsoft Stack Technologies (C# .NET Core) who has then transitioned into Product Ownership. We need someone highly analytical who can understand large Data Sets, Data Bricks and is able to bring Proof of Concepts to the table and help with the execution. The platform primarily serves two key personas: Data and Intelligence Delivery specialists, who manage data ingestion, transformation, and orchestration processes, and Assurance professionals, who use the analysers to enhance audit quality and client service (this can be taught - the mentality is development and analytical mindset first, audit specific knowledge second, which you can learn). This being said, we need DATA HEAVY Product Owners who have managed complex, Global products. Read on for more details Experience required: Development FIRST mentality. You must have been a Developer before moving into Product Ownership Technical proficiency: Familiarity with Azure services (e.g., Data Lake, Synapse, Fabric) and Databricks for data engineering, analytics, performance optimisation, and governance. Experience with implementing and optimising scalable cloud infrastructure is highly valued. Backlog management: Demonstrated expertise in maintaining and prioritizing product backlogs, writing detailed user stories, and collaborating with development teams to deliver sprint goals. Agile product ownership: Experience in SAFe or similar agile frameworks, including daily scrum leadership and sprint planning. Cross-team collaboration: Effective working across engineering, analytics, and business teams to ensure seamless execution. KPI management: Ability to track, analyze, and interpret KPIs to guide product improvements and communicate results to stakeholders. Technical acumen: Solid understanding of modern data platforms, including experience with medallion architecture, AI/ML applications, and cloud-native infrastructures. Communication skills: Excellent communication skills for conveying technical concepts to various audiences, including engineers, business partners, and senior leadership. Collaboration and flexibility: Experience working with distributed teams in dynamic, fast-paced environments. Innovation mindset: Passion for leveraging advanced analytics, AI, and cloud technologies to deliver cutting-edge solutions. This is a great opportunity and salary is dependent upon experience. Apply now for more details
04/12/2025
Full time
Technology Product Owner, AI Data Analytics, Microsoft Stack, Azure, Data Engineering, ML, Azure, Mainly Remote Technology Product Owner required to join a global Professional Services business based in Central London. However, this is practically a remote role, but when travel is required (to London, Europe and the States on occasions. We need someone who has come from a Development background with a hardened skillset in Microsoft Stack Technologies (C# .NET Core) who has then transitioned into Product Ownership. We need someone highly analytical who can understand large Data Sets, Data Bricks and is able to bring Proof of Concepts to the table and help with the execution. The platform primarily serves two key personas: Data and Intelligence Delivery specialists, who manage data ingestion, transformation, and orchestration processes, and Assurance professionals, who use the analysers to enhance audit quality and client service (this can be taught - the mentality is development and analytical mindset first, audit specific knowledge second, which you can learn). This being said, we need DATA HEAVY Product Owners who have managed complex, Global products. Read on for more details Experience required: Development FIRST mentality. You must have been a Developer before moving into Product Ownership Technical proficiency: Familiarity with Azure services (e.g., Data Lake, Synapse, Fabric) and Databricks for data engineering, analytics, performance optimisation, and governance. Experience with implementing and optimising scalable cloud infrastructure is highly valued. Backlog management: Demonstrated expertise in maintaining and prioritizing product backlogs, writing detailed user stories, and collaborating with development teams to deliver sprint goals. Agile product ownership: Experience in SAFe or similar agile frameworks, including daily scrum leadership and sprint planning. Cross-team collaboration: Effective working across engineering, analytics, and business teams to ensure seamless execution. KPI management: Ability to track, analyze, and interpret KPIs to guide product improvements and communicate results to stakeholders. Technical acumen: Solid understanding of modern data platforms, including experience with medallion architecture, AI/ML applications, and cloud-native infrastructures. Communication skills: Excellent communication skills for conveying technical concepts to various audiences, including engineers, business partners, and senior leadership. Collaboration and flexibility: Experience working with distributed teams in dynamic, fast-paced environments. Innovation mindset: Passion for leveraging advanced analytics, AI, and cloud technologies to deliver cutting-edge solutions. This is a great opportunity and salary is dependent upon experience. Apply now for more details
Tenth Revolution Group
Senior Data Engineering Consultant - £60,000 - Hybrid
Tenth Revolution Group
Senior Data Engineering Consultant - £60,000 - Hybrid Key Responsibilities Lead, mentor, and develop a team of Technical Consultants. Manage resource planning, scheduling, and overall delivery workflows. Collaborate with Pre-sales, Commercial, and Project Management teams to scope and deliver projects. Contribute to technical delivery, designing scalable data solutions in Azure/Microsoft environments. Support cloud migrations, data lake builds, and ETL/ELT pipeline development. Ensure delivery follows best practices and internal standards. Skills & Experience Strong leadership and relationship-building skills. Experience guiding or managing technical teams. Deep hands-on experience in Data Engineering using Microsoft Fabric, Azure Databricks, Synapse, Data Factory, and/or SQL Server. Expertise in SQL and Python for ETL/ELT development. Knowledge of data lakes, medallion lakehouse architecture, and large-scale dataset management. Solid understanding of BI, data warehousing, and database optimisation. To apply for this role please submit your CV or contact Dillon Blackburn on or at . Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
04/12/2025
Full time
Senior Data Engineering Consultant - £60,000 - Hybrid Key Responsibilities Lead, mentor, and develop a team of Technical Consultants. Manage resource planning, scheduling, and overall delivery workflows. Collaborate with Pre-sales, Commercial, and Project Management teams to scope and deliver projects. Contribute to technical delivery, designing scalable data solutions in Azure/Microsoft environments. Support cloud migrations, data lake builds, and ETL/ELT pipeline development. Ensure delivery follows best practices and internal standards. Skills & Experience Strong leadership and relationship-building skills. Experience guiding or managing technical teams. Deep hands-on experience in Data Engineering using Microsoft Fabric, Azure Databricks, Synapse, Data Factory, and/or SQL Server. Expertise in SQL and Python for ETL/ELT development. Knowledge of data lakes, medallion lakehouse architecture, and large-scale dataset management. Solid understanding of BI, data warehousing, and database optimisation. To apply for this role please submit your CV or contact Dillon Blackburn on or at . Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Crimson
Senior Data Engineer - Azure Databricks/ SC Clearance
Crimson
Senior Data Engineer - Azure Databricks/ SC Clearance - Contract Active SC Clearance is required for this position Hybrid working - 3 days / week on site required Up to £620 / day - Inside IR35 We are currently recruiting for a well experienced Senior Data Engineer, required for a leading global transformation consultancy, based in London. The Senior Data Engineer will be responsible for providing hands-on, technical expertise within an agile team. The ideal candidate will be well experienced in building and optimising data pipelines, implementing data transformations, and ensuring data quality and reliability. This role requires a strong understanding of data engineering principles, big data technologies, cloud computing (specifically Azure), and experience working with large datasets Key skills and responsibilities: Design, build, and maintain scalable ETL pipelines for ingesting, transforming, and loading data from APIs, databases, and financial data sources into Azure Databricks. Optimize pipelines for performance, reliability, and cost, incorporating data quality checks. Develop complex transformations and processing logic using Spark (PySpark/Scala) for cleaning, enrichment, and aggregation, ensuring accuracy and consistency across the data lifecycle. Work extensively with Unity Catalog, Delta Lake, Spark SQL, and related services. Apply best practices for development, deployment, and workload optimization. Program in SQL, Python, R, YAML, and JavaScript. Integrate data from relational databases, APIs, and streaming sources using best-practice patterns. Collaborate with API developers for seamless data exchange. Utilize Azure Purview for governance and quality monitoring. Implement lineage tracking, metadata management, and compliance with governance standards. Partner with data scientists, economists, and technical teams to translate requirements into solutions. Communicate effectively with technical and non-technical audiences. Participate in code reviews and knowledge-sharing sessions. Automate pipeline deployments and implement CI/CD processes in collaboration with DevOps teams. Promote best practices for automation and environment management. Interested? Please submit your updated CV to Lewis Rushton at Crimson for immediate consideration. Not interested? Do you know someone who might be a perfect fit for this role? Refer a friend and earn £250 worth of vouchers!
04/12/2025
Contractor
Senior Data Engineer - Azure Databricks/ SC Clearance - Contract Active SC Clearance is required for this position Hybrid working - 3 days / week on site required Up to £620 / day - Inside IR35 We are currently recruiting for a well experienced Senior Data Engineer, required for a leading global transformation consultancy, based in London. The Senior Data Engineer will be responsible for providing hands-on, technical expertise within an agile team. The ideal candidate will be well experienced in building and optimising data pipelines, implementing data transformations, and ensuring data quality and reliability. This role requires a strong understanding of data engineering principles, big data technologies, cloud computing (specifically Azure), and experience working with large datasets Key skills and responsibilities: Design, build, and maintain scalable ETL pipelines for ingesting, transforming, and loading data from APIs, databases, and financial data sources into Azure Databricks. Optimize pipelines for performance, reliability, and cost, incorporating data quality checks. Develop complex transformations and processing logic using Spark (PySpark/Scala) for cleaning, enrichment, and aggregation, ensuring accuracy and consistency across the data lifecycle. Work extensively with Unity Catalog, Delta Lake, Spark SQL, and related services. Apply best practices for development, deployment, and workload optimization. Program in SQL, Python, R, YAML, and JavaScript. Integrate data from relational databases, APIs, and streaming sources using best-practice patterns. Collaborate with API developers for seamless data exchange. Utilize Azure Purview for governance and quality monitoring. Implement lineage tracking, metadata management, and compliance with governance standards. Partner with data scientists, economists, and technical teams to translate requirements into solutions. Communicate effectively with technical and non-technical audiences. Participate in code reviews and knowledge-sharing sessions. Automate pipeline deployments and implement CI/CD processes in collaboration with DevOps teams. Promote best practices for automation and environment management. Interested? Please submit your updated CV to Lewis Rushton at Crimson for immediate consideration. Not interested? Do you know someone who might be a perfect fit for this role? Refer a friend and earn £250 worth of vouchers!

Modal Window

  • Home
  • Contact
  • About Us
  • FAQs
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • IT blog
  • Facebook
  • Twitter
  • LinkedIn
  • Youtube
© 2008-2025 IT Job Board