We are seeking a Data & AI Lead for our Platinum account. Our ideal candidate should be passionate about Data & AI, possess deep technical knowledge and a focus on delivering measurable business impact. This role offers leadership opportunities and client exposure in North America, the UK, and Europe. You must possess: 15-20 years' experience in a reputable Data & AI services firm, working in the Banking & Financial Services vertical. Proven Revenue Generation Track Record: A consistent history of delivering and exceeding revenue targets on a YoY basis within the Banking & FS sector, including winning new logos, expanding existing accounts, and converting pipeline into closed business with measurable commercial impact. Exceptional Client Engagement & Relationship Management: Possess the ability to engage C-suite executives and senior decision-makers in Banks, Building Societies, and Financial Market Infrastructures with confidence and credibility. Outstanding Communication & Compelling Storytelling: You are an articulate, persuasive communicator who can distill complex Data & AI concepts into compelling narratives that resonate with diverse audiences-from technical architects to CDOs. Deep Technical Expertise Across Data & AI: You possess comprehensive technical knowledge spanning the entire Data & AI landscape, including cloud platforms (Azure, Databricks, Snowflake, AWS &GCP). Knowledge of AI/ML, Gen AI and Agentic AI capabilities: You understand not just the "what" but the "how" and "why" behind these technologies, enabling you to architect enterprise-scale solutions that address real-world Banking & FS challenges. Ability to solve: Translate business requirements into scalable, secure, and compliant technical solutions that align with enterprise standards and regulatory frameworks. Matrix Organisation Leadership Across Geographies Ability to work with delivery, pre-sales, and sales teams throughout deal pursuits. Good to have: Bachelor's/Master's degree in IT, Computer Science, Engineering, Business, or Decision Sciences. Deep Banking & Financial Services Domain Expertise: At least 10+ years of progressive experience within the UK&I banking & FS sector, with demonstrable knowledge of retail banking, commercial banking, investment banking, wealth management, or insurance operations. Regulatory & Compliance Acumen Active Participation in Banking & FS Industry Forums & Thought Leadership Practice Building & Team Leadership: You must have experience building, mentoring, and scaling high-performing consulting teams Deep Understanding of UK Banking Regulatory Landscape Willingness to travel 10-20% of the time NOTE: 4 days/week onsite
11/12/2025
Full time
We are seeking a Data & AI Lead for our Platinum account. Our ideal candidate should be passionate about Data & AI, possess deep technical knowledge and a focus on delivering measurable business impact. This role offers leadership opportunities and client exposure in North America, the UK, and Europe. You must possess: 15-20 years' experience in a reputable Data & AI services firm, working in the Banking & Financial Services vertical. Proven Revenue Generation Track Record: A consistent history of delivering and exceeding revenue targets on a YoY basis within the Banking & FS sector, including winning new logos, expanding existing accounts, and converting pipeline into closed business with measurable commercial impact. Exceptional Client Engagement & Relationship Management: Possess the ability to engage C-suite executives and senior decision-makers in Banks, Building Societies, and Financial Market Infrastructures with confidence and credibility. Outstanding Communication & Compelling Storytelling: You are an articulate, persuasive communicator who can distill complex Data & AI concepts into compelling narratives that resonate with diverse audiences-from technical architects to CDOs. Deep Technical Expertise Across Data & AI: You possess comprehensive technical knowledge spanning the entire Data & AI landscape, including cloud platforms (Azure, Databricks, Snowflake, AWS &GCP). Knowledge of AI/ML, Gen AI and Agentic AI capabilities: You understand not just the "what" but the "how" and "why" behind these technologies, enabling you to architect enterprise-scale solutions that address real-world Banking & FS challenges. Ability to solve: Translate business requirements into scalable, secure, and compliant technical solutions that align with enterprise standards and regulatory frameworks. Matrix Organisation Leadership Across Geographies Ability to work with delivery, pre-sales, and sales teams throughout deal pursuits. Good to have: Bachelor's/Master's degree in IT, Computer Science, Engineering, Business, or Decision Sciences. Deep Banking & Financial Services Domain Expertise: At least 10+ years of progressive experience within the UK&I banking & FS sector, with demonstrable knowledge of retail banking, commercial banking, investment banking, wealth management, or insurance operations. Regulatory & Compliance Acumen Active Participation in Banking & FS Industry Forums & Thought Leadership Practice Building & Team Leadership: You must have experience building, mentoring, and scaling high-performing consulting teams Deep Understanding of UK Banking Regulatory Landscape Willingness to travel 10-20% of the time NOTE: 4 days/week onsite
Data & AI Senior Consultants Location - We are flexible: onsite, hybrid or fully remote, depending on what works for you and the client, UK or Netherlands based. What you will actually be doing This is not a role where you build clever models that never get used. Your focus is on creating measurable value for clients using data science, machine learning and GenAI, in a consulting and advisory context. You will own work from the very beginning, asking questions like "What value are we trying to create here?" and "Is this the right problem to solve?" through to "It is live, stakeholders are using it and we can see the impact in the numbers." You will work fairly independently and you will also be someone that more junior team members look to for help and direction. A big part of the job is taking messy, ambiguous business and technical problems and turning them into clear, valuable solutions that make sense to the client. You will do this in a client facing role. That means you will be in the room for key conversations, providing honest advice, managing expectations and helping clients make good decisions about where and how to use AI. What your day to day might look like Getting to the heart of the problem Meeting with stakeholders who may not be clear on what they really need Using discovery sessions, workshops and structured questioning to uncover the real business problem Framing success in terms of value. For example higher revenue, lower cost, reduced risk, increased efficiency or better customer experience Translating business goals into a clear roadmap of data and AI work that everyone can understand Advising clients when AI is not the right solution and suggesting simpler or more cost effective alternatives Consulting and advisory work Acting as a trusted advisor to product owners, heads of department and executives Helping clients prioritise use cases based on value, feasibility and risk Communicating trade offs in a simple way. For example accuracy versus speed, innovation versus compliance, cost versus impact Preparing and delivering client presentations, proposals and updates that tell a clear story Supporting pre sales activities where needed, such as scoping work, estimating effort and defining outcomes Managing client expectations, risks and dependencies so there are no surprises Building things that actually work Once the problem and value are clear, you will design and deliver production ready ML and GenAI solutions. That includes: Designing and building data pipelines, batch or streaming, that support the desired outcomes Working with engineers and architects so your work fits cleanly into existing systems Making sure what you build is reliable in production and moves the needle on agreed metrics, not just offline benchmarks Explaining design decisions to both technical and non technical stakeholders GenAI work You will work with GenAI in ways that are grounded in real use cases and business value: Building RAG systems that improve search, content discovery or productivity rather than existing for their own sake Implementing guardrails so models do not leak PII or generate harmful or off brand content Defining and tracking the right metrics so you and the client can see whether a GenAI solution is useful and cost effective Fine tuning and optimising models so they perform well for the use case and budget Designing agentic workflows where they genuinely improve outcomes rather than add complexity Helping clients understand what GenAI can and cannot do in practice Keeping it running You will set up the foundations that protect value over time: Experiment tracking and model versioning so you know what works and can roll back safely CI/CD pipelines for ML so improvements reach users quickly and reliably Monitoring and alerting for models and data so you can catch issues before they damage trust or results Communicating operational risks and mitigations to non technical stakeholders in plain language Security, quality and compliance You will help make sure: Data is accurate, traceable and well managed so decisions are sound Sensitive data is handled correctly, protecting users and the business Regulatory and compliance requirements are met, avoiding costly mistakes Clients understand the risk profile of AI solutions and the controls in place Working with people You will be a bridge between technical and non technical teams, inside our organisation and on the client side. That means: Explaining complex ML and GenAI ideas in plain language, always tied to business outcomes Working closely with product managers, engineers and business stakeholders to prioritise work that matters Facilitating workshops, playback sessions and show and tells that build buy in and understanding Coaching and supporting junior colleagues so the whole team can deliver more value Representing the company professionally in client meetings and at industry events What we are looking for Experience Around 3 to 6 years of experience shipping ML or GenAI solutions into production A track record of seeing projects through from discovery to delivery, with clear impact Experience working directly with stakeholders or clients in a consulting, advisory or product facing role Education A Bachelor or Master degree in a quantitative field such as Computer Science, Data Science, Statistics, Mathematics or Engineering or Equivalent experience that shows you can deliver results Technical skills Core skills Strong Python and SQL, with clean, maintainable code Solid understanding of ML fundamentals. For example feature engineering, model selection, handling imbalanced data, choosing and interpreting metrics Experience with PyTorch or TensorFlow GenAI specific Hands on experience with LLM APIs or open source models such as Llama or Mistral Experience building RAG systems with vector databases such as FAISS, Pinecone or Weaviate Ability to evaluate and improve prompts and retrieval quality using clear metrics Understanding of safety practices such as PII redaction and content filtering Exposure to agentic frameworks Cloud and infrastructure Comfortable working in at least one major cloud provider. AWS, GCP or Azure Familiar with Docker and CI/CD pipelines Experience with managed ML platforms such as SageMaker, Vertex AI or Azure ML Data engineering and MLOps Experience with data warehouses such as Snowflake, BigQuery or Redshift Workflow orchestration using tools like Airflow or Dagster Experience with MLOps tools such as MLflow, Weights and Biases or similar Awareness of data and model drift, and how to monitor and respond to it before it erodes value Soft skills, the things that really matter You are comfortable in client facing settings and can build trust quickly You can talk with anyone from a CEO to a new data analyst, and always bring the conversation back to business value You can take a vague, messy business problem and turn it into a clear technical plan that links to outcomes and metrics You are happy to push back and challenge assumptions respectfully when it is in the client's best interest You like helping other people grow and are happy to mentor junior colleagues You communicate clearly in writing and in person Nice to have, not required Do not rule yourself out if you do not have these. They are a bonus, not a checklist. Experience with Delta Lake, Iceberg, Spark or Databricks, Palantir Experience optimising LLM serving with tools such as vLLM, TGI or TensorRT LLM Search and ranking experience. For example Elasticsearch or rerankers Background in time series forecasting, causal inference, recommender systems or optimisation Experience managing cloud costs and IAM so value is not lost to waste Ability to work in other languages where needed. For example Java, Scala, Go or bash Experience with BI tools such as Looker or Tableau Prior consulting experience or leading client projects end to end Contributions to open source, conference talks or published papers that show your ability to share ideas and influence the wider community Got a background that fits and you're up for a new challenge? Send over your latest CV, expectations and availability. Staffworx Limited is a UK based recruitment consultancy partnering with leading global brands across digital, AI, software, and business consulting. Let's talk about what you could add to the mix.
11/12/2025
Full time
Data & AI Senior Consultants Location - We are flexible: onsite, hybrid or fully remote, depending on what works for you and the client, UK or Netherlands based. What you will actually be doing This is not a role where you build clever models that never get used. Your focus is on creating measurable value for clients using data science, machine learning and GenAI, in a consulting and advisory context. You will own work from the very beginning, asking questions like "What value are we trying to create here?" and "Is this the right problem to solve?" through to "It is live, stakeholders are using it and we can see the impact in the numbers." You will work fairly independently and you will also be someone that more junior team members look to for help and direction. A big part of the job is taking messy, ambiguous business and technical problems and turning them into clear, valuable solutions that make sense to the client. You will do this in a client facing role. That means you will be in the room for key conversations, providing honest advice, managing expectations and helping clients make good decisions about where and how to use AI. What your day to day might look like Getting to the heart of the problem Meeting with stakeholders who may not be clear on what they really need Using discovery sessions, workshops and structured questioning to uncover the real business problem Framing success in terms of value. For example higher revenue, lower cost, reduced risk, increased efficiency or better customer experience Translating business goals into a clear roadmap of data and AI work that everyone can understand Advising clients when AI is not the right solution and suggesting simpler or more cost effective alternatives Consulting and advisory work Acting as a trusted advisor to product owners, heads of department and executives Helping clients prioritise use cases based on value, feasibility and risk Communicating trade offs in a simple way. For example accuracy versus speed, innovation versus compliance, cost versus impact Preparing and delivering client presentations, proposals and updates that tell a clear story Supporting pre sales activities where needed, such as scoping work, estimating effort and defining outcomes Managing client expectations, risks and dependencies so there are no surprises Building things that actually work Once the problem and value are clear, you will design and deliver production ready ML and GenAI solutions. That includes: Designing and building data pipelines, batch or streaming, that support the desired outcomes Working with engineers and architects so your work fits cleanly into existing systems Making sure what you build is reliable in production and moves the needle on agreed metrics, not just offline benchmarks Explaining design decisions to both technical and non technical stakeholders GenAI work You will work with GenAI in ways that are grounded in real use cases and business value: Building RAG systems that improve search, content discovery or productivity rather than existing for their own sake Implementing guardrails so models do not leak PII or generate harmful or off brand content Defining and tracking the right metrics so you and the client can see whether a GenAI solution is useful and cost effective Fine tuning and optimising models so they perform well for the use case and budget Designing agentic workflows where they genuinely improve outcomes rather than add complexity Helping clients understand what GenAI can and cannot do in practice Keeping it running You will set up the foundations that protect value over time: Experiment tracking and model versioning so you know what works and can roll back safely CI/CD pipelines for ML so improvements reach users quickly and reliably Monitoring and alerting for models and data so you can catch issues before they damage trust or results Communicating operational risks and mitigations to non technical stakeholders in plain language Security, quality and compliance You will help make sure: Data is accurate, traceable and well managed so decisions are sound Sensitive data is handled correctly, protecting users and the business Regulatory and compliance requirements are met, avoiding costly mistakes Clients understand the risk profile of AI solutions and the controls in place Working with people You will be a bridge between technical and non technical teams, inside our organisation and on the client side. That means: Explaining complex ML and GenAI ideas in plain language, always tied to business outcomes Working closely with product managers, engineers and business stakeholders to prioritise work that matters Facilitating workshops, playback sessions and show and tells that build buy in and understanding Coaching and supporting junior colleagues so the whole team can deliver more value Representing the company professionally in client meetings and at industry events What we are looking for Experience Around 3 to 6 years of experience shipping ML or GenAI solutions into production A track record of seeing projects through from discovery to delivery, with clear impact Experience working directly with stakeholders or clients in a consulting, advisory or product facing role Education A Bachelor or Master degree in a quantitative field such as Computer Science, Data Science, Statistics, Mathematics or Engineering or Equivalent experience that shows you can deliver results Technical skills Core skills Strong Python and SQL, with clean, maintainable code Solid understanding of ML fundamentals. For example feature engineering, model selection, handling imbalanced data, choosing and interpreting metrics Experience with PyTorch or TensorFlow GenAI specific Hands on experience with LLM APIs or open source models such as Llama or Mistral Experience building RAG systems with vector databases such as FAISS, Pinecone or Weaviate Ability to evaluate and improve prompts and retrieval quality using clear metrics Understanding of safety practices such as PII redaction and content filtering Exposure to agentic frameworks Cloud and infrastructure Comfortable working in at least one major cloud provider. AWS, GCP or Azure Familiar with Docker and CI/CD pipelines Experience with managed ML platforms such as SageMaker, Vertex AI or Azure ML Data engineering and MLOps Experience with data warehouses such as Snowflake, BigQuery or Redshift Workflow orchestration using tools like Airflow or Dagster Experience with MLOps tools such as MLflow, Weights and Biases or similar Awareness of data and model drift, and how to monitor and respond to it before it erodes value Soft skills, the things that really matter You are comfortable in client facing settings and can build trust quickly You can talk with anyone from a CEO to a new data analyst, and always bring the conversation back to business value You can take a vague, messy business problem and turn it into a clear technical plan that links to outcomes and metrics You are happy to push back and challenge assumptions respectfully when it is in the client's best interest You like helping other people grow and are happy to mentor junior colleagues You communicate clearly in writing and in person Nice to have, not required Do not rule yourself out if you do not have these. They are a bonus, not a checklist. Experience with Delta Lake, Iceberg, Spark or Databricks, Palantir Experience optimising LLM serving with tools such as vLLM, TGI or TensorRT LLM Search and ranking experience. For example Elasticsearch or rerankers Background in time series forecasting, causal inference, recommender systems or optimisation Experience managing cloud costs and IAM so value is not lost to waste Ability to work in other languages where needed. For example Java, Scala, Go or bash Experience with BI tools such as Looker or Tableau Prior consulting experience or leading client projects end to end Contributions to open source, conference talks or published papers that show your ability to share ideas and influence the wider community Got a background that fits and you're up for a new challenge? Send over your latest CV, expectations and availability. Staffworx Limited is a UK based recruitment consultancy partnering with leading global brands across digital, AI, software, and business consulting. Let's talk about what you could add to the mix.
Tenth Revolution Group
Northampton, Northamptonshire
Senior Databricks Engineer - Northampton (Hybrid) - Up to 80K + Benefits Ready to lead, innovate, and make an impact? We're looking for a Senior Databricks Engineer to join a forward-thinking team and take ownership of cutting-edge data solutions. This is your chance to shape the future of data strategy in a business that has been empowering companies for decades, supporting thousands of SMEs worldwide. We value relationships, trust, and innovation, and offer a flexible, inclusive environment where you can grow, make an impact, and be part of something special. About the Role You'll play a key role in designing and delivering scalable data pipelines, collaborating on architecture, and mentoring a small team of Data Engineers. This is a hands-on technical leadership position where your expertise will drive innovation and performance across our data ecosystem. What You'll Do Build and optimise data pipelines using Databricks Collaborate on data architecture and strategy Deliver large-scale workflows for ingestion, transformation, and validation Implement best practices for data quality and governance Lead and coach a team of Data Engineers What We're Looking For Significant experience with Databricks (including Unity Catalog) Strong skills in Python, Spark, SQL Cloud expertise Knowledge of pipeline tools (Airflow, ADF) Leadership and problem-solving ability Ready to take the next step? Apply now and join us as our Senior Databricks Engineer!
10/12/2025
Full time
Senior Databricks Engineer - Northampton (Hybrid) - Up to 80K + Benefits Ready to lead, innovate, and make an impact? We're looking for a Senior Databricks Engineer to join a forward-thinking team and take ownership of cutting-edge data solutions. This is your chance to shape the future of data strategy in a business that has been empowering companies for decades, supporting thousands of SMEs worldwide. We value relationships, trust, and innovation, and offer a flexible, inclusive environment where you can grow, make an impact, and be part of something special. About the Role You'll play a key role in designing and delivering scalable data pipelines, collaborating on architecture, and mentoring a small team of Data Engineers. This is a hands-on technical leadership position where your expertise will drive innovation and performance across our data ecosystem. What You'll Do Build and optimise data pipelines using Databricks Collaborate on data architecture and strategy Deliver large-scale workflows for ingestion, transformation, and validation Implement best practices for data quality and governance Lead and coach a team of Data Engineers What We're Looking For Significant experience with Databricks (including Unity Catalog) Strong skills in Python, Spark, SQL Cloud expertise Knowledge of pipeline tools (Airflow, ADF) Leadership and problem-solving ability Ready to take the next step? Apply now and join us as our Senior Databricks Engineer!
Senior Data Engineer London 2-3 days on-site 65,000 - 72,000 + 20% Bonus + Excellent Benefits Our client is a leading global hospitality brand undergoing an exciting period of rapid growth and transformation. With significant investment in data and technology, they are building a world-class data platform to power decision-making across every area of the business - from supply chain and logistics to marketing, customer sales and in-store operations. We are seeking an experienced Senior Data Engineer with deep expertise in Databricks to design, build, and optimize the clients data platform. This role will be pivotal in developing scalable data pipelines, enabling advanced analytics, and driving data quality and governance across the organisation. You'll work closely with data scientists, analysts, and business stakeholders to transform raw data into trusted, actionable insights that power critical business decisions. Required Qualifications 6+ years of experience in data engineering 3+ years of hands-on experience with Databricks Strong working knowledge Azure Strong knowledge of data modeling, ETL/ELT design, and data lakehouse concepts. To apply for this role please email across your CV ASAP.
10/12/2025
Full time
Senior Data Engineer London 2-3 days on-site 65,000 - 72,000 + 20% Bonus + Excellent Benefits Our client is a leading global hospitality brand undergoing an exciting period of rapid growth and transformation. With significant investment in data and technology, they are building a world-class data platform to power decision-making across every area of the business - from supply chain and logistics to marketing, customer sales and in-store operations. We are seeking an experienced Senior Data Engineer with deep expertise in Databricks to design, build, and optimize the clients data platform. This role will be pivotal in developing scalable data pipelines, enabling advanced analytics, and driving data quality and governance across the organisation. You'll work closely with data scientists, analysts, and business stakeholders to transform raw data into trusted, actionable insights that power critical business decisions. Required Qualifications 6+ years of experience in data engineering 3+ years of hands-on experience with Databricks Strong working knowledge Azure Strong knowledge of data modeling, ETL/ELT design, and data lakehouse concepts. To apply for this role please email across your CV ASAP.
Senior Data Engineer - Azure Data - Burton-on-Trent - Permanent - Hybrid Salary - £60,000 - £67,000 per annum This role requires 1 day / week in Burton-on-Trent, with hybrid working arrangements. Our client is seeking a highly skilled Senior Data Engineer to join their dynamic IT team, based in Burton-on-Trent. The Senior Data Engineer will come on board to support the Strategic Data Manager in establishing and managing an efficient Business Intelligence technical service. Assisting in the advancement of our cloud-based data platforms, providing options for timely processing and cost-efficient solutions. A strong background in Azure Data Pipeline development is key for this position. Key Skills & Responsibilities: Build and manage pipelines using Azure Data Factory, Databricks, CI/CD, and Terraform. Optimisation of ETL processes for performance and cost-efficiency. Design scalable data models aligned with business needs. Azure data solutions for efficient data storage and retrieval. Ensure compliance with data protection laws (e.g., GDPR), implement encryption and access controls. Work with cross-functional teams and mentor junior engineers. Manage and tune Azure SQL Database instances. Proactively monitor pipelines and infrastructure for performance and reliability. Maintain technical documentation and lead knowledge-sharing initiatives. Deploy advanced analytics and machine learning solutions using Azure. Stay current with Azure technologies and identify areas for enhancement. Databricks (Unity Catalog, DLT), Data Factory, Synapse, Data Lake, Stream Analytics, Event Hubs. Strong knowledge of Python, Scala, C#, .NET. Experience with advanced SQL, T-SQL, relational databases. Azure DevOps, Terraform, BICEP, ARM templates. Distributed computing, cloud-native design patterns. Data modelling, metadata management, data quality, data as a product. Strong communication, empathy, determination, openness to innovation. Strong Microsoft Office 365 experience Interested? Please submit your updated CV to Lewis Rushton at Crimson for immediate consideration. Not interested? Do you know someone who might be a perfect fit for this role? Refer a friend and earn £250 worth of vouchers! Crimson is acting as an employment agency regarding this vacancy Crimson is acting as an employment agency regarding this vacancy JBRP1_UKTJ
10/12/2025
Full time
Senior Data Engineer - Azure Data - Burton-on-Trent - Permanent - Hybrid Salary - £60,000 - £67,000 per annum This role requires 1 day / week in Burton-on-Trent, with hybrid working arrangements. Our client is seeking a highly skilled Senior Data Engineer to join their dynamic IT team, based in Burton-on-Trent. The Senior Data Engineer will come on board to support the Strategic Data Manager in establishing and managing an efficient Business Intelligence technical service. Assisting in the advancement of our cloud-based data platforms, providing options for timely processing and cost-efficient solutions. A strong background in Azure Data Pipeline development is key for this position. Key Skills & Responsibilities: Build and manage pipelines using Azure Data Factory, Databricks, CI/CD, and Terraform. Optimisation of ETL processes for performance and cost-efficiency. Design scalable data models aligned with business needs. Azure data solutions for efficient data storage and retrieval. Ensure compliance with data protection laws (e.g., GDPR), implement encryption and access controls. Work with cross-functional teams and mentor junior engineers. Manage and tune Azure SQL Database instances. Proactively monitor pipelines and infrastructure for performance and reliability. Maintain technical documentation and lead knowledge-sharing initiatives. Deploy advanced analytics and machine learning solutions using Azure. Stay current with Azure technologies and identify areas for enhancement. Databricks (Unity Catalog, DLT), Data Factory, Synapse, Data Lake, Stream Analytics, Event Hubs. Strong knowledge of Python, Scala, C#, .NET. Experience with advanced SQL, T-SQL, relational databases. Azure DevOps, Terraform, BICEP, ARM templates. Distributed computing, cloud-native design patterns. Data modelling, metadata management, data quality, data as a product. Strong communication, empathy, determination, openness to innovation. Strong Microsoft Office 365 experience Interested? Please submit your updated CV to Lewis Rushton at Crimson for immediate consideration. Not interested? Do you know someone who might be a perfect fit for this role? Refer a friend and earn £250 worth of vouchers! Crimson is acting as an employment agency regarding this vacancy Crimson is acting as an employment agency regarding this vacancy JBRP1_UKTJ
Role: BI Manager Rate: 500.00 Per Day - Inside IR35 Location: Central Birmingham, West Midlands (Hybrid Working - 2 days per week onsite) Duration: Initial 3 - 6 Months with potential to go Permanent We are currently working with a leading services provider who require a technically strong, Midlands based Senior BI Manager with a good understanding of Azure Data and Data Engineering tools. Working as a key member of a newly formed Data Engineering team, the successful candidate will lead the design, development, and ongoing enhancement of the client's data and reporting infrastructure. You will be the strategic owner of the Azure Data Platform, overseeing services such as Azure Data Lake, Data Warehouse, Data Factory, Databricks, and Power BI. The technical focus is all Microsoft, primarily Azure so any Fabric experience would be very beneficial. Our client is looking for someone who is going to lead the function, and has previous experience doing this. Someone who really understands data and what it can be used for and challenge the business on what they need from the data and challenge the teams to produce the most effective data outputs for the business need so that it can improve and become a first-class function. You will need to be able to drive the direction of how data works for the organisation and the overall Data/BI strategy, design solutions that fit, and demonstrate what value data can bring to the company if it is used effectively. A technical background is essential to be able understand and bridge the gap between the Data Team and the Business environment so that the two collaborate effectively and are challenged both ways. Someone who can understand and appreciate both the technical side and the business strategy side. Skills & experience required: Experience leading a BI function Expertise in Azure BI architecture and Cloud services Hands-on experience with Azure Fabric, SQL warehousing, DataLakes, Databricks Track record in MI/BI product development using Agile and Waterfall methods Experience managing cross-functional teams and sprint activities Experience in leading a BI team and a business through the development and transition to a Data Lake / Factory / Warehouse Technical BI development/architect background If your profile demonstrates strong and recent experience in the above areas - please submit your application ASAP to Jackie Dean at TXP for consideration. TXP takes great pride in representing socially responsible clients who not only prioritise diversity and inclusion but also actively combat social inequality. Together, we have the power to make a profound impact on fostering a more equitable and inclusive society. By working with us, you become part of a movement dedicated to promoting a diverse and inclusive workforce.
09/12/2025
Contractor
Role: BI Manager Rate: 500.00 Per Day - Inside IR35 Location: Central Birmingham, West Midlands (Hybrid Working - 2 days per week onsite) Duration: Initial 3 - 6 Months with potential to go Permanent We are currently working with a leading services provider who require a technically strong, Midlands based Senior BI Manager with a good understanding of Azure Data and Data Engineering tools. Working as a key member of a newly formed Data Engineering team, the successful candidate will lead the design, development, and ongoing enhancement of the client's data and reporting infrastructure. You will be the strategic owner of the Azure Data Platform, overseeing services such as Azure Data Lake, Data Warehouse, Data Factory, Databricks, and Power BI. The technical focus is all Microsoft, primarily Azure so any Fabric experience would be very beneficial. Our client is looking for someone who is going to lead the function, and has previous experience doing this. Someone who really understands data and what it can be used for and challenge the business on what they need from the data and challenge the teams to produce the most effective data outputs for the business need so that it can improve and become a first-class function. You will need to be able to drive the direction of how data works for the organisation and the overall Data/BI strategy, design solutions that fit, and demonstrate what value data can bring to the company if it is used effectively. A technical background is essential to be able understand and bridge the gap between the Data Team and the Business environment so that the two collaborate effectively and are challenged both ways. Someone who can understand and appreciate both the technical side and the business strategy side. Skills & experience required: Experience leading a BI function Expertise in Azure BI architecture and Cloud services Hands-on experience with Azure Fabric, SQL warehousing, DataLakes, Databricks Track record in MI/BI product development using Agile and Waterfall methods Experience managing cross-functional teams and sprint activities Experience in leading a BI team and a business through the development and transition to a Data Lake / Factory / Warehouse Technical BI development/architect background If your profile demonstrates strong and recent experience in the above areas - please submit your application ASAP to Jackie Dean at TXP for consideration. TXP takes great pride in representing socially responsible clients who not only prioritise diversity and inclusion but also actively combat social inequality. Together, we have the power to make a profound impact on fostering a more equitable and inclusive society. By working with us, you become part of a movement dedicated to promoting a diverse and inclusive workforce.
Locations : Stockholm Copenhagen V Berlin München London Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures-and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. We Are BCG X We're a diverse team of more than 3,000 tech experts united by a drive to make a difference. Working across industries and disciplines, we combine our experience and expertise to tackle the biggest challenges faced by society today. We go beyond what was once thought possible, creating new and innovative solutions to the world's most complex problems. Leveraging BCG's global network and partnerships with leading organizations, BCG X provides a stable ecosystem for talent to build game-changing businesses, products, and services from the ground up, all while growing their career. Together, we strive to create solutions that will positively impact the lives of millions. What You'll Do Our BCG X teams own the full analytics value-chain end to end: framing new business challenges, designing innovative algorithms, implementing, and deploying scalable solutions, and enabling colleagues and clients to fully embrace AI. Our product offerings span from fully custom-builds to industry specific leading edge AI software solutions. As a (Senior) AI Software Engineer you'll be part of our rapidly growing engineering team and help to build the next generation of AI solutions. You'll have the chance to partner with clients in a variety of BCG regions and industries, and on key topics like climate change, enabling them to design, build, and deploy new and innovative solutions. Additional responsibilities will include developing and delivering thought leadership in scientific communities and papers as well as leading conferences on behalf of BCG X. We are looking for talented individuals with a passion for software development, large-scale data analytics and transforming organizations into AI led innovative companies. Successful candidates possess the following: +4 years of experience in a technology consulting environment Apply software development practices and standards to develop robust and maintainable software Actively involved in every part of the software development life cycle Experienced at guiding non-technical teams and consultants in and best practices for robust software development Optimize and enhance computational efficiency of algorithms and software design Motivated by a fast-paced, service-oriented environment and interacting directly with clients on new features for future product releases Enjoy collaborating in teams to share software design and solution ideas A natural problem-solver and intellectually curious across a breadth of industries and topics Master's degree or PhD in relevant field of study - please provide all academic certificates showing the final grades (A-level, Bachelor, Master, PhD) Additional tasks: Designing and building data & AI platforms for our clients. Such platforms provide data and (Gen)AI capabilities to a wide variety of consumers and use cases across the client organization. Often part of large (AI) transformational journeys BCG does for its clients. Often involves the following engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith, Langfuse and similar for LLMOps The difference to our "AI Engineer" role is: Do you "use/consume" these technologies, or are you the one that "provides" them to the rest of the organization. What You'll Bring TECHNOLOGIES: Programming Languages: Python Experience with additional programming languages is a plus Additional info BCG offers a comprehensive benefits program, including medical, dental and vision coverage, telemedicine services, life, accident and disability insurance, parental leave and family planning benefits, caregiving resources, mental health offerings, a generous retirement program, financial guidance, paid time off, and more. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify.
09/12/2025
Full time
Locations : Stockholm Copenhagen V Berlin München London Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures-and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. We Are BCG X We're a diverse team of more than 3,000 tech experts united by a drive to make a difference. Working across industries and disciplines, we combine our experience and expertise to tackle the biggest challenges faced by society today. We go beyond what was once thought possible, creating new and innovative solutions to the world's most complex problems. Leveraging BCG's global network and partnerships with leading organizations, BCG X provides a stable ecosystem for talent to build game-changing businesses, products, and services from the ground up, all while growing their career. Together, we strive to create solutions that will positively impact the lives of millions. What You'll Do Our BCG X teams own the full analytics value-chain end to end: framing new business challenges, designing innovative algorithms, implementing, and deploying scalable solutions, and enabling colleagues and clients to fully embrace AI. Our product offerings span from fully custom-builds to industry specific leading edge AI software solutions. As a (Senior) AI Software Engineer you'll be part of our rapidly growing engineering team and help to build the next generation of AI solutions. You'll have the chance to partner with clients in a variety of BCG regions and industries, and on key topics like climate change, enabling them to design, build, and deploy new and innovative solutions. Additional responsibilities will include developing and delivering thought leadership in scientific communities and papers as well as leading conferences on behalf of BCG X. We are looking for talented individuals with a passion for software development, large-scale data analytics and transforming organizations into AI led innovative companies. Successful candidates possess the following: +4 years of experience in a technology consulting environment Apply software development practices and standards to develop robust and maintainable software Actively involved in every part of the software development life cycle Experienced at guiding non-technical teams and consultants in and best practices for robust software development Optimize and enhance computational efficiency of algorithms and software design Motivated by a fast-paced, service-oriented environment and interacting directly with clients on new features for future product releases Enjoy collaborating in teams to share software design and solution ideas A natural problem-solver and intellectually curious across a breadth of industries and topics Master's degree or PhD in relevant field of study - please provide all academic certificates showing the final grades (A-level, Bachelor, Master, PhD) Additional tasks: Designing and building data & AI platforms for our clients. Such platforms provide data and (Gen)AI capabilities to a wide variety of consumers and use cases across the client organization. Often part of large (AI) transformational journeys BCG does for its clients. Often involves the following engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith, Langfuse and similar for LLMOps The difference to our "AI Engineer" role is: Do you "use/consume" these technologies, or are you the one that "provides" them to the rest of the organization. What You'll Bring TECHNOLOGIES: Programming Languages: Python Experience with additional programming languages is a plus Additional info BCG offers a comprehensive benefits program, including medical, dental and vision coverage, telemedicine services, life, accident and disability insurance, parental leave and family planning benefits, caregiving resources, mental health offerings, a generous retirement program, financial guidance, paid time off, and more. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify.
Senior Data Engineer Salary: Up to 70,000 I am working with a forward-thinking organisation that is modernising its data platform to support scalable analytics and business intelligence across the Group. With a strong focus on Microsoft technologies and cloud-first architecture, they are looking to bring on a Data Engineer to help design and deliver impactful data solutions using Azure. This is a hands-on role where you will work across the full data stack, collaborating with architects, analysts, and stakeholders to build a future-ready platform that drives insight and decision-making. In this role, you will be responsible for: Building and managing data pipelines using Azure Data Factory and related services. Building and maintaining data lakes, data warehouses, and ETL/ELT processes. Designing scalable data solutions and models for reporting in Power BI. Supporting data migration from legacy systems into the new platform. Ensuring data models are optimised for performance and reusability. To be successful in this role, you will have: Hands-on experience creating data pipelines using Azure services such as Synapse and Data Factory. Reporting experience with Power BI. Strong understanding of SQL, Python, or PySpark. Knowledge of the Azure data platform including Azure Data Lake Storage, Azure SQL Data Warehouse, or Azure Databricks. Some of the package/role details include: Salary up to 70,000 Hybrid working model twice per week in Portsmouth Pension scheme and private healthcare options Opportunities for training and development This is just a brief overview of the role. For the full details, simply apply with your CV and I'll be in touch to discuss it further.
09/12/2025
Full time
Senior Data Engineer Salary: Up to 70,000 I am working with a forward-thinking organisation that is modernising its data platform to support scalable analytics and business intelligence across the Group. With a strong focus on Microsoft technologies and cloud-first architecture, they are looking to bring on a Data Engineer to help design and deliver impactful data solutions using Azure. This is a hands-on role where you will work across the full data stack, collaborating with architects, analysts, and stakeholders to build a future-ready platform that drives insight and decision-making. In this role, you will be responsible for: Building and managing data pipelines using Azure Data Factory and related services. Building and maintaining data lakes, data warehouses, and ETL/ELT processes. Designing scalable data solutions and models for reporting in Power BI. Supporting data migration from legacy systems into the new platform. Ensuring data models are optimised for performance and reusability. To be successful in this role, you will have: Hands-on experience creating data pipelines using Azure services such as Synapse and Data Factory. Reporting experience with Power BI. Strong understanding of SQL, Python, or PySpark. Knowledge of the Azure data platform including Azure Data Lake Storage, Azure SQL Data Warehouse, or Azure Databricks. Some of the package/role details include: Salary up to 70,000 Hybrid working model twice per week in Portsmouth Pension scheme and private healthcare options Opportunities for training and development This is just a brief overview of the role. For the full details, simply apply with your CV and I'll be in touch to discuss it further.
Company Description Telefónica Tech (part of the Telefónica Group) is a leading NextGen Tech solutions provider with a highly diversified team of over 6,000 exceptionally skilled employees and nationalities. We serve more than 5.5m customers every day in over 175 countries, with a global ecosystem of market-leading partners. Global strategic hubs: Spain, Brazil, the UK, Germany. The Telefónica Tech UK&I hub has an end- to-end portfolio of market leading services and develops integrated technology solutions to accelerate digital transformation through: Cloud, Data & AI, Enterprise Applications, Workplace Services and Cyber Security & Networking. Values: Open, Trusted and Bold Trusted Partners: Microsoft: Top 3 Service Providers, Azure Expert Status, Fastrack & Inner Circle Partner HPE: Platinum Partner - FY23 UK&I Solution Provider of the Year Palo Alto & Crowdstrike: part of our NextDefense Cyber Security Portfolio Fortinet: Elite VIP Program - one of only 2 in the UK AWS: Advanced Solution & Managed Service Provider Program Job Description Senior AI/Data Engineering Consultant We are looking for people that will guide us in our growth, innovate and mentor. We need you to help us break and create the rules to continue to be a place admired for our people, culture and innovation - and to help us in being a place that everyone wants to work, and no one wants to leave! The role will vary depending on the project but will primarily focus on the delivery of enterprise-level solutions in the Artificial Intelligence, Data Science/Machine Learning and Data Engineering arena.? This is a client-facing position, so the ideal candidate must be comfortable speaking with clients and some occasional?travel. Our offices are in Farnham and London. The role can be based at either location. Induction, training and company meets are done at both offices. When we can, we generally get together at either of the offices either a Wednesday or?a Friday. RESPONSIBILITIES Working on projects that utilise the Microsoft Azure technology stack across domains such as AI, Data Engineering, Data Science & Machine Learning. Satisfying the expectations and requirements of customers, both internal and external Supporting others in their development Contributing to the internal?and external community Industry experience in delivering Microsoft Azure solutions, with a good grounding in all associated areas Proven written and spoken English Strategic and operational decision-making skills Outstanding interpersonal skills Ability and attitude towards investigating and sharing new technologies Ability to guide, direct or influence people Ability to identify opportunities, issues and risks Willingness to learn based on feedback Able to help others develop Ideally degree educated - computer science, data analysis, AI & Machine Learning etc Microsoft certified (nice to have) TECHNICAL SKILLS Peoples skills vary and that's great because the role varies. You should be comfortable with at least 3 of the core technologies below and have an interest in at least 4 others within the core/supporting/principles. Current Microsoft/Databricks certifications are useful but not mandatory - we'll help you get those! Core: Data Manipulation (SQL, Pandas, Pyspark) Azure AI (Azure AI Foundry, AI Search, Document Intelligence, AI Services) Data Science & Machine Learning (Databricks, Python, SKLearn, XGBoost, MLFlow, EDA) Familiarity with LLMs (OpenAI, Prompt Engineering, LangChain) Relevant Azure Data & Computation services (ADLS, ADF, Databricks, SQL Databases) Supporting: Azure ML Services React/CSS/JavaScript Azure infrastructure R, Powershell Kubernetes/Docker Tensorflow/Pytorch Principles: Data Modelling Data Science Data Warehouse Theory Data Architecture Master Data Management Additional Information At Telefónica Tech, we believe inclusion is the bridge that empowers everyone to be their authentic selves. We celebrate and respect our differences because diversity drives innovation and makes us stronger. Be yourself with us, and feel that you belong. We welcome applicants from all backgrounds and identities regardless of age, disability, gender reassignment, marital or civil partnership status, pregnancy or maternity, race, religion or belief, sex, and sexual orientation. We are also committed to equity, accessible hiring practices, and creating an inclusive culture through many means including TogetHer (Women's network) and our Employee Resource Groups which include Diversity and Inclusion, Telefónica Tech Pride, Neurodiversity, ELEVATE (African and Caribbean heritage network), and Sustainability. We don't believe hiring is a tick box exercise, so if you feel that you don't match the job description 100%, but would still be a great fit for role, please get in touch.
08/12/2025
Full time
Company Description Telefónica Tech (part of the Telefónica Group) is a leading NextGen Tech solutions provider with a highly diversified team of over 6,000 exceptionally skilled employees and nationalities. We serve more than 5.5m customers every day in over 175 countries, with a global ecosystem of market-leading partners. Global strategic hubs: Spain, Brazil, the UK, Germany. The Telefónica Tech UK&I hub has an end- to-end portfolio of market leading services and develops integrated technology solutions to accelerate digital transformation through: Cloud, Data & AI, Enterprise Applications, Workplace Services and Cyber Security & Networking. Values: Open, Trusted and Bold Trusted Partners: Microsoft: Top 3 Service Providers, Azure Expert Status, Fastrack & Inner Circle Partner HPE: Platinum Partner - FY23 UK&I Solution Provider of the Year Palo Alto & Crowdstrike: part of our NextDefense Cyber Security Portfolio Fortinet: Elite VIP Program - one of only 2 in the UK AWS: Advanced Solution & Managed Service Provider Program Job Description Senior AI/Data Engineering Consultant We are looking for people that will guide us in our growth, innovate and mentor. We need you to help us break and create the rules to continue to be a place admired for our people, culture and innovation - and to help us in being a place that everyone wants to work, and no one wants to leave! The role will vary depending on the project but will primarily focus on the delivery of enterprise-level solutions in the Artificial Intelligence, Data Science/Machine Learning and Data Engineering arena.? This is a client-facing position, so the ideal candidate must be comfortable speaking with clients and some occasional?travel. Our offices are in Farnham and London. The role can be based at either location. Induction, training and company meets are done at both offices. When we can, we generally get together at either of the offices either a Wednesday or?a Friday. RESPONSIBILITIES Working on projects that utilise the Microsoft Azure technology stack across domains such as AI, Data Engineering, Data Science & Machine Learning. Satisfying the expectations and requirements of customers, both internal and external Supporting others in their development Contributing to the internal?and external community Industry experience in delivering Microsoft Azure solutions, with a good grounding in all associated areas Proven written and spoken English Strategic and operational decision-making skills Outstanding interpersonal skills Ability and attitude towards investigating and sharing new technologies Ability to guide, direct or influence people Ability to identify opportunities, issues and risks Willingness to learn based on feedback Able to help others develop Ideally degree educated - computer science, data analysis, AI & Machine Learning etc Microsoft certified (nice to have) TECHNICAL SKILLS Peoples skills vary and that's great because the role varies. You should be comfortable with at least 3 of the core technologies below and have an interest in at least 4 others within the core/supporting/principles. Current Microsoft/Databricks certifications are useful but not mandatory - we'll help you get those! Core: Data Manipulation (SQL, Pandas, Pyspark) Azure AI (Azure AI Foundry, AI Search, Document Intelligence, AI Services) Data Science & Machine Learning (Databricks, Python, SKLearn, XGBoost, MLFlow, EDA) Familiarity with LLMs (OpenAI, Prompt Engineering, LangChain) Relevant Azure Data & Computation services (ADLS, ADF, Databricks, SQL Databases) Supporting: Azure ML Services React/CSS/JavaScript Azure infrastructure R, Powershell Kubernetes/Docker Tensorflow/Pytorch Principles: Data Modelling Data Science Data Warehouse Theory Data Architecture Master Data Management Additional Information At Telefónica Tech, we believe inclusion is the bridge that empowers everyone to be their authentic selves. We celebrate and respect our differences because diversity drives innovation and makes us stronger. Be yourself with us, and feel that you belong. We welcome applicants from all backgrounds and identities regardless of age, disability, gender reassignment, marital or civil partnership status, pregnancy or maternity, race, religion or belief, sex, and sexual orientation. We are also committed to equity, accessible hiring practices, and creating an inclusive culture through many means including TogetHer (Women's network) and our Employee Resource Groups which include Diversity and Inclusion, Telefónica Tech Pride, Neurodiversity, ELEVATE (African and Caribbean heritage network), and Sustainability. We don't believe hiring is a tick box exercise, so if you feel that you don't match the job description 100%, but would still be a great fit for role, please get in touch.
Role: BI Manager Salary: 70,000 - 80,000 PA Plus Bonus and Benefits Location: Central Birmingham, West Midlands (Hybrid Working - 2 days per week onsite) We are currently working with a leading Midlands based services provider who require a technically strong Senior BI Manager with a good understanding of Azure Data and Data Engineering tools. Working as a key member of a newly formed Data Engineering team, the successful candidate will lead the design, development, and ongoing enhancement of the client's data and reporting infrastructure. You will be the strategic owner of the Azure Data Platform, overseeing services such as Azure Data Lake, Data Warehouse, Data Factory, Databricks, and Power BI. The technical focus is all Microsoft, primarily Azure so any Fabric experience would be very beneficial. Our client is looking for someone who is going to lead the function, and has previous experience doing this. Someone who really understands data and what it can be used for and challenge the business on what they need from the data and challenge the teams to produce the most effective data outputs for the business need so that it can improve and become a first-class function. You will need to be able to drive the direction of how data works for the organisation and the overall Data/BI strategy, design solutions that fit, and demonstrate what value data can bring to the company if it is used effectively. A technical background is essential to be able understand and bridge the gap between the Data Team and the Business environment so that the two collaborate effectively and are challenged both ways. Someone who can understand and appreciate both the technical side and the business strategy side. Our client offers a good, supportive environment which is going through a major transformation being driven by technology. Skills & experience required: Experience leading a BI function Expertise in Azure BI architecture and Cloud services Hands-on experience with Azure Fabric, SQL warehousing, DataLakes, Databricks Track record in MI/BI product development using Agile and Waterfall methods Experience managing cross-functional teams and sprint activities Experience in leading a BI team and a business through the development and transition to a Data Lake / Factory / Warehouse Technical BI development/architect background Benefits: Achievable bonus scheme 4% Pension Life Insurance 3 x salary 25 days annual leave plus statutory - 1 x extra day every year for the first 3 years Blue Light Card Medicash - includes discounted gym memberships etc. If your profile demonstrates strong and recent experience in the above areas - please submit your application ASAP to Jackie Dean at TXP for consideration. TXP takes great pride in representing socially responsible clients who not only prioritise diversity and inclusion but also actively combat social inequality. Together, we have the power to make a profound impact on fostering a more equitable and inclusive society. By working with us, you become part of a movement dedicated to promoting a diverse and inclusive workforce.
05/12/2025
Full time
Role: BI Manager Salary: 70,000 - 80,000 PA Plus Bonus and Benefits Location: Central Birmingham, West Midlands (Hybrid Working - 2 days per week onsite) We are currently working with a leading Midlands based services provider who require a technically strong Senior BI Manager with a good understanding of Azure Data and Data Engineering tools. Working as a key member of a newly formed Data Engineering team, the successful candidate will lead the design, development, and ongoing enhancement of the client's data and reporting infrastructure. You will be the strategic owner of the Azure Data Platform, overseeing services such as Azure Data Lake, Data Warehouse, Data Factory, Databricks, and Power BI. The technical focus is all Microsoft, primarily Azure so any Fabric experience would be very beneficial. Our client is looking for someone who is going to lead the function, and has previous experience doing this. Someone who really understands data and what it can be used for and challenge the business on what they need from the data and challenge the teams to produce the most effective data outputs for the business need so that it can improve and become a first-class function. You will need to be able to drive the direction of how data works for the organisation and the overall Data/BI strategy, design solutions that fit, and demonstrate what value data can bring to the company if it is used effectively. A technical background is essential to be able understand and bridge the gap between the Data Team and the Business environment so that the two collaborate effectively and are challenged both ways. Someone who can understand and appreciate both the technical side and the business strategy side. Our client offers a good, supportive environment which is going through a major transformation being driven by technology. Skills & experience required: Experience leading a BI function Expertise in Azure BI architecture and Cloud services Hands-on experience with Azure Fabric, SQL warehousing, DataLakes, Databricks Track record in MI/BI product development using Agile and Waterfall methods Experience managing cross-functional teams and sprint activities Experience in leading a BI team and a business through the development and transition to a Data Lake / Factory / Warehouse Technical BI development/architect background Benefits: Achievable bonus scheme 4% Pension Life Insurance 3 x salary 25 days annual leave plus statutory - 1 x extra day every year for the first 3 years Blue Light Card Medicash - includes discounted gym memberships etc. If your profile demonstrates strong and recent experience in the above areas - please submit your application ASAP to Jackie Dean at TXP for consideration. TXP takes great pride in representing socially responsible clients who not only prioritise diversity and inclusion but also actively combat social inequality. Together, we have the power to make a profound impact on fostering a more equitable and inclusive society. By working with us, you become part of a movement dedicated to promoting a diverse and inclusive workforce.
Databricks Engineer Location: Oxfordshire (Hybrid) Salary: Competitive + Benefits Are you an experienced Databricks Engineer looking for your next challenge? The Role This is a hands-on technical role with leadership responsibilities. You'll design and deliver scalable data solutions, work closely with data leaders on architecture and strategy, and mentor a small team of Data Engineers to ensure best practices. Key Responsibilities Build and maintain scalable data pipelines and ETL processes using Databricks Collaborate on data architecture and translate designs into build plans Deliver large-scale data workflows and optimise for performance Implement data quality and validation processes What We're Looking For Strong experience with Databricks Proficiency in Python, Spark, and SQL Experience with cloud platforms Knowledge of pipeline tools Excellent problem-solving and leadership skills If you're passionate about data engineering and want to make an impact, apply today!
05/12/2025
Full time
Databricks Engineer Location: Oxfordshire (Hybrid) Salary: Competitive + Benefits Are you an experienced Databricks Engineer looking for your next challenge? The Role This is a hands-on technical role with leadership responsibilities. You'll design and deliver scalable data solutions, work closely with data leaders on architecture and strategy, and mentor a small team of Data Engineers to ensure best practices. Key Responsibilities Build and maintain scalable data pipelines and ETL processes using Databricks Collaborate on data architecture and translate designs into build plans Deliver large-scale data workflows and optimise for performance Implement data quality and validation processes What We're Looking For Strong experience with Databricks Proficiency in Python, Spark, and SQL Experience with cloud platforms Knowledge of pipeline tools Excellent problem-solving and leadership skills If you're passionate about data engineering and want to make an impact, apply today!
Senior Azure Data Engineer Hybrid - Work From Home and West London Circ £70,000 - £80,000 + Range of benefits A well-known and prestigious business is looking to add a Senior Azure Data Engineer to their data team. This is an exciting opportunity for a Data Engineer that's not just technical, but also enjoys directly engaging and collaborating with stakeholders from across business functions such as finance, operations, planning, manufacturing, retail, e-commerce etc. Having nearly completed the process of migrating data from their existing on-prem databases to an Azure Cloud based platform, the Senior Data Engineer will play a key role in helping make best use of the data by gathering and agreeing requirements with the business to build data solutions that align accordingly. Working with diverse data sets from multiple systems and overseeing their integration and optimisation will require raw development, management and optimisation of data pipelines using tools in the Azure Cloud. Our client has expanded rapidly and been transformed in recent years, they're an iconic business with a special work environment that's manifested a strong and positive culture amongst the whole workforce. This is a hybrid role where the postholder can work from home 2 or 3 days per week, the other days will be based onsite in West London just a few minutes walk from a Central Line tube station. The key responsibilities for the post include; Develop, construct, test and maintain data architectures within large scale data processing systems. Develop and manage data pipelines using Azure Data Factory, Delta Lake and Spark. Utilise Azure Cloud architecture knowledge to design and implement scalable data solutions. Utilise Spark, SQL, Python, R, and other data frameworks to manipulate data and gain a thorough understanding of the dataset's characteristics. Interact with API systems to query and retrieve data for analysis. Collaborate with business users / stakeholders to gather and agree requirements. To be considered for the post you'll need at least 5 years experience ideally with 1 or 2 years at a senior / lead level. You'll need to be goal driven and able to take ownership of work tasks without the need for constant supervision. You'll be engaging with multiple business areas so the ability to communicate effectively to understand requirements and build trusted relationships is a must. It's likely you'll have most, if not all the following: Experience as a Senior Data Engineer or similar Strong knowledge of Azure Cloud architecture and Azure Databricks, DevOps and CI/CD. Experience with PySpark, Python, SQL and other data engineering development tools. Experience with metadata driven pipelines and SQL serverless data warehouses. Knowledge of querying API systems. Experience building and optimising ETL pipelines using Databricks. Strong problem-solving skills and attention to detail. Understanding of data governance and data quality principles. A degree in computer science, engineering, or equivalent experience. Salary will be dependent on experience and likely to be in the region of £70,000 - £80,000 although client may consider higher for outstanding candidate. Our client can also provide a vibrant, rewarding, and diverse work environment that supports career development. Candidates must be authorised to work in the UK and not require sponsoring either now or in the future. For further information, please send your CV to Wayne Young at Young's Employment Services Ltd. Young's Employment Services acts in the capacity of both an Employment Agent and Employment Business.
05/12/2025
Full time
Senior Azure Data Engineer Hybrid - Work From Home and West London Circ £70,000 - £80,000 + Range of benefits A well-known and prestigious business is looking to add a Senior Azure Data Engineer to their data team. This is an exciting opportunity for a Data Engineer that's not just technical, but also enjoys directly engaging and collaborating with stakeholders from across business functions such as finance, operations, planning, manufacturing, retail, e-commerce etc. Having nearly completed the process of migrating data from their existing on-prem databases to an Azure Cloud based platform, the Senior Data Engineer will play a key role in helping make best use of the data by gathering and agreeing requirements with the business to build data solutions that align accordingly. Working with diverse data sets from multiple systems and overseeing their integration and optimisation will require raw development, management and optimisation of data pipelines using tools in the Azure Cloud. Our client has expanded rapidly and been transformed in recent years, they're an iconic business with a special work environment that's manifested a strong and positive culture amongst the whole workforce. This is a hybrid role where the postholder can work from home 2 or 3 days per week, the other days will be based onsite in West London just a few minutes walk from a Central Line tube station. The key responsibilities for the post include; Develop, construct, test and maintain data architectures within large scale data processing systems. Develop and manage data pipelines using Azure Data Factory, Delta Lake and Spark. Utilise Azure Cloud architecture knowledge to design and implement scalable data solutions. Utilise Spark, SQL, Python, R, and other data frameworks to manipulate data and gain a thorough understanding of the dataset's characteristics. Interact with API systems to query and retrieve data for analysis. Collaborate with business users / stakeholders to gather and agree requirements. To be considered for the post you'll need at least 5 years experience ideally with 1 or 2 years at a senior / lead level. You'll need to be goal driven and able to take ownership of work tasks without the need for constant supervision. You'll be engaging with multiple business areas so the ability to communicate effectively to understand requirements and build trusted relationships is a must. It's likely you'll have most, if not all the following: Experience as a Senior Data Engineer or similar Strong knowledge of Azure Cloud architecture and Azure Databricks, DevOps and CI/CD. Experience with PySpark, Python, SQL and other data engineering development tools. Experience with metadata driven pipelines and SQL serverless data warehouses. Knowledge of querying API systems. Experience building and optimising ETL pipelines using Databricks. Strong problem-solving skills and attention to detail. Understanding of data governance and data quality principles. A degree in computer science, engineering, or equivalent experience. Salary will be dependent on experience and likely to be in the region of £70,000 - £80,000 although client may consider higher for outstanding candidate. Our client can also provide a vibrant, rewarding, and diverse work environment that supports career development. Candidates must be authorised to work in the UK and not require sponsoring either now or in the future. For further information, please send your CV to Wayne Young at Young's Employment Services Ltd. Young's Employment Services acts in the capacity of both an Employment Agent and Employment Business.
Deerfoot Recruitment Solutions Limited
City, London
Data Engineering Technical Lead Global Investment Bank London - Hybrid Permanent - Excellent Package + Benefits We are working with one of the world's leading banking groups, who we have partnered with for 15 years. We are seeking an experienced Data Architect / EDM Developer / Data Engineering Lead to join their International Technology team in London. You will be a key part of the Architecture, Middleware, Data & Enterprise Services (AMD) division, driving data engineering, integration and automation initiatives across our clients EMEA banking and securities entities. This is a hands-on leadership role, combining technical expertise with mentoring and team leadership. Key Responsibilities Architect, design and deliver enterprise-wide EDM and data solutions. Lead and mentor EDM developers, ensuring high-quality, cost-effective delivery. Drive data innovation, automation and best practices across EMEA. Translate business requirements into functional and technical designs. Ensure compliance with SDLC, governance, and risk policies. Skills & Experience - Essential Strong SQL Server or Snowflake skills. Advanced knowledge of low-code/no-code data engineering / ETL tools - ideally Markit EDM (v19.2+) or similar (e.g. Informatica). Proven delivery experience in Financial Services / Banking sector. Deep understanding of SDLC, systems integration, and data warehousing. Ability to gather requirements and liaise effectively with business stakeholders. Desirable Skills Cloud (AWS / Azure), Python, PowerShell, APIs. Data pipelines, lineage, automation. BI tools (Power BI, Tableau, SSRS). Modern data architectures (lakehouse, data mesh). CI/CD, GitHub, Control-M, dbt/Databricks. This is an opportunity to join a global top-5 bank with long-term stability, world-class resources, and clear career progression routes. Enterprise Data Architect, EDM Developer, Data Engineering Lead, Data Architect, ETL Developer, Data Solutions Architect, Senior Data Engineer (Financial Services). Apply today for full details. Deerfoot Recruitment Solutions Ltd is a leading independent tech recruitment consultancy in the UK. For every CV sent to clients, we donate 1 to The Born Free Foundation. We are a Climate Action Workforce in partnership with Ecologi. If this role isn't right for you, explore our referral reward program with payouts at interview and placement milestones. Visit our website for details. Deerfoot Recruitment Solutions Ltd is acting as an Employment Agency in relation to this vacancy.
02/12/2025
Full time
Data Engineering Technical Lead Global Investment Bank London - Hybrid Permanent - Excellent Package + Benefits We are working with one of the world's leading banking groups, who we have partnered with for 15 years. We are seeking an experienced Data Architect / EDM Developer / Data Engineering Lead to join their International Technology team in London. You will be a key part of the Architecture, Middleware, Data & Enterprise Services (AMD) division, driving data engineering, integration and automation initiatives across our clients EMEA banking and securities entities. This is a hands-on leadership role, combining technical expertise with mentoring and team leadership. Key Responsibilities Architect, design and deliver enterprise-wide EDM and data solutions. Lead and mentor EDM developers, ensuring high-quality, cost-effective delivery. Drive data innovation, automation and best practices across EMEA. Translate business requirements into functional and technical designs. Ensure compliance with SDLC, governance, and risk policies. Skills & Experience - Essential Strong SQL Server or Snowflake skills. Advanced knowledge of low-code/no-code data engineering / ETL tools - ideally Markit EDM (v19.2+) or similar (e.g. Informatica). Proven delivery experience in Financial Services / Banking sector. Deep understanding of SDLC, systems integration, and data warehousing. Ability to gather requirements and liaise effectively with business stakeholders. Desirable Skills Cloud (AWS / Azure), Python, PowerShell, APIs. Data pipelines, lineage, automation. BI tools (Power BI, Tableau, SSRS). Modern data architectures (lakehouse, data mesh). CI/CD, GitHub, Control-M, dbt/Databricks. This is an opportunity to join a global top-5 bank with long-term stability, world-class resources, and clear career progression routes. Enterprise Data Architect, EDM Developer, Data Engineering Lead, Data Architect, ETL Developer, Data Solutions Architect, Senior Data Engineer (Financial Services). Apply today for full details. Deerfoot Recruitment Solutions Ltd is a leading independent tech recruitment consultancy in the UK. For every CV sent to clients, we donate 1 to The Born Free Foundation. We are a Climate Action Workforce in partnership with Ecologi. If this role isn't right for you, explore our referral reward program with payouts at interview and placement milestones. Visit our website for details. Deerfoot Recruitment Solutions Ltd is acting as an Employment Agency in relation to this vacancy.
Senior DataBricks Engineer - 70,000 - Hybrid We're looking for a hands-on Senior Databricks Engineer to lead the delivery of scalable data solutions within an Agile environment. Working closely with the Data Product Manager and Data Architect, you will shape and develop our data platform, delivering high-quality pipelines and insights that support strategic decision-making. You will also manage and coach a small team of Data Engineers, driving best practice, consistency, and governance. Key Responsibilities Translate business strategy into data solutions and ensure alignment with product goals. Provide technical leadership, breaking initiatives into Features, Epics, and Stories and setting engineering standards. Collaborate with the Data Architect to design and implement data architecture and build plans. Build and maintain scalable data pipelines, ETL/ELT processes, and large-scale data workflows. Optimise data systems for performance, reliability, and scalability. Implement data quality processes and maintain data models, schemas, and documentation. Operate CI/CD practices in Azure DevOps and contribute to Agile sprint cycles. Troubleshoot and resolve pipeline issues promptly. Stay current with industry trends and recommend improvements. Ensure adherence to governance standards. Line-manage and mentor a small team of Data Engineers. What We're Looking For Extensive Databricks experience, including Unity Catalog. Strong skills in Python, Spark, SQL and experience with SQL databases. Terraform experience for cloud infrastructure as code. Experience with Azure and workflow tools (Airflow, ADF). Excellent problem-solving ability, communication skills, and attention to detail. Experience across Waterfall and Agile methodologies. Curious, inclusive, and committed to continuous learning. To apply for this role please submit your CV or contact Dillon Blackburn on (phone number removed) or at (url removed). Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
02/12/2025
Full time
Senior DataBricks Engineer - 70,000 - Hybrid We're looking for a hands-on Senior Databricks Engineer to lead the delivery of scalable data solutions within an Agile environment. Working closely with the Data Product Manager and Data Architect, you will shape and develop our data platform, delivering high-quality pipelines and insights that support strategic decision-making. You will also manage and coach a small team of Data Engineers, driving best practice, consistency, and governance. Key Responsibilities Translate business strategy into data solutions and ensure alignment with product goals. Provide technical leadership, breaking initiatives into Features, Epics, and Stories and setting engineering standards. Collaborate with the Data Architect to design and implement data architecture and build plans. Build and maintain scalable data pipelines, ETL/ELT processes, and large-scale data workflows. Optimise data systems for performance, reliability, and scalability. Implement data quality processes and maintain data models, schemas, and documentation. Operate CI/CD practices in Azure DevOps and contribute to Agile sprint cycles. Troubleshoot and resolve pipeline issues promptly. Stay current with industry trends and recommend improvements. Ensure adherence to governance standards. Line-manage and mentor a small team of Data Engineers. What We're Looking For Extensive Databricks experience, including Unity Catalog. Strong skills in Python, Spark, SQL and experience with SQL databases. Terraform experience for cloud infrastructure as code. Experience with Azure and workflow tools (Airflow, ADF). Excellent problem-solving ability, communication skills, and attention to detail. Experience across Waterfall and Agile methodologies. Curious, inclusive, and committed to continuous learning. To apply for this role please submit your CV or contact Dillon Blackburn on (phone number removed) or at (url removed). Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Senior Data Engineer Location: Manchester Salary: Up to 105,000 and bonus We are seeking an experienced Data Engineer with expertise in Databricks to join a global consultancy on a major transformation project. This is a fantastic opportunity to work on cutting-edge data solutions in a collaborative, forward-thinking environment. About the role: Work with a global leader in analytics and digital transformation. Be part of a high-impact project driving innovation in the insurance domain. Enjoy a senior-level role with clear progression opportunities and exposure to strategic decision-making. Competitive package: Up to 105K base + bonus, plus other benefits. What We're Looking For Proven experience as a Data Engineer. Strong hands-on expertise with Databricks. Insurance domain experience. Solid background in data management.
02/12/2025
Full time
Senior Data Engineer Location: Manchester Salary: Up to 105,000 and bonus We are seeking an experienced Data Engineer with expertise in Databricks to join a global consultancy on a major transformation project. This is a fantastic opportunity to work on cutting-edge data solutions in a collaborative, forward-thinking environment. About the role: Work with a global leader in analytics and digital transformation. Be part of a high-impact project driving innovation in the insurance domain. Enjoy a senior-level role with clear progression opportunities and exposure to strategic decision-making. Competitive package: Up to 105K base + bonus, plus other benefits. What We're Looking For Proven experience as a Data Engineer. Strong hands-on expertise with Databricks. Insurance domain experience. Solid background in data management.
About Us Makutu designs, builds and supports Microsoft Azure cloud data platforms. We are a Microsoft Solutions Partner (Azure Data & AI) and are busy building a talented team with relevant skills to deliver industry leading data platforms for our customers. The Role The Data Engineer role is key to building and growing the in-house technical team at Makutu. The role will provide the successful applicants with the opportunity for significant career development while working with a range of large businesses to whom data is critical to their success. Working as part of the team and with the customer, you'll require excellent written and verbal English language and communication skills. Big growth plans are in place to build a broader and deeper technical capability with a focus on the Microsoft Azure technology stack. The position of Data Engineer is a key role in the wider capability of our team. Occasional visits to our Head Office and customers sites will be required. Key responsibilities: Identify, design, and implement working practices across data pipelines, data architectures, testing and deployment Understand complex business requirements and providing solutions to business problems Understand modern data architecture approaches and associated cloud focused solutions Defining data engineering best practice and sharing across the organisation Collaborating with the wider team on data strategy Skills and experience: A relevant Bachelors degree in Computing, Mathematics, Data Science or similar (ideal but not essential) A Masters degree in Data Science (ideal but not essential) Experience building data pipelines with modern practices including the use of cloud native technologies, DevOps practices, CI/CD pipelines and agile delivery Experience with data modelling, data warehousing, data lake solutions Able to communicate effectively with senior stakeholders. Successful candidates will likely posses Azure certifications such as DP-600 and/or DP-700. Also, applicants will have experience working with some of the following technologies: Power BI Power Apps Blob storage Synapse Azure Data Factory (ADF) IOT Hub SQL Server Azure Data Lake Storage Azure Databricks Purview Power Platform Python
02/12/2025
Full time
About Us Makutu designs, builds and supports Microsoft Azure cloud data platforms. We are a Microsoft Solutions Partner (Azure Data & AI) and are busy building a talented team with relevant skills to deliver industry leading data platforms for our customers. The Role The Data Engineer role is key to building and growing the in-house technical team at Makutu. The role will provide the successful applicants with the opportunity for significant career development while working with a range of large businesses to whom data is critical to their success. Working as part of the team and with the customer, you'll require excellent written and verbal English language and communication skills. Big growth plans are in place to build a broader and deeper technical capability with a focus on the Microsoft Azure technology stack. The position of Data Engineer is a key role in the wider capability of our team. Occasional visits to our Head Office and customers sites will be required. Key responsibilities: Identify, design, and implement working practices across data pipelines, data architectures, testing and deployment Understand complex business requirements and providing solutions to business problems Understand modern data architecture approaches and associated cloud focused solutions Defining data engineering best practice and sharing across the organisation Collaborating with the wider team on data strategy Skills and experience: A relevant Bachelors degree in Computing, Mathematics, Data Science or similar (ideal but not essential) A Masters degree in Data Science (ideal but not essential) Experience building data pipelines with modern practices including the use of cloud native technologies, DevOps practices, CI/CD pipelines and agile delivery Experience with data modelling, data warehousing, data lake solutions Able to communicate effectively with senior stakeholders. Successful candidates will likely posses Azure certifications such as DP-600 and/or DP-700. Also, applicants will have experience working with some of the following technologies: Power BI Power Apps Blob storage Synapse Azure Data Factory (ADF) IOT Hub SQL Server Azure Data Lake Storage Azure Databricks Purview Power Platform Python
Senior Data Engineer Salary: Up to 70,000 I am working with a forward-thinking organisation that is modernising its data platform to support scalable analytics and business intelligence across the Group. With a strong focus on Microsoft technologies and cloud-first architecture, they are looking to bring on a Data Engineer to help design and deliver impactful data solutions using Azure. This is a hands-on role where you will work across the full data stack, collaborating with architects, analysts, and stakeholders to build a future-ready platform that drives insight and decision-making. In this role, you will be responsible for: Building and managing data pipelines using Azure Data Factory and related services. Building and maintaining data lakes, data warehouses, and ETL/ELT processes. Designing scalable data solutions and models for reporting in Power BI. Supporting data migration from legacy systems into the new platform. Ensuring data models are optimised for performance and reusability. To be successful in this role, you will have: Hands-on experience creating data pipelines using Azure services such as Synapse and Data Factory. Reporting experience with Power BI. Strong understanding of SQL, Python, or PySpark. Knowledge of the Azure data platform including Azure Data Lake Storage, Azure SQL Data Warehouse, or Azure Databricks. Some of the package/role details include: Salary up to 70,000 Hybrid working model twice per week in Portsmouth Pension scheme and private healthcare options Opportunities for training and development This is just a brief overview of the role. For the full details, simply apply with your CV and I'll be in touch to discuss it further.
28/11/2025
Full time
Senior Data Engineer Salary: Up to 70,000 I am working with a forward-thinking organisation that is modernising its data platform to support scalable analytics and business intelligence across the Group. With a strong focus on Microsoft technologies and cloud-first architecture, they are looking to bring on a Data Engineer to help design and deliver impactful data solutions using Azure. This is a hands-on role where you will work across the full data stack, collaborating with architects, analysts, and stakeholders to build a future-ready platform that drives insight and decision-making. In this role, you will be responsible for: Building and managing data pipelines using Azure Data Factory and related services. Building and maintaining data lakes, data warehouses, and ETL/ELT processes. Designing scalable data solutions and models for reporting in Power BI. Supporting data migration from legacy systems into the new platform. Ensuring data models are optimised for performance and reusability. To be successful in this role, you will have: Hands-on experience creating data pipelines using Azure services such as Synapse and Data Factory. Reporting experience with Power BI. Strong understanding of SQL, Python, or PySpark. Knowledge of the Azure data platform including Azure Data Lake Storage, Azure SQL Data Warehouse, or Azure Databricks. Some of the package/role details include: Salary up to 70,000 Hybrid working model twice per week in Portsmouth Pension scheme and private healthcare options Opportunities for training and development This is just a brief overview of the role. For the full details, simply apply with your CV and I'll be in touch to discuss it further.
Senior Data Engineering Consultant - 60,000 - Hybrid Key Responsibilities Lead, mentor, and develop a team of Technical Consultants. Manage resource planning, scheduling, and overall delivery workflows. Collaborate with Pre-sales, Commercial, and Project Management teams to scope and deliver projects. Contribute to technical delivery, designing scalable data solutions in Azure/Microsoft environments. Support cloud migrations, data lake builds, and ETL/ELT pipeline development. Ensure delivery follows best practices and internal standards. Skills & Experience Strong leadership and relationship-building skills. Experience guiding or managing technical teams. Deep hands-on experience in Data Engineering using Microsoft Fabric, Azure Databricks, Synapse, Data Factory, and/or SQL Server. Expertise in SQL and Python for ETL/ELT development. Knowledge of data lakes, medallion lakehouse architecture, and large-scale dataset management. Solid understanding of BI, data warehousing, and database optimisation. To apply for this role please submit your CV or contact Dillon Blackburn on (phone number removed) or at (url removed). Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
24/11/2025
Full time
Senior Data Engineering Consultant - 60,000 - Hybrid Key Responsibilities Lead, mentor, and develop a team of Technical Consultants. Manage resource planning, scheduling, and overall delivery workflows. Collaborate with Pre-sales, Commercial, and Project Management teams to scope and deliver projects. Contribute to technical delivery, designing scalable data solutions in Azure/Microsoft environments. Support cloud migrations, data lake builds, and ETL/ELT pipeline development. Ensure delivery follows best practices and internal standards. Skills & Experience Strong leadership and relationship-building skills. Experience guiding or managing technical teams. Deep hands-on experience in Data Engineering using Microsoft Fabric, Azure Databricks, Synapse, Data Factory, and/or SQL Server. Expertise in SQL and Python for ETL/ELT development. Knowledge of data lakes, medallion lakehouse architecture, and large-scale dataset management. Solid understanding of BI, data warehousing, and database optimisation. To apply for this role please submit your CV or contact Dillon Blackburn on (phone number removed) or at (url removed). Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Senior Data Engineer - Azure & Databricks Lakehouse Glasgow (3/4 days onsite) Exclusive Role with a Leading UK Consumer Business A rapidly scaling UK consumer brand is undertaking a major data modernisation programme-moving away from legacy systems, manual Excel reporting and fragmented data sources into a fully automated Azure Enterprise Landing Zone + Databricks Lakehouse . They are building a modern data platform from the ground up using Lakeflow Declarative Pipelines , Unity Catalog , and Azure Data Factory , and this role sits right at the heart of that transformation. This is a rare opportunity to join early, influence architecture, and help define engineering standards, pipelines, curated layers and best practices that will support Operations, Finance, Sales, Logistics and Customer Care. If you want to build a best-in-class Lakehouse from scratch-this is the one. ? What You'll Be Doing Lakehouse Engineering (Azure + Databricks) Engineer scalable ELT pipelines using Lakeflow Declarative Pipelines , PySpark , and Spark SQL across a full Medallion Architecture (Bronze ? Silver ? Gold) . Implement ingestion patterns for files, APIs, SaaS platforms (e.g. subscription billing), SQL sources, SharePoint and SFTP using ADF + metadata-driven frameworks . Apply Lakeflow expectations for data quality, schema validation and operational reliability. Curated Data Layers & Modelling Build clean, conformed Silver/Gold models aligned to enterprise business domains (customers, subscriptions, deliveries, finance, credit, logistics, operations). Deliver star schemas, harmonisation logic, SCDs and business marts to power high-performance Power BI datasets . Apply governance, lineage and fine-grained permissions via Unity Catalog . Orchestration & Observability Design and optimise orchestration using Lakeflow Workflows and Azure Data Factory . Implement monitoring, alerting, SLAs/SLIs, runbooks and cost-optimisation across the platform. DevOps & Platform Engineering Build CI/CD pipelines in Azure DevOps for notebooks, Lakeflow pipelines, SQL models and ADF artefacts. Ensure secure, enterprise-grade platform operation across Dev ? Prod , using private endpoints, managed identities and Key Vault. Contribute to platform standards, design patterns, code reviews and future roadmap. Collaboration & Delivery Work closely with BI/Analytics teams to deliver curated datasets powering dashboards across the organisation. Influence architecture decisions and uplift engineering maturity within a growing data function. ? Tech Stack You'll Work With Databricks : Lakeflow Declarative Pipelines, Workflows, Unity Catalog, SQL Warehouses Azure : ADLS Gen2, Data Factory, Key Vault, vNets & Private Endpoints Languages : PySpark, Spark SQL, Python, Git DevOps : Azure DevOps Repos, Pipelines, CI/CD Analytics : Power BI, Fabric ? What We're Looking For Experience 5-8+ years of Data Engineering with 2-3+ years delivering production workloads on Azure + Databricks . Strong PySpark/Spark SQL and distributed data processing expertise. Proven Medallion/Lakehouse delivery experience using Delta Lake . Solid dimensional modelling (Kimball) including surrogate keys, SCD types 1/2, and merge strategies. Operational experience-SLAs, observability, idempotent pipelines, reprocessing, backfills. Mindset Strong grounding in secure Azure Landing Zone patterns . Comfort with Git, CI/CD, automated deployments and modern engineering standards. Clear communicator who can translate technical decisions into business outcomes. Nice to Have Databricks Certified Data Engineer Associate Streaming ingestion experience (Auto Loader, structured streaming, watermarking) Subscription/entitlement modelling experience Advanced Unity Catalog security (RLS, ABAC, PII governance) Terraform/Bicep for IaC Fabric Semantic Model / Direct Lake optimisation
17/11/2025
Full time
Senior Data Engineer - Azure & Databricks Lakehouse Glasgow (3/4 days onsite) Exclusive Role with a Leading UK Consumer Business A rapidly scaling UK consumer brand is undertaking a major data modernisation programme-moving away from legacy systems, manual Excel reporting and fragmented data sources into a fully automated Azure Enterprise Landing Zone + Databricks Lakehouse . They are building a modern data platform from the ground up using Lakeflow Declarative Pipelines , Unity Catalog , and Azure Data Factory , and this role sits right at the heart of that transformation. This is a rare opportunity to join early, influence architecture, and help define engineering standards, pipelines, curated layers and best practices that will support Operations, Finance, Sales, Logistics and Customer Care. If you want to build a best-in-class Lakehouse from scratch-this is the one. ? What You'll Be Doing Lakehouse Engineering (Azure + Databricks) Engineer scalable ELT pipelines using Lakeflow Declarative Pipelines , PySpark , and Spark SQL across a full Medallion Architecture (Bronze ? Silver ? Gold) . Implement ingestion patterns for files, APIs, SaaS platforms (e.g. subscription billing), SQL sources, SharePoint and SFTP using ADF + metadata-driven frameworks . Apply Lakeflow expectations for data quality, schema validation and operational reliability. Curated Data Layers & Modelling Build clean, conformed Silver/Gold models aligned to enterprise business domains (customers, subscriptions, deliveries, finance, credit, logistics, operations). Deliver star schemas, harmonisation logic, SCDs and business marts to power high-performance Power BI datasets . Apply governance, lineage and fine-grained permissions via Unity Catalog . Orchestration & Observability Design and optimise orchestration using Lakeflow Workflows and Azure Data Factory . Implement monitoring, alerting, SLAs/SLIs, runbooks and cost-optimisation across the platform. DevOps & Platform Engineering Build CI/CD pipelines in Azure DevOps for notebooks, Lakeflow pipelines, SQL models and ADF artefacts. Ensure secure, enterprise-grade platform operation across Dev ? Prod , using private endpoints, managed identities and Key Vault. Contribute to platform standards, design patterns, code reviews and future roadmap. Collaboration & Delivery Work closely with BI/Analytics teams to deliver curated datasets powering dashboards across the organisation. Influence architecture decisions and uplift engineering maturity within a growing data function. ? Tech Stack You'll Work With Databricks : Lakeflow Declarative Pipelines, Workflows, Unity Catalog, SQL Warehouses Azure : ADLS Gen2, Data Factory, Key Vault, vNets & Private Endpoints Languages : PySpark, Spark SQL, Python, Git DevOps : Azure DevOps Repos, Pipelines, CI/CD Analytics : Power BI, Fabric ? What We're Looking For Experience 5-8+ years of Data Engineering with 2-3+ years delivering production workloads on Azure + Databricks . Strong PySpark/Spark SQL and distributed data processing expertise. Proven Medallion/Lakehouse delivery experience using Delta Lake . Solid dimensional modelling (Kimball) including surrogate keys, SCD types 1/2, and merge strategies. Operational experience-SLAs, observability, idempotent pipelines, reprocessing, backfills. Mindset Strong grounding in secure Azure Landing Zone patterns . Comfort with Git, CI/CD, automated deployments and modern engineering standards. Clear communicator who can translate technical decisions into business outcomes. Nice to Have Databricks Certified Data Engineer Associate Streaming ingestion experience (Auto Loader, structured streaming, watermarking) Subscription/entitlement modelling experience Advanced Unity Catalog security (RLS, ABAC, PII governance) Terraform/Bicep for IaC Fabric Semantic Model / Direct Lake optimisation
Principal Data Engineer - Hybrid (London/Winchester) We're seeking a hands-on Principal Data Engineer to design and deliver enterprise-scale, cloud-native data platforms that power analytics, reporting, and Real Time decision-making. This is a strategic technical leadership role where you'll shape architecture, mentor engineers, and deliver end-to-end solutions across a modern AWS/Databricks stack. What you'll do Lead the design of scalable, secure data architectures on AWS. Build and optimise ETL/ELT pipelines for batch and streaming data. Deploy and manage Apache Spark jobs on Databricks and Delta Lake. Write production-grade Python and SQL for large-scale data transformations. Drive data quality, governance, and automation through CI/CD and IaC. Collaborate with data scientists, analysts, and business stakeholders. Mentor and guide data engineering teams. What we're looking for Proven experience in senior/principal data engineering roles. Expertise in AWS, Databricks, Apache Spark, Python, and SQL. Strong background in cloud-native data platforms, Real Time processing, and data lakes. Hands-on experience with tools such as Airflow, Kafka, Docker, GitLab CI/CD. Excellent stakeholder engagement and leadership skills. What's on offer £84000 salary + 10% bonus 6% pension contribution Private medical & flexible benefits package 25 days annual leave (plus buy/sell options) Hybrid working - travel to London or Winchester once/twice per week Join a company at the forefront of media, connectivity, and smart technology, where your work directly powers millions of daily connections across the UK.
06/10/2025
Full time
Principal Data Engineer - Hybrid (London/Winchester) We're seeking a hands-on Principal Data Engineer to design and deliver enterprise-scale, cloud-native data platforms that power analytics, reporting, and Real Time decision-making. This is a strategic technical leadership role where you'll shape architecture, mentor engineers, and deliver end-to-end solutions across a modern AWS/Databricks stack. What you'll do Lead the design of scalable, secure data architectures on AWS. Build and optimise ETL/ELT pipelines for batch and streaming data. Deploy and manage Apache Spark jobs on Databricks and Delta Lake. Write production-grade Python and SQL for large-scale data transformations. Drive data quality, governance, and automation through CI/CD and IaC. Collaborate with data scientists, analysts, and business stakeholders. Mentor and guide data engineering teams. What we're looking for Proven experience in senior/principal data engineering roles. Expertise in AWS, Databricks, Apache Spark, Python, and SQL. Strong background in cloud-native data platforms, Real Time processing, and data lakes. Hands-on experience with tools such as Airflow, Kafka, Docker, GitLab CI/CD. Excellent stakeholder engagement and leadership skills. What's on offer £84000 salary + 10% bonus 6% pension contribution Private medical & flexible benefits package 25 days annual leave (plus buy/sell options) Hybrid working - travel to London or Winchester once/twice per week Join a company at the forefront of media, connectivity, and smart technology, where your work directly powers millions of daily connections across the UK.