it job board logo
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
  • Recruiting? Post a job
  • Sign in
  • Sign up
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

21 jobs found

Email me jobs like this
Refine Search
Current Search
data engineer databricks london hybrid
Avanti
DV Cleared Data Architects Wanted
Avanti
DV Cleared Data Architects Wanted Multi-Cloud / Data Platforms UK (Hybrid) £90,000 - £100,000 + bonus + benefits London / Bristol / Manchester (hybrid) Active DV Clearance Required I'm working with successful multinational tech consultancy that has secured a number of long-term client programmes and is now looking to bring in several DV Cleared Data Architects to meet that demand. If you're DV cleared and tired of being locked into one stack or one client environment, this is worth a look. You'll be working across a range of secure, high-impact programmes with real ownership over how the data platforms are designed and delivered not just implementing someone else's blueprint. What you'll be doing It's a genuine end-to-end architecture role. You won't just be focused on one part of the data lifecycle, you'll own the full picture, from ingestion through to consumption, across different client environments and industries. Designing full data platforms end-to-end, making sure everything flows properly from ingestion through to consumption Working with modern approaches like lakehouse and medallion architectures (Bronze, Silver, Gold) Defining both batch and streaming pipelines depending on the use case Making real decisions around storage, orchestration, governance and security - not just executing a predefined design Working closely with clients and stakeholders to translate business problems into practical technical solutions, across both technical and non-technical audiences Operating in secure, regulated environments where attention to detail and sound judgement matter Tech environment It's a multi-cloud setup so you won't be pigeonholed but they are happy to consider applications from DV Cleared candidates with experience of one or more of the major cloud platforms. AWS, Azure and GCP depending on the project Databricks, Snowflake and Synapse Spark and distributed processing frameworks for large-scale data Streaming technologies such as Kafka or Kinesis You're expected to pick the right tools for the problem, not just default to whatever you know best. What they're looking for Active DV Clearance - must already be held A strong background in Data Architecture with hands-on experience Solid understanding of modern data patterns - lakehouse, medallion, streaming Experience designing full platforms, not just individual components Comfortable working with clients and stakeholders, not just internal teams A practical mindset, someone who makes sensible trade-offs rather than over-engineering Why it's worth considering What tends to appeal to people is the combination of variety and genuine ownership. You're working across different projects and industries, with the freedom to steer your own career based on where your strengths lie. There's a good balance of deep technical work and client-facing delivery, and you'll get real exposure to a wide range of tools, platforms and environments rather than repeating the same patterns on the same stack. Interested? If you hold active DV Clearance and want a role where you can design, influence and deliver real data platforms across secure programmes, it's worth a conversation. Get in touch and I'll give you the full picture. Salary: £90,000 - £100,000 + bonus + benefits Location - London / Bristol / Manchester / Belfast (hybrid) - 2 days a week in any of their offices.
01/04/2026
Full time
DV Cleared Data Architects Wanted Multi-Cloud / Data Platforms UK (Hybrid) £90,000 - £100,000 + bonus + benefits London / Bristol / Manchester (hybrid) Active DV Clearance Required I'm working with successful multinational tech consultancy that has secured a number of long-term client programmes and is now looking to bring in several DV Cleared Data Architects to meet that demand. If you're DV cleared and tired of being locked into one stack or one client environment, this is worth a look. You'll be working across a range of secure, high-impact programmes with real ownership over how the data platforms are designed and delivered not just implementing someone else's blueprint. What you'll be doing It's a genuine end-to-end architecture role. You won't just be focused on one part of the data lifecycle, you'll own the full picture, from ingestion through to consumption, across different client environments and industries. Designing full data platforms end-to-end, making sure everything flows properly from ingestion through to consumption Working with modern approaches like lakehouse and medallion architectures (Bronze, Silver, Gold) Defining both batch and streaming pipelines depending on the use case Making real decisions around storage, orchestration, governance and security - not just executing a predefined design Working closely with clients and stakeholders to translate business problems into practical technical solutions, across both technical and non-technical audiences Operating in secure, regulated environments where attention to detail and sound judgement matter Tech environment It's a multi-cloud setup so you won't be pigeonholed but they are happy to consider applications from DV Cleared candidates with experience of one or more of the major cloud platforms. AWS, Azure and GCP depending on the project Databricks, Snowflake and Synapse Spark and distributed processing frameworks for large-scale data Streaming technologies such as Kafka or Kinesis You're expected to pick the right tools for the problem, not just default to whatever you know best. What they're looking for Active DV Clearance - must already be held A strong background in Data Architecture with hands-on experience Solid understanding of modern data patterns - lakehouse, medallion, streaming Experience designing full platforms, not just individual components Comfortable working with clients and stakeholders, not just internal teams A practical mindset, someone who makes sensible trade-offs rather than over-engineering Why it's worth considering What tends to appeal to people is the combination of variety and genuine ownership. You're working across different projects and industries, with the freedom to steer your own career based on where your strengths lie. There's a good balance of deep technical work and client-facing delivery, and you'll get real exposure to a wide range of tools, platforms and environments rather than repeating the same patterns on the same stack. Interested? If you hold active DV Clearance and want a role where you can design, influence and deliver real data platforms across secure programmes, it's worth a conversation. Get in touch and I'll give you the full picture. Salary: £90,000 - £100,000 + bonus + benefits Location - London / Bristol / Manchester / Belfast (hybrid) - 2 days a week in any of their offices.
Experis
Enterprise Data Manager (Head of)
Experis
Enterprise Data Manager/Head of Data Hybrid: 2-3 days per week in the office ( London ) Permanent Paying up to 115k + Bonus Experis are delighted to be partnering with a well-established organisation as they continue to evolve and mature their enterprise data capability during a significant phase of data platform transformation. We are supporting them in the search for an Enterprise Data Manager to take ownership of the organisation's enterprise-wide data agenda. This is a senior, strategic role focused on data leadership, governance, stakeholder alignment and platform direction, sitting above the existing Data Engineering Manager and working across multiple business functions. This role provides enterprise-level oversight and cohesion across a complex, federated data landscape. The organisation is midway through modernising its data platform and needs an experienced data leader who can bring clarity, alignment and strategic direction across teams. This is a highly visible role operating across technology and business leadership, treating data as a strategic asset and ensuring the organisation is positioned to scale its data and AI capabilities responsibly. What You'll Be Doing Leading and evolving the enterprise data strategy, turning high-level intent into a clear, actionable roadmap. Providing enterprise leadership across the data ecosystem, bringing alignment between data engineering, architecture, governance and reporting functions. Acting as the senior authority on data across the organisation, influencing senior stakeholders and shaping data-driven decision making. Providing strategic oversight of the organisation's modern data platform including Azure, Databricks, Lakehouse architecture and early exploration of Microsoft Fabric. Guiding key decisions around sources of truth, data product lifecycle management and platform operating models. Managing the strategic delivery partnership with Telef nica Tech, ensuring strong collaboration, knowledge transfer and service delivery. Overseeing vendor-delivered workstreams and shaping the future sourcing strategy as the internal capability evolves. Supporting the organisation's data governance and AI governance foundations, ensuring strong controls before AI capability is scaled further. Operating across a federated data landscape, aligning multiple teams and reducing duplication while strengthening the organisation's overall data ecosystem. Experience Required Proven experience operating at Head of Data / Enterprise Data Manager / senior data leadership level. Strong track record working within large, complex or regulated organisations. Deep understanding of modern data platforms, data governance and enterprise data operating models. Experience managing external vendors, managed services and strategic delivery partners. Comfortable leading through organisational change, transformation and ambiguity. Experience working with Databricks, Lakehouse architectures or Microsoft Fabric. Background in regulated or data-rich industries such as finance, insurance, legal or media. If you'd like to learn more, please contact Jacob Ferdinand at
31/03/2026
Full time
Enterprise Data Manager/Head of Data Hybrid: 2-3 days per week in the office ( London ) Permanent Paying up to 115k + Bonus Experis are delighted to be partnering with a well-established organisation as they continue to evolve and mature their enterprise data capability during a significant phase of data platform transformation. We are supporting them in the search for an Enterprise Data Manager to take ownership of the organisation's enterprise-wide data agenda. This is a senior, strategic role focused on data leadership, governance, stakeholder alignment and platform direction, sitting above the existing Data Engineering Manager and working across multiple business functions. This role provides enterprise-level oversight and cohesion across a complex, federated data landscape. The organisation is midway through modernising its data platform and needs an experienced data leader who can bring clarity, alignment and strategic direction across teams. This is a highly visible role operating across technology and business leadership, treating data as a strategic asset and ensuring the organisation is positioned to scale its data and AI capabilities responsibly. What You'll Be Doing Leading and evolving the enterprise data strategy, turning high-level intent into a clear, actionable roadmap. Providing enterprise leadership across the data ecosystem, bringing alignment between data engineering, architecture, governance and reporting functions. Acting as the senior authority on data across the organisation, influencing senior stakeholders and shaping data-driven decision making. Providing strategic oversight of the organisation's modern data platform including Azure, Databricks, Lakehouse architecture and early exploration of Microsoft Fabric. Guiding key decisions around sources of truth, data product lifecycle management and platform operating models. Managing the strategic delivery partnership with Telef nica Tech, ensuring strong collaboration, knowledge transfer and service delivery. Overseeing vendor-delivered workstreams and shaping the future sourcing strategy as the internal capability evolves. Supporting the organisation's data governance and AI governance foundations, ensuring strong controls before AI capability is scaled further. Operating across a federated data landscape, aligning multiple teams and reducing duplication while strengthening the organisation's overall data ecosystem. Experience Required Proven experience operating at Head of Data / Enterprise Data Manager / senior data leadership level. Strong track record working within large, complex or regulated organisations. Deep understanding of modern data platforms, data governance and enterprise data operating models. Experience managing external vendors, managed services and strategic delivery partners. Comfortable leading through organisational change, transformation and ambiguity. Experience working with Databricks, Lakehouse architectures or Microsoft Fabric. Background in regulated or data-rich industries such as finance, insurance, legal or media. If you'd like to learn more, please contact Jacob Ferdinand at
Datatech
AI Engineer - All Seniorities
Datatech
AI Architect Financial Services London Hybrid At Datatech Analytics, we're delighted to partner with a global consulting organisation expanding its AI and Data capability within Financial Services. The firm works with major banks, insurers and capital markets institutions to design and deploy enterprise AI platforms, helping organisations transform how data and AI drive decision making across the business. As demand for AI transformation programmes continues to grow, the business is looking to hire an AI Architect to help shape and deliver complex AI platforms for large financial institutions. The role You will design and architect enterprise AI platforms, translating complex business challenges into scalable AI and data solutions. Working closely with engineering teams and senior stakeholders, you'll help organisations move from experimentation with AI to production-ready AI systems embedded within core business platforms. What you'll be doing Defining enterprise AI architecture strategies for financial services clients. Designing scalable AI platforms and ML infrastructure integrated with enterprise data systems. Architecting end-to-end AI pipelines , from data ingestion through to model deployment. Leading engineering teams across AI, machine learning and data engineering. Engaging senior stakeholders to shape AI transformation programmes and technical strategy. Technology environment Typical platforms and technologies include: Cloud platforms such as AWS, Azure or Google Cloud Data platforms including Databricks, Snowflake or BigQuery Python and modern ML frameworks Generative AI, LLM integration and RAG pipelines MLOps tooling and modern ML lifecycle management. The profile Experience designing enterprise AI or data platforms in complex technology environments. A background delivering large-scale transformation programmes, often within consulting or advisory environments. Strong technical leadership combined with the ability to engage senior stakeholders across business and technology teams. Experience working within Financial Services environments is highly valued. If you'd like to learn more about the opportunity, feel free to reach out for a confidential conversation.APPLY NOW Datatech Analytics
31/03/2026
Full time
AI Architect Financial Services London Hybrid At Datatech Analytics, we're delighted to partner with a global consulting organisation expanding its AI and Data capability within Financial Services. The firm works with major banks, insurers and capital markets institutions to design and deploy enterprise AI platforms, helping organisations transform how data and AI drive decision making across the business. As demand for AI transformation programmes continues to grow, the business is looking to hire an AI Architect to help shape and deliver complex AI platforms for large financial institutions. The role You will design and architect enterprise AI platforms, translating complex business challenges into scalable AI and data solutions. Working closely with engineering teams and senior stakeholders, you'll help organisations move from experimentation with AI to production-ready AI systems embedded within core business platforms. What you'll be doing Defining enterprise AI architecture strategies for financial services clients. Designing scalable AI platforms and ML infrastructure integrated with enterprise data systems. Architecting end-to-end AI pipelines , from data ingestion through to model deployment. Leading engineering teams across AI, machine learning and data engineering. Engaging senior stakeholders to shape AI transformation programmes and technical strategy. Technology environment Typical platforms and technologies include: Cloud platforms such as AWS, Azure or Google Cloud Data platforms including Databricks, Snowflake or BigQuery Python and modern ML frameworks Generative AI, LLM integration and RAG pipelines MLOps tooling and modern ML lifecycle management. The profile Experience designing enterprise AI or data platforms in complex technology environments. A background delivering large-scale transformation programmes, often within consulting or advisory environments. Strong technical leadership combined with the ability to engage senior stakeholders across business and technology teams. Experience working within Financial Services environments is highly valued. If you'd like to learn more about the opportunity, feel free to reach out for a confidential conversation.APPLY NOW Datatech Analytics
Syntax Consultancy Ltd
Python Full Stack Developer
Syntax Consultancy Ltd
Python Full Stack Developer Azure Databricks London (Hybrid) 6 Month Contract £550/day (Inside IR35) Python Full Stack Developer needed with active SC Security Clearance Azure Cloud and Azure Databricks. . 6 Month Contract based in Central London (Hybrid). Start ASAP in March/April 2026. Hybrid Working - 2 days/week remote (WFH), and 3 days/week working on-site from the London office. A chance to work with a leading global IT and Digital transformation business specialising in Government projects: Key experience, responsibilities + tasks: Strong Python development expertise , develop and maintain Python code for data processing, API development and integration with the Azure Databricks environment. Senior experience in software development, with a focus on both front-end and back-end development. Front-end development using React, Angular, Vue.js, Front-end and back-end development of applications and APIs interacting with the Azure Databricks platform. Experience with Azure cloud platform and services. Azure Databricks. containerisation (Docker) and orchestration (Kubernetes). CI/CD pipelines. Technical Environment: Full Stack, Python, RESTful APIs, Git, SQL Server database experience, including working with databases (SQL and NoSQL). Demonstratable experience of reverse engineering existing codebases. Experience with testing frameworks (e.g., pytest, xUnit). Familiarity working with economic data or Financial Markets strongly desirable. Banking / Financial Services domain experience preferred. Must hold active SC Security Clearance used on a project within the past 12 Months.
31/03/2026
Contractor
Python Full Stack Developer Azure Databricks London (Hybrid) 6 Month Contract £550/day (Inside IR35) Python Full Stack Developer needed with active SC Security Clearance Azure Cloud and Azure Databricks. . 6 Month Contract based in Central London (Hybrid). Start ASAP in March/April 2026. Hybrid Working - 2 days/week remote (WFH), and 3 days/week working on-site from the London office. A chance to work with a leading global IT and Digital transformation business specialising in Government projects: Key experience, responsibilities + tasks: Strong Python development expertise , develop and maintain Python code for data processing, API development and integration with the Azure Databricks environment. Senior experience in software development, with a focus on both front-end and back-end development. Front-end development using React, Angular, Vue.js, Front-end and back-end development of applications and APIs interacting with the Azure Databricks platform. Experience with Azure cloud platform and services. Azure Databricks. containerisation (Docker) and orchestration (Kubernetes). CI/CD pipelines. Technical Environment: Full Stack, Python, RESTful APIs, Git, SQL Server database experience, including working with databases (SQL and NoSQL). Demonstratable experience of reverse engineering existing codebases. Experience with testing frameworks (e.g., pytest, xUnit). Familiarity working with economic data or Financial Markets strongly desirable. Banking / Financial Services domain experience preferred. Must hold active SC Security Clearance used on a project within the past 12 Months.
Syntax Consultancy Ltd
Azure DevOps Engineer (SC Cleared)
Syntax Consultancy Ltd
Azure DevOps Engineer (SC Cleared) London (Hybrid) 6 Month Contract £(Apply online only)/day (Inside IR35) DevOps Platform Engineer needed with active SC Security Clearance and strong Azure Cloud expertise including Databricks + Data Factory. 6 Month Rolling Contract based in London. Start ASAP in March/April 2026. Hybrid Working - 2 days/week remote (WFH) + 3 days/week working from the office in London . A chance to work with a leading global IT transformation business specialising in large-scale Government projects: Azure Databricks Economic Data Platform is critical for Monetary Analysis, Forecasting + Modelling, used by data scientists, developers + economists. In-depth Azure Cloud platform engineering including: Azure Databricks, Data Factory, WebApp Service, API Management + APIOps. Building / maintaining core infrastructure + tooling, Infrastructure-as-Code (IaC), DevOps practices, containerisation + orchestration. Creating a self-service, scalable, and reliable platform that streamlines development workflows, simplifies infrastructure management + enhances productivity. Tools: GitHub Action (preferred), IaC, Terraform, ARM templates, Bicep, CI/CD principles, Github Action, Azure DevOps, Jenkins, GitLab CI. Containerisation + orchestration tools: Docker, Kubernetes, AKS. Scripting skills: PowerShell, Bash, YAML, Python. Configuration management tools: Ansible, Puppet, Chef. Azure Certifications preferred: Azure DevOps Engineer Expert, Azure Administrator Associate, Azure Solutions Architect Expert. SC Security Clearance is essential for this contract.
31/03/2026
Contractor
Azure DevOps Engineer (SC Cleared) London (Hybrid) 6 Month Contract £(Apply online only)/day (Inside IR35) DevOps Platform Engineer needed with active SC Security Clearance and strong Azure Cloud expertise including Databricks + Data Factory. 6 Month Rolling Contract based in London. Start ASAP in March/April 2026. Hybrid Working - 2 days/week remote (WFH) + 3 days/week working from the office in London . A chance to work with a leading global IT transformation business specialising in large-scale Government projects: Azure Databricks Economic Data Platform is critical for Monetary Analysis, Forecasting + Modelling, used by data scientists, developers + economists. In-depth Azure Cloud platform engineering including: Azure Databricks, Data Factory, WebApp Service, API Management + APIOps. Building / maintaining core infrastructure + tooling, Infrastructure-as-Code (IaC), DevOps practices, containerisation + orchestration. Creating a self-service, scalable, and reliable platform that streamlines development workflows, simplifies infrastructure management + enhances productivity. Tools: GitHub Action (preferred), IaC, Terraform, ARM templates, Bicep, CI/CD principles, Github Action, Azure DevOps, Jenkins, GitLab CI. Containerisation + orchestration tools: Docker, Kubernetes, AKS. Scripting skills: PowerShell, Bash, YAML, Python. Configuration management tools: Ansible, Puppet, Chef. Azure Certifications preferred: Azure DevOps Engineer Expert, Azure Administrator Associate, Azure Solutions Architect Expert. SC Security Clearance is essential for this contract.
Syntax Consultancy Ltd
Data Architect
Syntax Consultancy Ltd
Data Architect London (Hybrid) 6 Month Contract £500/day (Outside IR35) Data Architect needed with Retail sector and large-scale cloud data platform experience including Databricks, Snowflake and/or BigQuery. 6 Month Rolling Contract. Hybrid Working - 2 days/week remote (WFH) + 3 days/week working from the London office. Start ASAP in March/April 2026. A chance to work with a leading global IT transformation business specialising in large-scale Government projects. Hands-on data architecture experience with 2 or more major cloud data platform implementations (Databricks, Snowflake, BigQuery). Retail industry experience strongly preferred including Retail data domains: Point of Sale (POS), consumer, trade promotion, supply chain, finance. Supporting the end-to-end data architecture life-cycle including: discovery, design + implementation phases. Large-scale data platform modernisation, consolidation + migration programmes experience. In-depth experience in data platform architecture + data engineering roles. Evaluating current platforms + designing integration and migration patterns for a future data platform. Performing As-Is assessments of existing data platforms including: environments, data flows, patterns, governance, performance + cost hotspots. Designing integration patterns to enable safe + efficient cross-platform data analytics + data sharing. Defining target state technical patterns for how multiple platforms converge / coexist including: data engineering, data storage, ML/AI, BI, security + governance. Target architecture blueprint including concrete reference architectures, pattern catalogues + technical guardrails. Cross cloud integration patterns including: networking, identity, data movement, catalog/governance integration.
31/03/2026
Contractor
Data Architect London (Hybrid) 6 Month Contract £500/day (Outside IR35) Data Architect needed with Retail sector and large-scale cloud data platform experience including Databricks, Snowflake and/or BigQuery. 6 Month Rolling Contract. Hybrid Working - 2 days/week remote (WFH) + 3 days/week working from the London office. Start ASAP in March/April 2026. A chance to work with a leading global IT transformation business specialising in large-scale Government projects. Hands-on data architecture experience with 2 or more major cloud data platform implementations (Databricks, Snowflake, BigQuery). Retail industry experience strongly preferred including Retail data domains: Point of Sale (POS), consumer, trade promotion, supply chain, finance. Supporting the end-to-end data architecture life-cycle including: discovery, design + implementation phases. Large-scale data platform modernisation, consolidation + migration programmes experience. In-depth experience in data platform architecture + data engineering roles. Evaluating current platforms + designing integration and migration patterns for a future data platform. Performing As-Is assessments of existing data platforms including: environments, data flows, patterns, governance, performance + cost hotspots. Designing integration patterns to enable safe + efficient cross-platform data analytics + data sharing. Defining target state technical patterns for how multiple platforms converge / coexist including: data engineering, data storage, ML/AI, BI, security + governance. Target architecture blueprint including concrete reference architectures, pattern catalogues + technical guardrails. Cross cloud integration patterns including: networking, identity, data movement, catalog/governance integration.
VIQU IT
Data Architect - Palantir Foundry - Contract
VIQU IT
Morela is supporting a leading data consultancy on a nationally significant data programme, and we're looking for a hands-on Data Architect to take genuine ownership of the platform architecture, not just lead from a distance, but design and build alongside the team. You'll be responsible for ontology design, data pipeline architecture, and data modelling from scratch, while guiding a team of engineers through implementation and acting as the technical authority with key stakeholders. What you must bring: Deep, hands-on Data Architecture experience, you ve designed and built complex data platforms yourself, not just directed others. Proven ability to design ontologies from the ground up within large or complex environments. Strong data modelling skills across conceptual, logical, and physical layers. End-to-end data pipeline architecture experience, including ingestion, transformation, governance, and optimisation. The confidence to operate as the technical authority, make decisions, and justify architectural choices to senior stakeholders. A track record of supporting engineering teams through implementation, including practical, in-the-weeds guidance when needed. Experience with Palantir Foundry (pilot or full implementation) or equivalent enterprise platforms (Snowflake, Databricks, Azure, AWS, GCP). Ability to deliver under pressure in a high-visibility, mission-critical environment. Clear, concise communication skills, able to translate technical decisions effectively across mixed audiences. The client is clear, they want someone who has done this work themselves. If you've led a Palantir Foundry pilot or guided a team through a full end-to-end implementation, we'd love to hear from you. Strong Data Architects from other enterprise platforms are also encouraged to apply. What s on offer: Outside IR35 contract Hybrid working Nationally significant programme with real impact and visibility Competitive day rate aligned to experience Sector-specific experience is a bonus, but not essential, architectural depth and hands-on delivery capability matter far more. If this sounds like your next move, let's connect.
31/03/2026
Contractor
Morela is supporting a leading data consultancy on a nationally significant data programme, and we're looking for a hands-on Data Architect to take genuine ownership of the platform architecture, not just lead from a distance, but design and build alongside the team. You'll be responsible for ontology design, data pipeline architecture, and data modelling from scratch, while guiding a team of engineers through implementation and acting as the technical authority with key stakeholders. What you must bring: Deep, hands-on Data Architecture experience, you ve designed and built complex data platforms yourself, not just directed others. Proven ability to design ontologies from the ground up within large or complex environments. Strong data modelling skills across conceptual, logical, and physical layers. End-to-end data pipeline architecture experience, including ingestion, transformation, governance, and optimisation. The confidence to operate as the technical authority, make decisions, and justify architectural choices to senior stakeholders. A track record of supporting engineering teams through implementation, including practical, in-the-weeds guidance when needed. Experience with Palantir Foundry (pilot or full implementation) or equivalent enterprise platforms (Snowflake, Databricks, Azure, AWS, GCP). Ability to deliver under pressure in a high-visibility, mission-critical environment. Clear, concise communication skills, able to translate technical decisions effectively across mixed audiences. The client is clear, they want someone who has done this work themselves. If you've led a Palantir Foundry pilot or guided a team through a full end-to-end implementation, we'd love to hear from you. Strong Data Architects from other enterprise platforms are also encouraged to apply. What s on offer: Outside IR35 contract Hybrid working Nationally significant programme with real impact and visibility Competitive day rate aligned to experience Sector-specific experience is a bonus, but not essential, architectural depth and hands-on delivery capability matter far more. If this sounds like your next move, let's connect.
Joseph Harry Ltd
Head of Data SQL Azure AI Finance Financial Services London
Joseph Harry Ltd
Head of Data (Architect Architecture Data Development Engineer Engineering Manager Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL Banking Asset Management Investment Lending Loans Mortgtages) required by our financial client in London. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect/Head of Data Management experience Experience aligning the strategy with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Head of Data (Architect Architecture Data Development Engineer Engineering Manager Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL Banking Asset Management Investment Lending Loans Mortgtages) required by our financial client in London. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £100k - 125k + Bonus + Pension
30/03/2026
Full time
Head of Data (Architect Architecture Data Development Engineer Engineering Manager Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL Banking Asset Management Investment Lending Loans Mortgtages) required by our financial client in London. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect/Head of Data Management experience Experience aligning the strategy with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Head of Data (Architect Architecture Data Development Engineer Engineering Manager Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL Banking Asset Management Investment Lending Loans Mortgtages) required by our financial client in London. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £100k - 125k + Bonus + Pension
Joseph Harry Ltd
Head of Data SQL Azure AI Finance Financial Services London
Joseph Harry Ltd
Head of Data (Architect Architecture Data Development Engineer Engineering Manager Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL Banking Asset Management Investment Lending Loans Mortgtages) required by our financial client in London. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect/Head of Data Management experience Experience aligning the strategy with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Head of Data (Architect Architecture Data Development Engineer Engineering Manager Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL Banking Asset Management Investment Lending Loans Mortgtages) required by our financial client in London. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £125k - 140k + Bonus + Pension
30/03/2026
Full time
Head of Data (Architect Architecture Data Development Engineer Engineering Manager Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL Banking Asset Management Investment Lending Loans Mortgtages) required by our financial client in London. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect/Head of Data Management experience Experience aligning the strategy with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Head of Data (Architect Architecture Data Development Engineer Engineering Manager Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL Banking Asset Management Investment Lending Loans Mortgtages) required by our financial client in London. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £125k - 140k + Bonus + Pension
Joseph Harry Ltd
Data Engineering Manager Azure AI Finance Croydon London
Joseph Harry Ltd Croydon, Surrey
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Croydon, London. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Croydon, London. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £80k - 100k + Bonus + Pension
30/03/2026
Full time
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Croydon, London. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Croydon, London. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £80k - 100k + Bonus + Pension
Joseph Harry Ltd
Data Engineering Manager Azure AI Finance Croydon London
Joseph Harry Ltd Croydon, Surrey
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Croydon, London. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Croydon, London. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £100k - 125k + Bonus + Pension
30/03/2026
Full time
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Croydon, London. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Croydon, London. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £100k - 125k + Bonus + Pension
Joseph Harry Ltd
Data Engineering Manager Azure AI Finance Croydon London
Joseph Harry Ltd Croydon, Surrey
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Croydon, London. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Croydon, London. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £125k - 150k + Bonus + Pension
30/03/2026
Full time
Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Croydon, London. You MUST have the following: Good experience as a Data Engineering Manager/Lead Data Architect Strong management experience- inheriting teams, raising standards and performance Strategy to align with the needs of the business Excellent design and architecture ability MS SQL Server Azure AI - even if outside work Agile Experience in a financial environment The following are DESIRABLE, not essential: Microsoft Fabric, Synapse, Databricks or Snowflake Role: Data Engineering Manager (Architect Architecture Data Development Engineer Engineering Management Head of Agile Microsoft Azure ML AI Automation Finance Financial Services Fabric Synapse DataBricks Snowflake SQL) required by our financial client in Croydon, London. You will inherit a team of 3, comprising two permanent staff and one contractor. The contractor is senior, the two permanent are more junior, making this a very hands-on role. It will be all-encompassing, involving data architecture, engineering for technical delivery and management to cover line-management of the team and alignment of the company's strategy with the roadmap for the data environment. In addition to this are data governance and regulatory compliance requirements that you will also have ownership of. On the engineering and architecture side, you will have good experience of leading companies from on-premise virtual machines to Azure. You will be seasoned in taking data projects from inception to design, architecture and technical delivery, contributing to the engineering yourself. In addition to taking environments to the cloud, you will also have some exposure to AI and ML and be comfortable in assessing what tools and products are most appropriate for the business' goals and evolution. On the managerial side, you will have led teams and have experience with line-management. If you have inherited teams previously, that would also be ideal. You will have worked in an FCA regulated environment and be familiar with the necessary requirements to be compliant from a data perspective. The journey you will take with this team will be to implement better monitoring, automation, migration to the cloud and then the adoption of AI and ML. As the business is c.200 people and the management team is strong and Agile, this could happen very quickly. The technology department has a hybrid working setup. You will be given the flexibility to come into the office as you wish although, in the initial months, it will probably be appropriate to go to the office 2-3 times/week. Salary: £125k - 150k + Bonus + Pension
Damia Group LTD
Data Architect
Damia Group LTD Hounslow, London
Data Architect - 70-90K base (DOE) - West London (hybrid) We are recruiting a Data Architect for one of our clients based in West London on a permanent basis. The Data Architect is responsible for leading the definition, standardization, and governance of data architecture across platforms and products. This role balances technical leadership, data architecture, and collaboration with engineering, product, and security teams to ensure scalable, reliable, and secure systems. Key responsibilities: Enforce data architectural guidelines and consistency across development teams and services Support established Data Governance and Data Quality frameworks, including tooling, policy enforcement, and stewardship models Ensure robust metadata management, lineage tracking, and data cataloguing using business glossaries and modern catalog tools Review and approve data architecture for major features, platforms, and technical initiatives Collaborate with technical leads and DevOps on system scalability, performance, and reliability Ensure data platforms are AI/ML-ready, with scalable infrastructure and clean, well-structured data pipelines Collaborate with data science and analytics teams to enable model deployment, automation, and MLOps best practices Promote innovation in generative AI, predictive analytics, and real-time decision support Align data architecture with security, compliance, and data governance requirements Lead the evolution of technical architecture documentation, models, and decision records Conduct architecture and design reviews with cross-functional teams Guide teams in the adoption of best practices in API design, modularity, cloud-native patterns, and event-driven systems Recommend data management best practices, covering data flows, architecture patterns, retention, archival, and purging strategies Coach and mentor engineers on data design, refactoring, and architectural reasoning Essential skills and experience: Proven experience designing and scaling enterprise-grade cloud data platforms (AWS preferred) Deep experience with AWS, Databricks, Power Platform, and Redshift (Snowflake a plus) Proficiency in AWS Glue, Qlik Talend, DBT, Airflow, and modern data integration tools. Excellent knowledge of Python, SQL, PowerQuery (M), and preferably Scala or PySpark Working knowledge with enterprise architecture frameworks (e.g., TOGAF), MLOps, and BI tools like Power BI and QuickSight Experience of generative AI platforms (e.g., Amazon Bedrock, Anthropic) Familiarity with infrastructure as code (Terraform), CI/CD practices (Jenkins, GitHub Actions), and observability (Grafana, Kibana) Proficiency in scripting and automation using Bash, Groovy, or equivalent Ability to balance long-term architectural vision with immediate delivery constraints Damia Group Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this permanent job, you accept our Data Protection Policy which can be found on our website. Please note that no terminology in this advert is intended to discriminate on the grounds of a person's gender, marital status, race, religion, colour, age, disability or sexual orientation. Every candidate will be assessed only in accordance with their merits, qualifications and ability to perform the duties of the job. Damia Group is acting as an Employment Business in relation to this vacancy and in accordance to Conduct Regulations 2003. The advertised salary range is dependent on experience and the required qualifications. Damia Group Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept our Data Protection Policy which can be found on our website. Please note that no terminology in this advert is intended to discriminate on the grounds of a person's gender, marital status, race, religion, colour, age, disability or sexual orientation. Every candidate will be assessed only in accordance with their merits, qualifications and ability to perform the duties of the job. Damia Group is acting as an Employment Business in relation to this vacancy and in accordance to Conduct Regulations 2003.
26/03/2026
Full time
Data Architect - 70-90K base (DOE) - West London (hybrid) We are recruiting a Data Architect for one of our clients based in West London on a permanent basis. The Data Architect is responsible for leading the definition, standardization, and governance of data architecture across platforms and products. This role balances technical leadership, data architecture, and collaboration with engineering, product, and security teams to ensure scalable, reliable, and secure systems. Key responsibilities: Enforce data architectural guidelines and consistency across development teams and services Support established Data Governance and Data Quality frameworks, including tooling, policy enforcement, and stewardship models Ensure robust metadata management, lineage tracking, and data cataloguing using business glossaries and modern catalog tools Review and approve data architecture for major features, platforms, and technical initiatives Collaborate with technical leads and DevOps on system scalability, performance, and reliability Ensure data platforms are AI/ML-ready, with scalable infrastructure and clean, well-structured data pipelines Collaborate with data science and analytics teams to enable model deployment, automation, and MLOps best practices Promote innovation in generative AI, predictive analytics, and real-time decision support Align data architecture with security, compliance, and data governance requirements Lead the evolution of technical architecture documentation, models, and decision records Conduct architecture and design reviews with cross-functional teams Guide teams in the adoption of best practices in API design, modularity, cloud-native patterns, and event-driven systems Recommend data management best practices, covering data flows, architecture patterns, retention, archival, and purging strategies Coach and mentor engineers on data design, refactoring, and architectural reasoning Essential skills and experience: Proven experience designing and scaling enterprise-grade cloud data platforms (AWS preferred) Deep experience with AWS, Databricks, Power Platform, and Redshift (Snowflake a plus) Proficiency in AWS Glue, Qlik Talend, DBT, Airflow, and modern data integration tools. Excellent knowledge of Python, SQL, PowerQuery (M), and preferably Scala or PySpark Working knowledge with enterprise architecture frameworks (e.g., TOGAF), MLOps, and BI tools like Power BI and QuickSight Experience of generative AI platforms (e.g., Amazon Bedrock, Anthropic) Familiarity with infrastructure as code (Terraform), CI/CD practices (Jenkins, GitHub Actions), and observability (Grafana, Kibana) Proficiency in scripting and automation using Bash, Groovy, or equivalent Ability to balance long-term architectural vision with immediate delivery constraints Damia Group Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this permanent job, you accept our Data Protection Policy which can be found on our website. Please note that no terminology in this advert is intended to discriminate on the grounds of a person's gender, marital status, race, religion, colour, age, disability or sexual orientation. Every candidate will be assessed only in accordance with their merits, qualifications and ability to perform the duties of the job. Damia Group is acting as an Employment Business in relation to this vacancy and in accordance to Conduct Regulations 2003. The advertised salary range is dependent on experience and the required qualifications. Damia Group Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept our Data Protection Policy which can be found on our website. Please note that no terminology in this advert is intended to discriminate on the grounds of a person's gender, marital status, race, religion, colour, age, disability or sexual orientation. Every candidate will be assessed only in accordance with their merits, qualifications and ability to perform the duties of the job. Damia Group is acting as an Employment Business in relation to this vacancy and in accordance to Conduct Regulations 2003.
Stott and May
Principle Data Engineer
Stott and May
Principal Data Engineer - Hybrid (London/Winchester) We're seeking a hands-on Principal Data Engineer to design and deliver enterprise-scale, cloud-native data platforms that power analytics, reporting, and Real Time decision-making. This is a strategic technical leadership role where you'll shape architecture, mentor engineers, and deliver end-to-end solutions across a modern AWS/Databricks stack. What you'll do Lead the design of scalable, secure data architectures on AWS. Build and optimise ETL/ELT pipelines for batch and streaming data. Deploy and manage Apache Spark jobs on Databricks and Delta Lake. Write production-grade Python and SQL for large-scale data transformations. Drive data quality, governance, and automation through CI/CD and IaC. Collaborate with data scientists, analysts, and business stakeholders. Mentor and guide data engineering teams. What we're looking for Proven experience in senior/principal data engineering roles. Expertise in AWS, Databricks, Apache Spark, Python, and SQL. Strong background in cloud-native data platforms, Real Time processing, and data lakes. Hands-on experience with tools such as Airflow, Kafka, Docker, GitLab CI/CD. Excellent stakeholder engagement and leadership skills. What's on offer £84000 salary + 10% bonus 6% pension contribution Private medical & flexible benefits package 25 days annual leave (plus buy/sell options) Hybrid working - travel to London or Winchester once/twice per week Join a company at the forefront of media, connectivity, and smart technology, where your work directly powers millions of daily connections across the UK.
06/10/2025
Full time
Principal Data Engineer - Hybrid (London/Winchester) We're seeking a hands-on Principal Data Engineer to design and deliver enterprise-scale, cloud-native data platforms that power analytics, reporting, and Real Time decision-making. This is a strategic technical leadership role where you'll shape architecture, mentor engineers, and deliver end-to-end solutions across a modern AWS/Databricks stack. What you'll do Lead the design of scalable, secure data architectures on AWS. Build and optimise ETL/ELT pipelines for batch and streaming data. Deploy and manage Apache Spark jobs on Databricks and Delta Lake. Write production-grade Python and SQL for large-scale data transformations. Drive data quality, governance, and automation through CI/CD and IaC. Collaborate with data scientists, analysts, and business stakeholders. Mentor and guide data engineering teams. What we're looking for Proven experience in senior/principal data engineering roles. Expertise in AWS, Databricks, Apache Spark, Python, and SQL. Strong background in cloud-native data platforms, Real Time processing, and data lakes. Hands-on experience with tools such as Airflow, Kafka, Docker, GitLab CI/CD. Excellent stakeholder engagement and leadership skills. What's on offer £84000 salary + 10% bonus 6% pension contribution Private medical & flexible benefits package 25 days annual leave (plus buy/sell options) Hybrid working - travel to London or Winchester once/twice per week Join a company at the forefront of media, connectivity, and smart technology, where your work directly powers millions of daily connections across the UK.
La Fosse Associates Limited
ML Ops Engineer (Contract) - Financial Services - Outside IR35
La Fosse Associates Limited
ML Ops Lead (Engineer) required for a 6-month contract with a global financial services firm in London. Join a high-impact team building out scalable AI/ML solutions on an Azure and Databricks platform supporting varies areas of the organisation. Implement solid structure for Gen AI deployment and implement best practices within the team, leading on ML Ops frameworks & CI/CD pipeline design Build & scale AI/ML tooling across quants, risk, and trading desks Drive adoption of GenAI & LLMs with Azure OpenAI, Unity Catalog & more Partner with quant teams and central AI functions Required: 2+ years ML Ops experience + 3+ years AI/ML engineering Excellent knowledge of ML frameworks and associated libraries in Python. Expertise in Azure Databricks, Python, Azure DevOps, ML life cycle Financial data/trading/derivatives domain experience is a plus 2 stage interview process Location: Hybrid Central London office & remote (flexible) 6 months duration (possible extension)
03/10/2025
Contractor
ML Ops Lead (Engineer) required for a 6-month contract with a global financial services firm in London. Join a high-impact team building out scalable AI/ML solutions on an Azure and Databricks platform supporting varies areas of the organisation. Implement solid structure for Gen AI deployment and implement best practices within the team, leading on ML Ops frameworks & CI/CD pipeline design Build & scale AI/ML tooling across quants, risk, and trading desks Drive adoption of GenAI & LLMs with Azure OpenAI, Unity Catalog & more Partner with quant teams and central AI functions Required: 2+ years ML Ops experience + 3+ years AI/ML engineering Excellent knowledge of ML frameworks and associated libraries in Python. Expertise in Azure Databricks, Python, Azure DevOps, ML life cycle Financial data/trading/derivatives domain experience is a plus 2 stage interview process Location: Hybrid Central London office & remote (flexible) 6 months duration (possible extension)
Tenth Revolution Group
Lead Azure Data Engineer
Tenth Revolution Group
Contract Opportunity: Lead Azure Data Engineer (Remote - 500/day Outside IR35) We're hiring for a Lead Azure Data Engineer to join our team on a hybrid contract based in London, supporting key finance stakeholders and transforming our data platform. This role offers the chance to shape the future of financial reporting through cutting-edge cloud engineering and data architecture. Location : Central London (UK-based candidates only) Rate : 500/day IR35 Status : Outside IR35 Start Date : ASAP - Interviews next week Duration : 6 months with potential for extension The Role You'll lead the design and delivery of scalable data products that support financial analysis and decision-making. Working closely with BI and Analytics teams, you'll help evolve our data warehouse and implement best-in-class engineering practices across Azure. Key Responsibilities Build and enhance ETL/ELT pipelines in Azure Databricks Develop facts and dimensions for financial reporting Collaborate with cross-functional teams to deliver robust data solutions Optimize data workflows for performance and cost-efficiency Implement governance and security using Unity Catalog Drive automation and CI/CD practices across the data platform Explore new technologies to improve data ingestion and self-service Essential Skills Azure Databricks : Expert in Spark (SQL, PySpark), Databricks Workflows Data Pipeline Design : Proven experience in scalable ETL/ELT development Azure Services : Data Lake, Blob Storage, Synapse Data Governance : Unity Catalog, access control, metadata management Performance Tuning : Partitioning, caching, Spark job optimization Cloud Architecture : Infrastructure-as-code, monitoring, automation Finance Domain Knowledge : Experience with financial systems and reporting Data Modelling : Kimball methodology, star schemas Retail Experience : Preferred but not essential About the Team We're a collaborative data function made up of BI Developers and Data Engineers. We work end-to-end on solutions, share knowledge, and support each other's growth. Our culture values curiosity, innovation, and continuous learning. Interested? We have limited interview slots next week and aim to fill this role by the end of the month. Please send me a copy of your CV if you meet the requirements
02/10/2025
Contractor
Contract Opportunity: Lead Azure Data Engineer (Remote - 500/day Outside IR35) We're hiring for a Lead Azure Data Engineer to join our team on a hybrid contract based in London, supporting key finance stakeholders and transforming our data platform. This role offers the chance to shape the future of financial reporting through cutting-edge cloud engineering and data architecture. Location : Central London (UK-based candidates only) Rate : 500/day IR35 Status : Outside IR35 Start Date : ASAP - Interviews next week Duration : 6 months with potential for extension The Role You'll lead the design and delivery of scalable data products that support financial analysis and decision-making. Working closely with BI and Analytics teams, you'll help evolve our data warehouse and implement best-in-class engineering practices across Azure. Key Responsibilities Build and enhance ETL/ELT pipelines in Azure Databricks Develop facts and dimensions for financial reporting Collaborate with cross-functional teams to deliver robust data solutions Optimize data workflows for performance and cost-efficiency Implement governance and security using Unity Catalog Drive automation and CI/CD practices across the data platform Explore new technologies to improve data ingestion and self-service Essential Skills Azure Databricks : Expert in Spark (SQL, PySpark), Databricks Workflows Data Pipeline Design : Proven experience in scalable ETL/ELT development Azure Services : Data Lake, Blob Storage, Synapse Data Governance : Unity Catalog, access control, metadata management Performance Tuning : Partitioning, caching, Spark job optimization Cloud Architecture : Infrastructure-as-code, monitoring, automation Finance Domain Knowledge : Experience with financial systems and reporting Data Modelling : Kimball methodology, star schemas Retail Experience : Preferred but not essential About the Team We're a collaborative data function made up of BI Developers and Data Engineers. We work end-to-end on solutions, share knowledge, and support each other's growth. Our culture values curiosity, innovation, and continuous learning. Interested? We have limited interview slots next week and aim to fill this role by the end of the month. Please send me a copy of your CV if you meet the requirements
Client Server Ltd.
Python Software Engineer Databricks
Client Server Ltd.
Python Software Engineer / Developer (Azure Databricks FastAPI) London to £90k Are you a data centric Python Software Engineer? You could be progressing your career in a senior, hands-on role at a fast growing, global Insurance Underwriting technology provider that focusses on developing a complex SaaS calculation platform for rapidly growing Cybersecurity insurance markets. As a Python Software Engineer you will design and develop data centric backend services for the core platform with a focus on Python coding and building FastAPIs within an Azure environment with Databricks. You'll participate in the design and implementation of advanced mechanisms for data ingestion, transformation, mass parallel orchestration of network IO and will model data for a variety of purposes. You'll also collaborate with the Infrastructure team to maintain Infrastructure as Code and develop new features on the engineering platform. Location / WFH: There's a hybrid work from home model with three days a week in the high spec London, City office with rooftop bar. About you: You have strong Python software engineering skills You have a good understanding of working with data at scale and strong SQL skills (PostgreSQL, SQL Server and Databricks) You have experience of building FastAPIs You have a thorough knowledge of Computer Science fundamentals including Data Structures, Design Patterns, OOP You're collaborative and pragmatic with great communication skills What's in it for you: As a Python Software Engineer you will receive a competitive package: Salary to £90k + bonus 25 days holiday Private Medical Insurance (including dental and optical cashback) Life Insurance, Income Protection Pension Subsidised gym membership X4 paid volunteering days per year Season ticket loan Employee Assistance Programme Impactful role with great career progression Apply now to find out more about this Python Software Engineer / Developer opportunity. At Client Server we believe in a diverse workplace that allows people to play to their strengths and continually learn. We're an equal opportunities employer whose people come from all walks of life and will never discriminate based on race, colour, religion, sex, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status. The clients we work with share our values.
01/10/2025
Full time
Python Software Engineer / Developer (Azure Databricks FastAPI) London to £90k Are you a data centric Python Software Engineer? You could be progressing your career in a senior, hands-on role at a fast growing, global Insurance Underwriting technology provider that focusses on developing a complex SaaS calculation platform for rapidly growing Cybersecurity insurance markets. As a Python Software Engineer you will design and develop data centric backend services for the core platform with a focus on Python coding and building FastAPIs within an Azure environment with Databricks. You'll participate in the design and implementation of advanced mechanisms for data ingestion, transformation, mass parallel orchestration of network IO and will model data for a variety of purposes. You'll also collaborate with the Infrastructure team to maintain Infrastructure as Code and develop new features on the engineering platform. Location / WFH: There's a hybrid work from home model with three days a week in the high spec London, City office with rooftop bar. About you: You have strong Python software engineering skills You have a good understanding of working with data at scale and strong SQL skills (PostgreSQL, SQL Server and Databricks) You have experience of building FastAPIs You have a thorough knowledge of Computer Science fundamentals including Data Structures, Design Patterns, OOP You're collaborative and pragmatic with great communication skills What's in it for you: As a Python Software Engineer you will receive a competitive package: Salary to £90k + bonus 25 days holiday Private Medical Insurance (including dental and optical cashback) Life Insurance, Income Protection Pension Subsidised gym membership X4 paid volunteering days per year Season ticket loan Employee Assistance Programme Impactful role with great career progression Apply now to find out more about this Python Software Engineer / Developer opportunity. At Client Server we believe in a diverse workplace that allows people to play to their strengths and continually learn. We're an equal opportunities employer whose people come from all walks of life and will never discriminate based on race, colour, religion, sex, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status. The clients we work with share our values.
Experis IT
Databricks Engineer
Experis IT
Databricks Engineer London- hybrid- 3 days per week on-site 6 months + UMBRELLA only- Inside IR35 Key Responsibilities Design, develop, and maintain ETL/ELT pipelines using Airflow for orchestration and scheduling. Build and manage data transformation workflows in DBT running on Databricks . Optimize data models in Delta Lake for performance, scalability, and cost efficiency. Collaborate with analytics, BI, and data science teams to deliver clean, reliable datasets. Implement data quality checks (dbt tests, monitoring) and ensure governance standards. Manage and monitor Databricks clusters & SQL Warehouses to support workloads. Contribute to CI/CD practices for data pipelines (version control, testing, deployments). Troubleshoot pipeline failures, performance bottlenecks, and scaling challenges. Document workflows, transformations, and data models for knowledge sharing. Required Skills & Qualifications 3-6 years of experience as a Data Engineer (or similar). Hands-on expertise with: DBT (dbt-core, dbt-databricks adapter, testing & documentation). Apache Airflow (DAG design, operators, scheduling, dependencies). Databricks (Spark, SQL, Delta Lake, job clusters, SQL Warehouses). Strong SQL skills and understanding of data modelling (Kimball, Data Vault, or similar) . Proficiency in Python for Scripting and pipeline development. Experience with CI/CD tools (eg, GitHub Actions, GitLab CI, Azure DevOps). Familiarity with cloud platforms (AWS, Azure, or GCP). Strong problem-solving skills and ability to work in cross-functional teams. All profiles will be reviewed against the required skills and experience. Due to the high number of applications we will only be able to respond to successful applicants in the first instance. We thank you for your interest and the time taken to apply!
01/10/2025
Contractor
Databricks Engineer London- hybrid- 3 days per week on-site 6 months + UMBRELLA only- Inside IR35 Key Responsibilities Design, develop, and maintain ETL/ELT pipelines using Airflow for orchestration and scheduling. Build and manage data transformation workflows in DBT running on Databricks . Optimize data models in Delta Lake for performance, scalability, and cost efficiency. Collaborate with analytics, BI, and data science teams to deliver clean, reliable datasets. Implement data quality checks (dbt tests, monitoring) and ensure governance standards. Manage and monitor Databricks clusters & SQL Warehouses to support workloads. Contribute to CI/CD practices for data pipelines (version control, testing, deployments). Troubleshoot pipeline failures, performance bottlenecks, and scaling challenges. Document workflows, transformations, and data models for knowledge sharing. Required Skills & Qualifications 3-6 years of experience as a Data Engineer (or similar). Hands-on expertise with: DBT (dbt-core, dbt-databricks adapter, testing & documentation). Apache Airflow (DAG design, operators, scheduling, dependencies). Databricks (Spark, SQL, Delta Lake, job clusters, SQL Warehouses). Strong SQL skills and understanding of data modelling (Kimball, Data Vault, or similar) . Proficiency in Python for Scripting and pipeline development. Experience with CI/CD tools (eg, GitHub Actions, GitLab CI, Azure DevOps). Familiarity with cloud platforms (AWS, Azure, or GCP). Strong problem-solving skills and ability to work in cross-functional teams. All profiles will be reviewed against the required skills and experience. Due to the high number of applications we will only be able to respond to successful applicants in the first instance. We thank you for your interest and the time taken to apply!
Jacobs
Data Engineer
Jacobs
At Jacobs, we'll inspire and empower you to deliver your best work so you can evolve, grow and succeed - today and into tomorrow. With more than 55,000 people in 40 countries, working at Jacobs offers an exciting range of opportunities to develop your career within a supportive and diverse team who always strive to do the right thing for our people, clients and communities. People are Jacobs' greatest asset, and we offer a competitive package to retain and attract the best talent. In addition to the benefits you'd expect, UK employees also receive free single medical cover and digital GP service, family friendly benefits such as enhanced parental leave pay, free membership of employee assistance and parental programmes, plus reimbursement towards relevant professional development and memberships. We also give back to our communities through our Collectively program which incorporates matched-funding, paid volunteering time and charitable donations. Work-life balance and flexibility is a key focus area for Jacobs. We're happy to discuss hybrid, part-time and flexible working hours, patterns and locations to suit you and our business. About the Opportunity We have a unique and exciting opportunity to join the Digital Factory as a Data Engineer (Azure). As a Data Engineer, you will be working closely with a team of data scientists and software developers to design, provision and implement the appropriate technologies for our solutions. You will assist our business in setting up systems for data ingestion from business operations. The successful candidate will need an up-to-date technical skill base and a willingness to regularly review and develop their skills to ensure the team is best positioned to respond to market changes and developments. This technical skill base development will also be supported by the team. In this role you will be an expert who is one of the go-to people for queries relating to data engineering from across the business. Working closely with the team, you will consult, analyse the requirements and develop solutions to allow real business change. Good knowledge of Azure cloud services Experience in any of Azure data solutions (Data Lake, Data Factory, Synapse Analytics, etc.) Experience in relational or non-relational database solutions Programming skills (Python, C#, etc.) Understanding of data architecture patterns, including data mesh, microservices, etc. Familiarity with Agile ways of working Bonus Points for The ability to build data pipelines within Azure Data Factory Databricks/Spark Microsoft certifications Experience in designing and implementing distributed systems Experience with alternative cloud providers We are a professional data and technology house; knowledge of engineering helps but is by no means required. Our Culture Our values stand on a foundation of safety, integrity, inclusion and diversity. We put people at the heart of our business and we truly believe that by supporting one another through our culture of caring, we all succeed. We value positive mental health and a sense of belonging for all employees. Find out more about life at Jacobs. We aim to embed inclusion and diversity in everything we do. We know that if we are inclusive, we're more connected, and if we are diverse, we're more creative. We accept people for who they are, regardless of age, disability, gender identity, gender expression, marital status, mental health, race, faith or belief, sexual orientation, socioeconomic background, and whether you're pregnant or on family leave. This is reflected in our wide range of Global Employee Networks centred on inclusion and diversity - ACE, Careers, Enlace, Harambee, OneWorld, Prism, Vetnet, and Women's - find out more about our employee networks here. Jacobs partners with VERCIDA to help us attract and retain diverse talent. For greater online accessibility please visit to view and access our roles. As a Disability Confident employer, we will interview all disabled applicants who meet the minimum criteria for a vacancy. We welcome applications from candidates who are seeking flexible working and from those who may not meet all the listed requirements for a role If you require further support or reasonable adjustments with regards to the recruitment process (for example, you require the application form in a different format), please contact the team . Your application experience is important to us and we're keen to adapt to make every interaction even better. We welcome feedback on our recruitment process and if you need more from us before deciding to join Jacobs then please let us know.
01/02/2022
Full time
At Jacobs, we'll inspire and empower you to deliver your best work so you can evolve, grow and succeed - today and into tomorrow. With more than 55,000 people in 40 countries, working at Jacobs offers an exciting range of opportunities to develop your career within a supportive and diverse team who always strive to do the right thing for our people, clients and communities. People are Jacobs' greatest asset, and we offer a competitive package to retain and attract the best talent. In addition to the benefits you'd expect, UK employees also receive free single medical cover and digital GP service, family friendly benefits such as enhanced parental leave pay, free membership of employee assistance and parental programmes, plus reimbursement towards relevant professional development and memberships. We also give back to our communities through our Collectively program which incorporates matched-funding, paid volunteering time and charitable donations. Work-life balance and flexibility is a key focus area for Jacobs. We're happy to discuss hybrid, part-time and flexible working hours, patterns and locations to suit you and our business. About the Opportunity We have a unique and exciting opportunity to join the Digital Factory as a Data Engineer (Azure). As a Data Engineer, you will be working closely with a team of data scientists and software developers to design, provision and implement the appropriate technologies for our solutions. You will assist our business in setting up systems for data ingestion from business operations. The successful candidate will need an up-to-date technical skill base and a willingness to regularly review and develop their skills to ensure the team is best positioned to respond to market changes and developments. This technical skill base development will also be supported by the team. In this role you will be an expert who is one of the go-to people for queries relating to data engineering from across the business. Working closely with the team, you will consult, analyse the requirements and develop solutions to allow real business change. Good knowledge of Azure cloud services Experience in any of Azure data solutions (Data Lake, Data Factory, Synapse Analytics, etc.) Experience in relational or non-relational database solutions Programming skills (Python, C#, etc.) Understanding of data architecture patterns, including data mesh, microservices, etc. Familiarity with Agile ways of working Bonus Points for The ability to build data pipelines within Azure Data Factory Databricks/Spark Microsoft certifications Experience in designing and implementing distributed systems Experience with alternative cloud providers We are a professional data and technology house; knowledge of engineering helps but is by no means required. Our Culture Our values stand on a foundation of safety, integrity, inclusion and diversity. We put people at the heart of our business and we truly believe that by supporting one another through our culture of caring, we all succeed. We value positive mental health and a sense of belonging for all employees. Find out more about life at Jacobs. We aim to embed inclusion and diversity in everything we do. We know that if we are inclusive, we're more connected, and if we are diverse, we're more creative. We accept people for who they are, regardless of age, disability, gender identity, gender expression, marital status, mental health, race, faith or belief, sexual orientation, socioeconomic background, and whether you're pregnant or on family leave. This is reflected in our wide range of Global Employee Networks centred on inclusion and diversity - ACE, Careers, Enlace, Harambee, OneWorld, Prism, Vetnet, and Women's - find out more about our employee networks here. Jacobs partners with VERCIDA to help us attract and retain diverse talent. For greater online accessibility please visit to view and access our roles. As a Disability Confident employer, we will interview all disabled applicants who meet the minimum criteria for a vacancy. We welcome applications from candidates who are seeking flexible working and from those who may not meet all the listed requirements for a role If you require further support or reasonable adjustments with regards to the recruitment process (for example, you require the application form in a different format), please contact the team . Your application experience is important to us and we're keen to adapt to make every interaction even better. We welcome feedback on our recruitment process and if you need more from us before deciding to join Jacobs then please let us know.
Senior Data Engineer Cloud, Azure, SQL, Python, Java
Ampersand Consulting
Ampersand Consulting Ampersand Consulting Senior Data Engineer (Cloud, Azure, SQL, Python, Java) The UK's leading provider of IT solutions and services is looking to hire an experienced Senior Data Engineer (Cloud, Azure, SQL, Python, Java) to join their rapidly expanding Data Analytics team working closely with the Chief Data Scientist based in Central London. The Senior Data Engineer (Cloud, Azure, SQL, Python, Java) will implement complex multi/hybrid cloud based big data projects with a focus on collecting, parsing, managing, analyzing and visualizing large sets of data to turn information into insights using multiple technology platforms. The Senior Data Engineer (Cloud, Azure, SQL, Python, Java) will be a technical contributor with hands-on knowledge of all phases in building large-scale cloud based distributed data processing systems and applications. Responsibilities for the Senior Data Engineer (Cloud, Azure, SQL, Python, Java) Lead the design, implementation, and continuous delivery of pipelines using distributed Azure based big data technologies supporting data processing initiatives across batch and streaming datasets Experience in hybrid cloud solutions and experience in GCP resources Responsible for development using Scala, Python languages and Big Data Frameworks such as Spark, EMR, Kafka, Storm, Jenkins, Jfrog Artifactory and DataBricks Provide administrative support on deployed Azure platform components Identify, evaluate and implement cutting edge big data pipelines and frameworks required to provide requested capabilities to integrate external data sources and APIs Review, analyse and evaluate market requirements, business requirements and project briefs in order to design the most appropriate end-to-end technology solutions Process and manage high volume real time customer interaction streams Provide architectural support by building Proof of Concepts & Prototypes Self-Starter to deliver data engineering solutions to optimize both the cost and existing solution Requirements for the Senior Data Engineer (Cloud, Azure, SQL, Python, Java) Extensive Software Industry experience Development experience with Azure services (such as Data Factories, Data Lake Storage, SQL Elastic Pools, Data Pipeline, DataBricks, Kubernetes Services, Apache Nifi, jFrog etc) Development experience with GCP services (Cloud Storage, Cloud Spanner, BigQuery) Experience with Apache Spark and NoSQL Implementation Extensive working knowledge in different programming or scripting languages like Scala, Java, Linux, Shell, SQL, Python Proficiency working with structured, semi-structured and unstructured data sets including social, web logs and real time streaming data feeds Able to tune Big Data solutions to improve performance and end-user experience Knowledge on Visualization and Data Science Tools Expert level usage with Jenkins, GitHub is preferred If you would like to apply for the Senior Data Engineer (Cloud, Azure, SQL, Python, Java) role then please click the apply button
15/02/2019
Ampersand Consulting Ampersand Consulting Senior Data Engineer (Cloud, Azure, SQL, Python, Java) The UK's leading provider of IT solutions and services is looking to hire an experienced Senior Data Engineer (Cloud, Azure, SQL, Python, Java) to join their rapidly expanding Data Analytics team working closely with the Chief Data Scientist based in Central London. The Senior Data Engineer (Cloud, Azure, SQL, Python, Java) will implement complex multi/hybrid cloud based big data projects with a focus on collecting, parsing, managing, analyzing and visualizing large sets of data to turn information into insights using multiple technology platforms. The Senior Data Engineer (Cloud, Azure, SQL, Python, Java) will be a technical contributor with hands-on knowledge of all phases in building large-scale cloud based distributed data processing systems and applications. Responsibilities for the Senior Data Engineer (Cloud, Azure, SQL, Python, Java) Lead the design, implementation, and continuous delivery of pipelines using distributed Azure based big data technologies supporting data processing initiatives across batch and streaming datasets Experience in hybrid cloud solutions and experience in GCP resources Responsible for development using Scala, Python languages and Big Data Frameworks such as Spark, EMR, Kafka, Storm, Jenkins, Jfrog Artifactory and DataBricks Provide administrative support on deployed Azure platform components Identify, evaluate and implement cutting edge big data pipelines and frameworks required to provide requested capabilities to integrate external data sources and APIs Review, analyse and evaluate market requirements, business requirements and project briefs in order to design the most appropriate end-to-end technology solutions Process and manage high volume real time customer interaction streams Provide architectural support by building Proof of Concepts & Prototypes Self-Starter to deliver data engineering solutions to optimize both the cost and existing solution Requirements for the Senior Data Engineer (Cloud, Azure, SQL, Python, Java) Extensive Software Industry experience Development experience with Azure services (such as Data Factories, Data Lake Storage, SQL Elastic Pools, Data Pipeline, DataBricks, Kubernetes Services, Apache Nifi, jFrog etc) Development experience with GCP services (Cloud Storage, Cloud Spanner, BigQuery) Experience with Apache Spark and NoSQL Implementation Extensive working knowledge in different programming or scripting languages like Scala, Java, Linux, Shell, SQL, Python Proficiency working with structured, semi-structured and unstructured data sets including social, web logs and real time streaming data feeds Able to tune Big Data solutions to improve performance and end-user experience Knowledge on Visualization and Data Science Tools Expert level usage with Jenkins, GitHub is preferred If you would like to apply for the Senior Data Engineer (Cloud, Azure, SQL, Python, Java) role then please click the apply button

Modal Window

  • Home
  • Contact
  • About Us
  • FAQs
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • IT blog
  • Facebook
  • Twitter
  • LinkedIn
  • Youtube
© 2008-2026 IT Job Board