it job board logo
  • Home
  • Find IT Jobs
  • Register CV
  • Register as Employer
  • Contact us
  • Career Advice
  • Recruiting? Post a job
  • Sign in
  • Sign up
  • Home
  • Find IT Jobs
  • Register CV
  • Register as Employer
  • Contact us
  • Career Advice
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

122 jobs found

Email me jobs like this
Refine Search
Current Search
databricks data engineer
Staffworx Limited
Data & AI Senior Consultants - Dynamic AI Consulting firm
Staffworx Limited
Data & AI Senior Consultants Location - We are flexible: onsite, hybrid or fully remote, depending on what works for you and the client, UK or Netherlands based. What you will actually be doing This is not a role where you build clever models that never get used. Your focus is on creating measurable value for clients using data science, machine learning and GenAI, in a consulting and advisory context. You will own work from the very beginning, asking questions like "What value are we trying to create here?" and "Is this the right problem to solve?" through to "It is live, stakeholders are using it and we can see the impact in the numbers." You will work fairly independently and you will also be someone that more junior team members look to for help and direction. A big part of the job is taking messy, ambiguous business and technical problems and turning them into clear, valuable solutions that make sense to the client. You will do this in a client facing role. That means you will be in the room for key conversations, providing honest advice, managing expectations and helping clients make good decisions about where and how to use AI. What your day to day might look like Getting to the heart of the problem Meeting with stakeholders who may not be clear on what they really need Using discovery sessions, workshops and structured questioning to uncover the real business problem Framing success in terms of value. For example higher revenue, lower cost, reduced risk, increased efficiency or better customer experience Translating business goals into a clear roadmap of data and AI work that everyone can understand Advising clients when AI is not the right solution and suggesting simpler or more cost effective alternatives Consulting and advisory work Acting as a trusted advisor to product owners, heads of department and executives Helping clients prioritise use cases based on value, feasibility and risk Communicating trade offs in a simple way. For example accuracy versus speed, innovation versus compliance, cost versus impact Preparing and delivering client presentations, proposals and updates that tell a clear story Supporting pre sales activities where needed, such as scoping work, estimating effort and defining outcomes Managing client expectations, risks and dependencies so there are no surprises Building things that actually work Once the problem and value are clear, you will design and deliver production ready ML and GenAI solutions. That includes: Designing and building data pipelines, batch or streaming, that support the desired outcomes Working with engineers and architects so your work fits cleanly into existing systems Making sure what you build is reliable in production and moves the needle on agreed metrics, not just offline benchmarks Explaining design decisions to both technical and non technical stakeholders GenAI work You will work with GenAI in ways that are grounded in real use cases and business value: Building RAG systems that improve search, content discovery or productivity rather than existing for their own sake Implementing guardrails so models do not leak PII or generate harmful or off brand content Defining and tracking the right metrics so you and the client can see whether a GenAI solution is useful and cost effective Fine tuning and optimising models so they perform well for the use case and budget Designing agentic workflows where they genuinely improve outcomes rather than add complexity Helping clients understand what GenAI can and cannot do in practice Keeping it running You will set up the foundations that protect value over time: Experiment tracking and model versioning so you know what works and can roll back safely CI/CD pipelines for ML so improvements reach users quickly and reliably Monitoring and alerting for models and data so you can catch issues before they damage trust or results Communicating operational risks and mitigations to non technical stakeholders in plain language Security, quality and compliance You will help make sure: Data is accurate, traceable and well managed so decisions are sound Sensitive data is handled correctly, protecting users and the business Regulatory and compliance requirements are met, avoiding costly mistakes Clients understand the risk profile of AI solutions and the controls in place Working with people You will be a bridge between technical and non technical teams, inside our organisation and on the client side. That means: Explaining complex ML and GenAI ideas in plain language, always tied to business outcomes Working closely with product managers, engineers and business stakeholders to prioritise work that matters Facilitating workshops, playback sessions and show and tells that build buy in and understanding Coaching and supporting junior colleagues so the whole team can deliver more value Representing the company professionally in client meetings and at industry events What we are looking for Experience Around 3 to 6 years of experience shipping ML or GenAI solutions into production A track record of seeing projects through from discovery to delivery, with clear impact Experience working directly with stakeholders or clients in a consulting, advisory or product facing role Education A Bachelor or Master degree in a quantitative field such as Computer Science, Data Science, Statistics, Mathematics or Engineering or Equivalent experience that shows you can deliver results Technical skills Core skills Strong Python and SQL, with clean, maintainable code Solid understanding of ML fundamentals. For example feature engineering, model selection, handling imbalanced data, choosing and interpreting metrics Experience with PyTorch or TensorFlow GenAI specific Hands on experience with LLM APIs or open source models such as Llama or Mistral Experience building RAG systems with vector databases such as FAISS, Pinecone or Weaviate Ability to evaluate and improve prompts and retrieval quality using clear metrics Understanding of safety practices such as PII redaction and content filtering Exposure to agentic frameworks Cloud and infrastructure Comfortable working in at least one major cloud provider. AWS, GCP or Azure Familiar with Docker and CI/CD pipelines Experience with managed ML platforms such as SageMaker, Vertex AI or Azure ML Data engineering and MLOps Experience with data warehouses such as Snowflake, BigQuery or Redshift Workflow orchestration using tools like Airflow or Dagster Experience with MLOps tools such as MLflow, Weights and Biases or similar Awareness of data and model drift, and how to monitor and respond to it before it erodes value Soft skills, the things that really matter You are comfortable in client facing settings and can build trust quickly You can talk with anyone from a CEO to a new data analyst, and always bring the conversation back to business value You can take a vague, messy business problem and turn it into a clear technical plan that links to outcomes and metrics You are happy to push back and challenge assumptions respectfully when it is in the client's best interest You like helping other people grow and are happy to mentor junior colleagues You communicate clearly in writing and in person Nice to have, not required Do not rule yourself out if you do not have these. They are a bonus, not a checklist. Experience with Delta Lake, Iceberg, Spark or Databricks, Palantir Experience optimising LLM serving with tools such as vLLM, TGI or TensorRT LLM Search and ranking experience. For example Elasticsearch or rerankers Background in time series forecasting, causal inference, recommender systems or optimisation Experience managing cloud costs and IAM so value is not lost to waste Ability to work in other languages where needed. For example Java, Scala, Go or bash Experience with BI tools such as Looker or Tableau Prior consulting experience or leading client projects end to end Contributions to open source, conference talks or published papers that show your ability to share ideas and influence the wider community Got a background that fits and you're up for a new challenge? Send over your latest CV, expectations and availability. Staffworx Limited is a UK based recruitment consultancy partnering with leading global brands across digital, AI, software, and business consulting. Let's talk about what you could add to the mix.
22/01/2026
Full time
Data & AI Senior Consultants Location - We are flexible: onsite, hybrid or fully remote, depending on what works for you and the client, UK or Netherlands based. What you will actually be doing This is not a role where you build clever models that never get used. Your focus is on creating measurable value for clients using data science, machine learning and GenAI, in a consulting and advisory context. You will own work from the very beginning, asking questions like "What value are we trying to create here?" and "Is this the right problem to solve?" through to "It is live, stakeholders are using it and we can see the impact in the numbers." You will work fairly independently and you will also be someone that more junior team members look to for help and direction. A big part of the job is taking messy, ambiguous business and technical problems and turning them into clear, valuable solutions that make sense to the client. You will do this in a client facing role. That means you will be in the room for key conversations, providing honest advice, managing expectations and helping clients make good decisions about where and how to use AI. What your day to day might look like Getting to the heart of the problem Meeting with stakeholders who may not be clear on what they really need Using discovery sessions, workshops and structured questioning to uncover the real business problem Framing success in terms of value. For example higher revenue, lower cost, reduced risk, increased efficiency or better customer experience Translating business goals into a clear roadmap of data and AI work that everyone can understand Advising clients when AI is not the right solution and suggesting simpler or more cost effective alternatives Consulting and advisory work Acting as a trusted advisor to product owners, heads of department and executives Helping clients prioritise use cases based on value, feasibility and risk Communicating trade offs in a simple way. For example accuracy versus speed, innovation versus compliance, cost versus impact Preparing and delivering client presentations, proposals and updates that tell a clear story Supporting pre sales activities where needed, such as scoping work, estimating effort and defining outcomes Managing client expectations, risks and dependencies so there are no surprises Building things that actually work Once the problem and value are clear, you will design and deliver production ready ML and GenAI solutions. That includes: Designing and building data pipelines, batch or streaming, that support the desired outcomes Working with engineers and architects so your work fits cleanly into existing systems Making sure what you build is reliable in production and moves the needle on agreed metrics, not just offline benchmarks Explaining design decisions to both technical and non technical stakeholders GenAI work You will work with GenAI in ways that are grounded in real use cases and business value: Building RAG systems that improve search, content discovery or productivity rather than existing for their own sake Implementing guardrails so models do not leak PII or generate harmful or off brand content Defining and tracking the right metrics so you and the client can see whether a GenAI solution is useful and cost effective Fine tuning and optimising models so they perform well for the use case and budget Designing agentic workflows where they genuinely improve outcomes rather than add complexity Helping clients understand what GenAI can and cannot do in practice Keeping it running You will set up the foundations that protect value over time: Experiment tracking and model versioning so you know what works and can roll back safely CI/CD pipelines for ML so improvements reach users quickly and reliably Monitoring and alerting for models and data so you can catch issues before they damage trust or results Communicating operational risks and mitigations to non technical stakeholders in plain language Security, quality and compliance You will help make sure: Data is accurate, traceable and well managed so decisions are sound Sensitive data is handled correctly, protecting users and the business Regulatory and compliance requirements are met, avoiding costly mistakes Clients understand the risk profile of AI solutions and the controls in place Working with people You will be a bridge between technical and non technical teams, inside our organisation and on the client side. That means: Explaining complex ML and GenAI ideas in plain language, always tied to business outcomes Working closely with product managers, engineers and business stakeholders to prioritise work that matters Facilitating workshops, playback sessions and show and tells that build buy in and understanding Coaching and supporting junior colleagues so the whole team can deliver more value Representing the company professionally in client meetings and at industry events What we are looking for Experience Around 3 to 6 years of experience shipping ML or GenAI solutions into production A track record of seeing projects through from discovery to delivery, with clear impact Experience working directly with stakeholders or clients in a consulting, advisory or product facing role Education A Bachelor or Master degree in a quantitative field such as Computer Science, Data Science, Statistics, Mathematics or Engineering or Equivalent experience that shows you can deliver results Technical skills Core skills Strong Python and SQL, with clean, maintainable code Solid understanding of ML fundamentals. For example feature engineering, model selection, handling imbalanced data, choosing and interpreting metrics Experience with PyTorch or TensorFlow GenAI specific Hands on experience with LLM APIs or open source models such as Llama or Mistral Experience building RAG systems with vector databases such as FAISS, Pinecone or Weaviate Ability to evaluate and improve prompts and retrieval quality using clear metrics Understanding of safety practices such as PII redaction and content filtering Exposure to agentic frameworks Cloud and infrastructure Comfortable working in at least one major cloud provider. AWS, GCP or Azure Familiar with Docker and CI/CD pipelines Experience with managed ML platforms such as SageMaker, Vertex AI or Azure ML Data engineering and MLOps Experience with data warehouses such as Snowflake, BigQuery or Redshift Workflow orchestration using tools like Airflow or Dagster Experience with MLOps tools such as MLflow, Weights and Biases or similar Awareness of data and model drift, and how to monitor and respond to it before it erodes value Soft skills, the things that really matter You are comfortable in client facing settings and can build trust quickly You can talk with anyone from a CEO to a new data analyst, and always bring the conversation back to business value You can take a vague, messy business problem and turn it into a clear technical plan that links to outcomes and metrics You are happy to push back and challenge assumptions respectfully when it is in the client's best interest You like helping other people grow and are happy to mentor junior colleagues You communicate clearly in writing and in person Nice to have, not required Do not rule yourself out if you do not have these. They are a bonus, not a checklist. Experience with Delta Lake, Iceberg, Spark or Databricks, Palantir Experience optimising LLM serving with tools such as vLLM, TGI or TensorRT LLM Search and ranking experience. For example Elasticsearch or rerankers Background in time series forecasting, causal inference, recommender systems or optimisation Experience managing cloud costs and IAM so value is not lost to waste Ability to work in other languages where needed. For example Java, Scala, Go or bash Experience with BI tools such as Looker or Tableau Prior consulting experience or leading client projects end to end Contributions to open source, conference talks or published papers that show your ability to share ideas and influence the wider community Got a background that fits and you're up for a new challenge? Send over your latest CV, expectations and availability. Staffworx Limited is a UK based recruitment consultancy partnering with leading global brands across digital, AI, software, and business consulting. Let's talk about what you could add to the mix.
TEKsystems
Solutions Architect
TEKsystems
Job Title: Adobe Solution Architect - No Sponsorship Available - INSIDE IR35 Job Description We are seeking an experienced Adobe Solution Architect to lead the end-to-end design and delivery across Adobe Experience Platform (AEP), including Real-Time Customer Data Platform (RTCDP), Adobe Campaign v8 (migration from v7), Adobe Journey Optimizer (AJO), and Adobe Experience Manager Assets (AEM Assets). The role involves owning the solution architecture from discovery through to production, ensuring implementations are performant, secure, and scalable, aligning with enterprise standards to deliver measurable business outcomes. Responsibilities Own end-to-end solution architecture from discovery and non-functional requirements (NFRs) to high-level design (HLD), low-level design (LLD), reference patterns, and transition to build/run, ensuring successful delivery and adoption. Define integration and data flows across AEP/RTCDP, AJO, Campaign v8, AEM Assets, including identity resolution, consent, destinations, and downstream activation. Design AEP schemas (XDM), identities, datasets, sources/destinations, and RTCDP segmentation and governance. Establish real-time event ingestion, source connectors, and destination patterns. Define migration strategy for Adobe Campaign v7 to v8, including data model, workflows, deliveries, typologies, dependency mapping, and coexistence/cutover plans. Architect real-time, triggered, and scheduled journeys using AEP profiles, decisions, and offers. Design AEM Assets taxonomy, metadata strategy, and lifecycle workflows to support omnichannel content supply chain and activation. Embed data privacy, consent, and data residency controls, and define NFRs and observability metrics. Lead design reviews with Architecture Review Board and business/IT stakeholders; secure sign-offs and maintain design traceability. Provide architecture runway and coaching to engineering squads, supporting backlog refinement and release planning. Essential Skills - Please ensure your CV has these skills listed to be considered Hands-on architecture and delivery Experience across AEP/RTCDP, Adobe Campaign v8, and Adobe Journey Optimizer. Working knowledge of AEM Assets, metadata models, and workflow automation for omnichannel content. Strong data architecture skills with event streaming, APIs, SFTP/batch, and identity/consent models. Experience designing real-time activation, profile stitching, segment governance, and destination patterns. Proven end-to-end design and architecture ownership with successful go-lives at enterprise scale. Creation of HLD/LLD, sequence/data flow diagrams, and architecture decision records. Agile delivery Experience with multidisciplinary teams. Excellent communication skills, translating complex architecture into clear outcomes for both technical and non-technical audiences. Additional Skills & Qualifications Adobe certifications such as AEP Architect, RTCDP, Campaign, AJO, AEM Assets. Experience with Offer Decisioning/RTCDP B2B, Snowflake/Databricks, and paid media destination ecosystems. Prior work Experience in high scale B2C/B2B2C environments such as media, telco, retail, and financial services. Why Work Here? Join a dynamic and collaborative team focused on cutting-edge technologies and innovation. Enjoy opportunities for professional growth and development, as well as a supportive work environment that values work-life balance. Be part of a culture that encourages creativity and continuous learning. Work Environment You will work in a modern, technology-driven environment with access to the latest tools and platforms. The role involves collaboration with cross-functional teams across product, data, marketing operations, and engineering disciplines. Expect a flexible work schedule that supports a healthy work-life balance. Location 2 days a week on site at one of our client's UK hubs - With occasional travel across the UK (Travel costs to be covered) Location London, UK Trading as TEKsystems. Allegis Group Limited, Maxis 2, Western Road, Bracknell, RG12 1RT, United Kingdom. No. (phone number removed). Allegis Group Limited operates as an Employment Business and Employment Agency as set out in the Conduct of Employment Agencies and Employment Businesses Regulations 2003. TEKsystems is a company within the Allegis Group network of companies (collectively referred to as "Allegis Group"). Aerotek, Aston Carter, EASi, Talentis Solutions, TEKsystems, Stamford Consultants and The Stamford Group are Allegis Group brands. If you apply, your personal data will be processed as described in the Allegis Group Online Privacy Notice available at (url removed)> To access our Online Privacy Notice, which explains what information we may collect, use, share, and store about you, and describes your rights and choices about this, please go to (url removed)> We are part of a global network of companies and as a result, the personal data you provide will be shared within Allegis Group and transferred and processed outside the UK, Switzerland and European Economic Area subject to the protections described in the Allegis Group Online Privacy Notice. We store personal data in the UK, EEA, Switzerland and the USA. If you would like to exercise your privacy rights, please visit the "Contacting Us" section of our Online Privacy Notice at (url removed)/en-gb/privacy-notices for details on how to contact us. To protect your privacy and security, we may take steps to verify your identity, such as a password and user ID if there is an account associated with your request, or identifying information such as your address or date of birth, before proceeding with your request. If you are resident in the UK, EEA or Switzerland, we will process any access request you make in accordance with our commitments under the UK Data Protection Act, EU-U.S. Privacy Shield or the Swiss-U.S. Privacy Shield.
21/01/2026
Contractor
Job Title: Adobe Solution Architect - No Sponsorship Available - INSIDE IR35 Job Description We are seeking an experienced Adobe Solution Architect to lead the end-to-end design and delivery across Adobe Experience Platform (AEP), including Real-Time Customer Data Platform (RTCDP), Adobe Campaign v8 (migration from v7), Adobe Journey Optimizer (AJO), and Adobe Experience Manager Assets (AEM Assets). The role involves owning the solution architecture from discovery through to production, ensuring implementations are performant, secure, and scalable, aligning with enterprise standards to deliver measurable business outcomes. Responsibilities Own end-to-end solution architecture from discovery and non-functional requirements (NFRs) to high-level design (HLD), low-level design (LLD), reference patterns, and transition to build/run, ensuring successful delivery and adoption. Define integration and data flows across AEP/RTCDP, AJO, Campaign v8, AEM Assets, including identity resolution, consent, destinations, and downstream activation. Design AEP schemas (XDM), identities, datasets, sources/destinations, and RTCDP segmentation and governance. Establish real-time event ingestion, source connectors, and destination patterns. Define migration strategy for Adobe Campaign v7 to v8, including data model, workflows, deliveries, typologies, dependency mapping, and coexistence/cutover plans. Architect real-time, triggered, and scheduled journeys using AEP profiles, decisions, and offers. Design AEM Assets taxonomy, metadata strategy, and lifecycle workflows to support omnichannel content supply chain and activation. Embed data privacy, consent, and data residency controls, and define NFRs and observability metrics. Lead design reviews with Architecture Review Board and business/IT stakeholders; secure sign-offs and maintain design traceability. Provide architecture runway and coaching to engineering squads, supporting backlog refinement and release planning. Essential Skills - Please ensure your CV has these skills listed to be considered Hands-on architecture and delivery Experience across AEP/RTCDP, Adobe Campaign v8, and Adobe Journey Optimizer. Working knowledge of AEM Assets, metadata models, and workflow automation for omnichannel content. Strong data architecture skills with event streaming, APIs, SFTP/batch, and identity/consent models. Experience designing real-time activation, profile stitching, segment governance, and destination patterns. Proven end-to-end design and architecture ownership with successful go-lives at enterprise scale. Creation of HLD/LLD, sequence/data flow diagrams, and architecture decision records. Agile delivery Experience with multidisciplinary teams. Excellent communication skills, translating complex architecture into clear outcomes for both technical and non-technical audiences. Additional Skills & Qualifications Adobe certifications such as AEP Architect, RTCDP, Campaign, AJO, AEM Assets. Experience with Offer Decisioning/RTCDP B2B, Snowflake/Databricks, and paid media destination ecosystems. Prior work Experience in high scale B2C/B2B2C environments such as media, telco, retail, and financial services. Why Work Here? Join a dynamic and collaborative team focused on cutting-edge technologies and innovation. Enjoy opportunities for professional growth and development, as well as a supportive work environment that values work-life balance. Be part of a culture that encourages creativity and continuous learning. Work Environment You will work in a modern, technology-driven environment with access to the latest tools and platforms. The role involves collaboration with cross-functional teams across product, data, marketing operations, and engineering disciplines. Expect a flexible work schedule that supports a healthy work-life balance. Location 2 days a week on site at one of our client's UK hubs - With occasional travel across the UK (Travel costs to be covered) Location London, UK Trading as TEKsystems. Allegis Group Limited, Maxis 2, Western Road, Bracknell, RG12 1RT, United Kingdom. No. (phone number removed). Allegis Group Limited operates as an Employment Business and Employment Agency as set out in the Conduct of Employment Agencies and Employment Businesses Regulations 2003. TEKsystems is a company within the Allegis Group network of companies (collectively referred to as "Allegis Group"). Aerotek, Aston Carter, EASi, Talentis Solutions, TEKsystems, Stamford Consultants and The Stamford Group are Allegis Group brands. If you apply, your personal data will be processed as described in the Allegis Group Online Privacy Notice available at (url removed)> To access our Online Privacy Notice, which explains what information we may collect, use, share, and store about you, and describes your rights and choices about this, please go to (url removed)> We are part of a global network of companies and as a result, the personal data you provide will be shared within Allegis Group and transferred and processed outside the UK, Switzerland and European Economic Area subject to the protections described in the Allegis Group Online Privacy Notice. We store personal data in the UK, EEA, Switzerland and the USA. If you would like to exercise your privacy rights, please visit the "Contacting Us" section of our Online Privacy Notice at (url removed)/en-gb/privacy-notices for details on how to contact us. To protect your privacy and security, we may take steps to verify your identity, such as a password and user ID if there is an account associated with your request, or identifying information such as your address or date of birth, before proceeding with your request. If you are resident in the UK, EEA or Switzerland, we will process any access request you make in accordance with our commitments under the UK Data Protection Act, EU-U.S. Privacy Shield or the Swiss-U.S. Privacy Shield.
Data Idols
Staff Data Engineer
Data Idols
Staff Data Engineer Salary: 85,000 - 95,000 Location: London, hybrid Data Idols are working with one of the best-known retail brands in the UK that are investing heavily in its data platform. They are looking for a Staff Data Engineer to play a key role in scaling production data systems and raising engineering standards across the wider data function. This role sits at the centre of a major data transformation and offers the chance to work on high-impact data platforms used across the business. The Opportunity As a Staff Data Engineer, you'll take ownership of complex, production-grade data pipelines and act as a technical leader within the data engineering team. You'll work on cloud-native solutions built on Azure and Databricks, making key decisions around data processing, modelling, and performance. Alongside hands-on delivery, you'll help set best practices, support other engineers, and influence how data engineering is done across the organisation. Skills & Experience Strong hands-on experience with Azure data platforms Advanced SQL skills Commercial experience using Databricks and PySpark Proven background building and maintaining scalable data pipelines If you're looking for a role where you can combine technical depth, ownership, and influence, please submit your CV for initial screening and further details. Staff Data Engineer
21/01/2026
Full time
Staff Data Engineer Salary: 85,000 - 95,000 Location: London, hybrid Data Idols are working with one of the best-known retail brands in the UK that are investing heavily in its data platform. They are looking for a Staff Data Engineer to play a key role in scaling production data systems and raising engineering standards across the wider data function. This role sits at the centre of a major data transformation and offers the chance to work on high-impact data platforms used across the business. The Opportunity As a Staff Data Engineer, you'll take ownership of complex, production-grade data pipelines and act as a technical leader within the data engineering team. You'll work on cloud-native solutions built on Azure and Databricks, making key decisions around data processing, modelling, and performance. Alongside hands-on delivery, you'll help set best practices, support other engineers, and influence how data engineering is done across the organisation. Skills & Experience Strong hands-on experience with Azure data platforms Advanced SQL skills Commercial experience using Databricks and PySpark Proven background building and maintaining scalable data pipelines If you're looking for a role where you can combine technical depth, ownership, and influence, please submit your CV for initial screening and further details. Staff Data Engineer
Adecco
Azure Data Engineer X3
Adecco
Azure Data Engineer X3 UK Wide - 90% Remote Salary: 65-85,000 Per Annum + Permanent Benefits About the Role We are looking for experienced Azure Data Engineers to design, build, and optimise scalable data platforms in Microsoft Azure. You will play a key role in delivering reliable, high-quality data solutions that support analytics, reporting, and data-driven decision-making across the business. Working closely with data analysts, data scientists, and stakeholders, you'll help shape our modern data architecture and ensure best practices across data engineering, security, and performance. Key Responsibilities Design, develop, and maintain Azure-based data pipelines and data platforms Build and optimise ETL/ELT processes using Azure Data Factory and related services Develop data solutions using Azure Synapse Analytics, Azure SQL, and Data Lake Implement data modelling solutions for analytics and reporting Ensure data quality, reliability, and performance across data systems Collaborate with analytics and business teams to understand data requirements Apply best practices for security, governance, and cost optimisation in Azure Monitor, troubleshoot, and optimise data workflows Required Skills & Experience Strong experience as a Data Engineer in an Azure environment Hands-on expertise with: Azure Data Factory Azure Synapse Analytics Azure Data Lake (Gen2) Azure SQL / SQL Server Advanced SQL skills Experience with Python or Scala for data processing Solid understanding of data warehousing, data modelling, and ETL/ELT patterns Familiarity with CI/CD pipelines and source control (e.g., Azure DevOps, Git) Desirable Skills Experience with Databricks and Spark Knowledge of Power BI and analytics workloads Understanding of DevOps and Infrastructure as Code (e.g., ARM, Bicep, Terraform)
21/01/2026
Full time
Azure Data Engineer X3 UK Wide - 90% Remote Salary: 65-85,000 Per Annum + Permanent Benefits About the Role We are looking for experienced Azure Data Engineers to design, build, and optimise scalable data platforms in Microsoft Azure. You will play a key role in delivering reliable, high-quality data solutions that support analytics, reporting, and data-driven decision-making across the business. Working closely with data analysts, data scientists, and stakeholders, you'll help shape our modern data architecture and ensure best practices across data engineering, security, and performance. Key Responsibilities Design, develop, and maintain Azure-based data pipelines and data platforms Build and optimise ETL/ELT processes using Azure Data Factory and related services Develop data solutions using Azure Synapse Analytics, Azure SQL, and Data Lake Implement data modelling solutions for analytics and reporting Ensure data quality, reliability, and performance across data systems Collaborate with analytics and business teams to understand data requirements Apply best practices for security, governance, and cost optimisation in Azure Monitor, troubleshoot, and optimise data workflows Required Skills & Experience Strong experience as a Data Engineer in an Azure environment Hands-on expertise with: Azure Data Factory Azure Synapse Analytics Azure Data Lake (Gen2) Azure SQL / SQL Server Advanced SQL skills Experience with Python or Scala for data processing Solid understanding of data warehousing, data modelling, and ETL/ELT patterns Familiarity with CI/CD pipelines and source control (e.g., Azure DevOps, Git) Desirable Skills Experience with Databricks and Spark Knowledge of Power BI and analytics workloads Understanding of DevOps and Infrastructure as Code (e.g., ARM, Bicep, Terraform)
Emtec Software Solutions Ltd
Data Engineer
Emtec Software Solutions Ltd
I am looking for a Data Engineer (potentially 2-3) to work on an enterprise level, large scale Databricks project. We are looking for a Data Engineer with solid Databricks experience as they are building brand new products (think AI powered agentic workflows) on top of an existing databricks solution but need a Data Engineer with deep databricks knowledge to manage performance, cost, security, and data freshness. Any AI/ML knowledge specifically around LLM's/RAG would also be useful. Data Engineer Key Skills: Python / Pyspark Databricks Delta Lake Azure LLM's / RAG Location: Remote Rate: Circa 450 / Day (Outside) Contract Length: 6-12 Months Initial If this Data Engineer role sounds of interest please let me know and we can set up an initial chat so I can share the details :)
21/01/2026
Contractor
I am looking for a Data Engineer (potentially 2-3) to work on an enterprise level, large scale Databricks project. We are looking for a Data Engineer with solid Databricks experience as they are building brand new products (think AI powered agentic workflows) on top of an existing databricks solution but need a Data Engineer with deep databricks knowledge to manage performance, cost, security, and data freshness. Any AI/ML knowledge specifically around LLM's/RAG would also be useful. Data Engineer Key Skills: Python / Pyspark Databricks Delta Lake Azure LLM's / RAG Location: Remote Rate: Circa 450 / Day (Outside) Contract Length: 6-12 Months Initial If this Data Engineer role sounds of interest please let me know and we can set up an initial chat so I can share the details :)
Head Resourcing
Senior Power BI Engineer
Head Resourcing
Senior Power BI Report Engineer (Azure / Databricks) Glasgow based only 4 days onsite No visa restrictions please Are you a Senior Power BI specialist who loves clean, governed data and high-performance semantic models? Do you want to work with a business that's rebuilding its entire BI estate the right way-proper Lakehouse architecture, curated Gold tables, PBIP, Git, and end-to-end governance? If so, this is one of the most modern, forward-thinking Power BI engineering roles in Scotland. Our Glasgow-based client is transforming its reporting platform using Azure + Databricks , with Power BI sitting on top of a fully curated Gold Layer. They develop everything using PBIP + Git + Tabular Editor 3 , and semantic modelling is treated as a first-class engineering discipline. This is your chance to own the creation of high-quality datasets and dashboards used across Operations, Finance, Sales, Logistics and Customer Care-turning trusted Lakehouse data into insights the business relies on every day. ? Why This Role Exists To turn clean, curated Gold Lakehouse data into trusted, enterprise-grade Power BI insights. You'll own semantic modelling, dataset optimisation, governance and best-practice delivery across a modern BI ecosystem. ? What You'll Do Semantic Modelling with PBIP + Git Build and maintain enterprise PBIP datasets fully version-controlled in Git. Use Tabular Editor 3 for DAX, metadata modelling, calc groups and object governance. Manage branching, pull requests and releases via Azure DevOps . Lakehouse-Aligned Reporting (Gold Layer Only) Develop semantic models exclusively on top of curated Gold Databricks tables . Work closely with Data Engineering on schema design and contract-first modelling. Maintain consistent dimensional modelling aligned to the enterprise Bus Matrix. High-Performance Power BI Engineering Optimise performance: aggregations, composite models, incremental refresh, DQ/Import strategy. Tune Databricks SQL Warehouse queries for speed and cost efficiency. Monitor PPU capacity performance, refresh reliability and dataset health. Governance, Security & Standards Implement RLS/OLS , naming conventions, KPI definitions and calc groups. Apply dataset certification, endorsements and governance metadata. Align semantic models with lineage and security policies across the Azure/Databricks estate. Lifecycle, Release & Best Practice Delivery Use Power BI Deployment Pipelines for Dev ? UAT ? Prod releases. Enforce semantic CI/CD patterns with PBIP + Git + Tabular Editor. Build reusable, certified datasets and dataflows enabling scalable self-service BI. Adoption, UX & Collaboration Design intuitive dashboards with consistent UX across multiple business functions. Support BI adoption through training, documentation and best-practice guidance. Use telemetry to track usage, performance and improve user experience. ? What We're Looking For Required Certifications To meet BI engineering standards, candidates must hold: PL-300: Power BI Data Analyst Associate DP-600: Fabric Analytics Engineer Associate Skills & Experience Commercial years building enterprise Power BI datasets and dashboards. Strong DAX and semantic modelling expertise (calc groups, conformed dimensions, role-playing dimensions). Strong SQL skills; comfortable working with Databricks Gold-layer tables. Proven ability to optimise dataset performance (aggregations, incremental refresh, DQ/Import). Experience working with Git-based modelling workflows and PR reviews via Tabular Editor. Excellent design intuition-clean layouts, drill paths, and KPI logic. Nice to Have Python for automation or ad-hoc prep; PySpark familiarity. Understanding of Lakehouse patterns, Delta Lake, metadata-driven pipelines. Unity Catalog / Purview experience for lineage and governance. RLS/OLS implementation experience.
21/01/2026
Full time
Senior Power BI Report Engineer (Azure / Databricks) Glasgow based only 4 days onsite No visa restrictions please Are you a Senior Power BI specialist who loves clean, governed data and high-performance semantic models? Do you want to work with a business that's rebuilding its entire BI estate the right way-proper Lakehouse architecture, curated Gold tables, PBIP, Git, and end-to-end governance? If so, this is one of the most modern, forward-thinking Power BI engineering roles in Scotland. Our Glasgow-based client is transforming its reporting platform using Azure + Databricks , with Power BI sitting on top of a fully curated Gold Layer. They develop everything using PBIP + Git + Tabular Editor 3 , and semantic modelling is treated as a first-class engineering discipline. This is your chance to own the creation of high-quality datasets and dashboards used across Operations, Finance, Sales, Logistics and Customer Care-turning trusted Lakehouse data into insights the business relies on every day. ? Why This Role Exists To turn clean, curated Gold Lakehouse data into trusted, enterprise-grade Power BI insights. You'll own semantic modelling, dataset optimisation, governance and best-practice delivery across a modern BI ecosystem. ? What You'll Do Semantic Modelling with PBIP + Git Build and maintain enterprise PBIP datasets fully version-controlled in Git. Use Tabular Editor 3 for DAX, metadata modelling, calc groups and object governance. Manage branching, pull requests and releases via Azure DevOps . Lakehouse-Aligned Reporting (Gold Layer Only) Develop semantic models exclusively on top of curated Gold Databricks tables . Work closely with Data Engineering on schema design and contract-first modelling. Maintain consistent dimensional modelling aligned to the enterprise Bus Matrix. High-Performance Power BI Engineering Optimise performance: aggregations, composite models, incremental refresh, DQ/Import strategy. Tune Databricks SQL Warehouse queries for speed and cost efficiency. Monitor PPU capacity performance, refresh reliability and dataset health. Governance, Security & Standards Implement RLS/OLS , naming conventions, KPI definitions and calc groups. Apply dataset certification, endorsements and governance metadata. Align semantic models with lineage and security policies across the Azure/Databricks estate. Lifecycle, Release & Best Practice Delivery Use Power BI Deployment Pipelines for Dev ? UAT ? Prod releases. Enforce semantic CI/CD patterns with PBIP + Git + Tabular Editor. Build reusable, certified datasets and dataflows enabling scalable self-service BI. Adoption, UX & Collaboration Design intuitive dashboards with consistent UX across multiple business functions. Support BI adoption through training, documentation and best-practice guidance. Use telemetry to track usage, performance and improve user experience. ? What We're Looking For Required Certifications To meet BI engineering standards, candidates must hold: PL-300: Power BI Data Analyst Associate DP-600: Fabric Analytics Engineer Associate Skills & Experience Commercial years building enterprise Power BI datasets and dashboards. Strong DAX and semantic modelling expertise (calc groups, conformed dimensions, role-playing dimensions). Strong SQL skills; comfortable working with Databricks Gold-layer tables. Proven ability to optimise dataset performance (aggregations, incremental refresh, DQ/Import). Experience working with Git-based modelling workflows and PR reviews via Tabular Editor. Excellent design intuition-clean layouts, drill paths, and KPI logic. Nice to Have Python for automation or ad-hoc prep; PySpark familiarity. Understanding of Lakehouse patterns, Delta Lake, metadata-driven pipelines. Unity Catalog / Purview experience for lineage and governance. RLS/OLS implementation experience.
Emtec Software Solutions Ltd
Data Engineer
Emtec Software Solutions Ltd City, London
I am looking for a Data Engineer (potentially 2-3) to work on an enterprise level, large scale Databricks project. We are looking for a Data Engineer with solid Databricks experience as they are building brand new products (think AI powered agentic workflows) on top of an existing databricks solution but need a Data Engineer with deep databricks knowledge to manage performance, cost, security, and data freshness. Any AI/ML knowledge specifically around LLM's/RAG would also be useful. Data Engineer Key Skills: Python / Pyspark Databricks Delta Lake Azure LLM's / RAG Location: Remote Rate: Circa 450 / Day (Outside) Contract Length: 6-12 Months Initial If this Data Engineer role sounds of interest please let me know and we can set up an initial chat so I can share the details :)
21/01/2026
Contractor
I am looking for a Data Engineer (potentially 2-3) to work on an enterprise level, large scale Databricks project. We are looking for a Data Engineer with solid Databricks experience as they are building brand new products (think AI powered agentic workflows) on top of an existing databricks solution but need a Data Engineer with deep databricks knowledge to manage performance, cost, security, and data freshness. Any AI/ML knowledge specifically around LLM's/RAG would also be useful. Data Engineer Key Skills: Python / Pyspark Databricks Delta Lake Azure LLM's / RAG Location: Remote Rate: Circa 450 / Day (Outside) Contract Length: 6-12 Months Initial If this Data Engineer role sounds of interest please let me know and we can set up an initial chat so I can share the details :)
Hunter Bond
Lead DataOps Engineer - Big Data
Hunter Bond
My leading Tech client are looking for a talented and motivated individual to ensure the resilience, performance, and cost-effectiveness of their Azure-based data platform. This role is essential to their data ecosystem, combining platform reliability, incident response, SLA management, cost optimization (FinOps), and deployment oversight. You will be the single point of contact for operational issues, driving rapid resolution during outages, leading communications with stakeholders, and shaping the processes that keeps their platform running smoothly and efficiently. This is a newly created role in a growing business. A brilliant opportunity! The following skills/experience is required: Proven operational leadership for large-scale data platforms. Expertise in incident management, SLA enforcement, and stakeholder communication. Hands-on experience with Azure Synapse, Databricks, ADF, Power BI. Familiarity with CI/CD and automation. Strong FinOps mindset and cost management experience. Knowledge of monitoring and observability frameworks. Salary: Up to £90,000 + bonus + package Level: Lead Engineer Location: London (good work from home options available) If you are interested in this Lead DataOps Engineer (Big Data) position and meet the above requirements please apply immediately.
21/01/2026
Full time
My leading Tech client are looking for a talented and motivated individual to ensure the resilience, performance, and cost-effectiveness of their Azure-based data platform. This role is essential to their data ecosystem, combining platform reliability, incident response, SLA management, cost optimization (FinOps), and deployment oversight. You will be the single point of contact for operational issues, driving rapid resolution during outages, leading communications with stakeholders, and shaping the processes that keeps their platform running smoothly and efficiently. This is a newly created role in a growing business. A brilliant opportunity! The following skills/experience is required: Proven operational leadership for large-scale data platforms. Expertise in incident management, SLA enforcement, and stakeholder communication. Hands-on experience with Azure Synapse, Databricks, ADF, Power BI. Familiarity with CI/CD and automation. Strong FinOps mindset and cost management experience. Knowledge of monitoring and observability frameworks. Salary: Up to £90,000 + bonus + package Level: Lead Engineer Location: London (good work from home options available) If you are interested in this Lead DataOps Engineer (Big Data) position and meet the above requirements please apply immediately.
Harnham - Data & Analytics Recruitment
Senior Data Engineer
Harnham - Data & Analytics Recruitment
SENIOR DATA ENGINEER UP TO £90,000 + BENEFITS Remote (UK based) As a Senior Data Engineer, you'll take ownership of the data ingestion layer, working with complex and sometimes messy healthcare data sources. You'll play a key role in shaping how data is collected, transformed, and made available across the organisation. This is a hands-on role suited to someone who enjoys autonomy, ambiguity, and rolling up their sleeves. THE COMPANY: We're partnering with a fast-growing digital health provider operating in the mental health and neurodiversity space. The business is mission-driven, highly regulated, and expanding its services across healthcare and corporate wellbeing. This is a rare opportunity to join at a pivotal stage, helping to build a modern data platform from scratch in a small, trusted team where your work will have immediate impact. THE ROLE: A Senior Data Engineer will need to: Build and maintain Python-based batch ingestion pipelines from a variety of healthcare systems Work extensively with Databricks on AWS Contribute to data modelling and transformations using dbt Collaborate with analysts and stakeholders to support reporting and insights (PowerBI) Help establish best practices around data quality, governance, and reliability in a regulated environment THE BENEFITS: You will receive a salary, dependent on experience. Salary is up to £90,000 On top of the salary there are some fantastic extra benefits. HOW TO APPLY Please register your interest by sending your CV to Molly Bird via the apply link on this page.
21/01/2026
Full time
SENIOR DATA ENGINEER UP TO £90,000 + BENEFITS Remote (UK based) As a Senior Data Engineer, you'll take ownership of the data ingestion layer, working with complex and sometimes messy healthcare data sources. You'll play a key role in shaping how data is collected, transformed, and made available across the organisation. This is a hands-on role suited to someone who enjoys autonomy, ambiguity, and rolling up their sleeves. THE COMPANY: We're partnering with a fast-growing digital health provider operating in the mental health and neurodiversity space. The business is mission-driven, highly regulated, and expanding its services across healthcare and corporate wellbeing. This is a rare opportunity to join at a pivotal stage, helping to build a modern data platform from scratch in a small, trusted team where your work will have immediate impact. THE ROLE: A Senior Data Engineer will need to: Build and maintain Python-based batch ingestion pipelines from a variety of healthcare systems Work extensively with Databricks on AWS Contribute to data modelling and transformations using dbt Collaborate with analysts and stakeholders to support reporting and insights (PowerBI) Help establish best practices around data quality, governance, and reliability in a regulated environment THE BENEFITS: You will receive a salary, dependent on experience. Salary is up to £90,000 On top of the salary there are some fantastic extra benefits. HOW TO APPLY Please register your interest by sending your CV to Molly Bird via the apply link on this page.
Harnham - Data & Analytics Recruitment
Senior Data Engineer
Harnham - Data & Analytics Recruitment Leeds, Yorkshire
SENIOR DATA ENGINEER UP TO £90,000 + BENEFITS Remote (UK based) As a Senior Data Engineer, you'll take ownership of the data ingestion layer, working with complex and sometimes messy healthcare data sources. You'll play a key role in shaping how data is collected, transformed, and made available across the organisation. This is a hands-on role suited to someone who enjoys autonomy, ambiguity, and rolling up their sleeves. THE COMPANY: We're partnering with a fast-growing digital health provider operating in the mental health and neurodiversity space. The business is mission-driven, highly regulated, and expanding its services across healthcare and corporate wellbeing. This is a rare opportunity to join at a pivotal stage, helping to build a modern data platform from scratch in a small, trusted team where your work will have immediate impact. THE ROLE: A Senior Data Engineer will need to: Build and maintain Python-based batch ingestion pipelines from a variety of healthcare systems Work extensively with Databricks on AWS Contribute to data modelling and transformations using dbt Collaborate with analysts and stakeholders to support reporting and insights (PowerBI) Help establish best practices around data quality, governance, and reliability in a regulated environment THE BENEFITS: You will receive a salary, dependent on experience. Salary is up to £90,000 On top of the salary there are some fantastic extra benefits. HOW TO APPLY Please register your interest by sending your CV to Molly Bird via the apply link on this page.
21/01/2026
Full time
SENIOR DATA ENGINEER UP TO £90,000 + BENEFITS Remote (UK based) As a Senior Data Engineer, you'll take ownership of the data ingestion layer, working with complex and sometimes messy healthcare data sources. You'll play a key role in shaping how data is collected, transformed, and made available across the organisation. This is a hands-on role suited to someone who enjoys autonomy, ambiguity, and rolling up their sleeves. THE COMPANY: We're partnering with a fast-growing digital health provider operating in the mental health and neurodiversity space. The business is mission-driven, highly regulated, and expanding its services across healthcare and corporate wellbeing. This is a rare opportunity to join at a pivotal stage, helping to build a modern data platform from scratch in a small, trusted team where your work will have immediate impact. THE ROLE: A Senior Data Engineer will need to: Build and maintain Python-based batch ingestion pipelines from a variety of healthcare systems Work extensively with Databricks on AWS Contribute to data modelling and transformations using dbt Collaborate with analysts and stakeholders to support reporting and insights (PowerBI) Help establish best practices around data quality, governance, and reliability in a regulated environment THE BENEFITS: You will receive a salary, dependent on experience. Salary is up to £90,000 On top of the salary there are some fantastic extra benefits. HOW TO APPLY Please register your interest by sending your CV to Molly Bird via the apply link on this page.
Harnham - Data & Analytics Recruitment
Senior Data Engineer
Harnham - Data & Analytics Recruitment Manchester, Lancashire
SENIOR DATA ENGINEER UP TO £90,000 + BENEFITS Remote (UK based) As a Senior Data Engineer, you'll take ownership of the data ingestion layer, working with complex and sometimes messy healthcare data sources. You'll play a key role in shaping how data is collected, transformed, and made available across the organisation. This is a hands-on role suited to someone who enjoys autonomy, ambiguity, and rolling up their sleeves. THE COMPANY: We're partnering with a fast-growing digital health provider operating in the mental health and neurodiversity space. The business is mission-driven, highly regulated, and expanding its services across healthcare and corporate wellbeing. This is a rare opportunity to join at a pivotal stage, helping to build a modern data platform from scratch in a small, trusted team where your work will have immediate impact. THE ROLE: A Senior Data Engineer will need to: Build and maintain Python-based batch ingestion pipelines from a variety of healthcare systems Work extensively with Databricks on AWS Contribute to data modelling and transformations using dbt Collaborate with analysts and stakeholders to support reporting and insights (PowerBI) Help establish best practices around data quality, governance, and reliability in a regulated environment THE BENEFITS: You will receive a salary, dependent on experience. Salary is up to £90,000 On top of the salary there are some fantastic extra benefits. HOW TO APPLY Please register your interest by sending your CV to Molly Bird via the apply link on this page.
21/01/2026
Full time
SENIOR DATA ENGINEER UP TO £90,000 + BENEFITS Remote (UK based) As a Senior Data Engineer, you'll take ownership of the data ingestion layer, working with complex and sometimes messy healthcare data sources. You'll play a key role in shaping how data is collected, transformed, and made available across the organisation. This is a hands-on role suited to someone who enjoys autonomy, ambiguity, and rolling up their sleeves. THE COMPANY: We're partnering with a fast-growing digital health provider operating in the mental health and neurodiversity space. The business is mission-driven, highly regulated, and expanding its services across healthcare and corporate wellbeing. This is a rare opportunity to join at a pivotal stage, helping to build a modern data platform from scratch in a small, trusted team where your work will have immediate impact. THE ROLE: A Senior Data Engineer will need to: Build and maintain Python-based batch ingestion pipelines from a variety of healthcare systems Work extensively with Databricks on AWS Contribute to data modelling and transformations using dbt Collaborate with analysts and stakeholders to support reporting and insights (PowerBI) Help establish best practices around data quality, governance, and reliability in a regulated environment THE BENEFITS: You will receive a salary, dependent on experience. Salary is up to £90,000 On top of the salary there are some fantastic extra benefits. HOW TO APPLY Please register your interest by sending your CV to Molly Bird via the apply link on this page.
Tria Recruitment
Digital Analyst
Tria Recruitment
Digital Analytics & CRO Specialist London - Hybrid - 2-3 days About the Role We're seeking a data-driven specialist to own digital analytics and conversion optimisation across global platforms. You'll implement robust tracking, deliver actionable insights, and drive a culture of experimentation. Key Responsibilities Configure and optimise analytics tools (GA4, Adobe Analytics, GTM). Develop tagging specs, data layers, and event taxonomies. Build dashboards (Looker Studio, Power BI) and deliver performance insights. Lead CRO strategies: design A/B and multivariate tests, analyse results, and recommend improvements. Ensure compliance with GDPR and data governance standards. Collaborate with product, UX, and engineering teams to embed analytics early. What We're Looking For Hands-on experience with GA4, Adobe Analytics, GTM. Strong CRO and experimentation knowledge. Skilled in dashboarding tools and data storytelling. Familiarity with GDPR and data governance. Bonus: Databricks, CRM integrations, behavioural tools (Hotjar, Contentsquare). If this role sounds like a good fit then please apply today!
20/01/2026
Full time
Digital Analytics & CRO Specialist London - Hybrid - 2-3 days About the Role We're seeking a data-driven specialist to own digital analytics and conversion optimisation across global platforms. You'll implement robust tracking, deliver actionable insights, and drive a culture of experimentation. Key Responsibilities Configure and optimise analytics tools (GA4, Adobe Analytics, GTM). Develop tagging specs, data layers, and event taxonomies. Build dashboards (Looker Studio, Power BI) and deliver performance insights. Lead CRO strategies: design A/B and multivariate tests, analyse results, and recommend improvements. Ensure compliance with GDPR and data governance standards. Collaborate with product, UX, and engineering teams to embed analytics early. What We're Looking For Hands-on experience with GA4, Adobe Analytics, GTM. Strong CRO and experimentation knowledge. Skilled in dashboarding tools and data storytelling. Familiarity with GDPR and data governance. Bonus: Databricks, CRM integrations, behavioural tools (Hotjar, Contentsquare). If this role sounds like a good fit then please apply today!
DCS Recruitment Limited
Data Engineer
DCS Recruitment Limited City, Sheffield
Data Engineer Location: Sheffield (Hybrid - 3 days per week onsite) Salary: 50,000- 60,000 depending on experience DCS Tech are searching for an experienced Data Engineer to join our clients growing team! You will play a crucial part in designing, building, and optimising the data infrastructure that underpins the organisation. Key responsibilities Design, develop, and deploy scalable, secure, and reliable data pipelines using modern cloud and data engineering tools. Consolidate data from internal systems, APIs, and third-party sources into a unified data warehouse or data lake environment. Build and maintain robust data models to ensure accuracy, consistency, and accessibility across the organisation. Work closely with Data Analysts, Data Scientists, and business stakeholders to translate data requirements into effective technical solutions. Optimise data systems to deliver fast and accurate insights supporting dashboards, KPIs, and reporting frameworks. Implement monitoring, validation, and quality checks to ensure high levels of data accuracy and trust. Support compliance with relevant data standards and regulations, including GDPR. Maintain strong data security practices relating to access, encryption, and storage. Research and recommend new tools, technologies, and processes to improve performance, scalability, and efficiency. Contribute to migrations and modernisation projects across cloud and data platforms (e.g. AWS, Azure, GCP, Snowflake, Databricks). Create and maintain documentation aligned with internal processes and change management controls. Experience & Technical Skills Proven hands-on experience as a Data Engineer or in a similar data-centric role. Strong proficiency in SQL and Python. Solid understanding of ETL/ELT pipelines, data modelling, and data warehousing principles. Experience working with cloud platforms such as AWS, Azure, or GCP. Exposure to modern data tools such as Snowflake, Databricks, or BigQuery. Familiarity with streaming technologies (e.g., Kafka, Spark Streaming, Flink) is an advantage. Experience with orchestration and infrastructure tools such as Airflow, dbt, Prefect, CI/CD pipelines, and Terraform. What you get in return: Up to 60,000 per annum + benefits Hybrid working (3 in office) Opportunity to lead and mentor within a growing team! Professional development and training support This company is an equal opportunity employer and values diversity. We do not discriminate on the basis of race, religion, colour, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. Interested? Please submit your CV to Meg Kewley at DCS Recruitment via the link provided. Alternatively, email me at or call (phone number removed) . DCS Recruitment and all associated companies are committed to creating a working environment where diversity is celebrated and everyone is treated fairly, regardless of gender, gender identity, disability, ethnic origin, religion or belief, sexual orientation, marital or transgender status, age, or nationality
20/01/2026
Full time
Data Engineer Location: Sheffield (Hybrid - 3 days per week onsite) Salary: 50,000- 60,000 depending on experience DCS Tech are searching for an experienced Data Engineer to join our clients growing team! You will play a crucial part in designing, building, and optimising the data infrastructure that underpins the organisation. Key responsibilities Design, develop, and deploy scalable, secure, and reliable data pipelines using modern cloud and data engineering tools. Consolidate data from internal systems, APIs, and third-party sources into a unified data warehouse or data lake environment. Build and maintain robust data models to ensure accuracy, consistency, and accessibility across the organisation. Work closely with Data Analysts, Data Scientists, and business stakeholders to translate data requirements into effective technical solutions. Optimise data systems to deliver fast and accurate insights supporting dashboards, KPIs, and reporting frameworks. Implement monitoring, validation, and quality checks to ensure high levels of data accuracy and trust. Support compliance with relevant data standards and regulations, including GDPR. Maintain strong data security practices relating to access, encryption, and storage. Research and recommend new tools, technologies, and processes to improve performance, scalability, and efficiency. Contribute to migrations and modernisation projects across cloud and data platforms (e.g. AWS, Azure, GCP, Snowflake, Databricks). Create and maintain documentation aligned with internal processes and change management controls. Experience & Technical Skills Proven hands-on experience as a Data Engineer or in a similar data-centric role. Strong proficiency in SQL and Python. Solid understanding of ETL/ELT pipelines, data modelling, and data warehousing principles. Experience working with cloud platforms such as AWS, Azure, or GCP. Exposure to modern data tools such as Snowflake, Databricks, or BigQuery. Familiarity with streaming technologies (e.g., Kafka, Spark Streaming, Flink) is an advantage. Experience with orchestration and infrastructure tools such as Airflow, dbt, Prefect, CI/CD pipelines, and Terraform. What you get in return: Up to 60,000 per annum + benefits Hybrid working (3 in office) Opportunity to lead and mentor within a growing team! Professional development and training support This company is an equal opportunity employer and values diversity. We do not discriminate on the basis of race, religion, colour, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. Interested? Please submit your CV to Meg Kewley at DCS Recruitment via the link provided. Alternatively, email me at or call (phone number removed) . DCS Recruitment and all associated companies are committed to creating a working environment where diversity is celebrated and everyone is treated fairly, regardless of gender, gender identity, disability, ethnic origin, religion or belief, sexual orientation, marital or transgender status, age, or nationality
TRIA
Digital Analyst
TRIA
Digital Analytics & CRO Specialist London - Hybrid - 2-3 days About the Role We're seeking a data-driven specialist to own digital analytics and conversion optimisation across global platforms. You'll implement robust tracking, deliver actionable insights, and drive a culture of experimentation. Key Responsibilities Configure and optimise analytics tools (GA4, Adobe Analytics, GTM). Develop tagging specs, data layers, and event taxonomies. Build dashboards (Looker Studio, Power BI) and deliver performance insights. Lead CRO strategies: design A/B and multivariate tests, analyse results, and recommend improvements. Ensure compliance with GDPR and data governance standards. Collaborate with product, UX, and engineering teams to embed analytics early. What We're Looking For Hands-on experience with GA4, Adobe Analytics, GTM. Strong CRO and experimentation knowledge. Skilled in dashboarding tools and data storytelling. Familiarity with GDPR and data governance. Bonus: Databricks, CRM integrations, behavioural tools (Hotjar, Contentsquare). If this role sounds like a good fit then please apply today!
19/01/2026
Full time
Digital Analytics & CRO Specialist London - Hybrid - 2-3 days About the Role We're seeking a data-driven specialist to own digital analytics and conversion optimisation across global platforms. You'll implement robust tracking, deliver actionable insights, and drive a culture of experimentation. Key Responsibilities Configure and optimise analytics tools (GA4, Adobe Analytics, GTM). Develop tagging specs, data layers, and event taxonomies. Build dashboards (Looker Studio, Power BI) and deliver performance insights. Lead CRO strategies: design A/B and multivariate tests, analyse results, and recommend improvements. Ensure compliance with GDPR and data governance standards. Collaborate with product, UX, and engineering teams to embed analytics early. What We're Looking For Hands-on experience with GA4, Adobe Analytics, GTM. Strong CRO and experimentation knowledge. Skilled in dashboarding tools and data storytelling. Familiarity with GDPR and data governance. Bonus: Databricks, CRM integrations, behavioural tools (Hotjar, Contentsquare). If this role sounds like a good fit then please apply today!
BMR Associates
Azure Infrastructure Engineer - Permanent, Hybrid
BMR Associates City, Birmingham
Azure Infrastructure Engineer, Permanent, Hybrid - West Midlands Azure, App Service, Azure SQL Database, IaC, TerraForm, Application Gateway. Due to continued growth, this leading edge organisation is looking to the market for an Azure Specialist to join their expanding infrastructure team. On this occasion they are actively seeking an AZURE INFRASTRUCTURE ENGINEER who is ready to hit the ground running and has a proven track record of implementing Azure Infrastructure solutions from the ground up. To be considered you must ideally be Azure accredited and come with 3-5 years demonstrable hands on Azure Infrastructure engineering experience, specifically App Service, API Management, Azure SQL Database, Databricks, Storage Accounts and ServiceBus. As well as strong knowledge of Infrastructure as Code using TerraForm and Scripting. Working closely with the key stakeholders, you will be responsible for translating the solution design to create, implement and maintain the Azure Infrastructure and Network solution, deploy new features and continuously improve performance and automation. At this level you will obviously be expected to demonstrate superb communication skills both verbally and written, along with the ability to work collaboratively in agile, cross functional teams and develop excellent working relationships with Key Stakeholders at all levels.
19/01/2026
Full time
Azure Infrastructure Engineer, Permanent, Hybrid - West Midlands Azure, App Service, Azure SQL Database, IaC, TerraForm, Application Gateway. Due to continued growth, this leading edge organisation is looking to the market for an Azure Specialist to join their expanding infrastructure team. On this occasion they are actively seeking an AZURE INFRASTRUCTURE ENGINEER who is ready to hit the ground running and has a proven track record of implementing Azure Infrastructure solutions from the ground up. To be considered you must ideally be Azure accredited and come with 3-5 years demonstrable hands on Azure Infrastructure engineering experience, specifically App Service, API Management, Azure SQL Database, Databricks, Storage Accounts and ServiceBus. As well as strong knowledge of Infrastructure as Code using TerraForm and Scripting. Working closely with the key stakeholders, you will be responsible for translating the solution design to create, implement and maintain the Azure Infrastructure and Network solution, deploy new features and continuously improve performance and automation. At this level you will obviously be expected to demonstrate superb communication skills both verbally and written, along with the ability to work collaboratively in agile, cross functional teams and develop excellent working relationships with Key Stakeholders at all levels.
Adecco
Senior Data Engineer Contract Dublin Contract
Adecco
Senior Data Engineer Contract Dublin Contract 6-24 Months. My client a leading name in their respected industry is in urgent need of a talented and experienced Senior Data Engineer to join them on a contact basis. You will design and implement scalable and efficient data pipelines, ETL processes, and data integration solutions to collect, process, and store large volumes of data. You will collaborate with others in the development of data models, schema designs, and data architecture frameworks to support diverse analytical and reporting needs. You will build and optimise data processing workflows using distributed computing frameworks available on Azure, our preferred cloud provider. Integrate data from various internal and external sources including databases, APIs, and streaming platforms into centralised data repositories and data warehouses. Successful candidates will have 8 years of commercial experience working as an analyst developer or data engineer in a data-centric environment. You will have Proven experience in designing and implementing end-to-end data solutions from ingestion to consumption. You will have a Strong experience with Azure data PaaS services and data pipeline delivery on the Azure platform with Databricks. You will have experience delivering data platforms with C#, Python, JSON, XML, APIs, and message bus technology. Or similar technologies. Strong knowledge of database systems, data modelling, and data integration technologies. This a unique opportunity to work with a team that is at the beginning phase of the projects. If this sounds of interest drop me a CV so we can speak in more detail.
16/01/2026
Contractor
Senior Data Engineer Contract Dublin Contract 6-24 Months. My client a leading name in their respected industry is in urgent need of a talented and experienced Senior Data Engineer to join them on a contact basis. You will design and implement scalable and efficient data pipelines, ETL processes, and data integration solutions to collect, process, and store large volumes of data. You will collaborate with others in the development of data models, schema designs, and data architecture frameworks to support diverse analytical and reporting needs. You will build and optimise data processing workflows using distributed computing frameworks available on Azure, our preferred cloud provider. Integrate data from various internal and external sources including databases, APIs, and streaming platforms into centralised data repositories and data warehouses. Successful candidates will have 8 years of commercial experience working as an analyst developer or data engineer in a data-centric environment. You will have Proven experience in designing and implementing end-to-end data solutions from ingestion to consumption. You will have a Strong experience with Azure data PaaS services and data pipeline delivery on the Azure platform with Databricks. You will have experience delivering data platforms with C#, Python, JSON, XML, APIs, and message bus technology. Or similar technologies. Strong knowledge of database systems, data modelling, and data integration technologies. This a unique opportunity to work with a team that is at the beginning phase of the projects. If this sounds of interest drop me a CV so we can speak in more detail.
Datatech
Senior Data Engineer - (ML and AI Platform)
Datatech
Senior Data Engineer (ML and AI Platform) Location London with hybrid working Monday to Wednesday in the office Salary 65,000 to 80,000 depending on experience Reference J13026 We are partnering with an AI first SaaS business that turns complex first party data into trusted, decision ready insight at scale. You will join a collaborative data and engineering team building a modern, cloud agnostic data and AI platforms. This role is well suited to an experienced data engineer who enjoys working thoughtfully with real world data, contributing to reliable production systems, and developing clear and well-structured Python and SQL. Why join: Supportive and inclusive culture where people are encouraged to contribute and be heard Clear progression with space to develop your skills at a sustainable pace An environment where collaboration, learning, and thoughtful engineering are genuinely valued What you will be doing: Contributing to the design and delivery of cloud-based data and machine learning pipelines Working with Python, PySpark and SQL to build clear and maintainable data transformations Helping shape scalable data models that support analytics, machine learning, and product features Collaborating closely with Product, Engineering, and Data Science teams to deliver meaningful production outcomes What we are looking for: Experience using Python for data transformation, ideally alongside PySpark Confidence working with SQL and production data models Experience working with at least one modern cloud data platform such as GCP, AWS, Azure, Snowflake, or Databricks Experience contributing to data pipelines that run reliably in production environments A collaborative mindset with clear and thoughtful communication Right to work in the UK is required. Sponsorship is not available now or in the future. Apply to learn more and see if this could be the next step for you. If you have a friend or colleague who may be interested, referrals are welcome. For each successful placement, you will be eligible for our general gift or voucher scheme. Datatech is one of the UK's leading recruitment agencies specialising in analytics and is the host of the critically acclaimed Women in Data event. For more information, visit (url removed)
16/01/2026
Full time
Senior Data Engineer (ML and AI Platform) Location London with hybrid working Monday to Wednesday in the office Salary 65,000 to 80,000 depending on experience Reference J13026 We are partnering with an AI first SaaS business that turns complex first party data into trusted, decision ready insight at scale. You will join a collaborative data and engineering team building a modern, cloud agnostic data and AI platforms. This role is well suited to an experienced data engineer who enjoys working thoughtfully with real world data, contributing to reliable production systems, and developing clear and well-structured Python and SQL. Why join: Supportive and inclusive culture where people are encouraged to contribute and be heard Clear progression with space to develop your skills at a sustainable pace An environment where collaboration, learning, and thoughtful engineering are genuinely valued What you will be doing: Contributing to the design and delivery of cloud-based data and machine learning pipelines Working with Python, PySpark and SQL to build clear and maintainable data transformations Helping shape scalable data models that support analytics, machine learning, and product features Collaborating closely with Product, Engineering, and Data Science teams to deliver meaningful production outcomes What we are looking for: Experience using Python for data transformation, ideally alongside PySpark Confidence working with SQL and production data models Experience working with at least one modern cloud data platform such as GCP, AWS, Azure, Snowflake, or Databricks Experience contributing to data pipelines that run reliably in production environments A collaborative mindset with clear and thoughtful communication Right to work in the UK is required. Sponsorship is not available now or in the future. Apply to learn more and see if this could be the next step for you. If you have a friend or colleague who may be interested, referrals are welcome. For each successful placement, you will be eligible for our general gift or voucher scheme. Datatech is one of the UK's leading recruitment agencies specialising in analytics and is the host of the critically acclaimed Women in Data event. For more information, visit (url removed)
Tenth Revolution Group
Data Engineering Lead
Tenth Revolution Group Oxford, Oxfordshire
Data Engineering Lead I am working with a forward-thinking professional services organisation that is expanding its Data & Innovation capabilities and looking for a Data Engineering Lead to join their team. This is a fantastic opportunity to take on a hands-on leadership role where you will guide a small team of technical specialists while shaping the future of data engineering, automation, and systems development across the business. You'll be at the heart of strategic technical delivery, blending architectural oversight with hands-on execution, and mentoring others to build scalable, modern solutions using technologies like Databricks and the Azure tech stack. Key Responsibilities: Provide technical leadership to a small team focusing on data engineering and development Define and maintain scalable internal data and systems architecture aligned with business needs Lead the design and delivery of complex engineering solutions, ensuring best practices Guide the team on prioritised initiatives, ensuring timely and high-quality delivery Deliver hands-on technical change and development where required Skills & Experience Required: Strong Data Engineering experience Hands-on expertise with the Azure tech stack (Synapse, Data Factory, Data Lake Storage) and Databricks Proven ability to lead and mentor technical teams in agile environments Ability to work closely with other leaders across the business and have input into strategy planning sessions focusing on technical execution Benefits: Salary of up to 95,000 per year 25 days annual leave, plus bank holidays Private healthcare schemes and life insurance policies Enhanced parental leave policies Lifestyle benefits such as cycle to work schemes, retail discounts and vehicle salary sacrifice schemes
16/01/2026
Full time
Data Engineering Lead I am working with a forward-thinking professional services organisation that is expanding its Data & Innovation capabilities and looking for a Data Engineering Lead to join their team. This is a fantastic opportunity to take on a hands-on leadership role where you will guide a small team of technical specialists while shaping the future of data engineering, automation, and systems development across the business. You'll be at the heart of strategic technical delivery, blending architectural oversight with hands-on execution, and mentoring others to build scalable, modern solutions using technologies like Databricks and the Azure tech stack. Key Responsibilities: Provide technical leadership to a small team focusing on data engineering and development Define and maintain scalable internal data and systems architecture aligned with business needs Lead the design and delivery of complex engineering solutions, ensuring best practices Guide the team on prioritised initiatives, ensuring timely and high-quality delivery Deliver hands-on technical change and development where required Skills & Experience Required: Strong Data Engineering experience Hands-on expertise with the Azure tech stack (Synapse, Data Factory, Data Lake Storage) and Databricks Proven ability to lead and mentor technical teams in agile environments Ability to work closely with other leaders across the business and have input into strategy planning sessions focusing on technical execution Benefits: Salary of up to 95,000 per year 25 days annual leave, plus bank holidays Private healthcare schemes and life insurance policies Enhanced parental leave policies Lifestyle benefits such as cycle to work schemes, retail discounts and vehicle salary sacrifice schemes
Liberty CL Recruitment
IT Data Engineer
Liberty CL Recruitment Chandler's Ford, Hampshire
Job Title: IT Data Engineer Location: Southampton, Hampshire Salary: £40,000 - £50,000 Are you an experienced IT Data Engineer with experience in the professional services industry? If so, we may just have the perfect role for you! Role Overview: Based in Southampton, our client is a leading Law Firm looking to hire an IT Data Engineer to help aid their expansion plans. Your role will be structured around project and business support tasks, and feedback will be used to drive innovation and business growth. Although the role is primarily an IT Data Engineer, you will also need to be able to display BI Developer and strong Analytical skills and play a crucial role in designing, building, and maintaining our data pipelines using Microsoft Azure tools and platforms, as well as presenting information to the end user. Your responsibilities: Work with Microsoft Azure Technologies (e.g., Data Factory, Databricks, Synapse) to orchestrate data loading and workflows and manage data pipelines. Maintain, support, and build data warehouses using Azure SQL Technologies Collaborate with analysts, and business stakeholders to understand data requirements and translate them into technical solutions. Developing and implementing data validation and reconciliation processes to ensure data quality and consistency across the data platforms. Troubleshooting and resolving issues related to data transformation, data loading, and data quality, while proactively identifying opportunities for process optimisation and performance tuning. Contribute to the development, support and maintenance of reports and dashboards using Power BI. Troubleshoot and resolve data-related issues and provide support for data-related projects. Innovate on existing solutions and look to help maximise efficiency in the platform. The ideal candidate: Good understanding of SQL and relational databases. These are the key assets within our organisation. Familiarity with Microsoft Azure services (e.g., Azure SQL, Azure Synapse, Azure Databrick Understanding of data warehousing concepts and data architecture. Familiarity with any programming or scripting language (e.g., Python, R, JavaScript). Strong analytical and problem-solving skills. Excellent communication and teamwork abilities. Eagerness to learn and adapt to new technologies and methodologies What s in it for you? 26 days' holiday + buy up to a further 5 days A day off for your birthday Life assurance Employee assistance programme Enhanced maternity, adoption and paternity pay Private medical insurance Healthcare cash plan Annual discretionary bonus scheme Employee retail discounts If you would like to discuss this opportunity in more detail, please reach out to the team at Liberty Recruitment Group.
14/01/2026
Full time
Job Title: IT Data Engineer Location: Southampton, Hampshire Salary: £40,000 - £50,000 Are you an experienced IT Data Engineer with experience in the professional services industry? If so, we may just have the perfect role for you! Role Overview: Based in Southampton, our client is a leading Law Firm looking to hire an IT Data Engineer to help aid their expansion plans. Your role will be structured around project and business support tasks, and feedback will be used to drive innovation and business growth. Although the role is primarily an IT Data Engineer, you will also need to be able to display BI Developer and strong Analytical skills and play a crucial role in designing, building, and maintaining our data pipelines using Microsoft Azure tools and platforms, as well as presenting information to the end user. Your responsibilities: Work with Microsoft Azure Technologies (e.g., Data Factory, Databricks, Synapse) to orchestrate data loading and workflows and manage data pipelines. Maintain, support, and build data warehouses using Azure SQL Technologies Collaborate with analysts, and business stakeholders to understand data requirements and translate them into technical solutions. Developing and implementing data validation and reconciliation processes to ensure data quality and consistency across the data platforms. Troubleshooting and resolving issues related to data transformation, data loading, and data quality, while proactively identifying opportunities for process optimisation and performance tuning. Contribute to the development, support and maintenance of reports and dashboards using Power BI. Troubleshoot and resolve data-related issues and provide support for data-related projects. Innovate on existing solutions and look to help maximise efficiency in the platform. The ideal candidate: Good understanding of SQL and relational databases. These are the key assets within our organisation. Familiarity with Microsoft Azure services (e.g., Azure SQL, Azure Synapse, Azure Databrick Understanding of data warehousing concepts and data architecture. Familiarity with any programming or scripting language (e.g., Python, R, JavaScript). Strong analytical and problem-solving skills. Excellent communication and teamwork abilities. Eagerness to learn and adapt to new technologies and methodologies What s in it for you? 26 days' holiday + buy up to a further 5 days A day off for your birthday Life assurance Employee assistance programme Enhanced maternity, adoption and paternity pay Private medical insurance Healthcare cash plan Annual discretionary bonus scheme Employee retail discounts If you would like to discuss this opportunity in more detail, please reach out to the team at Liberty Recruitment Group.
Adecco
Azure Data Architect - SC CLEARANCE
Adecco
Azure Data Architect Location: UK Wide - Mainly remote with travel to office and client site when required Clearance Requirement: Eligible for SC clearance (must have lived in the UK for the past 5 years) Salary: 80-95,000 per annum + Permanent Benefits About the Role We're looking for an experienced Azure Data Architect who's passionate about delivering cutting-edge cloud data solutions and driving digital transformation. You'll join a high-performing team of architects, engineers, and analysts who specialise in helping organisations unlock the value of their data using modern cloud technologies. This is an opportunity to work across diverse industries, shaping and delivering data architectures that power smarter decision-making and innovation. What You'll Do Design Modern Azure Data Architectures: Lead the design and implementation of scalable, secure, and efficient data solutions using Azure PaaS and IaaS services. Collaborate Across Teams: Work closely with sales, delivery, and client stakeholders to align solutions with business objectives and technical best practices. Solution Leadership: Partner with other architects to ensure solution designs align to enterprise architecture blueprints and standards. Pre-Sales and Proposal Support: Support sales teams in defining technical strategies, solution proposals, pricing, and bid responses. Hands-On Expertise: Provide technical guidance on Azure components such as Synapse Analytics, Data Factory, Data Lake, Databricks, Purview, Azure SQL, and Storage. What We're Looking For We're seeking someone who combines deep technical capability with strong consulting and leadership skills. You'll ideally have: Azure Solution Architect certification (essential). Proven experience designing data-focused solutions in Azure environments. Expertise across Azure PaaS and IaaS , including App Services, Synapse Analytics, Azure Data Lake, and Data Factory. Strong understanding of CI/CD and Infrastructure as Code (IaC) using tools such as Azure DevOps. Familiarity with containerisation and orchestration (Docker, Kubernetes). Knowledge of networking and security fundamentals in Azure (ExpressRoute, load balancing, DNS, availability sets). Awareness of other cloud platforms (AWS, GCP) is a plus.
14/01/2026
Full time
Azure Data Architect Location: UK Wide - Mainly remote with travel to office and client site when required Clearance Requirement: Eligible for SC clearance (must have lived in the UK for the past 5 years) Salary: 80-95,000 per annum + Permanent Benefits About the Role We're looking for an experienced Azure Data Architect who's passionate about delivering cutting-edge cloud data solutions and driving digital transformation. You'll join a high-performing team of architects, engineers, and analysts who specialise in helping organisations unlock the value of their data using modern cloud technologies. This is an opportunity to work across diverse industries, shaping and delivering data architectures that power smarter decision-making and innovation. What You'll Do Design Modern Azure Data Architectures: Lead the design and implementation of scalable, secure, and efficient data solutions using Azure PaaS and IaaS services. Collaborate Across Teams: Work closely with sales, delivery, and client stakeholders to align solutions with business objectives and technical best practices. Solution Leadership: Partner with other architects to ensure solution designs align to enterprise architecture blueprints and standards. Pre-Sales and Proposal Support: Support sales teams in defining technical strategies, solution proposals, pricing, and bid responses. Hands-On Expertise: Provide technical guidance on Azure components such as Synapse Analytics, Data Factory, Data Lake, Databricks, Purview, Azure SQL, and Storage. What We're Looking For We're seeking someone who combines deep technical capability with strong consulting and leadership skills. You'll ideally have: Azure Solution Architect certification (essential). Proven experience designing data-focused solutions in Azure environments. Expertise across Azure PaaS and IaaS , including App Services, Synapse Analytics, Azure Data Lake, and Data Factory. Strong understanding of CI/CD and Infrastructure as Code (IaC) using tools such as Azure DevOps. Familiarity with containerisation and orchestration (Docker, Kubernetes). Knowledge of networking and security fundamentals in Azure (ExpressRoute, load balancing, DNS, availability sets). Awareness of other cloud platforms (AWS, GCP) is a plus.

Modal Window

  • Home
  • Contact
  • About Us
  • FAQs
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • IT blog
  • Facebook
  • Twitter
  • LinkedIn
  • Youtube
© 2008-2026 IT Job Board