Senior AWS Data engineer (LDW Data Warehouse Discovery) Max Supplier Rate: 483 Clearance Required: SC ACTIVE Duration: 6 months Location: Telford with 2 days/week in office IR35 Status: Inside The role falls within the Data Contract Delivery Area of the clients contract. The group provides a wide range of data and analytics solutions in support of our client's business priorities: maximise revenues, bear down on fraud, and cloud migration. This role involves migrating data from legacy on-premise systems (primarily Oracle and Informatica) to a new AWS cloud-native architecture. You will be part of an Agile software delivery team working closely with other engineers and supported by project managers, business analysts and architects. With additional client and key stakeholder interaction as required. We are looking for strong AWS Senior Data Engineers who can design and deliver cloud transformation projects. Your work will be to: As part of a cloud transformation team, supporting the technical lead with design and client interactions, and supporting junior engineers with their development. Design, Develop and Test Data Pipelines: Create robust pipelines to ingest, process, and transform data, ensuring it is ready for analytics and reporting. Implement ETL/ELT Processes: Develop and Test Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) workflows to seamlessly move data from source systems to Data Warehouses/Data Lakes/Lake Houses using Open Source and AWS tools. Adopt DevOps Practices: Utilise DevOps methodologies and tools for continuous integration and deployment (CI/CD). Must-have skills : Proficiency with Core AWS Tools (AWS Glue, Lambda, S3, Redshift) Programming Skills (Python) SQL and Data Storage Technologies: Some knowledge of Data Warehouse, Database technologies, and technologies (AWS Redshift, AWS RDS). AWS Data Lakes: Some experience with AWS data lakes on AWS S3 to store and process both structured and unstructured data sets. Nice-to-have skills: Knowledge of Open Table Formats (Iceberg/Delta). AWS Tools: Experience with Amazon CloudWatch, SNS, Athena, DynamoDB, EMR, Kinesis. Data modelling Job scheduling/orchestration Data virtualisation tools (Denodo) ALM Tooling (Jira, Confluence) CI/CD toolsets (GitLab, Terraform) Reporting tools (Business Objects, Power BI, Pentaho BA) Data Analytics toolset (SAS Viya) Observability tools (Grafana, Dynatrace) Experience: You should have experience as a senior data engineer delivering within large scale data analytics solutions and the ability to operate at all stages of the software engineering lifecycle, as well as some experience in the following Awareness of DevOps culture and modern engineering practices Experience of Agile Scrum based delivery Proactive in nature, personal drive, enthusiasm, willingness to learn Excellent communications skills including stakeholder management Developing solutions within the given architecture and adhering to specified NFRs Supporting other engineers within your team Continually looking for ways to improve
17/04/2026
Contractor
Senior AWS Data engineer (LDW Data Warehouse Discovery) Max Supplier Rate: 483 Clearance Required: SC ACTIVE Duration: 6 months Location: Telford with 2 days/week in office IR35 Status: Inside The role falls within the Data Contract Delivery Area of the clients contract. The group provides a wide range of data and analytics solutions in support of our client's business priorities: maximise revenues, bear down on fraud, and cloud migration. This role involves migrating data from legacy on-premise systems (primarily Oracle and Informatica) to a new AWS cloud-native architecture. You will be part of an Agile software delivery team working closely with other engineers and supported by project managers, business analysts and architects. With additional client and key stakeholder interaction as required. We are looking for strong AWS Senior Data Engineers who can design and deliver cloud transformation projects. Your work will be to: As part of a cloud transformation team, supporting the technical lead with design and client interactions, and supporting junior engineers with their development. Design, Develop and Test Data Pipelines: Create robust pipelines to ingest, process, and transform data, ensuring it is ready for analytics and reporting. Implement ETL/ELT Processes: Develop and Test Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) workflows to seamlessly move data from source systems to Data Warehouses/Data Lakes/Lake Houses using Open Source and AWS tools. Adopt DevOps Practices: Utilise DevOps methodologies and tools for continuous integration and deployment (CI/CD). Must-have skills : Proficiency with Core AWS Tools (AWS Glue, Lambda, S3, Redshift) Programming Skills (Python) SQL and Data Storage Technologies: Some knowledge of Data Warehouse, Database technologies, and technologies (AWS Redshift, AWS RDS). AWS Data Lakes: Some experience with AWS data lakes on AWS S3 to store and process both structured and unstructured data sets. Nice-to-have skills: Knowledge of Open Table Formats (Iceberg/Delta). AWS Tools: Experience with Amazon CloudWatch, SNS, Athena, DynamoDB, EMR, Kinesis. Data modelling Job scheduling/orchestration Data virtualisation tools (Denodo) ALM Tooling (Jira, Confluence) CI/CD toolsets (GitLab, Terraform) Reporting tools (Business Objects, Power BI, Pentaho BA) Data Analytics toolset (SAS Viya) Observability tools (Grafana, Dynatrace) Experience: You should have experience as a senior data engineer delivering within large scale data analytics solutions and the ability to operate at all stages of the software engineering lifecycle, as well as some experience in the following Awareness of DevOps culture and modern engineering practices Experience of Agile Scrum based delivery Proactive in nature, personal drive, enthusiasm, willingness to learn Excellent communications skills including stakeholder management Developing solutions within the given architecture and adhering to specified NFRs Supporting other engineers within your team Continually looking for ways to improve
Collibra Integration Engineer Role - Hybrid/Canary Wharf - 6 month Contract - Banking Tier 1 Bank - Regulatory Reporting Team Role - Collibra Integration Engineer Duration - 6 months with very likely extension Location - Hybrid/Canary Wharf - 3 days per week in a Canary Wharf office Rate - £490 per day (inside IR35) Tasks Administer and maintain the Collibra platform from an IT engineering perspective, including configuration, health monitoring, troubleshooting, and performance management across EMEA environments. Configure, schedule, and manage integrations using DIP/CDQ, EDGE, and Lineage Harvester, ensuring consistent ingestion of metadata, schemas, and lineage from multiple data sources. Perform connectivity and prerequisite validation with Application teams, ETL teams, and DBAs, including driver installation, credential management, network/Firewall requirements and technical onboarding for new data sources. Investigate and resolve incidents related to Collibra integrations, lineage ingestion, and platform availability; collaborate with infrastructure teams to address root causes and prevent reoccurrence. Ensure configuration consistency across environments (DEV/QA/PROD), maintaining runbooks, documentation, and standard operating procedures. Provide regular, clear updates on operational status, risks, and issues to Development Managers and stakeholders. Additional Tasks Support EMEA-wide Collibra platform stability by monitoring ingestion pipelines, schema drift, job failures, and performance degradation, ensuring reliable metadata and lineage availability Maintain technical documentation, including integration configurations, onboarding guides, troubleshooting logs and operational checklists Assist with access control operations, ensuring proper provisioning, least-privilege consistency, and audit-readiness for regulatory requirements Participate in technical testing for new features, patches, and platform improvements, raising risks or gaps early in collaboration with vendor support and internal IT teams. GCS is acting as an Employment Business in relation to this vacancy.
17/04/2026
Contractor
Collibra Integration Engineer Role - Hybrid/Canary Wharf - 6 month Contract - Banking Tier 1 Bank - Regulatory Reporting Team Role - Collibra Integration Engineer Duration - 6 months with very likely extension Location - Hybrid/Canary Wharf - 3 days per week in a Canary Wharf office Rate - £490 per day (inside IR35) Tasks Administer and maintain the Collibra platform from an IT engineering perspective, including configuration, health monitoring, troubleshooting, and performance management across EMEA environments. Configure, schedule, and manage integrations using DIP/CDQ, EDGE, and Lineage Harvester, ensuring consistent ingestion of metadata, schemas, and lineage from multiple data sources. Perform connectivity and prerequisite validation with Application teams, ETL teams, and DBAs, including driver installation, credential management, network/Firewall requirements and technical onboarding for new data sources. Investigate and resolve incidents related to Collibra integrations, lineage ingestion, and platform availability; collaborate with infrastructure teams to address root causes and prevent reoccurrence. Ensure configuration consistency across environments (DEV/QA/PROD), maintaining runbooks, documentation, and standard operating procedures. Provide regular, clear updates on operational status, risks, and issues to Development Managers and stakeholders. Additional Tasks Support EMEA-wide Collibra platform stability by monitoring ingestion pipelines, schema drift, job failures, and performance degradation, ensuring reliable metadata and lineage availability Maintain technical documentation, including integration configurations, onboarding guides, troubleshooting logs and operational checklists Assist with access control operations, ensuring proper provisioning, least-privilege consistency, and audit-readiness for regulatory requirements Participate in technical testing for new features, patches, and platform improvements, raising risks or gaps early in collaboration with vendor support and internal IT teams. GCS is acting as an Employment Business in relation to this vacancy.
Senior AWS Data engineer (LDW Data Warehouse Discovery) Max Supplier Rate: £483 Clearance Required: SC ACTIVE Duration: 6 months Location: Telford with 2 days/week in office IR35 Status: Inside The role falls within the Data Contract Delivery Area of the clients contract. The group provides a wide range of data and analytics solutions in support of our client's business priorities: maximise revenues, bear down on fraud, and cloud migration. This role involves migrating data from Legacy on-premise systems (primarily Oracle and Informatica) to a new AWS cloud-native architecture. You will be part of an Agile software delivery team working closely with other engineers and supported by project managers, business analysts and architects. With additional client and key stakeholder interaction as required. We are looking for strong AWS Senior Data Engineers who can design and deliver cloud transformation projects. Your work will be to: As part of a cloud transformation team, supporting the technical lead with design and client interactions, and supporting junior engineers with their development. Design, Develop and Test Data Pipelines: Create robust pipelines to ingest, process, and transform data, ensuring it is ready for analytics and reporting. Implement ETL/ELT Processes: Develop and Test Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) workflows to seamlessly move data from source systems to Data Warehouses/Data Lakes/Lake Houses using Open Source and AWS tools. Adopt DevOps Practices: Utilise DevOps methodologies and tools for continuous integration and deployment (CI/CD). Must-have skills : Proficiency with Core AWS Tools (AWS Glue, Lambda, S3, Redshift) Programming Skills (Python) SQL and Data Storage Technologies: Some knowledge of Data Warehouse, Database technologies, and technologies (AWS Redshift, AWS RDS). AWS Data Lakes: Some experience with AWS data lakes on AWS S3 to store and process both structured and unstructured data sets. Nice-to-have skills: Knowledge of Open Table Formats (Iceberg/Delta). AWS Tools: Experience with Amazon CloudWatch, SNS, Athena, DynamoDB, EMR, Kinesis. Data modelling Job scheduling/orchestration Data virtualisation tools (Denodo) ALM Tooling (Jira, Confluence) CI/CD toolsets (GitLab, Terraform) Reporting tools (Business Objects, Power BI, Pentaho BA) Data Analytics toolset (SAS Viya) Observability tools (Grafana, Dynatrace) Experience: You should have experience as a senior data engineer delivering within large scale data analytics solutions and the ability to operate at all stages of the software engineering life cycle, as well as some experience in the following Awareness of DevOps culture and modern engineering practices Experience of Agile Scrum based delivery Proactive in nature, personal drive, enthusiasm, willingness to learn Excellent communications skills including stakeholder management Developing solutions within the given architecture and adhering to specified NFRs Supporting other engineers within your team Continually looking for ways to improve
17/04/2026
Contractor
Senior AWS Data engineer (LDW Data Warehouse Discovery) Max Supplier Rate: £483 Clearance Required: SC ACTIVE Duration: 6 months Location: Telford with 2 days/week in office IR35 Status: Inside The role falls within the Data Contract Delivery Area of the clients contract. The group provides a wide range of data and analytics solutions in support of our client's business priorities: maximise revenues, bear down on fraud, and cloud migration. This role involves migrating data from Legacy on-premise systems (primarily Oracle and Informatica) to a new AWS cloud-native architecture. You will be part of an Agile software delivery team working closely with other engineers and supported by project managers, business analysts and architects. With additional client and key stakeholder interaction as required. We are looking for strong AWS Senior Data Engineers who can design and deliver cloud transformation projects. Your work will be to: As part of a cloud transformation team, supporting the technical lead with design and client interactions, and supporting junior engineers with their development. Design, Develop and Test Data Pipelines: Create robust pipelines to ingest, process, and transform data, ensuring it is ready for analytics and reporting. Implement ETL/ELT Processes: Develop and Test Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) workflows to seamlessly move data from source systems to Data Warehouses/Data Lakes/Lake Houses using Open Source and AWS tools. Adopt DevOps Practices: Utilise DevOps methodologies and tools for continuous integration and deployment (CI/CD). Must-have skills : Proficiency with Core AWS Tools (AWS Glue, Lambda, S3, Redshift) Programming Skills (Python) SQL and Data Storage Technologies: Some knowledge of Data Warehouse, Database technologies, and technologies (AWS Redshift, AWS RDS). AWS Data Lakes: Some experience with AWS data lakes on AWS S3 to store and process both structured and unstructured data sets. Nice-to-have skills: Knowledge of Open Table Formats (Iceberg/Delta). AWS Tools: Experience with Amazon CloudWatch, SNS, Athena, DynamoDB, EMR, Kinesis. Data modelling Job scheduling/orchestration Data virtualisation tools (Denodo) ALM Tooling (Jira, Confluence) CI/CD toolsets (GitLab, Terraform) Reporting tools (Business Objects, Power BI, Pentaho BA) Data Analytics toolset (SAS Viya) Observability tools (Grafana, Dynatrace) Experience: You should have experience as a senior data engineer delivering within large scale data analytics solutions and the ability to operate at all stages of the software engineering life cycle, as well as some experience in the following Awareness of DevOps culture and modern engineering practices Experience of Agile Scrum based delivery Proactive in nature, personal drive, enthusiasm, willingness to learn Excellent communications skills including stakeholder management Developing solutions within the given architecture and adhering to specified NFRs Supporting other engineers within your team Continually looking for ways to improve
A leading organisation undergoing major investment in its data and engineering capabilities is looking for a Senior Automation QA Engineer to strengthen quality across complex, distributed systems. This role sits at the intersection of software engineering and data engineering, supporting the delivery of high quality services and reliable data products at scale. You ll work closely with cross functional teams, driving automation, improving test maturity, and ensuring the accuracy and resilience of data pipelines and backend services. This is a hands on role suited to someone who enjoys ownership, technical depth, and influencing engineering best practice. Essential Skills & Experience Strong background in automation testing for microservices based architectures. Hands on experience building automation frameworks using Java or similar languages. Solid experience testing REST APIs, backend services, and service to service integrations. Proven experience validating data engineering pipelines (ETL/ELT, batch jobs, scheduled data builds). Strong SQL skills for data validation, reconciliation, and analysis. Experience with data warehouses, data lakes, or big data platforms such as Snowflake, Redshift, BigQuery, or Spark. Familiarity with CI/CD tooling (Jenkins, Azure DevOps, GitHub Actions, GitLab CI). Experience working with Git based version control. Exposure to cloud environments (Azure, AWS, or GCP). Understanding of distributed systems and microservices architecture. This contract is inside IR35. You will be required to work through an umbrella company for the duration of the project.
16/04/2026
Contractor
A leading organisation undergoing major investment in its data and engineering capabilities is looking for a Senior Automation QA Engineer to strengthen quality across complex, distributed systems. This role sits at the intersection of software engineering and data engineering, supporting the delivery of high quality services and reliable data products at scale. You ll work closely with cross functional teams, driving automation, improving test maturity, and ensuring the accuracy and resilience of data pipelines and backend services. This is a hands on role suited to someone who enjoys ownership, technical depth, and influencing engineering best practice. Essential Skills & Experience Strong background in automation testing for microservices based architectures. Hands on experience building automation frameworks using Java or similar languages. Solid experience testing REST APIs, backend services, and service to service integrations. Proven experience validating data engineering pipelines (ETL/ELT, batch jobs, scheduled data builds). Strong SQL skills for data validation, reconciliation, and analysis. Experience with data warehouses, data lakes, or big data platforms such as Snowflake, Redshift, BigQuery, or Spark. Familiarity with CI/CD tooling (Jenkins, Azure DevOps, GitHub Actions, GitLab CI). Experience working with Git based version control. Exposure to cloud environments (Azure, AWS, or GCP). Understanding of distributed systems and microservices architecture. This contract is inside IR35. You will be required to work through an umbrella company for the duration of the project.
Job Title: Data Engineer (ITEC CRS/FATCA) - Sage Max Supplier Rate: 500 (Inside IR35) Clearance Required: SC Duration: 6 months Location: Telford - requires attendance for occasional workshops (typically a couple of days per month) at one of our sites. Job Description: Demand on the ITEC CRS/FATCA project has dramatically increased necessitating the need for a new team to deliver the CRS/FATCA delivery. This developer role will be primarily working on Talend and Oracle RDS systems, within our existing Talend framework and patterns. Experience of ETL tooling will be needed, preferably Talend, but Pentaho/Informatica experience will be transferable. Experience working in Oracle RDS databases will also be required. Job spec: Create and maintain design documentation Develop project artefacts including (but not limited to) Talend jobs, Oracle DDL and SQL, GitLab Pipeline enhancements Review, capture and critique project requirements Investigate and resolve any defects/issues raised during testing phases. Review colleagues development and test artefacts. Support for Production promotion and warranty' support Raise and document a log of risks, assumptions, issues, and dependencies (RAID) identified during development Conduct Component Testing of developed artefacts (including test data creation) Support QAs in conducting test phases and debugging issues identified throughout the development life cycle Create and execute test packs for Component Test automation Support Performance Testing phase Provide support and guidance to colleagues and mentor junior engineers Reporting to Lead engineer and any scrum-of-scrum discussions Skills: SC clearance Data ETL product experience - Talend preferred Oracle RDS Additional Requirements Hybrid role: requires attendance for occasional workshops (typically a couple of days per month) at one of our sites - Telford. Security Clearance: candidates must hold active SC Clearance UK government department.
15/04/2026
Contractor
Job Title: Data Engineer (ITEC CRS/FATCA) - Sage Max Supplier Rate: 500 (Inside IR35) Clearance Required: SC Duration: 6 months Location: Telford - requires attendance for occasional workshops (typically a couple of days per month) at one of our sites. Job Description: Demand on the ITEC CRS/FATCA project has dramatically increased necessitating the need for a new team to deliver the CRS/FATCA delivery. This developer role will be primarily working on Talend and Oracle RDS systems, within our existing Talend framework and patterns. Experience of ETL tooling will be needed, preferably Talend, but Pentaho/Informatica experience will be transferable. Experience working in Oracle RDS databases will also be required. Job spec: Create and maintain design documentation Develop project artefacts including (but not limited to) Talend jobs, Oracle DDL and SQL, GitLab Pipeline enhancements Review, capture and critique project requirements Investigate and resolve any defects/issues raised during testing phases. Review colleagues development and test artefacts. Support for Production promotion and warranty' support Raise and document a log of risks, assumptions, issues, and dependencies (RAID) identified during development Conduct Component Testing of developed artefacts (including test data creation) Support QAs in conducting test phases and debugging issues identified throughout the development life cycle Create and execute test packs for Component Test automation Support Performance Testing phase Provide support and guidance to colleagues and mentor junior engineers Reporting to Lead engineer and any scrum-of-scrum discussions Skills: SC clearance Data ETL product experience - Talend preferred Oracle RDS Additional Requirements Hybrid role: requires attendance for occasional workshops (typically a couple of days per month) at one of our sites - Telford. Security Clearance: candidates must hold active SC Clearance UK government department.
This is a fantastic opportunity to work for an established ecommerce company on a long term remote contract as a Data Engineer, Inside IR35. The key skills required for this Data Engineer position are: Python SQL ETL Azure Databricks If you do have the relevant experience for this remote Data Engineer position, please do apppy.
02/04/2026
Contractor
This is a fantastic opportunity to work for an established ecommerce company on a long term remote contract as a Data Engineer, Inside IR35. The key skills required for this Data Engineer position are: Python SQL ETL Azure Databricks If you do have the relevant experience for this remote Data Engineer position, please do apppy.
Robert Half is working with a market-leading SaaS organisation to recruit a Junior Data Engineer for an initial 6-month contract. The role will involve building and optimising data pipelines, ensuring data quality, and enabling analytics and reporting across the business. Responsibilities: Develop and maintain data pipelines in Snowflake using Python and SQL. Write efficient SQL queries and stored procedures for data processing and reporting. Automate ingestion of structured and semi-structured data sources. Optimise data models and pipelines for scalability, performance, and cost efficiency. Collaborate with analysts and business stakeholders to deliver reliable datasets. Implement best practices in data quality, governance, and secu rity . Experience: 2-3 years' experience as a Data Engineer, with strong hands-on Snowflake expertise. Proven experience with SQL (advanced query writing, optimisation, performance tuning). Strong Python skills for data transformation, automation, and pipeline development. Experience with ETL/ELT tools (eg, dbt, Matillion, Talend, or similar). Exposure to cloud platforms (AWS preferred, Azure/GCP also considered). Familiarity with Git/version control and CI/CD pipelines. Knowledge of modern data warehousing concepts and best practices. Contract: Inital 6 Month Contract Fully Remote - UK Based Day Rate - Inside IR35 Robert Half Ltd acts as an employment business for temporary positions and an employment agency for permanent positions. Robert Half is committed to diversity, equity and inclusion. Suitable candidates with equivalent qualifications and more or less experience can apply. Rates of pay and salary ranges are dependent upon your experience, qualifications and training. If you wish to apply, please read our Privacy Notice describing how we may process, disclose and store your personal data:
07/10/2025
Contractor
Robert Half is working with a market-leading SaaS organisation to recruit a Junior Data Engineer for an initial 6-month contract. The role will involve building and optimising data pipelines, ensuring data quality, and enabling analytics and reporting across the business. Responsibilities: Develop and maintain data pipelines in Snowflake using Python and SQL. Write efficient SQL queries and stored procedures for data processing and reporting. Automate ingestion of structured and semi-structured data sources. Optimise data models and pipelines for scalability, performance, and cost efficiency. Collaborate with analysts and business stakeholders to deliver reliable datasets. Implement best practices in data quality, governance, and secu rity . Experience: 2-3 years' experience as a Data Engineer, with strong hands-on Snowflake expertise. Proven experience with SQL (advanced query writing, optimisation, performance tuning). Strong Python skills for data transformation, automation, and pipeline development. Experience with ETL/ELT tools (eg, dbt, Matillion, Talend, or similar). Exposure to cloud platforms (AWS preferred, Azure/GCP also considered). Familiarity with Git/version control and CI/CD pipelines. Knowledge of modern data warehousing concepts and best practices. Contract: Inital 6 Month Contract Fully Remote - UK Based Day Rate - Inside IR35 Robert Half Ltd acts as an employment business for temporary positions and an employment agency for permanent positions. Robert Half is committed to diversity, equity and inclusion. Suitable candidates with equivalent qualifications and more or less experience can apply. Rates of pay and salary ranges are dependent upon your experience, qualifications and training. If you wish to apply, please read our Privacy Notice describing how we may process, disclose and store your personal data:
Data Architect - Data Vault, Snowflake, ADF £Market rate (Inside IR35) London/Hybrid 6 months My client is an instantly recognisable Insurance brand who urgently require a Data Architect with expertise in Data Vault, Data Modelling, Data Engineering Design, strong Azure stack (including Data Factory (ADF), Data Lake Storage, SQL, Azure ML and Power BI), Snowflake and Enterprise scale Data Sources to join a business-critical Programme ASAP. Key Requirements: Proven expertise in Data Architecture working on large, complex Data Programmes Strong Data Modelling experience (especially Conceptual Data Models) Demonstrable background in Data Engineering Design processes (ETL, ELT), with good knowledge of Data Vault, Inmon and Kimball Strong knowledge of MS Azure stack (including Data Factory (ADF), Data Lake Storage, SQL, Azure ML and Power BI) Proven experience with Snowflake Architecture, including DevOps integration and CI/CD practices Previous experience with large, Enterprise scale Data sets and sources Strong understanding of Data Governance principles (Data stewardship, Data quality and compliance guardrails as well as Data quality, lineage, life cycle management and security) Excellent communication and stakeholder management skills Flexible approach to hybrid working (attending workshops, possibly across differ locations if needed) Nice to have: Insurance industry experience Previous work on Data strategies Exposure to collaborative modelling tools such as Ellie.ai Immediate availability If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found on our website.
03/10/2025
Contractor
Data Architect - Data Vault, Snowflake, ADF £Market rate (Inside IR35) London/Hybrid 6 months My client is an instantly recognisable Insurance brand who urgently require a Data Architect with expertise in Data Vault, Data Modelling, Data Engineering Design, strong Azure stack (including Data Factory (ADF), Data Lake Storage, SQL, Azure ML and Power BI), Snowflake and Enterprise scale Data Sources to join a business-critical Programme ASAP. Key Requirements: Proven expertise in Data Architecture working on large, complex Data Programmes Strong Data Modelling experience (especially Conceptual Data Models) Demonstrable background in Data Engineering Design processes (ETL, ELT), with good knowledge of Data Vault, Inmon and Kimball Strong knowledge of MS Azure stack (including Data Factory (ADF), Data Lake Storage, SQL, Azure ML and Power BI) Proven experience with Snowflake Architecture, including DevOps integration and CI/CD practices Previous experience with large, Enterprise scale Data sets and sources Strong understanding of Data Governance principles (Data stewardship, Data quality and compliance guardrails as well as Data quality, lineage, life cycle management and security) Excellent communication and stakeholder management skills Flexible approach to hybrid working (attending workshops, possibly across differ locations if needed) Nice to have: Insurance industry experience Previous work on Data strategies Exposure to collaborative modelling tools such as Ellie.ai Immediate availability If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now. Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found on our website.
Data Engineer - 3 Month Contract - Hybrid - Cardiff VIQU have partnered with an NHS client who are seeking a Data Engineer to support an ongoing project. The successful Data Engineer will support the modernisation of the data estate, migrating legacy SQL Server warehouses into Azure. You will play a key role in shaping the new cloud data platform. Responsibilities: Build and optimise ETL/ELT pipelines with Azure Data Factory, Synapse, and SQL Database. Lead the migration of on-premises SQL Server/SSIS workloads into Azure. Design data lakes, marts, and models to support analytics and reporting. Integrate external sources including Google Cloud Platform. Drive data governance with tools like Azure Purview. Collaborate with architects, BI teams, and stakeholders to ensure seamless data access. Apply DevOps (ADO, GitHub) and Agile practices for delivery. Key skills & experience: Strong experience with SQL Server, T-SQL, and SSIS. Hands-on with Azure SQL, Data Factory, Synapse, and Data Lake. Track record in data warehouse development and cloud migrations. Proficient with Azure DevOps/GitHub. Strong problem-solving and documentation skills. Python for automation (desirable) Knowledge of Microsoft Fabric & Azure Purview (desirable) Exposure to GCP (desirable) NHS experience (desirable) Contract Overview Role: Data Engineer Duration: 3-month contract IR35: Inside IR35 Rate: £400 - £420 per day Location: Hybrid - Cardiff Apply now to speak with VIQU IT in confidence. Or reach out to Suzie Stone via the VIQU IT website. Do you know someone great? We'll thank you with up to £1,000 if your referral is successful (terms apply). For more exciting roles and opportunities like this, please follow us on IT Recruitment.
03/10/2025
Full time
Data Engineer - 3 Month Contract - Hybrid - Cardiff VIQU have partnered with an NHS client who are seeking a Data Engineer to support an ongoing project. The successful Data Engineer will support the modernisation of the data estate, migrating legacy SQL Server warehouses into Azure. You will play a key role in shaping the new cloud data platform. Responsibilities: Build and optimise ETL/ELT pipelines with Azure Data Factory, Synapse, and SQL Database. Lead the migration of on-premises SQL Server/SSIS workloads into Azure. Design data lakes, marts, and models to support analytics and reporting. Integrate external sources including Google Cloud Platform. Drive data governance with tools like Azure Purview. Collaborate with architects, BI teams, and stakeholders to ensure seamless data access. Apply DevOps (ADO, GitHub) and Agile practices for delivery. Key skills & experience: Strong experience with SQL Server, T-SQL, and SSIS. Hands-on with Azure SQL, Data Factory, Synapse, and Data Lake. Track record in data warehouse development and cloud migrations. Proficient with Azure DevOps/GitHub. Strong problem-solving and documentation skills. Python for automation (desirable) Knowledge of Microsoft Fabric & Azure Purview (desirable) Exposure to GCP (desirable) NHS experience (desirable) Contract Overview Role: Data Engineer Duration: 3-month contract IR35: Inside IR35 Rate: £400 - £420 per day Location: Hybrid - Cardiff Apply now to speak with VIQU IT in confidence. Or reach out to Suzie Stone via the VIQU IT website. Do you know someone great? We'll thank you with up to £1,000 if your referral is successful (terms apply). For more exciting roles and opportunities like this, please follow us on IT Recruitment.
Data Engineer 12 months Remote 470 per day inside IR35 - Umbrella only Active SC clearance and eligible candidates will be considered Project description Our client playing a crucial role in the ESN project in 2025, which aims to modernize and enhance communication for frontline emergency services in the UK. This initiative will greatly enhance the capabilities of emergency services, allowing them to share vital data and information quickly and securely, improving response times and overall public safety. Required Skills Data Stage, RedShift, QuickSight, S3 data migration/ ETL both batch and real time data warehouse development DevSecOps Java SQL relational databases GitHub/lab experience Nice to have: Data quality xml AWS Data Specialty certification All profiles will be reviewed against the required skills and experience. Due to the high number of applications we will only be able to respond to successful applicants in the first instance. We thank you for your interest and the time taken to apply!
02/10/2025
Contractor
Data Engineer 12 months Remote 470 per day inside IR35 - Umbrella only Active SC clearance and eligible candidates will be considered Project description Our client playing a crucial role in the ESN project in 2025, which aims to modernize and enhance communication for frontline emergency services in the UK. This initiative will greatly enhance the capabilities of emergency services, allowing them to share vital data and information quickly and securely, improving response times and overall public safety. Required Skills Data Stage, RedShift, QuickSight, S3 data migration/ ETL both batch and real time data warehouse development DevSecOps Java SQL relational databases GitHub/lab experience Nice to have: Data quality xml AWS Data Specialty certification All profiles will be reviewed against the required skills and experience. Due to the high number of applications we will only be able to respond to successful applicants in the first instance. We thank you for your interest and the time taken to apply!
Senior Full Stack & AI Automation Engineer - London Rate: £500 - £530 per day Duration: 1 year contract 3 days onsite per week in London office (Battersea) As a key member of the Automation Team, you'll design and build next-generation automation platforms, blending Python, React, Postgres, and modern DevOps practices with cutting-edge AI/ML techniques such as RAG, vector databases, and multi-agent systems. You'll work across the full software lifecycle: from system design and test automation, to deployment and performance monitoring, collaborating with some of the brightest minds in the industry. What You'll Work On Designing and developing automation solutions aligned with Apple's product requirements. Building agentic AI/ML systems (MCP servers/clients, RAG pipelines, vector DBs). Developing both front-end (React, JS) and back-end (Python, APIs, Postgres) features. Driving test automation with Selenium and iOS functional frameworks. Implementing CI/CD pipelines (Jenkins), ensuring high-quality releases. Partnering with cross-functional teams, reporting on sprint progress, and demoing new features. What We're Looking For: 10+ years building operational support systems and automation in Python. Proven leadership in code reviews and mentoring. Hands-on experience with Selenium and iOS functional automation. Strong React/JavaScript UI development background. Back-end engineering with Python + Postgres. AI/ML expertise in LLMs, embeddings, and RAG systems. Experience with ETL pipelines, vector databases, and modern DevOps (Jenkins). Bonus points for: Multi-agent system design. Graph database experience. Familiarity with Charles Proxy, Git, Jenkins. Role Details Location: London (3 days onsite per week) Engagement: Contract - Inside IR35 Environment: Agile/Scrum, cross-functional, highly collaborative Rates depend on experience and client requirements
02/10/2025
Full time
Senior Full Stack & AI Automation Engineer - London Rate: £500 - £530 per day Duration: 1 year contract 3 days onsite per week in London office (Battersea) As a key member of the Automation Team, you'll design and build next-generation automation platforms, blending Python, React, Postgres, and modern DevOps practices with cutting-edge AI/ML techniques such as RAG, vector databases, and multi-agent systems. You'll work across the full software lifecycle: from system design and test automation, to deployment and performance monitoring, collaborating with some of the brightest minds in the industry. What You'll Work On Designing and developing automation solutions aligned with Apple's product requirements. Building agentic AI/ML systems (MCP servers/clients, RAG pipelines, vector DBs). Developing both front-end (React, JS) and back-end (Python, APIs, Postgres) features. Driving test automation with Selenium and iOS functional frameworks. Implementing CI/CD pipelines (Jenkins), ensuring high-quality releases. Partnering with cross-functional teams, reporting on sprint progress, and demoing new features. What We're Looking For: 10+ years building operational support systems and automation in Python. Proven leadership in code reviews and mentoring. Hands-on experience with Selenium and iOS functional automation. Strong React/JavaScript UI development background. Back-end engineering with Python + Postgres. AI/ML expertise in LLMs, embeddings, and RAG systems. Experience with ETL pipelines, vector databases, and modern DevOps (Jenkins). Bonus points for: Multi-agent system design. Graph database experience. Familiarity with Charles Proxy, Git, Jenkins. Role Details Location: London (3 days onsite per week) Engagement: Contract - Inside IR35 Environment: Agile/Scrum, cross-functional, highly collaborative Rates depend on experience and client requirements
Databricks Engineer London- hybrid- 3 days per week on-site 6 months + UMBRELLA only- Inside IR35 Key Responsibilities Design, develop, and maintain ETL/ELT pipelines using Airflow for orchestration and scheduling. Build and manage data transformation workflows in DBT running on Databricks . Optimize data models in Delta Lake for performance, scalability, and cost efficiency. Collaborate with analytics, BI, and data science teams to deliver clean, reliable datasets. Implement data quality checks (dbt tests, monitoring) and ensure governance standards. Manage and monitor Databricks clusters & SQL Warehouses to support workloads. Contribute to CI/CD practices for data pipelines (version control, testing, deployments). Troubleshoot pipeline failures, performance bottlenecks, and scaling challenges. Document workflows, transformations, and data models for knowledge sharing. Required Skills & Qualifications 3-6 years of experience as a Data Engineer (or similar). Hands-on expertise with: DBT (dbt-core, dbt-databricks adapter, testing & documentation). Apache Airflow (DAG design, operators, scheduling, dependencies). Databricks (Spark, SQL, Delta Lake, job clusters, SQL Warehouses). Strong SQL skills and understanding of data modelling (Kimball, Data Vault, or similar) . Proficiency in Python for Scripting and pipeline development. Experience with CI/CD tools (eg, GitHub Actions, GitLab CI, Azure DevOps). Familiarity with cloud platforms (AWS, Azure, or GCP). Strong problem-solving skills and ability to work in cross-functional teams. All profiles will be reviewed against the required skills and experience. Due to the high number of applications we will only be able to respond to successful applicants in the first instance. We thank you for your interest and the time taken to apply!
01/10/2025
Contractor
Databricks Engineer London- hybrid- 3 days per week on-site 6 months + UMBRELLA only- Inside IR35 Key Responsibilities Design, develop, and maintain ETL/ELT pipelines using Airflow for orchestration and scheduling. Build and manage data transformation workflows in DBT running on Databricks . Optimize data models in Delta Lake for performance, scalability, and cost efficiency. Collaborate with analytics, BI, and data science teams to deliver clean, reliable datasets. Implement data quality checks (dbt tests, monitoring) and ensure governance standards. Manage and monitor Databricks clusters & SQL Warehouses to support workloads. Contribute to CI/CD practices for data pipelines (version control, testing, deployments). Troubleshoot pipeline failures, performance bottlenecks, and scaling challenges. Document workflows, transformations, and data models for knowledge sharing. Required Skills & Qualifications 3-6 years of experience as a Data Engineer (or similar). Hands-on expertise with: DBT (dbt-core, dbt-databricks adapter, testing & documentation). Apache Airflow (DAG design, operators, scheduling, dependencies). Databricks (Spark, SQL, Delta Lake, job clusters, SQL Warehouses). Strong SQL skills and understanding of data modelling (Kimball, Data Vault, or similar) . Proficiency in Python for Scripting and pipeline development. Experience with CI/CD tools (eg, GitHub Actions, GitLab CI, Azure DevOps). Familiarity with cloud platforms (AWS, Azure, or GCP). Strong problem-solving skills and ability to work in cross-functional teams. All profiles will be reviewed against the required skills and experience. Due to the high number of applications we will only be able to respond to successful applicants in the first instance. We thank you for your interest and the time taken to apply!
Strategic Staffing Solutions International
Glasgow, Lanarkshire
I am looking for a SQL Server Developer - AHS Software Engineering Senior Advisor to work for a World leading Health Insurance organization located in Glasgow for a 3-month contract. This role is inside IR35/Umbrella. You will be an experienced SQL Server Developer/Programmer with strong skills in Datamodelling, Data Mining, and Data import and extraction processes related to SQL Server Databases. The ideal candidate would have more of a full-stack skill set. Experience: Improving system quality by identifying issues and common patterns Maintaining and improving existing codebases and Database structures Optimizing code and query Working on ETL data transformation and integration using SSIS packages This is a fully remote role but there is an option to work from the office if you wish, located in Glasgow Centre. Please send me your CV for an immediate review.
23/09/2022
Contractor
I am looking for a SQL Server Developer - AHS Software Engineering Senior Advisor to work for a World leading Health Insurance organization located in Glasgow for a 3-month contract. This role is inside IR35/Umbrella. You will be an experienced SQL Server Developer/Programmer with strong skills in Datamodelling, Data Mining, and Data import and extraction processes related to SQL Server Databases. The ideal candidate would have more of a full-stack skill set. Experience: Improving system quality by identifying issues and common patterns Maintaining and improving existing codebases and Database structures Optimizing code and query Working on ETL data transformation and integration using SSIS packages This is a fully remote role but there is an option to work from the office if you wish, located in Glasgow Centre. Please send me your CV for an immediate review.
I am partnered with a Global Technology Business in their search for a Senior Data Engineering contractor. Due to the business being acquired, there is a number of new workflows that have become available. This project is focused around moving on-premise Hadoop Data Systems over to Google Cloud Platform. For this we need a Data Engineer who is comfortable working on the following: Java Software Engineering ETL / Pipeline Experience Working with Spark and/or Databricks Experience with On-Prem to Cloud Migrations CI/CD and Container experience Knowledge of Google Cloud Platform If you are a Senior Data Engineer with experience in the Google Cloud Platform, please get in touch a Java background is preferred but Python also considered. The role is long term 6 months with the view to extend, the contract is Inside IR35 with a rate of up to £700 per day. To be considered, please send through an up to date copy of CV.
05/02/2022
Contractor
I am partnered with a Global Technology Business in their search for a Senior Data Engineering contractor. Due to the business being acquired, there is a number of new workflows that have become available. This project is focused around moving on-premise Hadoop Data Systems over to Google Cloud Platform. For this we need a Data Engineer who is comfortable working on the following: Java Software Engineering ETL / Pipeline Experience Working with Spark and/or Databricks Experience with On-Prem to Cloud Migrations CI/CD and Container experience Knowledge of Google Cloud Platform If you are a Senior Data Engineer with experience in the Google Cloud Platform, please get in touch a Java background is preferred but Python also considered. The role is long term 6 months with the view to extend, the contract is Inside IR35 with a rate of up to £700 per day. To be considered, please send through an up to date copy of CV.
Senior Data Engineer/Developer required by a global blue chip required to join a cross-functional team in an Agile/Scrum/Kanban environment working on the architecture, design and maintenance of data-led solutions. You will need to have in-depth experience of some of the following: SQL, eg T-SQL, PL/SQL, PostgreSQL. Extensive experience in a NoSQL implementation, eg Hadoop, Mongo, S3, Azure Table/BLOB storage. At least one programmatic language, eg C#, Java, Python, R. Cloud storage structures, formats, and transformation/querying techniques, eg JSON, XML, XSLT. Experience with SSIS, SSRS beneficial. In addition you should have experience of relational database design, including at least one enterprise SQL implementation. Non-relational database design (NoSQL). Cloud data storage formats and techniques. Development of ETL, data warehouse and data lake solutions. Commercial exposure to at least one enterprise cloud environment, eg AWS, Azure, Oracle, Google. Exposure to AWS products (S3, SQS, Athena, SNS, Lambdas) beneficial. Familiarity with BI/MI concepts, multi-dimensional data repositories, warehouses, lakes. Experience of transforming and aggregating data for business use in offline tools, eg Excel, Tableau. Experience of transferring high volume data between database and business systems, eg Salesforce, and of automating such processes. You should also have familiarity with current software development practices: CI/CD, automated testing, blue/green deployment, feature toggles. Mixture of Homeworking and on-site with 2-3 days a week working from home. £540/day inside IR35.
04/10/2021
Contractor
Senior Data Engineer/Developer required by a global blue chip required to join a cross-functional team in an Agile/Scrum/Kanban environment working on the architecture, design and maintenance of data-led solutions. You will need to have in-depth experience of some of the following: SQL, eg T-SQL, PL/SQL, PostgreSQL. Extensive experience in a NoSQL implementation, eg Hadoop, Mongo, S3, Azure Table/BLOB storage. At least one programmatic language, eg C#, Java, Python, R. Cloud storage structures, formats, and transformation/querying techniques, eg JSON, XML, XSLT. Experience with SSIS, SSRS beneficial. In addition you should have experience of relational database design, including at least one enterprise SQL implementation. Non-relational database design (NoSQL). Cloud data storage formats and techniques. Development of ETL, data warehouse and data lake solutions. Commercial exposure to at least one enterprise cloud environment, eg AWS, Azure, Oracle, Google. Exposure to AWS products (S3, SQS, Athena, SNS, Lambdas) beneficial. Familiarity with BI/MI concepts, multi-dimensional data repositories, warehouses, lakes. Experience of transforming and aggregating data for business use in offline tools, eg Excel, Tableau. Experience of transferring high volume data between database and business systems, eg Salesforce, and of automating such processes. You should also have familiarity with current software development practices: CI/CD, automated testing, blue/green deployment, feature toggles. Mixture of Homeworking and on-site with 2-3 days a week working from home. £540/day inside IR35.