Senior Data Engineer Opportunity: Permanent Salary: Competitive Hybrid: 2 days per week onsite Office Location: Central London Our client, a UK-based fintech operating an AI-powered intelligent commerce platform, is seeking a Senior Data Engineer to build and maintain the data infrastructure that drives AI-enabled decision-making across their global operations. You'll design the data systems that underpin both strategic business intelligence and advanced AI capabilities. Key Responsibilities Design and build data warehouse solutions that support business objectives with optimal performance Develop, implement, and maintain robust ELT pipelines consuming data from internal systems and external partners Partner with business stakeholders and customers to unlock data insights and build impactful analytical models Investigate and resolve system defects, respond to production incidents, and conduct thorough code reviews Provide technical leadership by staying current with emerging technologies and mentoring team members Technical Skills Strong expertise in SQL and database architecture Practical experience with data warehousing platforms (Snowflake, PostgreSQL, MySQL) Solid Python programming capabilities for data engineering workflows Proven track record designing and maintaining production-grade ELT/ETL solutions Services offered by Computappoint Limited are those of an Employment Business and/or Employment Agency in relation to this vacancy.
10/10/2025
Full time
Senior Data Engineer Opportunity: Permanent Salary: Competitive Hybrid: 2 days per week onsite Office Location: Central London Our client, a UK-based fintech operating an AI-powered intelligent commerce platform, is seeking a Senior Data Engineer to build and maintain the data infrastructure that drives AI-enabled decision-making across their global operations. You'll design the data systems that underpin both strategic business intelligence and advanced AI capabilities. Key Responsibilities Design and build data warehouse solutions that support business objectives with optimal performance Develop, implement, and maintain robust ELT pipelines consuming data from internal systems and external partners Partner with business stakeholders and customers to unlock data insights and build impactful analytical models Investigate and resolve system defects, respond to production incidents, and conduct thorough code reviews Provide technical leadership by staying current with emerging technologies and mentoring team members Technical Skills Strong expertise in SQL and database architecture Practical experience with data warehousing platforms (Snowflake, PostgreSQL, MySQL) Solid Python programming capabilities for data engineering workflows Proven track record designing and maintaining production-grade ELT/ETL solutions Services offered by Computappoint Limited are those of an Employment Business and/or Employment Agency in relation to this vacancy.
Lead Data Engineer Opportunity: Permanent Salary: Up to £85k Hybrid: 2 days per week onsite Office Location: Central London Our client a UK-based fintech are seeking a Lead Data Engineer to take ownership of their data infrastructure and play a pivotal role in powering AI-driven decision-making across the organisation. You'll be the architect behind the data systems that fuel both business intelligence and cutting-edge AI capabilities. Key Responsibilities Architect and own the data warehouse infrastructure, ensuring optimal performance and accessibility Design, build, and maintain ELT pipelines that integrate internal systems and external partner data Collaborate with stakeholders and customers to deliver actionable insights and refined data models Resolve production issues, bugs, and participate in code reviews to maintain engineering quality Mentor team members and champion emerging technologies to advance our data capabilities Technical Skills Deep expertise in SQL and database design principles Hands-on experience with modern data warehousing platforms (Snowflake, PostgreSQL, MySQL) Proficiency in Python for data engineering tasks Experience building and maintaining production ELT/ETL pipelines Services offered by Computappoint Limited are those of an Employment Business and/or Employment Agency in relation to this vacancy.
10/10/2025
Full time
Lead Data Engineer Opportunity: Permanent Salary: Up to £85k Hybrid: 2 days per week onsite Office Location: Central London Our client a UK-based fintech are seeking a Lead Data Engineer to take ownership of their data infrastructure and play a pivotal role in powering AI-driven decision-making across the organisation. You'll be the architect behind the data systems that fuel both business intelligence and cutting-edge AI capabilities. Key Responsibilities Architect and own the data warehouse infrastructure, ensuring optimal performance and accessibility Design, build, and maintain ELT pipelines that integrate internal systems and external partner data Collaborate with stakeholders and customers to deliver actionable insights and refined data models Resolve production issues, bugs, and participate in code reviews to maintain engineering quality Mentor team members and champion emerging technologies to advance our data capabilities Technical Skills Deep expertise in SQL and database design principles Hands-on experience with modern data warehousing platforms (Snowflake, PostgreSQL, MySQL) Proficiency in Python for data engineering tasks Experience building and maintaining production ELT/ETL pipelines Services offered by Computappoint Limited are those of an Employment Business and/or Employment Agency in relation to this vacancy.
Senior Analytics Engineer Location: London (Hybrid - 1 day per month) Team: Data & Analytics About the Role: We're looking for a Senior Analytics Engineer who combines strong technical expertise with business awareness and a passion for clean, scalable data. In this role, you'll design and maintain robust data models, support data infrastructure, and work closely with stakeholders to deliver trusted, self-serve analytics solutions. You'll take ownership of how data is modelled and shared across the business, enabling insight and supporting strategic decisions. This role is ideal for someone who thrives in fast-paced, evolving environments and wants to have a lasting impact on data culture and capability. You'll be based in London , with an expectation to work from the office once a week . Requirements: Proven experience as an Analytics Engineer or in a similar data engineering/BI role Advanced SQL skills with hands-on experience using dbt for data modeling Strong familiarity with modern data platforms like Snowflake, Looker, AWS/GCP, Tableau, or Airflow Experience with version control tools (e.g., Git) Ability to design, build, and document scalable, reliable data models Comfortable gathering business requirements and translating them into data architecture Strong problem-solving skills and attention to detail Clear communicator with the ability to support non-technical users and promote self-serve analytics What Will Make You Stand Out: Exposure to Python for analytics engineering or automation tasks Background in the insurance or financial services sector (a plus, not essential) A commercial mindset-you understand how data supports business goals and can design solutions with impact in mind Curiosity and initiative around using AI tools to streamline workflows, increase productivity, or enhance analysis Passion for improving processes and evolving best practices across data team The client is looking to move quickly, if you would like to follow up on your application please email or message me on LinkedIn.
09/10/2025
Full time
Senior Analytics Engineer Location: London (Hybrid - 1 day per month) Team: Data & Analytics About the Role: We're looking for a Senior Analytics Engineer who combines strong technical expertise with business awareness and a passion for clean, scalable data. In this role, you'll design and maintain robust data models, support data infrastructure, and work closely with stakeholders to deliver trusted, self-serve analytics solutions. You'll take ownership of how data is modelled and shared across the business, enabling insight and supporting strategic decisions. This role is ideal for someone who thrives in fast-paced, evolving environments and wants to have a lasting impact on data culture and capability. You'll be based in London , with an expectation to work from the office once a week . Requirements: Proven experience as an Analytics Engineer or in a similar data engineering/BI role Advanced SQL skills with hands-on experience using dbt for data modeling Strong familiarity with modern data platforms like Snowflake, Looker, AWS/GCP, Tableau, or Airflow Experience with version control tools (e.g., Git) Ability to design, build, and document scalable, reliable data models Comfortable gathering business requirements and translating them into data architecture Strong problem-solving skills and attention to detail Clear communicator with the ability to support non-technical users and promote self-serve analytics What Will Make You Stand Out: Exposure to Python for analytics engineering or automation tasks Background in the insurance or financial services sector (a plus, not essential) A commercial mindset-you understand how data supports business goals and can design solutions with impact in mind Curiosity and initiative around using AI tools to streamline workflows, increase productivity, or enhance analysis Passion for improving processes and evolving best practices across data team The client is looking to move quickly, if you would like to follow up on your application please email or message me on LinkedIn.
Data Engineer £700/day outside IR35 3-month initial contract Remote working - occasional site visits Working with a leading financial services client who are looking for a Data Engineering Consultant. Looking for someone who can act independently and deliver complex data products - work with stakeholders to understand requirements, engineer and deliver products. Data Engineering Consultant, key responsibilities: Work cross-functionally with non-technical stakeholders to understand data requirements Expert knowledge of SQL to query, analysis and model data Experience with Snowflake Using DBT for data transforms and modelling Data ingestion using Python Build foundations for business functions to be able to access the correct data and insights Experience working in large development teams Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people of all backgrounds and perspectives. Our success is driven by our people, united by the spirit of partnership to deliver the best resourcing solutions for our clients. If you need any help or adjustments during the recruitment process for any reason , please let us know when you apply or talk to the recruiters directly so we can support you.
09/10/2025
Full time
Data Engineer £700/day outside IR35 3-month initial contract Remote working - occasional site visits Working with a leading financial services client who are looking for a Data Engineering Consultant. Looking for someone who can act independently and deliver complex data products - work with stakeholders to understand requirements, engineer and deliver products. Data Engineering Consultant, key responsibilities: Work cross-functionally with non-technical stakeholders to understand data requirements Expert knowledge of SQL to query, analysis and model data Experience with Snowflake Using DBT for data transforms and modelling Data ingestion using Python Build foundations for business functions to be able to access the correct data and insights Experience working in large development teams Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people of all backgrounds and perspectives. Our success is driven by our people, united by the spirit of partnership to deliver the best resourcing solutions for our clients. If you need any help or adjustments during the recruitment process for any reason , please let us know when you apply or talk to the recruiters directly so we can support you.
Data Engineer £500 - £560 per day London - 1 day per week in office We're working with a leading global healthcare technology company who are building out their next-generation data platform, with a strong emphasis on automation, testing, and cloud-native engineering, and are looking for an experienced Data Engineer to join their team. The Role You'll be part of a modern data engineering function that's implementing best-in-class data practices across ingestion, transformation, and orchestration layers. The environment is highly technical, collaborative, and fast-paced, giving you the opportunity to work across a wide variety of data sources and tools. Day-to-day responsibilities include: Designing and developing DBT models and Airflow pipelines within a modern data stack. Building robust data ingestion pipelines across multiple sources - including external partners, internal platforms, and APIs. Implementing automated testing and CI/CD pipelines for data workflows. Performing data extraction and enrichment, including web scraping and parsing of unstructured text (e.g., scanned forms and documents). Collaborating on forecasting and predictive analytics initiatives. Bringing modern engineering practices, testing frameworks, and design patterns to the wider data function. Tech Stack & Skills Core skills: Strong experience with DBT, Airflow, Snowflake, and Python Proven background in automated testing, CI/CD, and test-driven development Experience building and maintaining data pipelines and APIs in production environments Nice to have: Knowledge of Snowflake infrastructure and data architecture design Experience using LLMs or MLOps frameworks for data extraction or model training Familiarity with cloud-agnostic deployments and version control best practices What You'll Bring A proactive, hands-on approach to engineering challenges A passion for data quality, scalability, and performance The ability to influence best practices and introduce modern standards across a data estate Strong problem-solving skills and the confidence to work across multiple complex data sources Why Join? This is an opportunity to help shape the data foundations of a high-impact healthcare technology business - one that's actively exploring the intersection of data engineering, MLOps, and AI. You'll have ownership of end-to-end data workflows, work with a world-class tech stack, and join a forward-thinking team that values automation, collaboration, and innovation.
09/10/2025
Full time
Data Engineer £500 - £560 per day London - 1 day per week in office We're working with a leading global healthcare technology company who are building out their next-generation data platform, with a strong emphasis on automation, testing, and cloud-native engineering, and are looking for an experienced Data Engineer to join their team. The Role You'll be part of a modern data engineering function that's implementing best-in-class data practices across ingestion, transformation, and orchestration layers. The environment is highly technical, collaborative, and fast-paced, giving you the opportunity to work across a wide variety of data sources and tools. Day-to-day responsibilities include: Designing and developing DBT models and Airflow pipelines within a modern data stack. Building robust data ingestion pipelines across multiple sources - including external partners, internal platforms, and APIs. Implementing automated testing and CI/CD pipelines for data workflows. Performing data extraction and enrichment, including web scraping and parsing of unstructured text (e.g., scanned forms and documents). Collaborating on forecasting and predictive analytics initiatives. Bringing modern engineering practices, testing frameworks, and design patterns to the wider data function. Tech Stack & Skills Core skills: Strong experience with DBT, Airflow, Snowflake, and Python Proven background in automated testing, CI/CD, and test-driven development Experience building and maintaining data pipelines and APIs in production environments Nice to have: Knowledge of Snowflake infrastructure and data architecture design Experience using LLMs or MLOps frameworks for data extraction or model training Familiarity with cloud-agnostic deployments and version control best practices What You'll Bring A proactive, hands-on approach to engineering challenges A passion for data quality, scalability, and performance The ability to influence best practices and introduce modern standards across a data estate Strong problem-solving skills and the confidence to work across multiple complex data sources Why Join? This is an opportunity to help shape the data foundations of a high-impact healthcare technology business - one that's actively exploring the intersection of data engineering, MLOps, and AI. You'll have ownership of end-to-end data workflows, work with a world-class tech stack, and join a forward-thinking team that values automation, collaboration, and innovation.
Robert Half Technology are assisting a cutting edge AI organisation to recruit a Data Engineer on a contract basis - remote working - UK based A Junior Data Engineer who's excited to help build the data foundations that power cutting-edge AI solutions. You'll join a high-impact team working at the intersection of data, analytics, and machine learning - designing pipelines and infrastructure that make innovation possible at scale. Role Design, build, and maintain scalable data pipelines that fuel AI and analytics initiatives. Partner closely with data scientists, analysts, and engineers to deliver clean, structured, and reliable data. Develop robust data transformations in Python and SQL, ensuring performance and accuracy. Work hands-on with Snowflake to model, optimise, and manage data flows. Continuously improve data engineering practices - from automation to observability. Bring ideas to the table: help shape how data is collected, processed, and leveraged across the business. Profile The Data Engineer will ideally have 2-5 years of experience in data engineering or a similar role. Strong Python skills for data processing and pipeline development. Proven experience writing complex and efficient SQL. Hands-on Snowflake experience (data modelling, performance tuning, pipelines). Familiarity with orchestration tools (e.g., Airflow, dbt) is a plus. A solid understanding of data best practices, version control (Git), and CI/CD. Company Rapidly growing, cutting edge AI organisation Remote working - Offices in London if preferred Salary & Benefits The salary range/rates of pay is dependent upon your experience, qualifications or training . Robert Half Ltd acts as an employment business for temporary positions and an employment agency for permanent positions. Robert Half is committed to diversity, equity and inclusion. Suitable candidates with equivalent qualifications and more or less experience can apply. Rates of pay and salary ranges are dependent upon your experience, qualifications and training. If you wish to apply, please read our Privacy Notice describing how we may process, disclose and store your personal data: gb/en/privacy-notice.
09/10/2025
Full time
Robert Half Technology are assisting a cutting edge AI organisation to recruit a Data Engineer on a contract basis - remote working - UK based A Junior Data Engineer who's excited to help build the data foundations that power cutting-edge AI solutions. You'll join a high-impact team working at the intersection of data, analytics, and machine learning - designing pipelines and infrastructure that make innovation possible at scale. Role Design, build, and maintain scalable data pipelines that fuel AI and analytics initiatives. Partner closely with data scientists, analysts, and engineers to deliver clean, structured, and reliable data. Develop robust data transformations in Python and SQL, ensuring performance and accuracy. Work hands-on with Snowflake to model, optimise, and manage data flows. Continuously improve data engineering practices - from automation to observability. Bring ideas to the table: help shape how data is collected, processed, and leveraged across the business. Profile The Data Engineer will ideally have 2-5 years of experience in data engineering or a similar role. Strong Python skills for data processing and pipeline development. Proven experience writing complex and efficient SQL. Hands-on Snowflake experience (data modelling, performance tuning, pipelines). Familiarity with orchestration tools (e.g., Airflow, dbt) is a plus. A solid understanding of data best practices, version control (Git), and CI/CD. Company Rapidly growing, cutting edge AI organisation Remote working - Offices in London if preferred Salary & Benefits The salary range/rates of pay is dependent upon your experience, qualifications or training . Robert Half Ltd acts as an employment business for temporary positions and an employment agency for permanent positions. Robert Half is committed to diversity, equity and inclusion. Suitable candidates with equivalent qualifications and more or less experience can apply. Rates of pay and salary ranges are dependent upon your experience, qualifications and training. If you wish to apply, please read our Privacy Notice describing how we may process, disclose and store your personal data: gb/en/privacy-notice.
Senior Data Platform Engineer Hybrid London (3 days per week) Up to £95,000 + Benefits Are you an experienced Data Engineer ready to take ownership of a cutting-edge cloud data platform? We're partnering with a high-growth fintech company that's transforming how analytics powers decision-making - and they're looking for a Senior Data Platform Engineer to help design, scale, and optimise their AWS and Snowflake-based infrastructure. Why this role? Join one of the UK's fastest-growing fintechs, where data is at the heart of every decision. Work with a modern stack - AWS, Snowflake, Python, Terraform, CI/CD, and Fivetran . Shape the future of a scalable, secure, and cost-efficient data platform used across the business. Hybrid working: 3 days a week in the London office , combining collaboration with flexibility. ? What you'll be doing: Designing, building, and evolving data infrastructure across AWS and Snowflake. Developing tooling and automation to improve analytics efficiency. Onboarding new data sources and ensuring smooth, reliable ingestion and processing. Implementing monitoring, testing, and observability to guarantee platform robustness. Driving improvements in CI/CD pipelines for faster, more reliable deployments. Managing infrastructure as code with Terraform , ensuring security and compliance. Collaborating with analysts, engineers, and business teams to turn data into action. Troubleshooting production issues and continuously improving performance and cost efficiency. ? What we're looking for: Strong software engineering background - ideally with Python or similar languages. Proven experience working with AWS and Snowflake in production environments. Solid grasp of data engineering best practices - testing, deployment, and configuration as code. Experience developing containerised services and cloud-based data applications. Proficiency in Terraform or other IaC tools. Familiarity with CI/CD pipelines (GitHub Actions preferred). Excellent communicator - able to work closely with technical and business stakeholders. ? Nice-to-haves: Experience with Kubernetes , Airflow/Prefect , or data observability tools (Monte Carlo, Great Expectations). Exposure to data cataloguing tools (Amundsen, OpenMetadata). Package & Benefits: Salary up to £95,000 , depending on experience. Hybrid working - 3 days per week in the London office .
09/10/2025
Full time
Senior Data Platform Engineer Hybrid London (3 days per week) Up to £95,000 + Benefits Are you an experienced Data Engineer ready to take ownership of a cutting-edge cloud data platform? We're partnering with a high-growth fintech company that's transforming how analytics powers decision-making - and they're looking for a Senior Data Platform Engineer to help design, scale, and optimise their AWS and Snowflake-based infrastructure. Why this role? Join one of the UK's fastest-growing fintechs, where data is at the heart of every decision. Work with a modern stack - AWS, Snowflake, Python, Terraform, CI/CD, and Fivetran . Shape the future of a scalable, secure, and cost-efficient data platform used across the business. Hybrid working: 3 days a week in the London office , combining collaboration with flexibility. ? What you'll be doing: Designing, building, and evolving data infrastructure across AWS and Snowflake. Developing tooling and automation to improve analytics efficiency. Onboarding new data sources and ensuring smooth, reliable ingestion and processing. Implementing monitoring, testing, and observability to guarantee platform robustness. Driving improvements in CI/CD pipelines for faster, more reliable deployments. Managing infrastructure as code with Terraform , ensuring security and compliance. Collaborating with analysts, engineers, and business teams to turn data into action. Troubleshooting production issues and continuously improving performance and cost efficiency. ? What we're looking for: Strong software engineering background - ideally with Python or similar languages. Proven experience working with AWS and Snowflake in production environments. Solid grasp of data engineering best practices - testing, deployment, and configuration as code. Experience developing containerised services and cloud-based data applications. Proficiency in Terraform or other IaC tools. Familiarity with CI/CD pipelines (GitHub Actions preferred). Excellent communicator - able to work closely with technical and business stakeholders. ? Nice-to-haves: Experience with Kubernetes , Airflow/Prefect , or data observability tools (Monte Carlo, Great Expectations). Exposure to data cataloguing tools (Amundsen, OpenMetadata). Package & Benefits: Salary up to £95,000 , depending on experience. Hybrid working - 3 days per week in the London office .
Data Engineer (with Data Analytics Background) Location: City of London Employment Type: Full-time Salary: 90,000- 100,000 Sector: Fintech We're looking for a well-rounded, communicative Data Engineer with a strong background in data analytics and experience within the Fintech sector . This role is ideal for someone who began their career as a Data Analyst and has since transitioned into a more engineering-focused position, someone who enjoys understanding the business context just as much as building the data solutions behind it. You'll work extensively with Python , Snowflake , SQL , and dbt to design, build, and maintain scalable, high-quality data pipelines and models that support decision-making across the business. This is a hands-on, collaborative role, suited to someone who's confident communicating with data, product, and engineering teams, not a "heads-down coder" type. Top 4 Core Skills Python - workflow automation, data processing, and ETL/ELT development. Snowflake - scalable data architecture, performance optimisation, and governance. SQL - expert-level query writing and optimisation for analytics and transformations. dbt (Data Build Tool) - modular data modelling, testing, documentation, and version control. Key Responsibilities Design, build, and maintain dbt models and SQL transformations to support analytical and operational use cases. Develop and maintain Python workflows for data ingestion, transformation, and automation. Engineer scalable, performant Snowflake pipelines and data models aligned with business and product needs. Partner closely with analysts, product managers, and engineers to translate complex business requirements into data-driven solutions. Write production-grade SQL and ensure data quality through testing, documentation, and version control. Promote best practices around data reliability, observability, and maintainability. (Optional but valued) Contribute to Infrastructure as Code and CI/CD pipelines (e.g., Terraform, GitHub Actions). Skills & Experience 5+ years of experience in data-focused roles, ideally progressing from Data Analyst to Data Engineer. Proven Fintech or Payments industry experience - strong understanding of the data challenges and regulatory context within the sector. Deep proficiency in Python , Snowflake , SQL , and dbt . Excellent communication and collaboration skills , with the ability to work effectively across data, product, and business teams. Solid grasp of modern data modelling techniques (star/snowflake schemas, data contracts, documentation). Experience working in cloud-based environments; familiarity with Terraform or similar IaC tools is a plus. Proactive, delivery-focused, and able to contribute quickly in a fast-moving environment. Nice to Have Experience with Power BI or other data visualisation tools. Familiarity with orchestration tools such as Airflow, Prefect, or Dagster. Understanding of CI/CD practices in data and analytics engineering. Knowledge of data governance, observability, and security best practices in cloud environments.
09/10/2025
Full time
Data Engineer (with Data Analytics Background) Location: City of London Employment Type: Full-time Salary: 90,000- 100,000 Sector: Fintech We're looking for a well-rounded, communicative Data Engineer with a strong background in data analytics and experience within the Fintech sector . This role is ideal for someone who began their career as a Data Analyst and has since transitioned into a more engineering-focused position, someone who enjoys understanding the business context just as much as building the data solutions behind it. You'll work extensively with Python , Snowflake , SQL , and dbt to design, build, and maintain scalable, high-quality data pipelines and models that support decision-making across the business. This is a hands-on, collaborative role, suited to someone who's confident communicating with data, product, and engineering teams, not a "heads-down coder" type. Top 4 Core Skills Python - workflow automation, data processing, and ETL/ELT development. Snowflake - scalable data architecture, performance optimisation, and governance. SQL - expert-level query writing and optimisation for analytics and transformations. dbt (Data Build Tool) - modular data modelling, testing, documentation, and version control. Key Responsibilities Design, build, and maintain dbt models and SQL transformations to support analytical and operational use cases. Develop and maintain Python workflows for data ingestion, transformation, and automation. Engineer scalable, performant Snowflake pipelines and data models aligned with business and product needs. Partner closely with analysts, product managers, and engineers to translate complex business requirements into data-driven solutions. Write production-grade SQL and ensure data quality through testing, documentation, and version control. Promote best practices around data reliability, observability, and maintainability. (Optional but valued) Contribute to Infrastructure as Code and CI/CD pipelines (e.g., Terraform, GitHub Actions). Skills & Experience 5+ years of experience in data-focused roles, ideally progressing from Data Analyst to Data Engineer. Proven Fintech or Payments industry experience - strong understanding of the data challenges and regulatory context within the sector. Deep proficiency in Python , Snowflake , SQL , and dbt . Excellent communication and collaboration skills , with the ability to work effectively across data, product, and business teams. Solid grasp of modern data modelling techniques (star/snowflake schemas, data contracts, documentation). Experience working in cloud-based environments; familiarity with Terraform or similar IaC tools is a plus. Proactive, delivery-focused, and able to contribute quickly in a fast-moving environment. Nice to Have Experience with Power BI or other data visualisation tools. Familiarity with orchestration tools such as Airflow, Prefect, or Dagster. Understanding of CI/CD practices in data and analytics engineering. Knowledge of data governance, observability, and security best practices in cloud environments.
Robert Half is working with a market-leading SaaS organisation to recruit a Junior Data Engineer for an initial 6-month contract. The role will involve building and optimising data pipelines, ensuring data quality, and enabling analytics and reporting across the business. Responsibilities: Develop and maintain data pipelines in Snowflake using Python and SQL. Write efficient SQL queries and stored procedures for data processing and reporting. Automate ingestion of structured and semi-structured data sources. Optimise data models and pipelines for scalability, performance, and cost efficiency. Collaborate with analysts and business stakeholders to deliver reliable datasets. Implement best practices in data quality, governance, and secu rity . Experience: 2-3 years' experience as a Data Engineer, with strong hands-on Snowflake expertise. Proven experience with SQL (advanced query writing, optimisation, performance tuning). Strong Python skills for data transformation, automation, and pipeline development. Experience with ETL/ELT tools (eg, dbt, Matillion, Talend, or similar). Exposure to cloud platforms (AWS preferred, Azure/GCP also considered). Familiarity with Git/version control and CI/CD pipelines. Knowledge of modern data warehousing concepts and best practices. Contract: Inital 6 Month Contract Fully Remote - UK Based Day Rate - Inside IR35 Robert Half Ltd acts as an employment business for temporary positions and an employment agency for permanent positions. Robert Half is committed to diversity, equity and inclusion. Suitable candidates with equivalent qualifications and more or less experience can apply. Rates of pay and salary ranges are dependent upon your experience, qualifications and training. If you wish to apply, please read our Privacy Notice describing how we may process, disclose and store your personal data:
07/10/2025
Contractor
Robert Half is working with a market-leading SaaS organisation to recruit a Junior Data Engineer for an initial 6-month contract. The role will involve building and optimising data pipelines, ensuring data quality, and enabling analytics and reporting across the business. Responsibilities: Develop and maintain data pipelines in Snowflake using Python and SQL. Write efficient SQL queries and stored procedures for data processing and reporting. Automate ingestion of structured and semi-structured data sources. Optimise data models and pipelines for scalability, performance, and cost efficiency. Collaborate with analysts and business stakeholders to deliver reliable datasets. Implement best practices in data quality, governance, and secu rity . Experience: 2-3 years' experience as a Data Engineer, with strong hands-on Snowflake expertise. Proven experience with SQL (advanced query writing, optimisation, performance tuning). Strong Python skills for data transformation, automation, and pipeline development. Experience with ETL/ELT tools (eg, dbt, Matillion, Talend, or similar). Exposure to cloud platforms (AWS preferred, Azure/GCP also considered). Familiarity with Git/version control and CI/CD pipelines. Knowledge of modern data warehousing concepts and best practices. Contract: Inital 6 Month Contract Fully Remote - UK Based Day Rate - Inside IR35 Robert Half Ltd acts as an employment business for temporary positions and an employment agency for permanent positions. Robert Half is committed to diversity, equity and inclusion. Suitable candidates with equivalent qualifications and more or less experience can apply. Rates of pay and salary ranges are dependent upon your experience, qualifications and training. If you wish to apply, please read our Privacy Notice describing how we may process, disclose and store your personal data:
Senior Data Engineer - York - Hybrid/RemoteSenior Data Engineer - Data Warehouse / Microsoft FabricRoc Search are currently recruiting for Senior Data Engineer to join an expanding company based in York.They have a continuous programme of growth at the moment surrounding Data Warehouse that you will be working on. Day to day you will be working with 3rd parties in the business, as well as other stakeholders responsible for making certain decisions surrounding data engineering. You may also have to deputise for the data engineering manager from time to time.This is a really great time to join as they are investing largely into their IT systems at the moment and rapidly expanding, so there is plenty opportunity to develop and utilise your skills and career with them.They are happy for you to work predominantly remote, coming into the office once every 4-6 weeks.Essential skills include experience with: Data Warehousing At least one of the following: Microsoft Fabric, Synapse, Snowflake or AWS Python SQL Advantageous skills include experience with: Power BI Microsoft Fabric Leadership European language skills This is an excellent opportunity for the successful applicant to gain exposure to a company with great rewards and career progression, with continual professional development and training.Apply now for immediate consideration.As a professional company we gladly welcome applications from persons of any age and background and do not intend to discriminate with advert text and terminology.
07/10/2025
Full time
Senior Data Engineer - York - Hybrid/RemoteSenior Data Engineer - Data Warehouse / Microsoft FabricRoc Search are currently recruiting for Senior Data Engineer to join an expanding company based in York.They have a continuous programme of growth at the moment surrounding Data Warehouse that you will be working on. Day to day you will be working with 3rd parties in the business, as well as other stakeholders responsible for making certain decisions surrounding data engineering. You may also have to deputise for the data engineering manager from time to time.This is a really great time to join as they are investing largely into their IT systems at the moment and rapidly expanding, so there is plenty opportunity to develop and utilise your skills and career with them.They are happy for you to work predominantly remote, coming into the office once every 4-6 weeks.Essential skills include experience with: Data Warehousing At least one of the following: Microsoft Fabric, Synapse, Snowflake or AWS Python SQL Advantageous skills include experience with: Power BI Microsoft Fabric Leadership European language skills This is an excellent opportunity for the successful applicant to gain exposure to a company with great rewards and career progression, with continual professional development and training.Apply now for immediate consideration.As a professional company we gladly welcome applications from persons of any age and background and do not intend to discriminate with advert text and terminology.
A global investment bank is seeking an experienced Database Developer to support the design, automation, and scaling of its distributed data platforms, including Snowflake, PostgreSQL, and Greenplum . You'll work within the enterprise data infrastructure team, driving performance optimization and modernization across cloud and on-premise environments. Database Developer - Snowflake & PostgreSQL (Contract) Global Investment Bank Glasgow £400-£500/day (via umbrella) Hybrid - 3 Days Onsite Duration: 12 Months Initial Term Key Responsibilities Design, deploy, and manage distributed database systems across Snowflake , PostgreSQL , and Greenplum . Develop infrastructure-as-code and automation for provisioning, scaling, and monitoring clusters. Optimize platform performance and support high availability, disaster recovery, and cost efficiency. Lead multi-cloud database deployments across AWS and Azure environments. Collaborate with data engineers and platform teams to ensure secure, scalable database solutions. Drive modernization efforts including cloud-native adoption, serverless design, and AI/ML integration. Implement governance policies around access, encryption, and compliance standards. Mentor junior team members and support cross-training initiatives. Required Skills & Experience Proven expertise in PostgreSQL and Snowflake administration, tuning, and scaling. Experience managing large-scale distributed systems (10K+ clusters). Strong skills in Python , Terraform , Ansible , or similar automation tools. Familiarity with CI/CD pipelines , infrastructure-as-code, and DevOps practices. Hands-on experience in AWS and/or Azure cloud platforms. Understanding of modern data architecture , data warehousing, and cloud-native patterns. Nice to Have Experience with Snowflake design patterns and migration workflows. Knowledge of geo-redundant deployments and clustering tools (e.g., Patroni , GoldenGate ). Awareness of Snowflake Cortex AI , SaaS offerings, and third-party integrations. Familiarity with data governance, security, and compliance frameworks. Contract Details Rate: £400-£500/day (via umbrella) Location: Glasgow (Hybrid - 3 days onsite) Duration: e.g., 6 months initially with potential extension Robert Walters Operations Limited is an employment business and employment agency and welcomes applications from all candidates
07/10/2025
Full time
A global investment bank is seeking an experienced Database Developer to support the design, automation, and scaling of its distributed data platforms, including Snowflake, PostgreSQL, and Greenplum . You'll work within the enterprise data infrastructure team, driving performance optimization and modernization across cloud and on-premise environments. Database Developer - Snowflake & PostgreSQL (Contract) Global Investment Bank Glasgow £400-£500/day (via umbrella) Hybrid - 3 Days Onsite Duration: 12 Months Initial Term Key Responsibilities Design, deploy, and manage distributed database systems across Snowflake , PostgreSQL , and Greenplum . Develop infrastructure-as-code and automation for provisioning, scaling, and monitoring clusters. Optimize platform performance and support high availability, disaster recovery, and cost efficiency. Lead multi-cloud database deployments across AWS and Azure environments. Collaborate with data engineers and platform teams to ensure secure, scalable database solutions. Drive modernization efforts including cloud-native adoption, serverless design, and AI/ML integration. Implement governance policies around access, encryption, and compliance standards. Mentor junior team members and support cross-training initiatives. Required Skills & Experience Proven expertise in PostgreSQL and Snowflake administration, tuning, and scaling. Experience managing large-scale distributed systems (10K+ clusters). Strong skills in Python , Terraform , Ansible , or similar automation tools. Familiarity with CI/CD pipelines , infrastructure-as-code, and DevOps practices. Hands-on experience in AWS and/or Azure cloud platforms. Understanding of modern data architecture , data warehousing, and cloud-native patterns. Nice to Have Experience with Snowflake design patterns and migration workflows. Knowledge of geo-redundant deployments and clustering tools (e.g., Patroni , GoldenGate ). Awareness of Snowflake Cortex AI , SaaS offerings, and third-party integrations. Familiarity with data governance, security, and compliance frameworks. Contract Details Rate: £400-£500/day (via umbrella) Location: Glasgow (Hybrid - 3 days onsite) Duration: e.g., 6 months initially with potential extension Robert Walters Operations Limited is an employment business and employment agency and welcomes applications from all candidates
Fully Remote Up to £120k Base + Bonus + Stock Options We're partnered with a high-growth FinTech scale-up at the forefront of building next-generation market surveillance and risk monitoring technology . Backed by leading investors and trusted by global financial institutions, regulators, and crypto-native firms, they process trillions of market events daily to protect investors, detect manipulation, and ensure integrity across both traditional finance and digital asset markets . As a Senior Software Engineer , you'll play a key role in designing and scaling advanced systems and algorithms that safeguard some of the most dynamic and complex markets in the world. This is a chance to build high-performance, low-latency, and cloud-native systems while tackling challenges in fraud detection, data engineering, and market integrity . What you'll do Design, implement, and optimise algorithms for detecting manipulation, fraud, and compliance breaches in real-time and batch data. Build and scale high-volume data pipelines handling billions of daily events across distributed systems. Work with cutting-edge tech including Java, Kafka, Spark, Kubernetes, Clickhouse, Snowflake, Redis . Collaborate with quants, data scientists, and compliance experts to improve surveillance strategies. Contribute to system architecture for low-latency monitoring and high-throughput processing . Continuously improve data quality, reliability, and query optimisation at scale. ? What we're looking for 5+ years' experience in software engineering , ideally in financial markets, market surveillance, or large-scale data systems . Strong skills in Java (or similar: Kotlin, C#, C++) ; Python is a plus. Proven experience with real-time, low-latency systems and distributed data pipelines . Hands-on with Kafka, Spark, K8s, Clickhouse, Snowflake, and modern data architectures . Solid foundations in algorithms, system design, and optimisation . Curious, proactive, and comfortable working in a fast-moving, scaling environment. Why apply? Fully remote role with global reach . Up to £120k base + bonus + stock options . Build mission-critical systems at the intersection of finance, crypto, and AI/ML-driven surveillance . Work alongside world-class engineers, quants, and technologists shaping the future of market integrity. McGregor Boyall is an equal opportunity employer and do not discriminate on any grounds.
06/10/2025
Full time
Fully Remote Up to £120k Base + Bonus + Stock Options We're partnered with a high-growth FinTech scale-up at the forefront of building next-generation market surveillance and risk monitoring technology . Backed by leading investors and trusted by global financial institutions, regulators, and crypto-native firms, they process trillions of market events daily to protect investors, detect manipulation, and ensure integrity across both traditional finance and digital asset markets . As a Senior Software Engineer , you'll play a key role in designing and scaling advanced systems and algorithms that safeguard some of the most dynamic and complex markets in the world. This is a chance to build high-performance, low-latency, and cloud-native systems while tackling challenges in fraud detection, data engineering, and market integrity . What you'll do Design, implement, and optimise algorithms for detecting manipulation, fraud, and compliance breaches in real-time and batch data. Build and scale high-volume data pipelines handling billions of daily events across distributed systems. Work with cutting-edge tech including Java, Kafka, Spark, Kubernetes, Clickhouse, Snowflake, Redis . Collaborate with quants, data scientists, and compliance experts to improve surveillance strategies. Contribute to system architecture for low-latency monitoring and high-throughput processing . Continuously improve data quality, reliability, and query optimisation at scale. ? What we're looking for 5+ years' experience in software engineering , ideally in financial markets, market surveillance, or large-scale data systems . Strong skills in Java (or similar: Kotlin, C#, C++) ; Python is a plus. Proven experience with real-time, low-latency systems and distributed data pipelines . Hands-on with Kafka, Spark, K8s, Clickhouse, Snowflake, and modern data architectures . Solid foundations in algorithms, system design, and optimisation . Curious, proactive, and comfortable working in a fast-moving, scaling environment. Why apply? Fully remote role with global reach . Up to £120k base + bonus + stock options . Build mission-critical systems at the intersection of finance, crypto, and AI/ML-driven surveillance . Work alongside world-class engineers, quants, and technologists shaping the future of market integrity. McGregor Boyall is an equal opportunity employer and do not discriminate on any grounds.
i am recruiting for a Data Engineer to work in Glasgow 3 days a week, 2 days remote. The role falls inside IR35 so you will have to work through an umbrella company. You will have several years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. You will have experience of designing and implementing tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Experience in data development and solutions in highly complex data environments with large data volumes. SQL / PLSQL experience with the ability to write ad-hoc and complex queries to perform data analysis. Databricks experience is essential. Experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. You will be able to develop solutions in a hybrid data environment (on-Prem and Cloud). You must be able to collaborate seamlessly across diverse technical stacks, including Cloudera, Databricks, Snowflake, Azure, AWS, etc. Hands on experience with developing data pipelines for structured, semi-structured, and unstructured data and experience integrating with their supporting stores (e.g. RDBMS, NoSQL DBs, Document DBs, Log Files etc). Please apply ASAP to find out more!
03/10/2025
Contractor
i am recruiting for a Data Engineer to work in Glasgow 3 days a week, 2 days remote. The role falls inside IR35 so you will have to work through an umbrella company. You will have several years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. You will have experience of designing and implementing tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Experience in data development and solutions in highly complex data environments with large data volumes. SQL / PLSQL experience with the ability to write ad-hoc and complex queries to perform data analysis. Databricks experience is essential. Experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. You will be able to develop solutions in a hybrid data environment (on-Prem and Cloud). You must be able to collaborate seamlessly across diverse technical stacks, including Cloudera, Databricks, Snowflake, Azure, AWS, etc. Hands on experience with developing data pipelines for structured, semi-structured, and unstructured data and experience integrating with their supporting stores (e.g. RDBMS, NoSQL DBs, Document DBs, Log Files etc). Please apply ASAP to find out more!
Lead Data Engineer, Snowflake, 4 months, Remote, CO7368 A leading software consultancy requires a Lead Data Engineer / Tech Lead with strong Snowflake skills - for the next 4 months with the likelihood of significant extension - to provide technical leadership for a prestigious client project to deliver an innovative data platform. Working remotely and outside of IR35, the Snowflake Lead Data Engineer / Tech Lead will actively work on delivering the technical solutions whilst carrying out code reviews, setting best practices, ensuring code quality, keeping everyone aligned from a technical standpoint, and fostering good communication and collaboration. This will involve: Working with the project owner to gather ongoing requirements from the client Improving the transparency and traceability of data through well-documented pipelines and clear data lineage. Empowering internal teams with cleaner, structured data, reducing dependence on tribal knowledge and manual interventions. Enabling future scalability by establishing robust, modular foundations Experience Required: Snowflake expertise Solid data engineering expertise including strong SQL and Python Proficiency in DBT for transformation tasks Experience with Dagster or a similar orchestration tool Familiarity with Azure So What's Next? If you are an experienced Snowflake-focused Lead Data Engineer / Tech Lead and are available for a prompt start; apply now for immediate consideration! Corriculo Ltd acts as an employment agency and an employment business.
03/10/2025
Full time
Lead Data Engineer, Snowflake, 4 months, Remote, CO7368 A leading software consultancy requires a Lead Data Engineer / Tech Lead with strong Snowflake skills - for the next 4 months with the likelihood of significant extension - to provide technical leadership for a prestigious client project to deliver an innovative data platform. Working remotely and outside of IR35, the Snowflake Lead Data Engineer / Tech Lead will actively work on delivering the technical solutions whilst carrying out code reviews, setting best practices, ensuring code quality, keeping everyone aligned from a technical standpoint, and fostering good communication and collaboration. This will involve: Working with the project owner to gather ongoing requirements from the client Improving the transparency and traceability of data through well-documented pipelines and clear data lineage. Empowering internal teams with cleaner, structured data, reducing dependence on tribal knowledge and manual interventions. Enabling future scalability by establishing robust, modular foundations Experience Required: Snowflake expertise Solid data engineering expertise including strong SQL and Python Proficiency in DBT for transformation tasks Experience with Dagster or a similar orchestration tool Familiarity with Azure So What's Next? If you are an experienced Snowflake-focused Lead Data Engineer / Tech Lead and are available for a prompt start; apply now for immediate consideration! Corriculo Ltd acts as an employment agency and an employment business.
Snowflake Data Engineer, 4 months, Remote, CO7369 A leading software consultancy requires a Snowflake Data Engineer - for the next 4 months - to work on a prestigious client project to deliver an innovative data platform Working remotely and outside of IR35, the Snowflake Data Engineer will: Improve the transparency and traceability of data through well-documented pipelines and clear data lineage. Ensure seamless data integration and architecture Optimise data processing performance Collaborate with development and data teams to support data-driven features and insights Experience Required: Snowflake expertise Solid data engineering expertise including strong SQL and Python Proficiency in DBT for transformation tasks Experience with Dagster or a similar orchestration tool Familiarity with Azure So What's Next? If you are an experienced Snowflake-focused Data Engineer and are available for a prompt start; apply now for immediate consideration! Corriculo Ltd acts as an employment agency and an employment business.
03/10/2025
Full time
Snowflake Data Engineer, 4 months, Remote, CO7369 A leading software consultancy requires a Snowflake Data Engineer - for the next 4 months - to work on a prestigious client project to deliver an innovative data platform Working remotely and outside of IR35, the Snowflake Data Engineer will: Improve the transparency and traceability of data through well-documented pipelines and clear data lineage. Ensure seamless data integration and architecture Optimise data processing performance Collaborate with development and data teams to support data-driven features and insights Experience Required: Snowflake expertise Solid data engineering expertise including strong SQL and Python Proficiency in DBT for transformation tasks Experience with Dagster or a similar orchestration tool Familiarity with Azure So What's Next? If you are an experienced Snowflake-focused Data Engineer and are available for a prompt start; apply now for immediate consideration! Corriculo Ltd acts as an employment agency and an employment business.
Snowflake Data Modeller Location: Remote (Mostly remote with some occasional travel to Hemel Hempstead) Contract: Outside IR35 Day rate: Up to £550 per day Duration: 6 months Start date: ASAP Key skills: Snowflake, DBT, SQL, Python, AWS and Kimball The Client who are in the process of migrating to SnowFlake therefore require extra support. As a result, they require someone from a strong SQL & Python development background with excellent working knowledge of Data Build Tool (DBT). You will be undertaking aspects of the development life cycle and be experienced in data modelling, process design, development, and testing. And whilst this company is going through a large-scale migration, this will present you with an opportunity to be at the cutting edge of data engineering. YOUR SKILLS AND EXPERIENCE A successful Senior Data Engineer here will have experience in the following: - Advanced SQL knowledge - SnowFlake (ideally certified) - Python development - AWS cloud experience essential, relating to data tooling and development - Working knowledge of Data Build Tool (DBT). o Develop staging, intermediate and marts in DBT to achieve analytics requirements o Optimize existing models to make it more reusable by following DBT best practices o Spot opportunities for the models to be incremental over full refresh o Organize sources and document sources in DBT o Develop DBT macros for most frequently repeated tasks - Knowledge of Continuous Integration and Continuous Deployment (CI/CD) - Development experience in one or more object-oriented programming languages (eg Python) - Ability to work with senior stakeholders across multiple verticals
03/10/2025
Contractor
Snowflake Data Modeller Location: Remote (Mostly remote with some occasional travel to Hemel Hempstead) Contract: Outside IR35 Day rate: Up to £550 per day Duration: 6 months Start date: ASAP Key skills: Snowflake, DBT, SQL, Python, AWS and Kimball The Client who are in the process of migrating to SnowFlake therefore require extra support. As a result, they require someone from a strong SQL & Python development background with excellent working knowledge of Data Build Tool (DBT). You will be undertaking aspects of the development life cycle and be experienced in data modelling, process design, development, and testing. And whilst this company is going through a large-scale migration, this will present you with an opportunity to be at the cutting edge of data engineering. YOUR SKILLS AND EXPERIENCE A successful Senior Data Engineer here will have experience in the following: - Advanced SQL knowledge - SnowFlake (ideally certified) - Python development - AWS cloud experience essential, relating to data tooling and development - Working knowledge of Data Build Tool (DBT). o Develop staging, intermediate and marts in DBT to achieve analytics requirements o Optimize existing models to make it more reusable by following DBT best practices o Spot opportunities for the models to be incremental over full refresh o Organize sources and document sources in DBT o Develop DBT macros for most frequently repeated tasks - Knowledge of Continuous Integration and Continuous Deployment (CI/CD) - Development experience in one or more object-oriented programming languages (eg Python) - Ability to work with senior stakeholders across multiple verticals
i am recruiting for a Data Engineer to work in Glasgow 3 days a week, 2 days remote. The role falls inside IR35 so you will have to work through an umbrella company. You will have several years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. You will have experience of designing and implementing tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Experience in data development and solutions in highly complex data environments with large data volumes. SQL / PLSQL experience with the ability to write ad-hoc and complex queries to perform data analysis. Databricks experience is essential. Experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. You will be able to develop solutions in a hybrid data environment (on-Prem and Cloud). You must be able to collaborate seamlessly across diverse technical stacks, including Cloudera, Databricks, Snowflake, Azure, AWS, etc. Hands on experience with developing data pipelines for structured, semi-structured, and unstructured data and experience integrating with their supporting stores (e.g. RDBMS, NoSQL DBs, Document DBs, Log Files etc). Please apply ASAP to find out more!
03/10/2025
Full time
i am recruiting for a Data Engineer to work in Glasgow 3 days a week, 2 days remote. The role falls inside IR35 so you will have to work through an umbrella company. You will have several years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. You will have experience of designing and implementing tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Experience in data development and solutions in highly complex data environments with large data volumes. SQL / PLSQL experience with the ability to write ad-hoc and complex queries to perform data analysis. Databricks experience is essential. Experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. You will be able to develop solutions in a hybrid data environment (on-Prem and Cloud). You must be able to collaborate seamlessly across diverse technical stacks, including Cloudera, Databricks, Snowflake, Azure, AWS, etc. Hands on experience with developing data pipelines for structured, semi-structured, and unstructured data and experience integrating with their supporting stores (e.g. RDBMS, NoSQL DBs, Document DBs, Log Files etc). Please apply ASAP to find out more!
Senior AWS Data Engineer - London - £125,000 Please note - this role will require you to work from the London based office. You must have the unrestricted right to work in the UK to be eligible for this role. This organisation is not able to offer sponsorship. An exciting opportunity to join a greenfield initiative focused on transforming how market data is accessed and utilised. As a Senior AWS Data Engineer, you'll play a key role in designing and building a cutting-edge data platform using technologies like Databricks, Snowflake, and AWS Glue. Key Responsibilities: Build and maintain scalable data pipelines, warehouses, and lakes. Design secure, high-performance data architectures. Develop processing and analysis algorithms for complex data sets. Collaborate with data scientists to deploy machine learning models. Contribute to strategy, planning, and continuous improvement. Required Experience: Hands-on experience with AWS data tools: Glue, PySpark, Athena, Iceberg, Lake Formation. Strong Python and SQL skills for data processing and analysis. Deep understanding of data governance, quality, and security. Knowledge of market data and its business applications. Desirable Experience: Experience with Databricks and Snowflake. Familiarity with machine learning and data science concepts. Strategic thinking and ability to influence cross-functional teams. This role offers the chance to work across multiple business areas, solve complex data challenges, and contribute to long-term strategic goals. You'll be empowered to lead, collaborate, and innovate in a dynamic environment. To apply for this role please submit your CV or contact David Airey on or at . Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
03/10/2025
Full time
Senior AWS Data Engineer - London - £125,000 Please note - this role will require you to work from the London based office. You must have the unrestricted right to work in the UK to be eligible for this role. This organisation is not able to offer sponsorship. An exciting opportunity to join a greenfield initiative focused on transforming how market data is accessed and utilised. As a Senior AWS Data Engineer, you'll play a key role in designing and building a cutting-edge data platform using technologies like Databricks, Snowflake, and AWS Glue. Key Responsibilities: Build and maintain scalable data pipelines, warehouses, and lakes. Design secure, high-performance data architectures. Develop processing and analysis algorithms for complex data sets. Collaborate with data scientists to deploy machine learning models. Contribute to strategy, planning, and continuous improvement. Required Experience: Hands-on experience with AWS data tools: Glue, PySpark, Athena, Iceberg, Lake Formation. Strong Python and SQL skills for data processing and analysis. Deep understanding of data governance, quality, and security. Knowledge of market data and its business applications. Desirable Experience: Experience with Databricks and Snowflake. Familiarity with machine learning and data science concepts. Strategic thinking and ability to influence cross-functional teams. This role offers the chance to work across multiple business areas, solve complex data challenges, and contribute to long-term strategic goals. You'll be empowered to lead, collaborate, and innovate in a dynamic environment. To apply for this role please submit your CV or contact David Airey on or at . Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Senior Python Software Engineer (Senior Architecture Programmer Developer Python Java Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You MUST have the following: Advanced ability as a Senior Python Software Engineer/Technical Lead/Solutions Architect/Principal Engineer Good design and architecture ability Java Three or more of the following: Iceberg Dremio DBT Arrow Snowflake Glue Athena Airflow Agile The following is DESIRABLE, not essential: Trading, Front Office finance Spark Buy-side asset management (hedge fund, asset manager, investment management) Role: Senior Python Software Engineer (Senior Architecture Programmer Developer Python Java Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund Snowflake) required by my asset management client in London. You will join a relatively new department that is responsible for the data used across the Front Office. The data is sourced from a variety of external vendors and internal departments and held in an AWS data lake, although this is being migrated to a data mesh architecture. They are working heavily with Python, Java and AWS. If you have any experience in Iceberg, Dremio, DBT, Arrow, Spark, Snowflake, Glue, Athena, Airflow or related tools, this would also be very advantageous. You will join a team of 5 that are responsible for pricing data for the Front Office. This is a senior role in the team and will demand a strong ability in design and architecture. If you come in at the right level, you could be the deputy for the team manager. They have a very flexible hybrid working set up. Salary: £120-150k + 15% Bonus + 10% Pension
03/10/2025
Full time
Senior Python Software Engineer (Senior Architecture Programmer Developer Python Java Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You MUST have the following: Advanced ability as a Senior Python Software Engineer/Technical Lead/Solutions Architect/Principal Engineer Good design and architecture ability Java Three or more of the following: Iceberg Dremio DBT Arrow Snowflake Glue Athena Airflow Agile The following is DESIRABLE, not essential: Trading, Front Office finance Spark Buy-side asset management (hedge fund, asset manager, investment management) Role: Senior Python Software Engineer (Senior Architecture Programmer Developer Python Java Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund Snowflake) required by my asset management client in London. You will join a relatively new department that is responsible for the data used across the Front Office. The data is sourced from a variety of external vendors and internal departments and held in an AWS data lake, although this is being migrated to a data mesh architecture. They are working heavily with Python, Java and AWS. If you have any experience in Iceberg, Dremio, DBT, Arrow, Spark, Snowflake, Glue, Athena, Airflow or related tools, this would also be very advantageous. You will join a team of 5 that are responsible for pricing data for the Front Office. This is a senior role in the team and will demand a strong ability in design and architecture. If you come in at the right level, you could be the deputy for the team manager. They have a very flexible hybrid working set up. Salary: £120-150k + 15% Bonus + 10% Pension
Senior Java Software Engineer (Senior Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You MUST have the following: Advanced ability as a Senior Java Software Engineer/Technical Lead/Solutions Architect/Principal Engineer Good design and architecture ability Python Three or more of the following: Iceberg Dremio DBT Arrow Snowflake Glue Athena Airflow Agile The following is DESIRABLE, not essential: Trading, Front Office finance Spark Buy-side asset management (hedge fund, asset manager, investment management) Role: Senior Java Software Engineer (Senior Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund Snowflake) required by my asset management client in London. You will join a relatively new department that is responsible for the data used across the Front Office. The data is sourced from a variety of external vendors and internal departments and held in an AWS data lake, although this is being migrated to a data mesh architecture. They are working heavily with Java, Python and AWS. If you have any experience in Iceberg, Dremio, DBT, Arrow, Spark, Snowflake, Glue, Athena, Airflow or related tools, this would also be very advantageous. You will join a team of 5 that are responsible for pricing data for the Front Office. This is a senior role in the team and will demand a strong ability in design and architecture. If you come in at the right level, you could be the deputy for the team manager. They have a very flexible hybrid working set up. Salary: £90-120k + 15% Bonus + 10% Pension
03/10/2025
Full time
Senior Java Software Engineer (Senior Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You MUST have the following: Advanced ability as a Senior Java Software Engineer/Technical Lead/Solutions Architect/Principal Engineer Good design and architecture ability Python Three or more of the following: Iceberg Dremio DBT Arrow Snowflake Glue Athena Airflow Agile The following is DESIRABLE, not essential: Trading, Front Office finance Spark Buy-side asset management (hedge fund, asset manager, investment management) Role: Senior Java Software Engineer (Senior Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund Snowflake) required by my asset management client in London. You will join a relatively new department that is responsible for the data used across the Front Office. The data is sourced from a variety of external vendors and internal departments and held in an AWS data lake, although this is being migrated to a data mesh architecture. They are working heavily with Java, Python and AWS. If you have any experience in Iceberg, Dremio, DBT, Arrow, Spark, Snowflake, Glue, Athena, Airflow or related tools, this would also be very advantageous. You will join a team of 5 that are responsible for pricing data for the Front Office. This is a senior role in the team and will demand a strong ability in design and architecture. If you come in at the right level, you could be the deputy for the team manager. They have a very flexible hybrid working set up. Salary: £90-120k + 15% Bonus + 10% Pension
Jobs - Frequently Asked Questions
Use the location filter to find IT jobs in cities like London, Manchester, Birmingham, and across the UK.
Entry-level roles include IT support technician, junior developer, QA tester, and helpdesk analyst.
New jobs are posted daily. Set up alerts to be notified as soon as new roles match your preferences.
Key skills include problem-solving, coding, cloud computing, networking, and familiarity with tools like AWS or SQL.
Yes, many employers offer training or junior roles. Focus on building a strong CV with relevant coursework or personal projects.