it job board logo
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
  • Recruiting? Post a job
  • Sign in
  • Sign up
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

49 jobs found

Email me jobs like this
Refine Search
Current Search
data engineer python airflow
ISR Recruitment Ltd
Senior Data Engineer
ISR Recruitment Ltd Manchester, Lancashire
Senior Data Engineer Hybrid-working (Manchester + Home-based) c£60,000 to £75,000 per year (DOE) Plus an excellent company benefits package (including Private Healthcare, Bonuses, Professional Accreditations and Subscriptions, 25 days Annual Leave + Bank Holidays, etc.) The Opportunity: We are supporting a leading IT Consultancy operating at the forefront of digital services and transformation across the UK public sector are seeking an experienced Senior Data Engineer to play a key role in designing and delivering modern, scalable data platforms that support critical national services. Working within collaborative, multi-disciplinary teams, you will take ownership of end-to-end data engineering delivery across greenfield and transformation initiatives. You will influence technical direction, guide engineering best practice, and support the development of high-quality, robust data services that operate at enterprise scale. You will be consulting across modern cloud ecosystems and data technologies, with opportunities to deepen expertise in Python, SQL, cloud-native data tooling, orchestration platforms and streaming technologies across AWS, Azure and GCP. Skills and Experience: Proven experience delivering production-grade data engineering solutions within complex environments Strong Python skills for building, testing and operating scalable data pipelines Experience working with at least one major cloud platform (AWS, Azure or GCP) Strong SQL expertise and experience working with relational databases such as PostgreSQL or Microsoft SQL Server Experience working with NoSQL technologies such as DynamoDB, MongoDB or similar Hands-on Kafka (or equivalent streaming) and workflow orchestration (Airflow) experience Strong understanding of data architecture patterns including data lakes, warehouses and event-driven architectures Experience of consulting across Agile delivery environments, implementing data quality, validation and monitoring frameworks Role and Responsibilities: Lead the design, build and delivery of data platforms and services across the full engineering life cycle Own technical delivery of data pipelines, models and platform components, ensuring solutions are robust, scalable and maintainable Design, develop and deploy ETL/ELT pipelines to ingest, transform and optimise large-scale datasets Build and operate event-driven architectures (Kafka) and orchestrate workflows (Airflow) Apply strong data architecture principles across data lakes, warehouses and event-driven solutions Develop and maintain streaming pipelines using technologies such as Kafka Implement monitoring and observability solutions using tooling such as Prometheus and Grafana Ensure data quality, validation and governance processes are built into engineering workflows Act as a trusted technical advisor to clients and stakeholders (client-facing), translating business requirements into robust engineering solutions Support delivery planning activities, including estimation, risk identification and dependency management Mentor and support other engineers, contributing to a culture of continuous improvement and engineering excellence Applications: Please contact Edward Laing here at ISR to learn more about our client and how they are leading the way in developing the next generation of technical solutions through innovation and transformational technology?
03/04/2026
Full time
Senior Data Engineer Hybrid-working (Manchester + Home-based) c£60,000 to £75,000 per year (DOE) Plus an excellent company benefits package (including Private Healthcare, Bonuses, Professional Accreditations and Subscriptions, 25 days Annual Leave + Bank Holidays, etc.) The Opportunity: We are supporting a leading IT Consultancy operating at the forefront of digital services and transformation across the UK public sector are seeking an experienced Senior Data Engineer to play a key role in designing and delivering modern, scalable data platforms that support critical national services. Working within collaborative, multi-disciplinary teams, you will take ownership of end-to-end data engineering delivery across greenfield and transformation initiatives. You will influence technical direction, guide engineering best practice, and support the development of high-quality, robust data services that operate at enterprise scale. You will be consulting across modern cloud ecosystems and data technologies, with opportunities to deepen expertise in Python, SQL, cloud-native data tooling, orchestration platforms and streaming technologies across AWS, Azure and GCP. Skills and Experience: Proven experience delivering production-grade data engineering solutions within complex environments Strong Python skills for building, testing and operating scalable data pipelines Experience working with at least one major cloud platform (AWS, Azure or GCP) Strong SQL expertise and experience working with relational databases such as PostgreSQL or Microsoft SQL Server Experience working with NoSQL technologies such as DynamoDB, MongoDB or similar Hands-on Kafka (or equivalent streaming) and workflow orchestration (Airflow) experience Strong understanding of data architecture patterns including data lakes, warehouses and event-driven architectures Experience of consulting across Agile delivery environments, implementing data quality, validation and monitoring frameworks Role and Responsibilities: Lead the design, build and delivery of data platforms and services across the full engineering life cycle Own technical delivery of data pipelines, models and platform components, ensuring solutions are robust, scalable and maintainable Design, develop and deploy ETL/ELT pipelines to ingest, transform and optimise large-scale datasets Build and operate event-driven architectures (Kafka) and orchestrate workflows (Airflow) Apply strong data architecture principles across data lakes, warehouses and event-driven solutions Develop and maintain streaming pipelines using technologies such as Kafka Implement monitoring and observability solutions using tooling such as Prometheus and Grafana Ensure data quality, validation and governance processes are built into engineering workflows Act as a trusted technical advisor to clients and stakeholders (client-facing), translating business requirements into robust engineering solutions Support delivery planning activities, including estimation, risk identification and dependency management Mentor and support other engineers, contributing to a culture of continuous improvement and engineering excellence Applications: Please contact Edward Laing here at ISR to learn more about our client and how they are leading the way in developing the next generation of technical solutions through innovation and transformational technology?
Tech Mahindra
Network Automation Engineers
Tech Mahindra
Platforms Itential Automation Platform Cisco NSO Service development and lifecycle Apache Airflow Hasi Corp Vault Openshift Automation development Python Fast API Rest API Netconf Yang modelling XML/Json Jinja/TextFSM Pytest Unit testing Database MongoDB DevOps Docker Git CI/CD Kubernetes Networking Ciena 517x/8180/8114 Nokia BNG/OLT Infinera GTX Series XGSPON, Metro Ethernet, g8032 rings Switching and Routing SRMPLS, BGP, L3VPN, EVPN, IS-IS
02/04/2026
Full time
Platforms Itential Automation Platform Cisco NSO Service development and lifecycle Apache Airflow Hasi Corp Vault Openshift Automation development Python Fast API Rest API Netconf Yang modelling XML/Json Jinja/TextFSM Pytest Unit testing Database MongoDB DevOps Docker Git CI/CD Kubernetes Networking Ciena 517x/8180/8114 Nokia BNG/OLT Infinera GTX Series XGSPON, Metro Ethernet, g8032 rings Switching and Routing SRMPLS, BGP, L3VPN, EVPN, IS-IS
Opus Recruitment Solutions Ltd
SC Cleared Python Engineer
Opus Recruitment Solutions Ltd
I am currently working with a public sector consultancy who are looking for an SC cleared Python Engineer to support cloud based data and analytics solutions in AWS. Key skills: Strong Python experience AWS, Apache Spark, Airflow Terraform, Docker, GitLab Experience with security scanning tools (eg Trivy, Wiz, Trend Micro) Active SC clearance - essential
02/04/2026
Full time
I am currently working with a public sector consultancy who are looking for an SC cleared Python Engineer to support cloud based data and analytics solutions in AWS. Key skills: Strong Python experience AWS, Apache Spark, Airflow Terraform, Docker, GitLab Experience with security scanning tools (eg Trivy, Wiz, Trend Micro) Active SC clearance - essential
Hirexa Solutions UK
Snowflake Data Architect
Hirexa Solutions UK Hemel Hempstead, Hertfordshire
Experience 12+ years of experience in Data Engineering, Data Warehousing, Cloud Data Platforms, and Enterprise Analytics solutions, with strong expertise in modern cloud data architectures. Job Summary We are seeking an experienced Data Architect with strong expertise in Snowflake on Amazon Web Services and DBT to design, architect, and optimize scalable enterprise data platforms. The role involves defining data platform architecture, governance standards, and scalable data transformation frameworks, while ensuring high performance, security, and cost efficiency. The architect will provide technical leadership to data engineering and analytics teams and ensure the platform supports enterprise reporting, advanced analytics, and AI/ML initiatives. The ideal candidate should also have exposure to AI/ML data platforms and experience in the hospitality domain, supporting systems such as reservations, guest management, and operational analytics. Key Responsibilities Define and lead the architecture and design of enterprise data platforms using Snowflake on AWS. Architect scalable data ingestion frameworks for integrating multiple source systems into the cloud data platform. Design and govern data transformation frameworks using DBT. Define and enforce data modelling standards including dimensional modelling, star schema, and enterprise data models. Lead architecture reviews and solution design discussions for new data initiatives. Optimize Snowflake performance, workload management, and cost governance. Establish data governance frameworks including access control, data security, and compliance standards. Design and support AI/ML-ready data architecture for advanced analytics and predictive modelling. Provide architectural guidance to data engineering, BI, and analytics teams. Design architecture to support data consumption for reporting systems, operational applications, and analytics platforms. Implement automation, orchestration, and scalable pipeline frameworks using tools such as Apache Airflow. Collaborate with business stakeholders and technical teams to align the data platform with enterprise data strategy. Support hospitality analytics use cases, including guest behaviour analysis, booking trends, revenue analytics, and operational reporting. Required Technical Skills Data Platform Strong expertise in Snowflake Deep knowledge of Snowflake architecture, performance tuning, data sharing, security, and workload optimization. Cloud Platform Strong experience with Amazon Web Services, including: S3 IAM AWS Glue Lambda CloudWatch Data Transformation Strong experience with DBT for enterprise-scale data modelling, testing, and transformation pipelines. Programming / Query Strong expertise in SQL for data transformation and performance optimization Python (preferred) for automation and data engineering tasks. Data Engineering Enterprise ETL / ELT pipeline architecture Data warehousing and enterprise data modelling Dimensional modelling (Star Schema, Snowflake Schema) Data pipeline scalability and reliability design. AI / Data Science Exposure Experience supporting AI/ML data pipelines and data preparation for machine learning models. Understanding of predictive analytics, recommendation engines, and customer behaviour analytics. Ability to design AI-ready data platforms for future analytics use cases. Preferred Skills Experience with Apache Airflow for pipeline orchestration. Knowledge of CI/CD pipelines, DevOps, and Git-based development workflows. Experience with data governance, metadata management, and enterprise data catalog tools. Experience with BI tools such as Tableau Microsoft Power BI. Domain experience in hospitality, travel, or hotel systems, including reservation systems, guest analytics, and operational reporting. Education Bachelor's or Master's degree in Computer Science, Information Technology, Data Engineering, Data Science, or a related field.
01/04/2026
Contractor
Experience 12+ years of experience in Data Engineering, Data Warehousing, Cloud Data Platforms, and Enterprise Analytics solutions, with strong expertise in modern cloud data architectures. Job Summary We are seeking an experienced Data Architect with strong expertise in Snowflake on Amazon Web Services and DBT to design, architect, and optimize scalable enterprise data platforms. The role involves defining data platform architecture, governance standards, and scalable data transformation frameworks, while ensuring high performance, security, and cost efficiency. The architect will provide technical leadership to data engineering and analytics teams and ensure the platform supports enterprise reporting, advanced analytics, and AI/ML initiatives. The ideal candidate should also have exposure to AI/ML data platforms and experience in the hospitality domain, supporting systems such as reservations, guest management, and operational analytics. Key Responsibilities Define and lead the architecture and design of enterprise data platforms using Snowflake on AWS. Architect scalable data ingestion frameworks for integrating multiple source systems into the cloud data platform. Design and govern data transformation frameworks using DBT. Define and enforce data modelling standards including dimensional modelling, star schema, and enterprise data models. Lead architecture reviews and solution design discussions for new data initiatives. Optimize Snowflake performance, workload management, and cost governance. Establish data governance frameworks including access control, data security, and compliance standards. Design and support AI/ML-ready data architecture for advanced analytics and predictive modelling. Provide architectural guidance to data engineering, BI, and analytics teams. Design architecture to support data consumption for reporting systems, operational applications, and analytics platforms. Implement automation, orchestration, and scalable pipeline frameworks using tools such as Apache Airflow. Collaborate with business stakeholders and technical teams to align the data platform with enterprise data strategy. Support hospitality analytics use cases, including guest behaviour analysis, booking trends, revenue analytics, and operational reporting. Required Technical Skills Data Platform Strong expertise in Snowflake Deep knowledge of Snowflake architecture, performance tuning, data sharing, security, and workload optimization. Cloud Platform Strong experience with Amazon Web Services, including: S3 IAM AWS Glue Lambda CloudWatch Data Transformation Strong experience with DBT for enterprise-scale data modelling, testing, and transformation pipelines. Programming / Query Strong expertise in SQL for data transformation and performance optimization Python (preferred) for automation and data engineering tasks. Data Engineering Enterprise ETL / ELT pipeline architecture Data warehousing and enterprise data modelling Dimensional modelling (Star Schema, Snowflake Schema) Data pipeline scalability and reliability design. AI / Data Science Exposure Experience supporting AI/ML data pipelines and data preparation for machine learning models. Understanding of predictive analytics, recommendation engines, and customer behaviour analytics. Ability to design AI-ready data platforms for future analytics use cases. Preferred Skills Experience with Apache Airflow for pipeline orchestration. Knowledge of CI/CD pipelines, DevOps, and Git-based development workflows. Experience with data governance, metadata management, and enterprise data catalog tools. Experience with BI tools such as Tableau Microsoft Power BI. Domain experience in hospitality, travel, or hotel systems, including reservation systems, guest analytics, and operational reporting. Education Bachelor's or Master's degree in Computer Science, Information Technology, Data Engineering, Data Science, or a related field.
Sanderson
Data Engineer
Sanderson
You will be responsible for: Designing, implementing, and maintaining scalable data pipelines and ETL processes Developing and optimising databases and data storage solutions for structured and unstructured data Collaborating with data scientists and analysts to deliver reliable, high-quality data for analytics and reporting Ensuring data quality, integrity, and compliance with security and governance standards Supporting the adoption of best practices for data engineering and contributing to technical decision-making Candidates should demonstrate: A BEng/BSc or Master's degree in Computer Science, Data Engineering, Mathematics, or a related discipline Strong programming skills in languages such as Python, SQL, and Java or Scala Experience with relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra) Expertise in building and maintaining ETL pipelines and data workflows Familiarity with cloud data platforms (AWS, Azure, GCP) and data pipeline orchestration tools (Airflow, Prefect, etc.) Understanding of data modelling, schema design, and performance optimisation Experience with agile development methodologies, including Scrum and Kanban Familiarity with version control tools such as Git Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people of all backgrounds and perspectives. Our success is driven by our people, united by the spirit of partnership to deliver the best resourcing solutions for our clients. If you need any help or adjustments during the recruitment process for any reason , please let us know when you apply or talk to the recruiters directly so we can support you.
01/04/2026
Full time
You will be responsible for: Designing, implementing, and maintaining scalable data pipelines and ETL processes Developing and optimising databases and data storage solutions for structured and unstructured data Collaborating with data scientists and analysts to deliver reliable, high-quality data for analytics and reporting Ensuring data quality, integrity, and compliance with security and governance standards Supporting the adoption of best practices for data engineering and contributing to technical decision-making Candidates should demonstrate: A BEng/BSc or Master's degree in Computer Science, Data Engineering, Mathematics, or a related discipline Strong programming skills in languages such as Python, SQL, and Java or Scala Experience with relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra) Expertise in building and maintaining ETL pipelines and data workflows Familiarity with cloud data platforms (AWS, Azure, GCP) and data pipeline orchestration tools (Airflow, Prefect, etc.) Understanding of data modelling, schema design, and performance optimisation Experience with agile development methodologies, including Scrum and Kanban Familiarity with version control tools such as Git Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people of all backgrounds and perspectives. Our success is driven by our people, united by the spirit of partnership to deliver the best resourcing solutions for our clients. If you need any help or adjustments during the recruitment process for any reason , please let us know when you apply or talk to the recruiters directly so we can support you.
Harnham - Data & Analytics Recruitment
Machine Learning Engineer
Harnham - Data & Analytics Recruitment
ML Engineer £500-£560 The Company They are a well-established consumer technology business with a mission to create a safe, trusted environment for millions of users. Data, experimentation, and scalable engineering are core to how they operate. Their Trust function is expanding, and they are investing heavily in modern ML infrastructure to support rapid growth. You will join a collaborative environment where ML Engineers, Scientists, Backend Engineers, and MLOps specialists work closely to deliver measurable impact. The Role In this role, you will: Design and implement pipelines for training, evaluating, deploying, and monitoring detection models Productionise detection models in partnership with ML Scientists, improving reliability and latency across asynchronous inference workflows Collaborate with backend and product teams to define integration requirements for trust detection services Extend ML infrastructure with MLOps teams, including reproducible training workflows, CI/CD for model deployment, batch and real-time model serving, feature consistency, and monitoring Uphold strong standards around testing, observability, and operational excellence Contribute to a scaling engineering culture where experimentation and measurable outcomes are central Your Skills and Experience Strong commercial experience building and deploying ML pipelines in production Experience with asynchronous ML inference pipelines Deep understanding of end-to-end ML workflows from research through to deployment Ability to operate with ownership in complex, fast-moving environments Strong communication skills for working with both technical and non-technical stakeholders Experience designing systems within modern cloud environments such as AWS or GCP Proficiency in Python and common ML frameworks such as PyTorch, TensorFlow, or scikit-learn Exposure to ML/MLOps tooling such as SageMaker, MLflow, or TFServing Experience with Spark, Databricks, CI/CD tools, and streaming or orchestration systems like Kafka or Airflow How to Apply If you are interested in this Machine Learning Engineer position, please apply with your CV.
01/04/2026
Contractor
ML Engineer £500-£560 The Company They are a well-established consumer technology business with a mission to create a safe, trusted environment for millions of users. Data, experimentation, and scalable engineering are core to how they operate. Their Trust function is expanding, and they are investing heavily in modern ML infrastructure to support rapid growth. You will join a collaborative environment where ML Engineers, Scientists, Backend Engineers, and MLOps specialists work closely to deliver measurable impact. The Role In this role, you will: Design and implement pipelines for training, evaluating, deploying, and monitoring detection models Productionise detection models in partnership with ML Scientists, improving reliability and latency across asynchronous inference workflows Collaborate with backend and product teams to define integration requirements for trust detection services Extend ML infrastructure with MLOps teams, including reproducible training workflows, CI/CD for model deployment, batch and real-time model serving, feature consistency, and monitoring Uphold strong standards around testing, observability, and operational excellence Contribute to a scaling engineering culture where experimentation and measurable outcomes are central Your Skills and Experience Strong commercial experience building and deploying ML pipelines in production Experience with asynchronous ML inference pipelines Deep understanding of end-to-end ML workflows from research through to deployment Ability to operate with ownership in complex, fast-moving environments Strong communication skills for working with both technical and non-technical stakeholders Experience designing systems within modern cloud environments such as AWS or GCP Proficiency in Python and common ML frameworks such as PyTorch, TensorFlow, or scikit-learn Exposure to ML/MLOps tooling such as SageMaker, MLflow, or TFServing Experience with Spark, Databricks, CI/CD tools, and streaming or orchestration systems like Kafka or Airflow How to Apply If you are interested in this Machine Learning Engineer position, please apply with your CV.
Oscar Technology
Data Engineer
Oscar Technology Manchester, Lancashire
Data Engineer £500 - £600 p/day Contract (Inside IR35) 6+ Months Manchester (Hybrid) Our client is looking for a talented Data Engineer to help design, build and maintain reliable, scalable data pipelines that power analytics, reporting and operational insights. What you'll do: Design and maintain robust data pipelines integrating data from operational systems, APIs and third-party platforms Build batch and near-real-time ingestion processes with strong monitoring, logging and error handling Develop and maintain data models to support reporting and analytics Manage workflow orchestration and scheduling (e.g. Airflow or similar) Support CI/CD, version control and deployments across dev, test and production environments Implement data quality checks, governance and documentation Collaborate with analysts, architects and technology teams to deliver end-to-end data solutions What we're looking for: Strong experience building data pipelines in modern data platforms Solid data modelling knowledge Experience with orchestration and scheduling tools Strong SQL and Python skills Experience with Azure data technologies (Azure Synapse, Azure Data Factory) Understanding of data quality, reliability and CI/CD practices If this sounds like you, apply now! Data Engineer £500 - £600 p/day Contract (Inside IR35) 6+ Months Manchester (Hybrid) Oscar Associates (UK) Limited is acting as an Employment Business in relation to this vacancy. To understand more about what we do with your data please review our privacy policy in the privacy section of the Oscar website.
01/04/2026
Contractor
Data Engineer £500 - £600 p/day Contract (Inside IR35) 6+ Months Manchester (Hybrid) Our client is looking for a talented Data Engineer to help design, build and maintain reliable, scalable data pipelines that power analytics, reporting and operational insights. What you'll do: Design and maintain robust data pipelines integrating data from operational systems, APIs and third-party platforms Build batch and near-real-time ingestion processes with strong monitoring, logging and error handling Develop and maintain data models to support reporting and analytics Manage workflow orchestration and scheduling (e.g. Airflow or similar) Support CI/CD, version control and deployments across dev, test and production environments Implement data quality checks, governance and documentation Collaborate with analysts, architects and technology teams to deliver end-to-end data solutions What we're looking for: Strong experience building data pipelines in modern data platforms Solid data modelling knowledge Experience with orchestration and scheduling tools Strong SQL and Python skills Experience with Azure data technologies (Azure Synapse, Azure Data Factory) Understanding of data quality, reliability and CI/CD practices If this sounds like you, apply now! Data Engineer £500 - £600 p/day Contract (Inside IR35) 6+ Months Manchester (Hybrid) Oscar Associates (UK) Limited is acting as an Employment Business in relation to this vacancy. To understand more about what we do with your data please review our privacy policy in the privacy section of the Oscar website.
Lead Python Data Engineer - Leading Technology AI Brand
MLR Associates
Senior Engineer/Architect Leading Technology AI Brand SaaS - Platform based Technology Services London/City £70-100k salary + equity package Our client a global technology leader is currently looking for a Senior/Lead Data Engineer to work with the dev team to guide the provision of Software Development for an exciting new AI product. Key Responsibilities:- Architect and build scalable data pipelines and infrastructure Design and maintain data ingestion, transformation, and storage architectures for operational and AI workloads. Develop and manage batch and Real Time data pipelines. Build and optimize systems for vector search, retrieval, and ML data pipelines. Ensure data reliability, security, and governance across the platform. Collaborate with AI and Back End engineering teams to support training, inference, and product features. Implement monitoring, observability, and data quality frameworks. Core Experience:- 7+ years of experience in data engineering or Back End engineering roles. Strong experience designing and building data pipelines and distributed data systems. Experience working with relational databases (PostgreSQL preferred, but MySQL or similar is acceptable). Experience with NoSQL databases. Experience with vector databases used in modern AI systems. Strong programming experience in Python. Frameworks/Infrastructure:- Apache Spark Apache Airflow Kafka Elasticsearch/OpenSearch
31/03/2026
Full time
Senior Engineer/Architect Leading Technology AI Brand SaaS - Platform based Technology Services London/City £70-100k salary + equity package Our client a global technology leader is currently looking for a Senior/Lead Data Engineer to work with the dev team to guide the provision of Software Development for an exciting new AI product. Key Responsibilities:- Architect and build scalable data pipelines and infrastructure Design and maintain data ingestion, transformation, and storage architectures for operational and AI workloads. Develop and manage batch and Real Time data pipelines. Build and optimize systems for vector search, retrieval, and ML data pipelines. Ensure data reliability, security, and governance across the platform. Collaborate with AI and Back End engineering teams to support training, inference, and product features. Implement monitoring, observability, and data quality frameworks. Core Experience:- 7+ years of experience in data engineering or Back End engineering roles. Strong experience designing and building data pipelines and distributed data systems. Experience working with relational databases (PostgreSQL preferred, but MySQL or similar is acceptable). Experience with NoSQL databases. Experience with vector databases used in modern AI systems. Strong programming experience in Python. Frameworks/Infrastructure:- Apache Spark Apache Airflow Kafka Elasticsearch/OpenSearch
CoreCom Consulting
Senior Founding Engineer
CoreCom Consulting
Founding Engineer (EnergyTech / AI) London Hybrid (3 days onsite) 70k- 120k + Equity A venture-backed EnergyTech start-up is building a new type of power company designed for the electrified future. As renewable energy adoption accelerates, the challenge is no longer just generating power, but storing, managing, and intelligently dispatching it. This team is developing software that sits at the centre of that transition, enabling a new generation of energy suppliers built around flexibility, storage, and intelligent automation. Backed by leading investors and founded by operators with experience from some of the most respected names in the European energy ecosystem, the business has already launched its first product and is growing rapidly month-on-month. The next phase is building the core technology platform that will power a fully integrated energy supplier launching later this year. This is an opportunity to join a small, highly capable engineering team building core infrastructure for a next-generation energy platform. The Role You will work across the full product stack, helping design and build the systems that power both the customer experience and the operational backbone of the platform. Engineers here take ownership of problems end-to-end, from shaping early ideas through to delivering production features used by customers. Responsibilities include: Building full-stack product features across web, backend services and data systems Designing APIs and integrating with hardware and energy data platforms Developing AI-enabled capabilities, including LLM-powered workflows and operational tooling Contributing to core supplier infrastructure such as billing systems, operational tooling, and internal platforms Working closely with founders and customers to shape the product direction Tech Stack Frontend: React, React Native, TypeScript Backend: Python, FastAPI, PostgreSQL Infrastructure: GCP, Terraform, Airflow AI: LLM integrations and AI-driven automation What They're Looking For Experience as a Full-Stack Engineer building production systems Strong engineering fundamentals and the ability to work across backend and frontend systems Comfort operating in a fast-moving start-up environment where priorities evolve quickly A proactive mindset with a bias towards ownership and solutions Interest in the energy transition or climate technology is a strong plus Engineers who thrive here tend to enjoy working close to the problem, communicating clearly across technical and non-technical teams, and taking initiative rather than waiting for direction. Package 70k- 120k salary depending on experience Meaningful equity Private health insurance Meal allowance and team dinners Hybrid working (London office 3 days per week)
31/03/2026
Full time
Founding Engineer (EnergyTech / AI) London Hybrid (3 days onsite) 70k- 120k + Equity A venture-backed EnergyTech start-up is building a new type of power company designed for the electrified future. As renewable energy adoption accelerates, the challenge is no longer just generating power, but storing, managing, and intelligently dispatching it. This team is developing software that sits at the centre of that transition, enabling a new generation of energy suppliers built around flexibility, storage, and intelligent automation. Backed by leading investors and founded by operators with experience from some of the most respected names in the European energy ecosystem, the business has already launched its first product and is growing rapidly month-on-month. The next phase is building the core technology platform that will power a fully integrated energy supplier launching later this year. This is an opportunity to join a small, highly capable engineering team building core infrastructure for a next-generation energy platform. The Role You will work across the full product stack, helping design and build the systems that power both the customer experience and the operational backbone of the platform. Engineers here take ownership of problems end-to-end, from shaping early ideas through to delivering production features used by customers. Responsibilities include: Building full-stack product features across web, backend services and data systems Designing APIs and integrating with hardware and energy data platforms Developing AI-enabled capabilities, including LLM-powered workflows and operational tooling Contributing to core supplier infrastructure such as billing systems, operational tooling, and internal platforms Working closely with founders and customers to shape the product direction Tech Stack Frontend: React, React Native, TypeScript Backend: Python, FastAPI, PostgreSQL Infrastructure: GCP, Terraform, Airflow AI: LLM integrations and AI-driven automation What They're Looking For Experience as a Full-Stack Engineer building production systems Strong engineering fundamentals and the ability to work across backend and frontend systems Comfort operating in a fast-moving start-up environment where priorities evolve quickly A proactive mindset with a bias towards ownership and solutions Interest in the energy transition or climate technology is a strong plus Engineers who thrive here tend to enjoy working close to the problem, communicating clearly across technical and non-technical teams, and taking initiative rather than waiting for direction. Package 70k- 120k salary depending on experience Meaningful equity Private health insurance Meal allowance and team dinners Hybrid working (London office 3 days per week)
Akkodis
Data Engineer
Akkodis
Data Engineer Full Time / Permanent 55,000 - 60,000 plus up to 20% bonus, private medical and other extensive benefits Hybrid - 1-2 days a week in the North Oxfordshire head office The Company: My client is an industry leading and award-winning financial services organisation who operate on a global scale. They are headquartered in North Oxfordshire, UK. This would be a hybrid role requiring 1-2 days a week in the North Oxfordshire head office. The Role: I am looking for a driven and experienced Data Engineer to help to design, build and maintain a data lakehouse in databricks pulling data from core platforms and external sources and refining this into well curated analysis ready datasets. As a Data Engineer you will operate within an Agile delivery environment, working closely with other Data Engineers, Data Analysts and a Data Architect to deliver against the backlog; providing vital insight from a wide-ranging dataset to support executive and operational decision making that will underpin sustained growth of business units domestically and internationally. The Person: The ideal candidate will possess a strong background in Data Engineering with a proven ability to design, build, and maintain scalable data pipelines and solutions. From a technical standpoint you will ideally possess: Proven experience with databricks Proficiency in programming languages such as Python, Spark, SQL. Strong experience with SQL databases. Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF). Experience with cloud platforms (Azure preferred) and related data services. Knowledge of big data technologies (e.g., Hadoop, Spark, Kafka). Experience of Waterfall and Agile delivery methodologies Contact: Please apply via the link or contact (url removed) for more information. Modis International Ltd acts as an employment agency for permanent recruitment and an employment business for the supply of temporary workers in the UK. Modis Europe Ltd provide a variety of international solutions that connect clients to the best talent in the world. For all positions based in Switzerland, Modis Europe Ltd works with its licensed Swiss partner Accurity GmbH to ensure that candidate applications are handled in accordance with Swiss law. Both Modis International Ltd and Modis Europe Ltd are Equal Opportunities Employers. By applying for this role your details will be submitted to Modis International Ltd and/ or Modis Europe Ltd. Our Candidate Privacy Information Statement which explains how we will use your information is available on the Modis website.
30/03/2026
Full time
Data Engineer Full Time / Permanent 55,000 - 60,000 plus up to 20% bonus, private medical and other extensive benefits Hybrid - 1-2 days a week in the North Oxfordshire head office The Company: My client is an industry leading and award-winning financial services organisation who operate on a global scale. They are headquartered in North Oxfordshire, UK. This would be a hybrid role requiring 1-2 days a week in the North Oxfordshire head office. The Role: I am looking for a driven and experienced Data Engineer to help to design, build and maintain a data lakehouse in databricks pulling data from core platforms and external sources and refining this into well curated analysis ready datasets. As a Data Engineer you will operate within an Agile delivery environment, working closely with other Data Engineers, Data Analysts and a Data Architect to deliver against the backlog; providing vital insight from a wide-ranging dataset to support executive and operational decision making that will underpin sustained growth of business units domestically and internationally. The Person: The ideal candidate will possess a strong background in Data Engineering with a proven ability to design, build, and maintain scalable data pipelines and solutions. From a technical standpoint you will ideally possess: Proven experience with databricks Proficiency in programming languages such as Python, Spark, SQL. Strong experience with SQL databases. Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF). Experience with cloud platforms (Azure preferred) and related data services. Knowledge of big data technologies (e.g., Hadoop, Spark, Kafka). Experience of Waterfall and Agile delivery methodologies Contact: Please apply via the link or contact (url removed) for more information. Modis International Ltd acts as an employment agency for permanent recruitment and an employment business for the supply of temporary workers in the UK. Modis Europe Ltd provide a variety of international solutions that connect clients to the best talent in the world. For all positions based in Switzerland, Modis Europe Ltd works with its licensed Swiss partner Accurity GmbH to ensure that candidate applications are handled in accordance with Swiss law. Both Modis International Ltd and Modis Europe Ltd are Equal Opportunities Employers. By applying for this role your details will be submitted to Modis International Ltd and/ or Modis Europe Ltd. Our Candidate Privacy Information Statement which explains how we will use your information is available on the Modis website.
Damia Group LTD
Data Architect
Damia Group LTD Hounslow, London
Data Architect - 70-90K base (DOE) - West London (hybrid) We are recruiting a Data Architect for one of our clients based in West London on a permanent basis. The Data Architect is responsible for leading the definition, standardization, and governance of data architecture across platforms and products. This role balances technical leadership, data architecture, and collaboration with engineering, product, and security teams to ensure scalable, reliable, and secure systems. Key responsibilities: Enforce data architectural guidelines and consistency across development teams and services Support established Data Governance and Data Quality frameworks, including tooling, policy enforcement, and stewardship models Ensure robust metadata management, lineage tracking, and data cataloguing using business glossaries and modern catalog tools Review and approve data architecture for major features, platforms, and technical initiatives Collaborate with technical leads and DevOps on system scalability, performance, and reliability Ensure data platforms are AI/ML-ready, with scalable infrastructure and clean, well-structured data pipelines Collaborate with data science and analytics teams to enable model deployment, automation, and MLOps best practices Promote innovation in generative AI, predictive analytics, and real-time decision support Align data architecture with security, compliance, and data governance requirements Lead the evolution of technical architecture documentation, models, and decision records Conduct architecture and design reviews with cross-functional teams Guide teams in the adoption of best practices in API design, modularity, cloud-native patterns, and event-driven systems Recommend data management best practices, covering data flows, architecture patterns, retention, archival, and purging strategies Coach and mentor engineers on data design, refactoring, and architectural reasoning Essential skills and experience: Proven experience designing and scaling enterprise-grade cloud data platforms (AWS preferred) Deep experience with AWS, Databricks, Power Platform, and Redshift (Snowflake a plus) Proficiency in AWS Glue, Qlik Talend, DBT, Airflow, and modern data integration tools. Excellent knowledge of Python, SQL, PowerQuery (M), and preferably Scala or PySpark Working knowledge with enterprise architecture frameworks (e.g., TOGAF), MLOps, and BI tools like Power BI and QuickSight Experience of generative AI platforms (e.g., Amazon Bedrock, Anthropic) Familiarity with infrastructure as code (Terraform), CI/CD practices (Jenkins, GitHub Actions), and observability (Grafana, Kibana) Proficiency in scripting and automation using Bash, Groovy, or equivalent Ability to balance long-term architectural vision with immediate delivery constraints Damia Group Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this permanent job, you accept our Data Protection Policy which can be found on our website. Please note that no terminology in this advert is intended to discriminate on the grounds of a person's gender, marital status, race, religion, colour, age, disability or sexual orientation. Every candidate will be assessed only in accordance with their merits, qualifications and ability to perform the duties of the job. Damia Group is acting as an Employment Business in relation to this vacancy and in accordance to Conduct Regulations 2003. The advertised salary range is dependent on experience and the required qualifications. Damia Group Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept our Data Protection Policy which can be found on our website. Please note that no terminology in this advert is intended to discriminate on the grounds of a person's gender, marital status, race, religion, colour, age, disability or sexual orientation. Every candidate will be assessed only in accordance with their merits, qualifications and ability to perform the duties of the job. Damia Group is acting as an Employment Business in relation to this vacancy and in accordance to Conduct Regulations 2003.
26/03/2026
Full time
Data Architect - 70-90K base (DOE) - West London (hybrid) We are recruiting a Data Architect for one of our clients based in West London on a permanent basis. The Data Architect is responsible for leading the definition, standardization, and governance of data architecture across platforms and products. This role balances technical leadership, data architecture, and collaboration with engineering, product, and security teams to ensure scalable, reliable, and secure systems. Key responsibilities: Enforce data architectural guidelines and consistency across development teams and services Support established Data Governance and Data Quality frameworks, including tooling, policy enforcement, and stewardship models Ensure robust metadata management, lineage tracking, and data cataloguing using business glossaries and modern catalog tools Review and approve data architecture for major features, platforms, and technical initiatives Collaborate with technical leads and DevOps on system scalability, performance, and reliability Ensure data platforms are AI/ML-ready, with scalable infrastructure and clean, well-structured data pipelines Collaborate with data science and analytics teams to enable model deployment, automation, and MLOps best practices Promote innovation in generative AI, predictive analytics, and real-time decision support Align data architecture with security, compliance, and data governance requirements Lead the evolution of technical architecture documentation, models, and decision records Conduct architecture and design reviews with cross-functional teams Guide teams in the adoption of best practices in API design, modularity, cloud-native patterns, and event-driven systems Recommend data management best practices, covering data flows, architecture patterns, retention, archival, and purging strategies Coach and mentor engineers on data design, refactoring, and architectural reasoning Essential skills and experience: Proven experience designing and scaling enterprise-grade cloud data platforms (AWS preferred) Deep experience with AWS, Databricks, Power Platform, and Redshift (Snowflake a plus) Proficiency in AWS Glue, Qlik Talend, DBT, Airflow, and modern data integration tools. Excellent knowledge of Python, SQL, PowerQuery (M), and preferably Scala or PySpark Working knowledge with enterprise architecture frameworks (e.g., TOGAF), MLOps, and BI tools like Power BI and QuickSight Experience of generative AI platforms (e.g., Amazon Bedrock, Anthropic) Familiarity with infrastructure as code (Terraform), CI/CD practices (Jenkins, GitHub Actions), and observability (Grafana, Kibana) Proficiency in scripting and automation using Bash, Groovy, or equivalent Ability to balance long-term architectural vision with immediate delivery constraints Damia Group Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this permanent job, you accept our Data Protection Policy which can be found on our website. Please note that no terminology in this advert is intended to discriminate on the grounds of a person's gender, marital status, race, religion, colour, age, disability or sexual orientation. Every candidate will be assessed only in accordance with their merits, qualifications and ability to perform the duties of the job. Damia Group is acting as an Employment Business in relation to this vacancy and in accordance to Conduct Regulations 2003. The advertised salary range is dependent on experience and the required qualifications. Damia Group Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept our Data Protection Policy which can be found on our website. Please note that no terminology in this advert is intended to discriminate on the grounds of a person's gender, marital status, race, religion, colour, age, disability or sexual orientation. Every candidate will be assessed only in accordance with their merits, qualifications and ability to perform the duties of the job. Damia Group is acting as an Employment Business in relation to this vacancy and in accordance to Conduct Regulations 2003.
Involved Productions Ltd
Data Engineer
Involved Productions Ltd London
We’re looking for a Data Engineer to work across the Involved Group, the collective behind globally renowned dance and electronic music labels including Anjunabeats and Anjunadeep, spanning label services and distribution, music publishing, events promotion and artist management. This is a key role within our Technology Department, responsible for developing and managing data pipelines, automating data collection processes, and creating analytics dashboards to provide actionable insights across the company, directly impacting strategy. This role involves working closely with a variety of departments to understand their data needs, developing solutions that streamline data analysis and reporting processes. Reporting to the Head of Technology, our Data Engineer ensures that data analytics initiatives are strategically aligned, efficiently executed, and contribute to the company's overall objectives. Location: Bermondsey, London Working pattern: Part-time (3 days/week) – either in-person at our lively Bermondsey office, hybrid, or home-working.   ____________________________   Who we are:   Based in Bermondsey, the Involved group of companies includes: Involved Productions, home of globally renowned independent dance and electronic music labels Anjunabeats, Anjunadeep and Anjunachill, as well as our label and distribution services. Involved Live, the touring and events company responsible for a portfolio of international events, festivals, and all-night-long showcases, creating unforgettable experiences for fans globally. Involved Publishing, a progressive independent music publisher, representing cutting-edge producers, writers and artists from around the world. Involved Management is a boutique artist management company that is responsible for steering the careers of Above & Beyond, Lane 8, Le Youth and Dusky.  We offer careers, not just jobs, and our team embrace the entrepreneurial spirit, independent mindset and respectful culture we have created, building community and connection through music. ____________________________   Our Data Engineer is responsible for: Analytics Dashboard Creation: Developing and optimising Tableau dashboards that provide clear, actionable insights to various teams, including Streaming & Promotions, Label Directors, and Publishing. Data Pipeline Development: Designing, building, and maintaining efficient and scalable data pipelines to automate the collection, transformation, and delivery of data to and from various sources, including DSPs, FUGA Analytics, Google Analytics, Chartmetric, Curve, etc. Database Management: Developing and maintaining the company’s database structure, ensuring data accuracy, security, and accessibility for analytics purposes. Teaching: Providing support and training to ensure teams are making effective use of analytics tools and dashboards. Tailoring : Collaborating with different departments to understand their data needs, and working creatively to provide tailored analytics solutions. Building: Supporting the Head of Technology in building and maintaining cross-platform automations. Innovation and Research: Staying up to date with the latest trends and technologies in data engineering and analytics, exploring new tools and methodologies that can enhance our data capabilities. This list is not exhaustive – we may ask you to go beyond your job description on occasion, and we hope the role will change and develop with you. ____________________________   About you:   The ideal candidate for this role will likely have: a solid foundation in Python and JavaScript, ideally with proficiency in other programming languages. experience designing and implementing ETL pipelines, specifically using Apache Airflow (Astronomer). hands-on experience with ETL frameworks, particularly dbt (data build tool). SQL and various database management system skills. a good understanding of different database types, designs, and data modelling systems. experience with cloud platforms like AWS and GCP, including services such as BigQuery, RDS, and Athena. familiarity with Tableau and project management tools like monday.com and Notion. knowledge of APIs from music Digital Service Providers (e.g., Spotify, Apple Music). previous experience at a record label, music distributor, or music publisher. an understanding of the music industry excellent analytical, problem-solving, and communication skills. a proactive approach to learning, excitement about problem-solving, approaching new projects with an open mind. strong accuracy and attention to detail. good written and verbal communication skills, the ability to explain complex ideas using non-technical language. the ability to prioritise and manage their time independently.   ____________________________   What we offer:   A competitive salary (£50-60k pro rata) Participation in our Profit Share Scheme 20 days annual leave A benefits package to support your wellbeing, including access to local gyms and fitness classes, and subscription to health apps including Calm, Headspace and Strava A collection of enhanced family policies to support your family life The opportunity to attend a variety of live events Cycle to work scheme Season ticket loans A lively, collaborative office environment, and a flexible hybrid working policy Paid time off to volunteer with our local charitable initiatives   Applications   Closing date for applications is 21 November 2025, although we may close applications earlier. If you need more information before applying, email us at people@anjunabeats.com. We are committed to inclusion, and encourage applications from anyone with relevant experience and skills. If you require any adjustments throughout the application process to meet your needs and help you perform at your best, please let us know.
28/10/2025
Part time
We’re looking for a Data Engineer to work across the Involved Group, the collective behind globally renowned dance and electronic music labels including Anjunabeats and Anjunadeep, spanning label services and distribution, music publishing, events promotion and artist management. This is a key role within our Technology Department, responsible for developing and managing data pipelines, automating data collection processes, and creating analytics dashboards to provide actionable insights across the company, directly impacting strategy. This role involves working closely with a variety of departments to understand their data needs, developing solutions that streamline data analysis and reporting processes. Reporting to the Head of Technology, our Data Engineer ensures that data analytics initiatives are strategically aligned, efficiently executed, and contribute to the company's overall objectives. Location: Bermondsey, London Working pattern: Part-time (3 days/week) – either in-person at our lively Bermondsey office, hybrid, or home-working.   ____________________________   Who we are:   Based in Bermondsey, the Involved group of companies includes: Involved Productions, home of globally renowned independent dance and electronic music labels Anjunabeats, Anjunadeep and Anjunachill, as well as our label and distribution services. Involved Live, the touring and events company responsible for a portfolio of international events, festivals, and all-night-long showcases, creating unforgettable experiences for fans globally. Involved Publishing, a progressive independent music publisher, representing cutting-edge producers, writers and artists from around the world. Involved Management is a boutique artist management company that is responsible for steering the careers of Above & Beyond, Lane 8, Le Youth and Dusky.  We offer careers, not just jobs, and our team embrace the entrepreneurial spirit, independent mindset and respectful culture we have created, building community and connection through music. ____________________________   Our Data Engineer is responsible for: Analytics Dashboard Creation: Developing and optimising Tableau dashboards that provide clear, actionable insights to various teams, including Streaming & Promotions, Label Directors, and Publishing. Data Pipeline Development: Designing, building, and maintaining efficient and scalable data pipelines to automate the collection, transformation, and delivery of data to and from various sources, including DSPs, FUGA Analytics, Google Analytics, Chartmetric, Curve, etc. Database Management: Developing and maintaining the company’s database structure, ensuring data accuracy, security, and accessibility for analytics purposes. Teaching: Providing support and training to ensure teams are making effective use of analytics tools and dashboards. Tailoring : Collaborating with different departments to understand their data needs, and working creatively to provide tailored analytics solutions. Building: Supporting the Head of Technology in building and maintaining cross-platform automations. Innovation and Research: Staying up to date with the latest trends and technologies in data engineering and analytics, exploring new tools and methodologies that can enhance our data capabilities. This list is not exhaustive – we may ask you to go beyond your job description on occasion, and we hope the role will change and develop with you. ____________________________   About you:   The ideal candidate for this role will likely have: a solid foundation in Python and JavaScript, ideally with proficiency in other programming languages. experience designing and implementing ETL pipelines, specifically using Apache Airflow (Astronomer). hands-on experience with ETL frameworks, particularly dbt (data build tool). SQL and various database management system skills. a good understanding of different database types, designs, and data modelling systems. experience with cloud platforms like AWS and GCP, including services such as BigQuery, RDS, and Athena. familiarity with Tableau and project management tools like monday.com and Notion. knowledge of APIs from music Digital Service Providers (e.g., Spotify, Apple Music). previous experience at a record label, music distributor, or music publisher. an understanding of the music industry excellent analytical, problem-solving, and communication skills. a proactive approach to learning, excitement about problem-solving, approaching new projects with an open mind. strong accuracy and attention to detail. good written and verbal communication skills, the ability to explain complex ideas using non-technical language. the ability to prioritise and manage their time independently.   ____________________________   What we offer:   A competitive salary (£50-60k pro rata) Participation in our Profit Share Scheme 20 days annual leave A benefits package to support your wellbeing, including access to local gyms and fitness classes, and subscription to health apps including Calm, Headspace and Strava A collection of enhanced family policies to support your family life The opportunity to attend a variety of live events Cycle to work scheme Season ticket loans A lively, collaborative office environment, and a flexible hybrid working policy Paid time off to volunteer with our local charitable initiatives   Applications   Closing date for applications is 21 November 2025, although we may close applications earlier. If you need more information before applying, email us at people@anjunabeats.com. We are committed to inclusion, and encourage applications from anyone with relevant experience and skills. If you require any adjustments throughout the application process to meet your needs and help you perform at your best, please let us know.
Barclays Bank Plc
Solution Architect - Component Designer Assurance & AI
Barclays Bank Plc Tower Hamlets, London
Join us on a transformative journey as a Solution Architect-Component Designer Assurance & AI, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unapparelled customer experiences. Operational Support Systems (OSS) & Tools is a newly formed functional unit in the Network Product domain at Barclays. The Barclays OSS & Tools Engineering team is responsible for the design, build and operate of the underlying OSS infrastructure and toolchain across cloud, data centre, campus and branch that are required to run the Barclays Global Network at scale. To be successful as a Solution Architect, you should have experience with Demonstrable experience of building high scale observability solutions using open-source tooling like ELK, Grafana, Prometheus, Nagios, Telegraf and others. Knowledge and demonstrable hands-on experience with middleware technologies (Kafka, API gateways and others) and Data Engineering tools/frameworks like Apache Spark, Airflow, Flink and Hadoop ecosystems. Understanding of network technology fundamentals, Data Structures, scalable system design and ability to translate information in a structured manner for wider product and engineering teams to translate into working solution. Some other highly valued skills may include Knowledge of DevOps tooling, GitOps, CI/CD, configuration management, Jenkins, build pipelines and source control systems. Working knowledge of cloud infrastructure services: compute, storage, networking, hybrid connectivity, monitoring/logging, security and IAM. Programming experience in one of the highlevel languages like Python, Java, Golang and NetDevOps automation and AI engineering (GenAI/Agentic AI), API service development. Proficiency in Agile Methodologies Scrum/Kanban, backlog and workflow mgmt. and SRE specific reporting (MTTR, deployment frequency, SLO and others) You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills Location is Knutsford Purpose of the role To design, develop, and implement solutions to complex business problems, collaborating with stakeholders to understand their needs and requirements, and design and implement solutions that meet those needs and create solutions that balance technology risks against business delivery, driving consistency. Accountabilities Design and development of solutions as products that can evolve, meeting business requirements that align with modern software engineering practices and automated delivery tooling. This includes identification and implementation of the technologies and platforms. Targeted design activities that apply an appropriate workload placement strategy and maximise the benefit of cloud capabilities such as elasticity, serverless, containerisation etc. Best practice designs incorporating security principles (such as defence in depth and reduction of blast radius) that meet the Bank's resiliency expectations. Solutions that appropriately balance risks and controls to deliver the agreed business and technology value. Adoption of standardised solutions where they fit. If no standard solutions fit, feed into their ongoing evolution where appropriate. Fault finding and performance issues support to operational support teams, leveraging available tooling. Solution design impact assessment in terms of risk, capacity and cost impact, inc. estimation of project change and ongoing run costs. Development of the requisite architecture inputs required to comply with the banks governance processes, including design artefacts required for architecture, privacy, security and records management governance processes. Vice President Expectations To contribute or set strategy, drive requirements and make recommendations for change. Plan resources, budgets, and policies; manage and maintain policies/ processes; deliver continuous improvements and escalate breaches of policies/procedures If managing a team, they define jobs and responsibilities, planning for the department's future needs and operations, counselling employees on performance and contributing to employee pay decisions/changes. They may also lead a number of specialists to influence the operations of a department, in alignment with strategic as well as tactical priorities, while balancing short and long term goals and ensuring that budgets and schedules meet corporate requirements If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L - Listen and be authentic, E - Energise and inspire, A - Align across the enterprise, D - Develop others OR for an individual contributor, they will be a subject matter expert within own discipline and will guide technical direction. They will lead collaborative, multi-year assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will train, guide and coach less experienced specialists and provide information affecting long term profits, organisational risks and strategic decisions Advise key stakeholders, including functional leadership teams and senior management on functional and cross functional areas of impact and alignment. Manage and mitigate risks through assessment, in support of the control and governance agenda. Demonstrate leadership and accountability for managing risk and strengthening controls in relation to the work your team does. Demonstrate comprehensive understanding of the organisation functions to contribute to achieving the goals of the business. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategies. Create solutions based on sophisticated analytical thought comparing and selecting complex alternatives. In-depth analysis with interpretative thinking will be required to define problems and develop innovative solutions. Adopt and include the outcomes of extensive research in problem solving processes. Seek out, build and maintain trusting relationships and partnerships with internal and external stakeholders in order to accomplish key business objectives, using influencing and negotiating skills to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship - our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset - to Empower, Challenge and Drive - the operating manual for how we behave.
06/10/2025
Full time
Join us on a transformative journey as a Solution Architect-Component Designer Assurance & AI, where you'll spearhead the evolution of our digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize our digital offerings, ensuring unapparelled customer experiences. Operational Support Systems (OSS) & Tools is a newly formed functional unit in the Network Product domain at Barclays. The Barclays OSS & Tools Engineering team is responsible for the design, build and operate of the underlying OSS infrastructure and toolchain across cloud, data centre, campus and branch that are required to run the Barclays Global Network at scale. To be successful as a Solution Architect, you should have experience with Demonstrable experience of building high scale observability solutions using open-source tooling like ELK, Grafana, Prometheus, Nagios, Telegraf and others. Knowledge and demonstrable hands-on experience with middleware technologies (Kafka, API gateways and others) and Data Engineering tools/frameworks like Apache Spark, Airflow, Flink and Hadoop ecosystems. Understanding of network technology fundamentals, Data Structures, scalable system design and ability to translate information in a structured manner for wider product and engineering teams to translate into working solution. Some other highly valued skills may include Knowledge of DevOps tooling, GitOps, CI/CD, configuration management, Jenkins, build pipelines and source control systems. Working knowledge of cloud infrastructure services: compute, storage, networking, hybrid connectivity, monitoring/logging, security and IAM. Programming experience in one of the highlevel languages like Python, Java, Golang and NetDevOps automation and AI engineering (GenAI/Agentic AI), API service development. Proficiency in Agile Methodologies Scrum/Kanban, backlog and workflow mgmt. and SRE specific reporting (MTTR, deployment frequency, SLO and others) You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills Location is Knutsford Purpose of the role To design, develop, and implement solutions to complex business problems, collaborating with stakeholders to understand their needs and requirements, and design and implement solutions that meet those needs and create solutions that balance technology risks against business delivery, driving consistency. Accountabilities Design and development of solutions as products that can evolve, meeting business requirements that align with modern software engineering practices and automated delivery tooling. This includes identification and implementation of the technologies and platforms. Targeted design activities that apply an appropriate workload placement strategy and maximise the benefit of cloud capabilities such as elasticity, serverless, containerisation etc. Best practice designs incorporating security principles (such as defence in depth and reduction of blast radius) that meet the Bank's resiliency expectations. Solutions that appropriately balance risks and controls to deliver the agreed business and technology value. Adoption of standardised solutions where they fit. If no standard solutions fit, feed into their ongoing evolution where appropriate. Fault finding and performance issues support to operational support teams, leveraging available tooling. Solution design impact assessment in terms of risk, capacity and cost impact, inc. estimation of project change and ongoing run costs. Development of the requisite architecture inputs required to comply with the banks governance processes, including design artefacts required for architecture, privacy, security and records management governance processes. Vice President Expectations To contribute or set strategy, drive requirements and make recommendations for change. Plan resources, budgets, and policies; manage and maintain policies/ processes; deliver continuous improvements and escalate breaches of policies/procedures If managing a team, they define jobs and responsibilities, planning for the department's future needs and operations, counselling employees on performance and contributing to employee pay decisions/changes. They may also lead a number of specialists to influence the operations of a department, in alignment with strategic as well as tactical priorities, while balancing short and long term goals and ensuring that budgets and schedules meet corporate requirements If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L - Listen and be authentic, E - Energise and inspire, A - Align across the enterprise, D - Develop others OR for an individual contributor, they will be a subject matter expert within own discipline and will guide technical direction. They will lead collaborative, multi-year assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will train, guide and coach less experienced specialists and provide information affecting long term profits, organisational risks and strategic decisions Advise key stakeholders, including functional leadership teams and senior management on functional and cross functional areas of impact and alignment. Manage and mitigate risks through assessment, in support of the control and governance agenda. Demonstrate leadership and accountability for managing risk and strengthening controls in relation to the work your team does. Demonstrate comprehensive understanding of the organisation functions to contribute to achieving the goals of the business. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategies. Create solutions based on sophisticated analytical thought comparing and selecting complex alternatives. In-depth analysis with interpretative thinking will be required to define problems and develop innovative solutions. Adopt and include the outcomes of extensive research in problem solving processes. Seek out, build and maintain trusting relationships and partnerships with internal and external stakeholders in order to accomplish key business objectives, using influencing and negotiating skills to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship - our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset - to Empower, Challenge and Drive - the operating manual for how we behave.
Lorien
Data Engineer - AI & Machine Learning Focus
Lorien
Data Engineer - AI & Machine Learning Focus Location: Kings Cross, London ? Hybrid Working: 2-3 days in the office, remainder from home Duration: 6 Months Contract Type: Inside IR35 (via Umbrella) Are you passionate about building data solutions that drive cutting-edge AI and Machine Learning applications? We're looking for a talented Data Engineer to join a forward-thinking team developing transformative therapies for both existing and emerging diseases. This is your opportunity to work on impactful projects at the intersection of data engineering and life sciences, using modern cloud technologies to deliver scalable, reliable data pipelines that support scientific innovation. What You'll Be Doing: Designing and building data pipelines using tools like Python, Spark, SQL, BigQuery, and Google Cloud Storage Ensuring pipelines meet the specific needs of data-driven scientific applications Writing high-quality, well-documented code with automated testing Monitoring and improving key metrics across tools and services Collaborating with cross-functional teams to build end-to-end data solutions Participating in code reviews and contributing to team-wide engineering standards What We're Looking For: Data engineering experience Python and SQL skills Experience with cloud platforms (Google Cloud preferred) Familiarity with unit testing (e.g. pytest) and agile development practices Proficiency with modern software development tools (e.g. Git/GitHub, DevOps) Desirable Experience: Experience with biological or scientific datasets (e.g. genomics, proteomics) Bioinformatics expertise and familiarity with large-scale datasets Knowledge of NLP techniques and unstructured data processing Experience with orchestration tools (e.g. Airflow, Google Workflows) Exposure to AI/ML-powered applications Familiarity with containerization (e.g. Docker) and Nextflow pipelines If you're excited by the idea of using data engineering to help shape the future of healthcare and AI, we'd love to hear from you. Carbon60, Lorien & SRG - The Impellam Group STEM Portfolio are acting as an Employment Business in relation to this vacancy.
06/10/2025
Full time
Data Engineer - AI & Machine Learning Focus Location: Kings Cross, London ? Hybrid Working: 2-3 days in the office, remainder from home Duration: 6 Months Contract Type: Inside IR35 (via Umbrella) Are you passionate about building data solutions that drive cutting-edge AI and Machine Learning applications? We're looking for a talented Data Engineer to join a forward-thinking team developing transformative therapies for both existing and emerging diseases. This is your opportunity to work on impactful projects at the intersection of data engineering and life sciences, using modern cloud technologies to deliver scalable, reliable data pipelines that support scientific innovation. What You'll Be Doing: Designing and building data pipelines using tools like Python, Spark, SQL, BigQuery, and Google Cloud Storage Ensuring pipelines meet the specific needs of data-driven scientific applications Writing high-quality, well-documented code with automated testing Monitoring and improving key metrics across tools and services Collaborating with cross-functional teams to build end-to-end data solutions Participating in code reviews and contributing to team-wide engineering standards What We're Looking For: Data engineering experience Python and SQL skills Experience with cloud platforms (Google Cloud preferred) Familiarity with unit testing (e.g. pytest) and agile development practices Proficiency with modern software development tools (e.g. Git/GitHub, DevOps) Desirable Experience: Experience with biological or scientific datasets (e.g. genomics, proteomics) Bioinformatics expertise and familiarity with large-scale datasets Knowledge of NLP techniques and unstructured data processing Experience with orchestration tools (e.g. Airflow, Google Workflows) Exposure to AI/ML-powered applications Familiarity with containerization (e.g. Docker) and Nextflow pipelines If you're excited by the idea of using data engineering to help shape the future of healthcare and AI, we'd love to hear from you. Carbon60, Lorien & SRG - The Impellam Group STEM Portfolio are acting as an Employment Business in relation to this vacancy.
Stott and May
Principle Data Engineer
Stott and May
Principal Data Engineer - Hybrid (London/Winchester) We're seeking a hands-on Principal Data Engineer to design and deliver enterprise-scale, cloud-native data platforms that power analytics, reporting, and Real Time decision-making. This is a strategic technical leadership role where you'll shape architecture, mentor engineers, and deliver end-to-end solutions across a modern AWS/Databricks stack. What you'll do Lead the design of scalable, secure data architectures on AWS. Build and optimise ETL/ELT pipelines for batch and streaming data. Deploy and manage Apache Spark jobs on Databricks and Delta Lake. Write production-grade Python and SQL for large-scale data transformations. Drive data quality, governance, and automation through CI/CD and IaC. Collaborate with data scientists, analysts, and business stakeholders. Mentor and guide data engineering teams. What we're looking for Proven experience in senior/principal data engineering roles. Expertise in AWS, Databricks, Apache Spark, Python, and SQL. Strong background in cloud-native data platforms, Real Time processing, and data lakes. Hands-on experience with tools such as Airflow, Kafka, Docker, GitLab CI/CD. Excellent stakeholder engagement and leadership skills. What's on offer £84000 salary + 10% bonus 6% pension contribution Private medical & flexible benefits package 25 days annual leave (plus buy/sell options) Hybrid working - travel to London or Winchester once/twice per week Join a company at the forefront of media, connectivity, and smart technology, where your work directly powers millions of daily connections across the UK.
06/10/2025
Full time
Principal Data Engineer - Hybrid (London/Winchester) We're seeking a hands-on Principal Data Engineer to design and deliver enterprise-scale, cloud-native data platforms that power analytics, reporting, and Real Time decision-making. This is a strategic technical leadership role where you'll shape architecture, mentor engineers, and deliver end-to-end solutions across a modern AWS/Databricks stack. What you'll do Lead the design of scalable, secure data architectures on AWS. Build and optimise ETL/ELT pipelines for batch and streaming data. Deploy and manage Apache Spark jobs on Databricks and Delta Lake. Write production-grade Python and SQL for large-scale data transformations. Drive data quality, governance, and automation through CI/CD and IaC. Collaborate with data scientists, analysts, and business stakeholders. Mentor and guide data engineering teams. What we're looking for Proven experience in senior/principal data engineering roles. Expertise in AWS, Databricks, Apache Spark, Python, and SQL. Strong background in cloud-native data platforms, Real Time processing, and data lakes. Hands-on experience with tools such as Airflow, Kafka, Docker, GitLab CI/CD. Excellent stakeholder engagement and leadership skills. What's on offer £84000 salary + 10% bonus 6% pension contribution Private medical & flexible benefits package 25 days annual leave (plus buy/sell options) Hybrid working - travel to London or Winchester once/twice per week Join a company at the forefront of media, connectivity, and smart technology, where your work directly powers millions of daily connections across the UK.
Tenth Revolution Group
Data Engineer
Tenth Revolution Group
Data Engineer - Portsmouth - 60,000 A rapidly growing software company in Portsmouth is looking for a Data Engineer to lead the build of their first-ever data platform. This is a greenfield opportunity to architect and implement scalable data infrastructure from scratch, with the potential to grow into a team lead role. With over 100 employees and continued expansion, they're now investing in data to support smarter decision-making across their subscription-based business. This is a hands-on role with strategic impact. You'll have the autonomy to choose the right tools and shape the data architecture. If successful, you'll have the opportunity to grow the data function and step into a leadership role. The Role: Designing and building robust ETL pipelines using tools like dbt or Apache Airflow Integrating data from APIs, databases, and SaaS platforms into BigQuery Structuring clean, queryable data models to support analytics and reporting Collaborating with analysts to deliver insightful dashboards via Looker Establishing data governance and quality processes Requirements: GCP (BigQuery), but open to other cloud backgrounds ETL: dbt, Apache Airflow, or similar BI: Looker (preferred), or other BI tools Languages: SQL, Python, Java Experienced data engineer, with strong ETL and cloud data warehouse experience Proficiency in SQL and data modelling best practices Experience with BI tools and dashboard creation Self-starter attitude with minimal supervision required Package: Competitive salary up to 70,000 + discretionary bonus (performance reviewed twice a year) Discounts, perks, and 20 days holiday Please Note: This is a permanent role for UK residents only. This role does not offer Sponsorship. You must have the right to work in the UK with no restrictions. Some of our roles may be subject to successful background checks including a DBS and Credit Check. Contact me: (url removed)
04/10/2025
Full time
Data Engineer - Portsmouth - 60,000 A rapidly growing software company in Portsmouth is looking for a Data Engineer to lead the build of their first-ever data platform. This is a greenfield opportunity to architect and implement scalable data infrastructure from scratch, with the potential to grow into a team lead role. With over 100 employees and continued expansion, they're now investing in data to support smarter decision-making across their subscription-based business. This is a hands-on role with strategic impact. You'll have the autonomy to choose the right tools and shape the data architecture. If successful, you'll have the opportunity to grow the data function and step into a leadership role. The Role: Designing and building robust ETL pipelines using tools like dbt or Apache Airflow Integrating data from APIs, databases, and SaaS platforms into BigQuery Structuring clean, queryable data models to support analytics and reporting Collaborating with analysts to deliver insightful dashboards via Looker Establishing data governance and quality processes Requirements: GCP (BigQuery), but open to other cloud backgrounds ETL: dbt, Apache Airflow, or similar BI: Looker (preferred), or other BI tools Languages: SQL, Python, Java Experienced data engineer, with strong ETL and cloud data warehouse experience Proficiency in SQL and data modelling best practices Experience with BI tools and dashboard creation Self-starter attitude with minimal supervision required Package: Competitive salary up to 70,000 + discretionary bonus (performance reviewed twice a year) Discounts, perks, and 20 days holiday Please Note: This is a permanent role for UK residents only. This role does not offer Sponsorship. You must have the right to work in the UK with no restrictions. Some of our roles may be subject to successful background checks including a DBS and Credit Check. Contact me: (url removed)
Expleo UK LTD
Data Engineer
Expleo UK LTD
Are you a Data Engineer, seeking a new challenge? Do you have passion for future vehicle development? If so, Expleo have an opportunity for you! Our client, a prestigious Automotive Manufacturer, is currently recruiting for a Data Engineer to join their Customer Relationship Centre team. As a Data Engineer you will play a key part in enabling data-driven decision-making by building and maintaining robust data pipelines, integrating customer service systems, and ensuring high-quality data is available for analysis and insights. Based in Warwickshire, you will be supporting, a dedicated and enthusiastic team, on a contract basis. You will work closely with analysts, developers, and operational teams to support customer experience initiatives. . Responsibilities of the Data Engineer will include: Design, build, and maintain scalable data pipelines to support customer service operations Migrate and modernise legacy data systems to cloud-based solutions Integrate data from CRM systems and customer touchpoints into cloud platforms Ensure data quality, consistency, and availability for reporting and analytics Collaborate with Data Analysts to deliver actionable insights Develop and maintain documentation for data architecture, workflows, and processes Troubleshoot and resolve data-related issues, ensuring minimal disruption to service operations Support automation of reporting and data delivery Design and implement API integrations to integrate data Qualifications and skills required for the Data Engineer position: Ideally degree educated in Computer Science, Data Engineering, or a related field Experience as a Data Engineer, ideally within a customer service or contact centre environment Knowledge of SQL, Python, and data pipeline development Skilled in Google Cloud Platform (GCP) and cloud data tools Background in CRM systems and customer data structures Understanding of data warehousing concepts and cloud architecture Experience with ETL tools and frameworks Airflow, Git, CI/CD pipeline Data Insights reporting experience Competent with real-time data processing and streaming technologies Proficiency in Tableau or other data visualisation tools is desirable PLEASE NOTE To meet with current legislation, right to work checks will be carried out to ensure candidates can work in the UK. Regretfully, we are unable to support applications that require sponsorship. If you are interested in applying for the role of Data Engineer or require further information, please contact: Jacquie Linton (phone number removed) (url removed)
03/10/2025
Contractor
Are you a Data Engineer, seeking a new challenge? Do you have passion for future vehicle development? If so, Expleo have an opportunity for you! Our client, a prestigious Automotive Manufacturer, is currently recruiting for a Data Engineer to join their Customer Relationship Centre team. As a Data Engineer you will play a key part in enabling data-driven decision-making by building and maintaining robust data pipelines, integrating customer service systems, and ensuring high-quality data is available for analysis and insights. Based in Warwickshire, you will be supporting, a dedicated and enthusiastic team, on a contract basis. You will work closely with analysts, developers, and operational teams to support customer experience initiatives. . Responsibilities of the Data Engineer will include: Design, build, and maintain scalable data pipelines to support customer service operations Migrate and modernise legacy data systems to cloud-based solutions Integrate data from CRM systems and customer touchpoints into cloud platforms Ensure data quality, consistency, and availability for reporting and analytics Collaborate with Data Analysts to deliver actionable insights Develop and maintain documentation for data architecture, workflows, and processes Troubleshoot and resolve data-related issues, ensuring minimal disruption to service operations Support automation of reporting and data delivery Design and implement API integrations to integrate data Qualifications and skills required for the Data Engineer position: Ideally degree educated in Computer Science, Data Engineering, or a related field Experience as a Data Engineer, ideally within a customer service or contact centre environment Knowledge of SQL, Python, and data pipeline development Skilled in Google Cloud Platform (GCP) and cloud data tools Background in CRM systems and customer data structures Understanding of data warehousing concepts and cloud architecture Experience with ETL tools and frameworks Airflow, Git, CI/CD pipeline Data Insights reporting experience Competent with real-time data processing and streaming technologies Proficiency in Tableau or other data visualisation tools is desirable PLEASE NOTE To meet with current legislation, right to work checks will be carried out to ensure candidates can work in the UK. Regretfully, we are unable to support applications that require sponsorship. If you are interested in applying for the role of Data Engineer or require further information, please contact: Jacquie Linton (phone number removed) (url removed)
Tenth Revolution Group
Data Engineer
Tenth Revolution Group
Data Engineer - Portsmouth - £60,000 A rapidly growing software company in Portsmouth is looking for a Data Engineer to lead the build of their first-ever data platform. This is a greenfield opportunity to architect and implement scalable data infrastructure from scratch, with the potential to grow into a team lead role. With over 100 employees and continued expansion, they're now investing in data to support smarter decision-making across their subscription-based business. This is a hands-on role with strategic impact. You'll have the autonomy to choose the right tools and shape the data architecture. If successful, you'll have the opportunity to grow the data function and step into a leadership role. The Role: Designing and building robust ETL pipelines using tools like dbt or Apache Airflow Integrating data from APIs, databases, and SaaS platforms into BigQuery Structuring clean, queryable data models to support analytics and reporting Collaborating with analysts to deliver insightful dashboards via Looker Establishing data governance and quality processes Requirements: GCP (BigQuery), but open to other cloud backgrounds ETL: dbt, Apache Airflow, or similar BI: Looker (preferred), or other BI tools Languages: SQL, Python, Java Experienced data engineer, with strong ETL and cloud data warehouse experience Proficiency in SQL and data modelling best practices Experience with BI tools and dashboard creation Self-starter attitude with minimal supervision required Package: Competitive salary up to £70,000 + discretionary bonus (performance reviewed twice a year) Discounts, perks, and 20 days holiday Please Note: This is a permanent role for UK residents only. This role does not offer Sponsorship. You must have the right to work in the UK with no restrictions. Some of our roles may be subject to successful background checks including a DBS and Credit Check. Contact me:
03/10/2025
Full time
Data Engineer - Portsmouth - £60,000 A rapidly growing software company in Portsmouth is looking for a Data Engineer to lead the build of their first-ever data platform. This is a greenfield opportunity to architect and implement scalable data infrastructure from scratch, with the potential to grow into a team lead role. With over 100 employees and continued expansion, they're now investing in data to support smarter decision-making across their subscription-based business. This is a hands-on role with strategic impact. You'll have the autonomy to choose the right tools and shape the data architecture. If successful, you'll have the opportunity to grow the data function and step into a leadership role. The Role: Designing and building robust ETL pipelines using tools like dbt or Apache Airflow Integrating data from APIs, databases, and SaaS platforms into BigQuery Structuring clean, queryable data models to support analytics and reporting Collaborating with analysts to deliver insightful dashboards via Looker Establishing data governance and quality processes Requirements: GCP (BigQuery), but open to other cloud backgrounds ETL: dbt, Apache Airflow, or similar BI: Looker (preferred), or other BI tools Languages: SQL, Python, Java Experienced data engineer, with strong ETL and cloud data warehouse experience Proficiency in SQL and data modelling best practices Experience with BI tools and dashboard creation Self-starter attitude with minimal supervision required Package: Competitive salary up to £70,000 + discretionary bonus (performance reviewed twice a year) Discounts, perks, and 20 days holiday Please Note: This is a permanent role for UK residents only. This role does not offer Sponsorship. You must have the right to work in the UK with no restrictions. Some of our roles may be subject to successful background checks including a DBS and Credit Check. Contact me:
Joseph Harry Ltd
Python Software Engineer AWS Java Data Finance London
Joseph Harry Ltd
Senior Python Software Engineer (Senior Architecture Programmer Developer Python Java Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You MUST have the following: Advanced ability as a Senior Python Software Engineer/Technical Lead/Solutions Architect/Principal Engineer Good design and architecture ability Java Three or more of the following: Iceberg Dremio DBT Arrow Snowflake Glue Athena Airflow Agile The following is DESIRABLE, not essential: Trading, Front Office finance Spark Buy-side asset management (hedge fund, asset manager, investment management) Role: Senior Python Software Engineer (Senior Architecture Programmer Developer Python Java Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund Snowflake) required by my asset management client in London. You will join a relatively new department that is responsible for the data used across the Front Office. The data is sourced from a variety of external vendors and internal departments and held in an AWS data lake, although this is being migrated to a data mesh architecture. They are working heavily with Python, Java and AWS. If you have any experience in Iceberg, Dremio, DBT, Arrow, Spark, Snowflake, Glue, Athena, Airflow or related tools, this would also be very advantageous. You will join a team of 5 that are responsible for pricing data for the Front Office. This is a senior role in the team and will demand a strong ability in design and architecture. If you come in at the right level, you could be the deputy for the team manager. They have a very flexible hybrid working set up. Salary: £120-150k + 15% Bonus + 10% Pension
03/10/2025
Full time
Senior Python Software Engineer (Senior Architecture Programmer Developer Python Java Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You MUST have the following: Advanced ability as a Senior Python Software Engineer/Technical Lead/Solutions Architect/Principal Engineer Good design and architecture ability Java Three or more of the following: Iceberg Dremio DBT Arrow Snowflake Glue Athena Airflow Agile The following is DESIRABLE, not essential: Trading, Front Office finance Spark Buy-side asset management (hedge fund, asset manager, investment management) Role: Senior Python Software Engineer (Senior Architecture Programmer Developer Python Java Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund Snowflake) required by my asset management client in London. You will join a relatively new department that is responsible for the data used across the Front Office. The data is sourced from a variety of external vendors and internal departments and held in an AWS data lake, although this is being migrated to a data mesh architecture. They are working heavily with Python, Java and AWS. If you have any experience in Iceberg, Dremio, DBT, Arrow, Spark, Snowflake, Glue, Athena, Airflow or related tools, this would also be very advantageous. You will join a team of 5 that are responsible for pricing data for the Front Office. This is a senior role in the team and will demand a strong ability in design and architecture. If you come in at the right level, you could be the deputy for the team manager. They have a very flexible hybrid working set up. Salary: £120-150k + 15% Bonus + 10% Pension
Joseph Harry Ltd
Senior Java Software Engineer AWS Python Data Finance London
Joseph Harry Ltd
Senior Java Software Engineer (Senior Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You MUST have the following: Advanced ability as a Senior Java Software Engineer/Technical Lead/Solutions Architect/Principal Engineer Good design and architecture ability Python Three or more of the following: Iceberg Dremio DBT Arrow Snowflake Glue Athena Airflow Agile The following is DESIRABLE, not essential: Trading, Front Office finance Spark Buy-side asset management (hedge fund, asset manager, investment management) Role: Senior Java Software Engineer (Senior Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund Snowflake) required by my asset management client in London. You will join a relatively new department that is responsible for the data used across the Front Office. The data is sourced from a variety of external vendors and internal departments and held in an AWS data lake, although this is being migrated to a data mesh architecture. They are working heavily with Java, Python and AWS. If you have any experience in Iceberg, Dremio, DBT, Arrow, Spark, Snowflake, Glue, Athena, Airflow or related tools, this would also be very advantageous. You will join a team of 5 that are responsible for pricing data for the Front Office. This is a senior role in the team and will demand a strong ability in design and architecture. If you come in at the right level, you could be the deputy for the team manager. They have a very flexible hybrid working set up. Salary: £90-120k + 15% Bonus + 10% Pension
03/10/2025
Full time
Senior Java Software Engineer (Senior Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You MUST have the following: Advanced ability as a Senior Java Software Engineer/Technical Lead/Solutions Architect/Principal Engineer Good design and architecture ability Python Three or more of the following: Iceberg Dremio DBT Arrow Snowflake Glue Athena Airflow Agile The following is DESIRABLE, not essential: Trading, Front Office finance Spark Buy-side asset management (hedge fund, asset manager, investment management) Role: Senior Java Software Engineer (Senior Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund Snowflake) required by my asset management client in London. You will join a relatively new department that is responsible for the data used across the Front Office. The data is sourced from a variety of external vendors and internal departments and held in an AWS data lake, although this is being migrated to a data mesh architecture. They are working heavily with Java, Python and AWS. If you have any experience in Iceberg, Dremio, DBT, Arrow, Spark, Snowflake, Glue, Athena, Airflow or related tools, this would also be very advantageous. You will join a team of 5 that are responsible for pricing data for the Front Office. This is a senior role in the team and will demand a strong ability in design and architecture. If you come in at the right level, you could be the deputy for the team manager. They have a very flexible hybrid working set up. Salary: £90-120k + 15% Bonus + 10% Pension

Modal Window

  • Home
  • Contact
  • About Us
  • FAQs
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • IT blog
  • Facebook
  • Twitter
  • LinkedIn
  • Youtube
© 2008-2026 IT Job Board