it job board logo
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
  • Recruiting? Post a job
  • Sign in
  • Sign up
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

4 jobs found

Email me jobs like this
Refine Search
Current Search
pyspark developer
Tenth Revolution Group
Lead Azure Data Engineer
Tenth Revolution Group
Contract Opportunity: Lead Azure Data Engineer (Remote - 500/day Outside IR35) We're hiring for a Lead Azure Data Engineer to join our team on a hybrid contract based in London, supporting key finance stakeholders and transforming our data platform. This role offers the chance to shape the future of financial reporting through cutting-edge cloud engineering and data architecture. Location : Central London (UK-based candidates only) Rate : 500/day IR35 Status : Outside IR35 Start Date : ASAP - Interviews next week Duration : 6 months with potential for extension The Role You'll lead the design and delivery of scalable data products that support financial analysis and decision-making. Working closely with BI and Analytics teams, you'll help evolve our data warehouse and implement best-in-class engineering practices across Azure. Key Responsibilities Build and enhance ETL/ELT pipelines in Azure Databricks Develop facts and dimensions for financial reporting Collaborate with cross-functional teams to deliver robust data solutions Optimize data workflows for performance and cost-efficiency Implement governance and security using Unity Catalog Drive automation and CI/CD practices across the data platform Explore new technologies to improve data ingestion and self-service Essential Skills Azure Databricks : Expert in Spark (SQL, PySpark), Databricks Workflows Data Pipeline Design : Proven experience in scalable ETL/ELT development Azure Services : Data Lake, Blob Storage, Synapse Data Governance : Unity Catalog, access control, metadata management Performance Tuning : Partitioning, caching, Spark job optimization Cloud Architecture : Infrastructure-as-code, monitoring, automation Finance Domain Knowledge : Experience with financial systems and reporting Data Modelling : Kimball methodology, star schemas Retail Experience : Preferred but not essential About the Team We're a collaborative data function made up of BI Developers and Data Engineers. We work end-to-end on solutions, share knowledge, and support each other's growth. Our culture values curiosity, innovation, and continuous learning. Interested? We have limited interview slots next week and aim to fill this role by the end of the month. Please send me a copy of your CV if you meet the requirements
02/10/2025
Contractor
Contract Opportunity: Lead Azure Data Engineer (Remote - 500/day Outside IR35) We're hiring for a Lead Azure Data Engineer to join our team on a hybrid contract based in London, supporting key finance stakeholders and transforming our data platform. This role offers the chance to shape the future of financial reporting through cutting-edge cloud engineering and data architecture. Location : Central London (UK-based candidates only) Rate : 500/day IR35 Status : Outside IR35 Start Date : ASAP - Interviews next week Duration : 6 months with potential for extension The Role You'll lead the design and delivery of scalable data products that support financial analysis and decision-making. Working closely with BI and Analytics teams, you'll help evolve our data warehouse and implement best-in-class engineering practices across Azure. Key Responsibilities Build and enhance ETL/ELT pipelines in Azure Databricks Develop facts and dimensions for financial reporting Collaborate with cross-functional teams to deliver robust data solutions Optimize data workflows for performance and cost-efficiency Implement governance and security using Unity Catalog Drive automation and CI/CD practices across the data platform Explore new technologies to improve data ingestion and self-service Essential Skills Azure Databricks : Expert in Spark (SQL, PySpark), Databricks Workflows Data Pipeline Design : Proven experience in scalable ETL/ELT development Azure Services : Data Lake, Blob Storage, Synapse Data Governance : Unity Catalog, access control, metadata management Performance Tuning : Partitioning, caching, Spark job optimization Cloud Architecture : Infrastructure-as-code, monitoring, automation Finance Domain Knowledge : Experience with financial systems and reporting Data Modelling : Kimball methodology, star schemas Retail Experience : Preferred but not essential About the Team We're a collaborative data function made up of BI Developers and Data Engineers. We work end-to-end on solutions, share knowledge, and support each other's growth. Our culture values curiosity, innovation, and continuous learning. Interested? We have limited interview slots next week and aim to fill this role by the end of the month. Please send me a copy of your CV if you meet the requirements
Searchability
Contract Python Engineer - DV Cleared
Searchability
CONTRACT PYTHON/DATA ENGINEER - DV CLEARED NEW OUTSIDE IR35 CONTRACT OPPORTUNITY AVAILABLE WITHIN A LEADING NATIONAL SECURITY SME FOR A PYTHON/DATA ENGINEER WITH DV CLEARANCE Contract job opportunity for a Data Engineer National Security & Defence client Outside IR35 12 month rolling contract Central London based organisation in an easily accessible location To apply please call or email (see below) WHO WE ARE? We are recruiting a contract Python/Data Engineer to work with a National Security & Defence SME in central London. Due to the nature of the work, you must hold UKSV/MOD or Enhanced DV Clearance. WE NEED THE PYTHON/DATA ENGINEER TO HAVE. Current DV Security Clearance (Standard or Enhanced) Experience with big data tools such as Hadoop, Cloudera or Elasticsearch Python/PySpark experience Experience With Palantir Foundry is nice to have Experience working in an Agile Scrum environment with tools such as Confluence/Jira Experience in design, development, test and integration of software Willingness to learn new technologies TO BE CONSIDERED. Please either apply by clicking online or emailing me directly to (see below) For further information please call me. I can make myself available outside of normal working hours to suit from 7am until 10pm. If unavailable, please leave a message and either myself or one of my colleagues will respond. By applying for this role, you give express consent for us to process & submit (subject to required skills) your application to our client in conjunction with this vacancy only. I look forward to hearing from you. PYTHON/DATA ENGINEER - DV CLEARED KEY SKILLS: BIG DATA DEVELOPER/BIG DATA ENGINEER/SENIOR BIG DATA DEVELOPER/SENIOR BIG DATA ENIGNEER/DATA ENGINEER/DATA DEVELOPER/SENIOR SOFTWARE DEVELOPER LEAD SOFTWARE ENGINEER/LEAD SOFTWARE DEVELOPER/SENIOR SOFTWARE DEVELOPER/DV CLEARED/DV CLEARANCE/DEVELOPPED VETTING/DEVELOPED VETTED/DEEP VETTING/DEEP VETTED/SC CLEARED/SC CLEARANCE/SECURITY CLEARED/SECURITY CLEARANCE/NIFI/CLOUDERA/HADOOP/KAFKA/ELASTIC SEARCH/LEAD BIG DATA ENGINEER/LEAD BIG DATA DEVELOPER
01/10/2025
Contractor
CONTRACT PYTHON/DATA ENGINEER - DV CLEARED NEW OUTSIDE IR35 CONTRACT OPPORTUNITY AVAILABLE WITHIN A LEADING NATIONAL SECURITY SME FOR A PYTHON/DATA ENGINEER WITH DV CLEARANCE Contract job opportunity for a Data Engineer National Security & Defence client Outside IR35 12 month rolling contract Central London based organisation in an easily accessible location To apply please call or email (see below) WHO WE ARE? We are recruiting a contract Python/Data Engineer to work with a National Security & Defence SME in central London. Due to the nature of the work, you must hold UKSV/MOD or Enhanced DV Clearance. WE NEED THE PYTHON/DATA ENGINEER TO HAVE. Current DV Security Clearance (Standard or Enhanced) Experience with big data tools such as Hadoop, Cloudera or Elasticsearch Python/PySpark experience Experience With Palantir Foundry is nice to have Experience working in an Agile Scrum environment with tools such as Confluence/Jira Experience in design, development, test and integration of software Willingness to learn new technologies TO BE CONSIDERED. Please either apply by clicking online or emailing me directly to (see below) For further information please call me. I can make myself available outside of normal working hours to suit from 7am until 10pm. If unavailable, please leave a message and either myself or one of my colleagues will respond. By applying for this role, you give express consent for us to process & submit (subject to required skills) your application to our client in conjunction with this vacancy only. I look forward to hearing from you. PYTHON/DATA ENGINEER - DV CLEARED KEY SKILLS: BIG DATA DEVELOPER/BIG DATA ENGINEER/SENIOR BIG DATA DEVELOPER/SENIOR BIG DATA ENIGNEER/DATA ENGINEER/DATA DEVELOPER/SENIOR SOFTWARE DEVELOPER LEAD SOFTWARE ENGINEER/LEAD SOFTWARE DEVELOPER/SENIOR SOFTWARE DEVELOPER/DV CLEARED/DV CLEARANCE/DEVELOPPED VETTING/DEVELOPED VETTED/DEEP VETTING/DEEP VETTED/SC CLEARED/SC CLEARANCE/SECURITY CLEARED/SECURITY CLEARANCE/NIFI/CLOUDERA/HADOOP/KAFKA/ELASTIC SEARCH/LEAD BIG DATA ENGINEER/LEAD BIG DATA DEVELOPER
Brio Digital
Python Developer
Brio Digital Leeds, Yorkshire
Python Developer - Contract - Fully Remote - Inside IR35 - 2 opportunities available Brio Digital are partnered with a Consultancy who work closely with the NHS to deliver their data processing platforms. We have opportunities for 2 Python Developers to come onboard ASAP and work across two specific projects. The platform is built on Python, PySpark on AWS which includes the use of Docker and Terraform. You'll ideally have a strong Linux background and any experience with DataBricks would be highly advantageous. The positions can all be fully remote and are all inside IR35. Python Developer key skills Python Spark PySpark AWS, Terraform, Docker Linux DataBricks Apply now
05/11/2021
Contractor
Python Developer - Contract - Fully Remote - Inside IR35 - 2 opportunities available Brio Digital are partnered with a Consultancy who work closely with the NHS to deliver their data processing platforms. We have opportunities for 2 Python Developers to come onboard ASAP and work across two specific projects. The platform is built on Python, PySpark on AWS which includes the use of Docker and Terraform. You'll ideally have a strong Linux background and any experience with DataBricks would be highly advantageous. The positions can all be fully remote and are all inside IR35. Python Developer key skills Python Spark PySpark AWS, Terraform, Docker Linux DataBricks Apply now
Jefferson Frank International
Data Engineer
Jefferson Frank International
An AWS consulting partner has a 6 month, fully reomte and outside of IR35 contract opportunity for an AWS Data Engineer with strong developer/programmatic skills in Python and Pyspark to work for a large and recognisable UK based Financial Services client. Experience across the AWS Data stack with Glue, Lambda and S3 are essential as the project environment is within SDLF (serverless datalake framework). Redshift modelling or Schema/Star Schema experience is also essential as the successful candidate will be making data available to Tableau dashboards. The successful candidate will also have excellent communication skills and have hands on experience migrating and running complex transformations of large and varied formats using AWS Glue and Lambda. If interested please send me an up to date version of your CV and contact number or reach out to me E - (see below)
04/11/2021
Contractor
An AWS consulting partner has a 6 month, fully reomte and outside of IR35 contract opportunity for an AWS Data Engineer with strong developer/programmatic skills in Python and Pyspark to work for a large and recognisable UK based Financial Services client. Experience across the AWS Data stack with Glue, Lambda and S3 are essential as the project environment is within SDLF (serverless datalake framework). Redshift modelling or Schema/Star Schema experience is also essential as the successful candidate will be making data available to Tableau dashboards. The successful candidate will also have excellent communication skills and have hands on experience migrating and running complex transformations of large and varied formats using AWS Glue and Lambda. If interested please send me an up to date version of your CV and contact number or reach out to me E - (see below)

Modal Window

  • Home
  • Contact
  • About Us
  • FAQs
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • IT blog
  • Facebook
  • Twitter
  • LinkedIn
  • Youtube
© 2008-2026 IT Job Board