it job board logo
  • Home
  • Find IT Jobs
  • Register CV
  • Register as Employer
  • Contact us
  • Career Advice
  • Recruiting? Post a job
  • Sign in
  • Sign up
  • Home
  • Find IT Jobs
  • Register CV
  • Register as Employer
  • Contact us
  • Career Advice
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

352 jobs found

Email me jobs like this
Refine Search
Current Search
systems engineer python and sql
Involved Productions Ltd
Data Engineer
Involved Productions Ltd London
We’re looking for a Data Engineer to work across the Involved Group, the collective behind globally renowned dance and electronic music labels including Anjunabeats and Anjunadeep, spanning label services and distribution, music publishing, events promotion and artist management. This is a key role within our Technology Department, responsible for developing and managing data pipelines, automating data collection processes, and creating analytics dashboards to provide actionable insights across the company, directly impacting strategy. This role involves working closely with a variety of departments to understand their data needs, developing solutions that streamline data analysis and reporting processes. Reporting to the Head of Technology, our Data Engineer ensures that data analytics initiatives are strategically aligned, efficiently executed, and contribute to the company's overall objectives. Location: Bermondsey, London Working pattern: Part-time (3 days/week) – either in-person at our lively Bermondsey office, hybrid, or home-working.   ____________________________   Who we are:   Based in Bermondsey, the Involved group of companies includes: Involved Productions, home of globally renowned independent dance and electronic music labels Anjunabeats, Anjunadeep and Anjunachill, as well as our label and distribution services. Involved Live, the touring and events company responsible for a portfolio of international events, festivals, and all-night-long showcases, creating unforgettable experiences for fans globally. Involved Publishing, a progressive independent music publisher, representing cutting-edge producers, writers and artists from around the world. Involved Management is a boutique artist management company that is responsible for steering the careers of Above & Beyond, Lane 8, Le Youth and Dusky.  We offer careers, not just jobs, and our team embrace the entrepreneurial spirit, independent mindset and respectful culture we have created, building community and connection through music. ____________________________   Our Data Engineer is responsible for: Analytics Dashboard Creation: Developing and optimising Tableau dashboards that provide clear, actionable insights to various teams, including Streaming & Promotions, Label Directors, and Publishing. Data Pipeline Development: Designing, building, and maintaining efficient and scalable data pipelines to automate the collection, transformation, and delivery of data to and from various sources, including DSPs, FUGA Analytics, Google Analytics, Chartmetric, Curve, etc. Database Management: Developing and maintaining the company’s database structure, ensuring data accuracy, security, and accessibility for analytics purposes. Teaching: Providing support and training to ensure teams are making effective use of analytics tools and dashboards. Tailoring : Collaborating with different departments to understand their data needs, and working creatively to provide tailored analytics solutions. Building: Supporting the Head of Technology in building and maintaining cross-platform automations. Innovation and Research: Staying up to date with the latest trends and technologies in data engineering and analytics, exploring new tools and methodologies that can enhance our data capabilities. This list is not exhaustive – we may ask you to go beyond your job description on occasion, and we hope the role will change and develop with you. ____________________________   About you:   The ideal candidate for this role will likely have: a solid foundation in Python and JavaScript, ideally with proficiency in other programming languages. experience designing and implementing ETL pipelines, specifically using Apache Airflow (Astronomer). hands-on experience with ETL frameworks, particularly dbt (data build tool). SQL and various database management system skills. a good understanding of different database types, designs, and data modelling systems. experience with cloud platforms like AWS and GCP, including services such as BigQuery, RDS, and Athena. familiarity with Tableau and project management tools like monday.com and Notion. knowledge of APIs from music Digital Service Providers (e.g., Spotify, Apple Music). previous experience at a record label, music distributor, or music publisher. an understanding of the music industry excellent analytical, problem-solving, and communication skills. a proactive approach to learning, excitement about problem-solving, approaching new projects with an open mind. strong accuracy and attention to detail. good written and verbal communication skills, the ability to explain complex ideas using non-technical language. the ability to prioritise and manage their time independently.   ____________________________   What we offer:   A competitive salary (£50-60k pro rata) Participation in our Profit Share Scheme 20 days annual leave A benefits package to support your wellbeing, including access to local gyms and fitness classes, and subscription to health apps including Calm, Headspace and Strava A collection of enhanced family policies to support your family life The opportunity to attend a variety of live events Cycle to work scheme Season ticket loans A lively, collaborative office environment, and a flexible hybrid working policy Paid time off to volunteer with our local charitable initiatives   Applications   Closing date for applications is 21 November 2025, although we may close applications earlier. If you need more information before applying, email us at people@anjunabeats.com. We are committed to inclusion, and encourage applications from anyone with relevant experience and skills. If you require any adjustments throughout the application process to meet your needs and help you perform at your best, please let us know.
28/10/2025
Part time
We’re looking for a Data Engineer to work across the Involved Group, the collective behind globally renowned dance and electronic music labels including Anjunabeats and Anjunadeep, spanning label services and distribution, music publishing, events promotion and artist management. This is a key role within our Technology Department, responsible for developing and managing data pipelines, automating data collection processes, and creating analytics dashboards to provide actionable insights across the company, directly impacting strategy. This role involves working closely with a variety of departments to understand their data needs, developing solutions that streamline data analysis and reporting processes. Reporting to the Head of Technology, our Data Engineer ensures that data analytics initiatives are strategically aligned, efficiently executed, and contribute to the company's overall objectives. Location: Bermondsey, London Working pattern: Part-time (3 days/week) – either in-person at our lively Bermondsey office, hybrid, or home-working.   ____________________________   Who we are:   Based in Bermondsey, the Involved group of companies includes: Involved Productions, home of globally renowned independent dance and electronic music labels Anjunabeats, Anjunadeep and Anjunachill, as well as our label and distribution services. Involved Live, the touring and events company responsible for a portfolio of international events, festivals, and all-night-long showcases, creating unforgettable experiences for fans globally. Involved Publishing, a progressive independent music publisher, representing cutting-edge producers, writers and artists from around the world. Involved Management is a boutique artist management company that is responsible for steering the careers of Above & Beyond, Lane 8, Le Youth and Dusky.  We offer careers, not just jobs, and our team embrace the entrepreneurial spirit, independent mindset and respectful culture we have created, building community and connection through music. ____________________________   Our Data Engineer is responsible for: Analytics Dashboard Creation: Developing and optimising Tableau dashboards that provide clear, actionable insights to various teams, including Streaming & Promotions, Label Directors, and Publishing. Data Pipeline Development: Designing, building, and maintaining efficient and scalable data pipelines to automate the collection, transformation, and delivery of data to and from various sources, including DSPs, FUGA Analytics, Google Analytics, Chartmetric, Curve, etc. Database Management: Developing and maintaining the company’s database structure, ensuring data accuracy, security, and accessibility for analytics purposes. Teaching: Providing support and training to ensure teams are making effective use of analytics tools and dashboards. Tailoring : Collaborating with different departments to understand their data needs, and working creatively to provide tailored analytics solutions. Building: Supporting the Head of Technology in building and maintaining cross-platform automations. Innovation and Research: Staying up to date with the latest trends and technologies in data engineering and analytics, exploring new tools and methodologies that can enhance our data capabilities. This list is not exhaustive – we may ask you to go beyond your job description on occasion, and we hope the role will change and develop with you. ____________________________   About you:   The ideal candidate for this role will likely have: a solid foundation in Python and JavaScript, ideally with proficiency in other programming languages. experience designing and implementing ETL pipelines, specifically using Apache Airflow (Astronomer). hands-on experience with ETL frameworks, particularly dbt (data build tool). SQL and various database management system skills. a good understanding of different database types, designs, and data modelling systems. experience with cloud platforms like AWS and GCP, including services such as BigQuery, RDS, and Athena. familiarity with Tableau and project management tools like monday.com and Notion. knowledge of APIs from music Digital Service Providers (e.g., Spotify, Apple Music). previous experience at a record label, music distributor, or music publisher. an understanding of the music industry excellent analytical, problem-solving, and communication skills. a proactive approach to learning, excitement about problem-solving, approaching new projects with an open mind. strong accuracy and attention to detail. good written and verbal communication skills, the ability to explain complex ideas using non-technical language. the ability to prioritise and manage their time independently.   ____________________________   What we offer:   A competitive salary (£50-60k pro rata) Participation in our Profit Share Scheme 20 days annual leave A benefits package to support your wellbeing, including access to local gyms and fitness classes, and subscription to health apps including Calm, Headspace and Strava A collection of enhanced family policies to support your family life The opportunity to attend a variety of live events Cycle to work scheme Season ticket loans A lively, collaborative office environment, and a flexible hybrid working policy Paid time off to volunteer with our local charitable initiatives   Applications   Closing date for applications is 21 November 2025, although we may close applications earlier. If you need more information before applying, email us at people@anjunabeats.com. We are committed to inclusion, and encourage applications from anyone with relevant experience and skills. If you require any adjustments throughout the application process to meet your needs and help you perform at your best, please let us know.
National Audit Office
Data Science Internship 2026
National Audit Office
Data Science Internship Contract: 1-year fixed-term Location: London or Newcastle office with a minimum of 2 days per week in the office in line with our hybrid working policy Salary: £27,811 in London and £25,089 in Newcastle Job Description: We welcome applications to participate in our year-long Data Science Internship Scheme starting from September 2026. This is an entry-level position in our Analysis Hub, aimed at supporting and developing people looking to start a career in data science. As part of the scheme, you will benefit from dedicated training to develop skills using R, SQL and Python mixed with the opportunity to put those skills into practice. For example, you will spend your time: • applying your quantitative and qualitative skills to large messy datasets to derive new insights; • building and implementing data tools to support our range of assurance work. For example, take a look at the Data Visualisations presented on our website- ( +visualisation&post_type=any ), such as Waste Management- ( ), or Integrated Care Systems- ( ) • collaborating with our financial auditors to review high value, high risk quantitative models which underpin accounting estimates. For example, the value of the student loans book, the rate of fraud and error in tax credit and benefit payments, and the money needed in the future to compensate people who have experienced clinical negligence. All applicants will be invited to complete a numerical reasoning assessment. Following completion of the assessment, screening of your CV, and a review of your application form, we will invite you to complete a technical exercise in January. This will be followed by an in-person interview in February. The closing deadline for applications is 23:59pm Sunday 14th December 2025. The internship starts in September 2026. Equal opportunities and diversity Disability and Reasonable Adjustments Applicants with a disability who wish their application to be considered under the Disability Confident scheme should confirm this when submitting their application. Under this scheme we guarantee an interview to an applicant with a disability who meets the minimum requirements for the role. Applicants will not be discriminated against on the grounds of any protected characteristic. Nationality Requirement: • UK Nationals • Nationals of Commonwealth countries who have the right to work in the UK • Nationals from the EU, EEA or Switzerland with (or eligible for) status under the European Union Settlement Scheme (EUSS) Responsibilities You ll work as a team member in our Analysis Hub. In your first month you will undertake intensive training in our programming and development tools and languages. From your second month you will start to apply the skills you have learned to real life problems, helping the office produce insights into the data we receive from government departments. You will also get the opportunity to collaborate in the review of government models which produce estimates for the audited financial accounts. You ll benefit from dedicated training to develop skills using R, SQL and Python, mixed with the opportunity to put those skills into practice developing new applications, as well as getting your hands dirty doing analysis and reviewing models produced by government. By the time you have completed your year, you will have outstanding professional experience in the application of data science and modelling skills. Skills required No previous experience of audit is necessary, and training will be provided in key data science skills such as R and Python. An interest in learning programming skills is required; previous programming experience is desirable but not essential. Educational requirements We do have some minimum criteria which you will need to meet: • A minimum of 120 UCAS points (or 300 based on the pre-2017 UCAS tariff) or equivalent from your top 3 A-Levels, not including General Studies. If you have 104 UCAS points • An undergraduate degree course with a substantive quantitative component such as data science, operational research, mathematics, statistics, physics, engineering, management science, economics (the data science internship is unlikely to be suitable for people studying accounting finance degree courses), where: o You are in your second year of a sandwich degree course o You are in your final year and are expecting a 2.1 degree or better, or o You have completed your undergraduate degree course and achieved a 2.1 degree or better
11/12/2025
Full time
Data Science Internship Contract: 1-year fixed-term Location: London or Newcastle office with a minimum of 2 days per week in the office in line with our hybrid working policy Salary: £27,811 in London and £25,089 in Newcastle Job Description: We welcome applications to participate in our year-long Data Science Internship Scheme starting from September 2026. This is an entry-level position in our Analysis Hub, aimed at supporting and developing people looking to start a career in data science. As part of the scheme, you will benefit from dedicated training to develop skills using R, SQL and Python mixed with the opportunity to put those skills into practice. For example, you will spend your time: • applying your quantitative and qualitative skills to large messy datasets to derive new insights; • building and implementing data tools to support our range of assurance work. For example, take a look at the Data Visualisations presented on our website- ( +visualisation&post_type=any ), such as Waste Management- ( ), or Integrated Care Systems- ( ) • collaborating with our financial auditors to review high value, high risk quantitative models which underpin accounting estimates. For example, the value of the student loans book, the rate of fraud and error in tax credit and benefit payments, and the money needed in the future to compensate people who have experienced clinical negligence. All applicants will be invited to complete a numerical reasoning assessment. Following completion of the assessment, screening of your CV, and a review of your application form, we will invite you to complete a technical exercise in January. This will be followed by an in-person interview in February. The closing deadline for applications is 23:59pm Sunday 14th December 2025. The internship starts in September 2026. Equal opportunities and diversity Disability and Reasonable Adjustments Applicants with a disability who wish their application to be considered under the Disability Confident scheme should confirm this when submitting their application. Under this scheme we guarantee an interview to an applicant with a disability who meets the minimum requirements for the role. Applicants will not be discriminated against on the grounds of any protected characteristic. Nationality Requirement: • UK Nationals • Nationals of Commonwealth countries who have the right to work in the UK • Nationals from the EU, EEA or Switzerland with (or eligible for) status under the European Union Settlement Scheme (EUSS) Responsibilities You ll work as a team member in our Analysis Hub. In your first month you will undertake intensive training in our programming and development tools and languages. From your second month you will start to apply the skills you have learned to real life problems, helping the office produce insights into the data we receive from government departments. You will also get the opportunity to collaborate in the review of government models which produce estimates for the audited financial accounts. You ll benefit from dedicated training to develop skills using R, SQL and Python, mixed with the opportunity to put those skills into practice developing new applications, as well as getting your hands dirty doing analysis and reviewing models produced by government. By the time you have completed your year, you will have outstanding professional experience in the application of data science and modelling skills. Skills required No previous experience of audit is necessary, and training will be provided in key data science skills such as R and Python. An interest in learning programming skills is required; previous programming experience is desirable but not essential. Educational requirements We do have some minimum criteria which you will need to meet: • A minimum of 120 UCAS points (or 300 based on the pre-2017 UCAS tariff) or equivalent from your top 3 A-Levels, not including General Studies. If you have 104 UCAS points • An undergraduate degree course with a substantive quantitative component such as data science, operational research, mathematics, statistics, physics, engineering, management science, economics (the data science internship is unlikely to be suitable for people studying accounting finance degree courses), where: o You are in your second year of a sandwich degree course o You are in your final year and are expecting a 2.1 degree or better, or o You have completed your undergraduate degree course and achieved a 2.1 degree or better
City Football
Software Engineer
City Football Manchester, Lancashire
Our Story Established in 2013, City Football Group is the world s leading private owner and operator of football clubs, with total or partial ownership of thirteen clubs across the world. City Football Group also invests in other football related businesses and serves as a global commercial platform for our partners, whilst fulfilling our purpose of empowering better lives through football on a local and global scale, consistent with what City football has meant to people for over a century. Purpose Join our technology team as a Software Engineer. In this role you will play a critical part in designing, deploying and maintaining scalable systems, software solutions and data pipelines. Your expertise will drive the transformation of our technology landscape to further our clubs' success both on and off the field. This position offers an enriching collaborative environment, opportunities for career growth, and a chance to contribute to exciting projects in the football industry. Your Impact Develop, test, and maintain high-quality software solutions based on user requirements and design specifications. Apply data engineering skills to configure and maintain data pipelines, providing a seamless flow of data from various sources while monitoring and ensuring system performance. An interest in emerging technologies, including Generative AI, and their application to software solutions. Assist in troubleshooting, identifying issues, and proposing solutions. Collaborate with cross-functional teams to ascertain and refine specifications and requirements. Utilise project management skills to monitor project progress, adherence to quality standards, and to identify potential improvement opportunities. Employ DevOps best practices to increase software deployment speed and reliability in a cloud-based environment. Participate in code reviews to uphold code quality and foster knowledge sharing across the team. Prepare and deliver progress and outcome reports to stakeholders. Facilitate effective team coordination and communication to achieve project objectives. What we are looking for Essential Bachelor s degree in Computer Science, Software Engineering, or a related field, or equivalent work experience. Experience in software development (Python, Java, Scala etc ) and Big Data tools (e.g. Hadoop, Spark), preferably in a cloud-based environment (AWS, Azure, or GCP). Proficiency in SQL. Robust problem-solving skills and an analytical mindset. Solid understanding of version control and DevOps principles including CI/CD pipelines. Strong written and communication skills. Ability to manage multiple tasks and projects under tight deadlines. Desirable Exposure to project management practices such as Agile, Scrum and Waterfall. To find out more and to apply, please click APPLY NOW. Closing Date of Applications - 30/12/25
11/12/2025
Full time
Our Story Established in 2013, City Football Group is the world s leading private owner and operator of football clubs, with total or partial ownership of thirteen clubs across the world. City Football Group also invests in other football related businesses and serves as a global commercial platform for our partners, whilst fulfilling our purpose of empowering better lives through football on a local and global scale, consistent with what City football has meant to people for over a century. Purpose Join our technology team as a Software Engineer. In this role you will play a critical part in designing, deploying and maintaining scalable systems, software solutions and data pipelines. Your expertise will drive the transformation of our technology landscape to further our clubs' success both on and off the field. This position offers an enriching collaborative environment, opportunities for career growth, and a chance to contribute to exciting projects in the football industry. Your Impact Develop, test, and maintain high-quality software solutions based on user requirements and design specifications. Apply data engineering skills to configure and maintain data pipelines, providing a seamless flow of data from various sources while monitoring and ensuring system performance. An interest in emerging technologies, including Generative AI, and their application to software solutions. Assist in troubleshooting, identifying issues, and proposing solutions. Collaborate with cross-functional teams to ascertain and refine specifications and requirements. Utilise project management skills to monitor project progress, adherence to quality standards, and to identify potential improvement opportunities. Employ DevOps best practices to increase software deployment speed and reliability in a cloud-based environment. Participate in code reviews to uphold code quality and foster knowledge sharing across the team. Prepare and deliver progress and outcome reports to stakeholders. Facilitate effective team coordination and communication to achieve project objectives. What we are looking for Essential Bachelor s degree in Computer Science, Software Engineering, or a related field, or equivalent work experience. Experience in software development (Python, Java, Scala etc ) and Big Data tools (e.g. Hadoop, Spark), preferably in a cloud-based environment (AWS, Azure, or GCP). Proficiency in SQL. Robust problem-solving skills and an analytical mindset. Solid understanding of version control and DevOps principles including CI/CD pipelines. Strong written and communication skills. Ability to manage multiple tasks and projects under tight deadlines. Desirable Exposure to project management practices such as Agile, Scrum and Waterfall. To find out more and to apply, please click APPLY NOW. Closing Date of Applications - 30/12/25
Staffworx Limited
Data & AI Senior Consultants - Dynamic AI Consulting firm
Staffworx Limited
Data & AI Senior Consultants Location - We are flexible: onsite, hybrid or fully remote, depending on what works for you and the client, UK or Netherlands based. What you will actually be doing This is not a role where you build clever models that never get used. Your focus is on creating measurable value for clients using data science, machine learning and GenAI, in a consulting and advisory context. You will own work from the very beginning, asking questions like "What value are we trying to create here?" and "Is this the right problem to solve?" through to "It is live, stakeholders are using it and we can see the impact in the numbers." You will work fairly independently and you will also be someone that more junior team members look to for help and direction. A big part of the job is taking messy, ambiguous business and technical problems and turning them into clear, valuable solutions that make sense to the client. You will do this in a client facing role. That means you will be in the room for key conversations, providing honest advice, managing expectations and helping clients make good decisions about where and how to use AI. What your day to day might look like Getting to the heart of the problem Meeting with stakeholders who may not be clear on what they really need Using discovery sessions, workshops and structured questioning to uncover the real business problem Framing success in terms of value. For example higher revenue, lower cost, reduced risk, increased efficiency or better customer experience Translating business goals into a clear roadmap of data and AI work that everyone can understand Advising clients when AI is not the right solution and suggesting simpler or more cost effective alternatives Consulting and advisory work Acting as a trusted advisor to product owners, heads of department and executives Helping clients prioritise use cases based on value, feasibility and risk Communicating trade offs in a simple way. For example accuracy versus speed, innovation versus compliance, cost versus impact Preparing and delivering client presentations, proposals and updates that tell a clear story Supporting pre sales activities where needed, such as scoping work, estimating effort and defining outcomes Managing client expectations, risks and dependencies so there are no surprises Building things that actually work Once the problem and value are clear, you will design and deliver production ready ML and GenAI solutions. That includes: Designing and building data pipelines, batch or streaming, that support the desired outcomes Working with engineers and architects so your work fits cleanly into existing systems Making sure what you build is reliable in production and moves the needle on agreed metrics, not just offline benchmarks Explaining design decisions to both technical and non technical stakeholders GenAI work You will work with GenAI in ways that are grounded in real use cases and business value: Building RAG systems that improve search, content discovery or productivity rather than existing for their own sake Implementing guardrails so models do not leak PII or generate harmful or off brand content Defining and tracking the right metrics so you and the client can see whether a GenAI solution is useful and cost effective Fine tuning and optimising models so they perform well for the use case and budget Designing agentic workflows where they genuinely improve outcomes rather than add complexity Helping clients understand what GenAI can and cannot do in practice Keeping it running You will set up the foundations that protect value over time: Experiment tracking and model versioning so you know what works and can roll back safely CI/CD pipelines for ML so improvements reach users quickly and reliably Monitoring and alerting for models and data so you can catch issues before they damage trust or results Communicating operational risks and mitigations to non technical stakeholders in plain language Security, quality and compliance You will help make sure: Data is accurate, traceable and well managed so decisions are sound Sensitive data is handled correctly, protecting users and the business Regulatory and compliance requirements are met, avoiding costly mistakes Clients understand the risk profile of AI solutions and the controls in place Working with people You will be a bridge between technical and non technical teams, inside our organisation and on the client side. That means: Explaining complex ML and GenAI ideas in plain language, always tied to business outcomes Working closely with product managers, engineers and business stakeholders to prioritise work that matters Facilitating workshops, playback sessions and show and tells that build buy in and understanding Coaching and supporting junior colleagues so the whole team can deliver more value Representing the company professionally in client meetings and at industry events What we are looking for Experience Around 3 to 6 years of experience shipping ML or GenAI solutions into production A track record of seeing projects through from discovery to delivery, with clear impact Experience working directly with stakeholders or clients in a consulting, advisory or product facing role Education A Bachelor or Master degree in a quantitative field such as Computer Science, Data Science, Statistics, Mathematics or Engineering or Equivalent experience that shows you can deliver results Technical skills Core skills Strong Python and SQL, with clean, maintainable code Solid understanding of ML fundamentals. For example feature engineering, model selection, handling imbalanced data, choosing and interpreting metrics Experience with PyTorch or TensorFlow GenAI specific Hands on experience with LLM APIs or open source models such as Llama or Mistral Experience building RAG systems with vector databases such as FAISS, Pinecone or Weaviate Ability to evaluate and improve prompts and retrieval quality using clear metrics Understanding of safety practices such as PII redaction and content filtering Exposure to agentic frameworks Cloud and infrastructure Comfortable working in at least one major cloud provider. AWS, GCP or Azure Familiar with Docker and CI/CD pipelines Experience with managed ML platforms such as SageMaker, Vertex AI or Azure ML Data engineering and MLOps Experience with data warehouses such as Snowflake, BigQuery or Redshift Workflow orchestration using tools like Airflow or Dagster Experience with MLOps tools such as MLflow, Weights and Biases or similar Awareness of data and model drift, and how to monitor and respond to it before it erodes value Soft skills, the things that really matter You are comfortable in client facing settings and can build trust quickly You can talk with anyone from a CEO to a new data analyst, and always bring the conversation back to business value You can take a vague, messy business problem and turn it into a clear technical plan that links to outcomes and metrics You are happy to push back and challenge assumptions respectfully when it is in the client's best interest You like helping other people grow and are happy to mentor junior colleagues You communicate clearly in writing and in person Nice to have, not required Do not rule yourself out if you do not have these. They are a bonus, not a checklist. Experience with Delta Lake, Iceberg, Spark or Databricks, Palantir Experience optimising LLM serving with tools such as vLLM, TGI or TensorRT LLM Search and ranking experience. For example Elasticsearch or rerankers Background in time series forecasting, causal inference, recommender systems or optimisation Experience managing cloud costs and IAM so value is not lost to waste Ability to work in other languages where needed. For example Java, Scala, Go or bash Experience with BI tools such as Looker or Tableau Prior consulting experience or leading client projects end to end Contributions to open source, conference talks or published papers that show your ability to share ideas and influence the wider community Got a background that fits and you're up for a new challenge? Send over your latest CV, expectations and availability. Staffworx Limited is a UK based recruitment consultancy partnering with leading global brands across digital, AI, software, and business consulting. Let's talk about what you could add to the mix.
11/12/2025
Full time
Data & AI Senior Consultants Location - We are flexible: onsite, hybrid or fully remote, depending on what works for you and the client, UK or Netherlands based. What you will actually be doing This is not a role where you build clever models that never get used. Your focus is on creating measurable value for clients using data science, machine learning and GenAI, in a consulting and advisory context. You will own work from the very beginning, asking questions like "What value are we trying to create here?" and "Is this the right problem to solve?" through to "It is live, stakeholders are using it and we can see the impact in the numbers." You will work fairly independently and you will also be someone that more junior team members look to for help and direction. A big part of the job is taking messy, ambiguous business and technical problems and turning them into clear, valuable solutions that make sense to the client. You will do this in a client facing role. That means you will be in the room for key conversations, providing honest advice, managing expectations and helping clients make good decisions about where and how to use AI. What your day to day might look like Getting to the heart of the problem Meeting with stakeholders who may not be clear on what they really need Using discovery sessions, workshops and structured questioning to uncover the real business problem Framing success in terms of value. For example higher revenue, lower cost, reduced risk, increased efficiency or better customer experience Translating business goals into a clear roadmap of data and AI work that everyone can understand Advising clients when AI is not the right solution and suggesting simpler or more cost effective alternatives Consulting and advisory work Acting as a trusted advisor to product owners, heads of department and executives Helping clients prioritise use cases based on value, feasibility and risk Communicating trade offs in a simple way. For example accuracy versus speed, innovation versus compliance, cost versus impact Preparing and delivering client presentations, proposals and updates that tell a clear story Supporting pre sales activities where needed, such as scoping work, estimating effort and defining outcomes Managing client expectations, risks and dependencies so there are no surprises Building things that actually work Once the problem and value are clear, you will design and deliver production ready ML and GenAI solutions. That includes: Designing and building data pipelines, batch or streaming, that support the desired outcomes Working with engineers and architects so your work fits cleanly into existing systems Making sure what you build is reliable in production and moves the needle on agreed metrics, not just offline benchmarks Explaining design decisions to both technical and non technical stakeholders GenAI work You will work with GenAI in ways that are grounded in real use cases and business value: Building RAG systems that improve search, content discovery or productivity rather than existing for their own sake Implementing guardrails so models do not leak PII or generate harmful or off brand content Defining and tracking the right metrics so you and the client can see whether a GenAI solution is useful and cost effective Fine tuning and optimising models so they perform well for the use case and budget Designing agentic workflows where they genuinely improve outcomes rather than add complexity Helping clients understand what GenAI can and cannot do in practice Keeping it running You will set up the foundations that protect value over time: Experiment tracking and model versioning so you know what works and can roll back safely CI/CD pipelines for ML so improvements reach users quickly and reliably Monitoring and alerting for models and data so you can catch issues before they damage trust or results Communicating operational risks and mitigations to non technical stakeholders in plain language Security, quality and compliance You will help make sure: Data is accurate, traceable and well managed so decisions are sound Sensitive data is handled correctly, protecting users and the business Regulatory and compliance requirements are met, avoiding costly mistakes Clients understand the risk profile of AI solutions and the controls in place Working with people You will be a bridge between technical and non technical teams, inside our organisation and on the client side. That means: Explaining complex ML and GenAI ideas in plain language, always tied to business outcomes Working closely with product managers, engineers and business stakeholders to prioritise work that matters Facilitating workshops, playback sessions and show and tells that build buy in and understanding Coaching and supporting junior colleagues so the whole team can deliver more value Representing the company professionally in client meetings and at industry events What we are looking for Experience Around 3 to 6 years of experience shipping ML or GenAI solutions into production A track record of seeing projects through from discovery to delivery, with clear impact Experience working directly with stakeholders or clients in a consulting, advisory or product facing role Education A Bachelor or Master degree in a quantitative field such as Computer Science, Data Science, Statistics, Mathematics or Engineering or Equivalent experience that shows you can deliver results Technical skills Core skills Strong Python and SQL, with clean, maintainable code Solid understanding of ML fundamentals. For example feature engineering, model selection, handling imbalanced data, choosing and interpreting metrics Experience with PyTorch or TensorFlow GenAI specific Hands on experience with LLM APIs or open source models such as Llama or Mistral Experience building RAG systems with vector databases such as FAISS, Pinecone or Weaviate Ability to evaluate and improve prompts and retrieval quality using clear metrics Understanding of safety practices such as PII redaction and content filtering Exposure to agentic frameworks Cloud and infrastructure Comfortable working in at least one major cloud provider. AWS, GCP or Azure Familiar with Docker and CI/CD pipelines Experience with managed ML platforms such as SageMaker, Vertex AI or Azure ML Data engineering and MLOps Experience with data warehouses such as Snowflake, BigQuery or Redshift Workflow orchestration using tools like Airflow or Dagster Experience with MLOps tools such as MLflow, Weights and Biases or similar Awareness of data and model drift, and how to monitor and respond to it before it erodes value Soft skills, the things that really matter You are comfortable in client facing settings and can build trust quickly You can talk with anyone from a CEO to a new data analyst, and always bring the conversation back to business value You can take a vague, messy business problem and turn it into a clear technical plan that links to outcomes and metrics You are happy to push back and challenge assumptions respectfully when it is in the client's best interest You like helping other people grow and are happy to mentor junior colleagues You communicate clearly in writing and in person Nice to have, not required Do not rule yourself out if you do not have these. They are a bonus, not a checklist. Experience with Delta Lake, Iceberg, Spark or Databricks, Palantir Experience optimising LLM serving with tools such as vLLM, TGI or TensorRT LLM Search and ranking experience. For example Elasticsearch or rerankers Background in time series forecasting, causal inference, recommender systems or optimisation Experience managing cloud costs and IAM so value is not lost to waste Ability to work in other languages where needed. For example Java, Scala, Go or bash Experience with BI tools such as Looker or Tableau Prior consulting experience or leading client projects end to end Contributions to open source, conference talks or published papers that show your ability to share ideas and influence the wider community Got a background that fits and you're up for a new challenge? Send over your latest CV, expectations and availability. Staffworx Limited is a UK based recruitment consultancy partnering with leading global brands across digital, AI, software, and business consulting. Let's talk about what you could add to the mix.
Panoramic Associates
GIS developer
Panoramic Associates
Role Overview We are looking for a capable and motivated GIS Developer to build, optimise and maintain the council's spatial data systems, helping us to make better decisions and deliver high-value services for residents. You will play a key technical role within the GIS and Data function - designing mapping tools, managing spatial data pipelines, and delivering clear, functional geospatial solutions used across planning, highways, environment, housing and wider council services. This position suits someone who enjoys solving problems, working with code, and turning spatial data into Practical intelligence. Key Responsibilities GIS Development & Technical Delivery Develop, maintain and enhance GIS applications using modern frameworks and mapping technologies. Build web-mapping tools, dashboards and geospatial interfaces for operational and strategic use. Create automated ETL workflows and spatial data integrations between key council systems. Design, optimise and maintain spatial databases with a focus on availability, performance and data quality. Spatial Data Management & Analysis Manage, transform and analyse spatial datasets from multiple internal and external sources. Develop scripted data quality improvements, metadata standards and geospatial governance controls. Produce spatial models, reports and analytical outputs to support decision-making. Maintain core datasets including LLPG, LSG, UPRN, OS mapping layers, asset registers and statutory feeds. Collaboration & Support Work closely with service areas to understand requirements and build effective geospatial solutions. Provide technical guidance, documentation and support for GIS products and users at all levels. Deliver ad-hoc troubleshooting, system configuration and small enhancements where required. Innovation & Continuous Improvement Research new technologies, automation techniques and geospatial methods to enhance capability. Contribute to GIS strategy, roadmap planning and service improvement initiatives. Ensure compliance with data protection, PSN standards and relevant geospatial legislation. Essential Skills & Experience Commercial or public sector experience as a GIS Developer, Analyst/Programmer or similar role. Strong technical grounding in: ESRI ArcGIS / ArcGIS Online / ArcGIS Enterprise QGIS and open-source GIS tools Spatial databases (PostgreSQL/PostGIS, SQL Server Spatial, Oracle Spatial) Python, JavaScript, REST APIs, HTML/CSS Experience developing web-mapping applications (ArcGIS API, Leaflet, OpenLayers, Mapbox etc.) Solid understanding of geoprocessing, coordinate systems, topology and data automation. Experience with ETL pipelines (FME, ArcGIS ModelBuilder, Python GDAL/OGR, Geopandas). Able to break down complex requirements and deliver practical, well-engineered solutions. Confident communicator capable of working with technical and non-technical stakeholders. Desirable Skills Local government or wider public-sector experience. Knowledge of UPRN/USRN, INSPIRE regulations, BS7666 and national geospatial standards. Cloud-based GIS hosting experience (Azure, AWS, ESRI Cloud). Integration with asset/estate/plan-based or highways systems. Spatial visualisation in Power BI/Tableau. Version control (Git), CI/CD, automated deployment pipelines. Experience with live geospatial feeds, remote sensing or IoT mapping. What We Offer Flexible hybrid work Strong emphasis on technical development and progression Opportunity to work on high-impact geospatial projects across the authority A focused, collaborative team environment JBRP1_UKTJ
11/12/2025
Full time
Role Overview We are looking for a capable and motivated GIS Developer to build, optimise and maintain the council's spatial data systems, helping us to make better decisions and deliver high-value services for residents. You will play a key technical role within the GIS and Data function - designing mapping tools, managing spatial data pipelines, and delivering clear, functional geospatial solutions used across planning, highways, environment, housing and wider council services. This position suits someone who enjoys solving problems, working with code, and turning spatial data into Practical intelligence. Key Responsibilities GIS Development & Technical Delivery Develop, maintain and enhance GIS applications using modern frameworks and mapping technologies. Build web-mapping tools, dashboards and geospatial interfaces for operational and strategic use. Create automated ETL workflows and spatial data integrations between key council systems. Design, optimise and maintain spatial databases with a focus on availability, performance and data quality. Spatial Data Management & Analysis Manage, transform and analyse spatial datasets from multiple internal and external sources. Develop scripted data quality improvements, metadata standards and geospatial governance controls. Produce spatial models, reports and analytical outputs to support decision-making. Maintain core datasets including LLPG, LSG, UPRN, OS mapping layers, asset registers and statutory feeds. Collaboration & Support Work closely with service areas to understand requirements and build effective geospatial solutions. Provide technical guidance, documentation and support for GIS products and users at all levels. Deliver ad-hoc troubleshooting, system configuration and small enhancements where required. Innovation & Continuous Improvement Research new technologies, automation techniques and geospatial methods to enhance capability. Contribute to GIS strategy, roadmap planning and service improvement initiatives. Ensure compliance with data protection, PSN standards and relevant geospatial legislation. Essential Skills & Experience Commercial or public sector experience as a GIS Developer, Analyst/Programmer or similar role. Strong technical grounding in: ESRI ArcGIS / ArcGIS Online / ArcGIS Enterprise QGIS and open-source GIS tools Spatial databases (PostgreSQL/PostGIS, SQL Server Spatial, Oracle Spatial) Python, JavaScript, REST APIs, HTML/CSS Experience developing web-mapping applications (ArcGIS API, Leaflet, OpenLayers, Mapbox etc.) Solid understanding of geoprocessing, coordinate systems, topology and data automation. Experience with ETL pipelines (FME, ArcGIS ModelBuilder, Python GDAL/OGR, Geopandas). Able to break down complex requirements and deliver practical, well-engineered solutions. Confident communicator capable of working with technical and non-technical stakeholders. Desirable Skills Local government or wider public-sector experience. Knowledge of UPRN/USRN, INSPIRE regulations, BS7666 and national geospatial standards. Cloud-based GIS hosting experience (Azure, AWS, ESRI Cloud). Integration with asset/estate/plan-based or highways systems. Spatial visualisation in Power BI/Tableau. Version control (Git), CI/CD, automated deployment pipelines. Experience with live geospatial feeds, remote sensing or IoT mapping. What We Offer Flexible hybrid work Strong emphasis on technical development and progression Opportunity to work on high-impact geospatial projects across the authority A focused, collaborative team environment JBRP1_UKTJ
Barclays Bank Plc
PostgreSQL SRE
Barclays Bank Plc Tower Hamlets, London
Join us as a PostgreSQL SRE at Barclays where you'll effectively monitor and maintain the bank's critical technology infrastructure and resolve more complex technical issues, whilst minimizing disruption to operations. In this role you will assume a key technical leadership role. You will shape the direction of our database administration, ensuring our technological approaches are innovative and aligned with the Bank's business goals. To be successful as a PostgreSQL SRE, you should have: Experience as a Database Administrator, with a focus on PostgreSQL and similar database technologies such as Oracle or MS-SQL. A background in implementing and leading SRE practices across large organizations or complex teams. Hands-on experience on Containers and Kubernetes Experience with DevOps automation tools such as Code versioning (git), JIRA, Ansible, database CI/CD tools and their implementation. Some other highly valued skills may include: Expertise with scripting languages (e.g. PowerShell, Python, Bash) for automation/migration tasks Experience of working on Data migration tools and software's Expertise in system configuration management tools such as Chef, Ansible for database server configurations. You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills. This role can be based in our Knutsford or Glasgow office. Purpose of the role To apply software engineering techniques, automation, and best practices in incident response, to ensure the reliability, availability, and scalability of the systems, platforms, and technology through them. Accountabilities Availability, performance, and scalability of systems and services through proactive monitoring, maintenance, and capacity planning. Resolution, analysis and response to system outages and disruptions, and implement measures to prevent similar incidents from recurring. Development of tools and scripts to automate operational processes, reducing manual workload, increasing efficiency, and improving system resilience. Monitoring and optimisation of system performance and resource usage, identify and address bottlenecks, and implement best practices for performance tuning. Collaboration with development teams to integrate best practices for reliability, scalability, and performance into the software development lifecycle, and work closely with other teams to ensure smooth and efficient operations. Stay informed of industry technology trends and innovations, and actively contribute to the organization's technology communities to foster a culture of technical excellence and growth. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L - Listen and be authentic, E - Energise and inspire, A - Align across the enterprise, D - Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc).to solve problems creatively and effectively. Communicate complex information. 'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship - our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset - to Empower, Challenge and Drive - the operating manual for how we behave. Investment
10/12/2025
Full time
Join us as a PostgreSQL SRE at Barclays where you'll effectively monitor and maintain the bank's critical technology infrastructure and resolve more complex technical issues, whilst minimizing disruption to operations. In this role you will assume a key technical leadership role. You will shape the direction of our database administration, ensuring our technological approaches are innovative and aligned with the Bank's business goals. To be successful as a PostgreSQL SRE, you should have: Experience as a Database Administrator, with a focus on PostgreSQL and similar database technologies such as Oracle or MS-SQL. A background in implementing and leading SRE practices across large organizations or complex teams. Hands-on experience on Containers and Kubernetes Experience with DevOps automation tools such as Code versioning (git), JIRA, Ansible, database CI/CD tools and their implementation. Some other highly valued skills may include: Expertise with scripting languages (e.g. PowerShell, Python, Bash) for automation/migration tasks Experience of working on Data migration tools and software's Expertise in system configuration management tools such as Chef, Ansible for database server configurations. You may be assessed on the key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen strategic thinking and digital and technology, as well as job-specific technical skills. This role can be based in our Knutsford or Glasgow office. Purpose of the role To apply software engineering techniques, automation, and best practices in incident response, to ensure the reliability, availability, and scalability of the systems, platforms, and technology through them. Accountabilities Availability, performance, and scalability of systems and services through proactive monitoring, maintenance, and capacity planning. Resolution, analysis and response to system outages and disruptions, and implement measures to prevent similar incidents from recurring. Development of tools and scripts to automate operational processes, reducing manual workload, increasing efficiency, and improving system resilience. Monitoring and optimisation of system performance and resource usage, identify and address bottlenecks, and implement best practices for performance tuning. Collaboration with development teams to integrate best practices for reliability, scalability, and performance into the software development lifecycle, and work closely with other teams to ensure smooth and efficient operations. Stay informed of industry technology trends and innovations, and actively contribute to the organization's technology communities to foster a culture of technical excellence and growth. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L - Listen and be authentic, E - Energise and inspire, A - Align across the enterprise, D - Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc).to solve problems creatively and effectively. Communicate complex information. 'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship - our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset - to Empower, Challenge and Drive - the operating manual for how we behave. Investment
Data Integration Engineer
Halliburton Abingdon, Oxfordshire
We are looking for the right people - people who want to innovate, achieve, grow and lead. We attract and retain the best talent by investing in our employees and empowering them to develop themselves and their careers. Experience the challenges, rewards and opportunity of working for one of the world's largest providers of products and services to the global energy industry. Job Duties We are seeking a skilled and proactive Data Integration Engineer to join the Neftex Technical Services team. Reporting to the Team Lead the Data Integration Engineer will be responsible for designing, building, and maintaining robust data pipelines and integration frameworks that connect diverse systems including LLMs and a proprietary Data Integration solution. Successful candidates will be evidently enthusiastic and motivated people who we can train up in our processes and ultimately play a key role in quality assurance initiatives across different stakeholder groups. This role is based in our Abingdon, Oxfordshire office. Key Responsibilities: Design and implement scalable data integration solutions using ETL/ELT tools and APIs Develop and maintain data pipelines that include Large Language Models (LLMs) Build solutions that include cloud and on-premises environments Collaborate with data architects, analysts, and business stakeholders to understand data requirements Integrate data from various sources including databases, SaaS platforms, APIs, and flat files Monitor and optimize data flows for performance, reliability, and cost-efficiency Ensure data quality, consistency, and governance across integrated systems Automate data workflows and support real-time data streaming Document integration processes and maintain technical specification Qualifications Qualifications & Experience: 3+ years' experience working with database and related tools Strong proficiency with data virtualisation platforms and tools such as Teiid or similar Solid understanding of SQL, relational databases, and data modelling Experience with cloud platforms (AWS, Azure) and cloud-native data services Familiarity with RESTful APIs, JSON, XML, OData, and message queues (Kafka) Knowledge of data governance, security, and compliance best practices Preferred Skills: Experience with cloud-based database solutions. Understanding of data lifecycle management and SOC2 security standards. Familiarity with geoscience disciplines, geospatial data and GIS tools (e.g., ArcGIS, QGIS) is advantageous. Scripting and automation (e.g., PowerShell, Python, Java). Experience with Gitlab. Knowledge of Spotfire data visualization platform or alternative dashboard solutions. Awareness of Agile delivery methodologies. Halliburton is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, disability, genetic information, pregnancy, citizenship, marital status, sex/gender, sexual preference/ orientation, gender identity, age, veteran status, national origin, or any other status protected by law or regulation. Location 97 Jubilee Avenue, Milton Park, Abingdon, Oxfordshire, OX14 4RW, United Kingdom Job Details Requisition Number: 204269 Experience Level: Entry-Level Job Family: Engineering/Science/Technology Product Service Line: division Full Time / Part Time: Full Time Additional Locations for this position: Compensation Information Compensation is competitive and commensurate with experience.
10/12/2025
Full time
We are looking for the right people - people who want to innovate, achieve, grow and lead. We attract and retain the best talent by investing in our employees and empowering them to develop themselves and their careers. Experience the challenges, rewards and opportunity of working for one of the world's largest providers of products and services to the global energy industry. Job Duties We are seeking a skilled and proactive Data Integration Engineer to join the Neftex Technical Services team. Reporting to the Team Lead the Data Integration Engineer will be responsible for designing, building, and maintaining robust data pipelines and integration frameworks that connect diverse systems including LLMs and a proprietary Data Integration solution. Successful candidates will be evidently enthusiastic and motivated people who we can train up in our processes and ultimately play a key role in quality assurance initiatives across different stakeholder groups. This role is based in our Abingdon, Oxfordshire office. Key Responsibilities: Design and implement scalable data integration solutions using ETL/ELT tools and APIs Develop and maintain data pipelines that include Large Language Models (LLMs) Build solutions that include cloud and on-premises environments Collaborate with data architects, analysts, and business stakeholders to understand data requirements Integrate data from various sources including databases, SaaS platforms, APIs, and flat files Monitor and optimize data flows for performance, reliability, and cost-efficiency Ensure data quality, consistency, and governance across integrated systems Automate data workflows and support real-time data streaming Document integration processes and maintain technical specification Qualifications Qualifications & Experience: 3+ years' experience working with database and related tools Strong proficiency with data virtualisation platforms and tools such as Teiid or similar Solid understanding of SQL, relational databases, and data modelling Experience with cloud platforms (AWS, Azure) and cloud-native data services Familiarity with RESTful APIs, JSON, XML, OData, and message queues (Kafka) Knowledge of data governance, security, and compliance best practices Preferred Skills: Experience with cloud-based database solutions. Understanding of data lifecycle management and SOC2 security standards. Familiarity with geoscience disciplines, geospatial data and GIS tools (e.g., ArcGIS, QGIS) is advantageous. Scripting and automation (e.g., PowerShell, Python, Java). Experience with Gitlab. Knowledge of Spotfire data visualization platform or alternative dashboard solutions. Awareness of Agile delivery methodologies. Halliburton is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, disability, genetic information, pregnancy, citizenship, marital status, sex/gender, sexual preference/ orientation, gender identity, age, veteran status, national origin, or any other status protected by law or regulation. Location 97 Jubilee Avenue, Milton Park, Abingdon, Oxfordshire, OX14 4RW, United Kingdom Job Details Requisition Number: 204269 Experience Level: Entry-Level Job Family: Engineering/Science/Technology Product Service Line: division Full Time / Part Time: Full Time Additional Locations for this position: Compensation Information Compensation is competitive and commensurate with experience.
Head Resourcing
Data Engineer
Head Resourcing
Mid-Level Data Engineer (Azure / Databricks) NO VISA REQUIREMENTS Location: Glasgow (3+ days) Reports to: Head of IT My client is undergoing a major transformation of their entire data landscape-migrating from legacy systems and manual reporting into a modern Azure + Databricks Lakehouse. They are building a secure, automated, enterprise-grade platform powered by Lakeflow Declarative Pipelines, Unity Catalog and Azure Data Factory. They are looking for a Mid-Level Data Engineer to help deliver high-quality pipelines and curated datasets used across Finance, Operations, Sales, Customer Care and Logistics. What You'll Do Lakehouse Engineering (Azure + Databricks) Build and maintain scalable ELT pipelines using Lakeflow Declarative Pipelines, PySpark and Spark SQL. Work within a Medallion architecture (Bronze ? Silver ? Gold) to deliver reliable, high-quality datasets. Ingest data from multiple sources including ChargeBee, legacy operational files, SharePoint, SFTP, SQL, REST and GraphQL APIs using Azure Data Factory and metadata-driven patterns. Apply data quality and validation rules using Lakeflow Declarative Pipelines expectations. Curated Layers & Data Modelling Develop clean and conforming Silver & Gold layers aligned to enterprise subject areas. Contribute to dimensional modelling (star schemas), harmonisation logic, SCDs and business marts powering Power BI datasets. Apply governance, lineage and permissioning through Unity Catalog. Orchestration & Observability Use Lakeflow Workflows and ADF to orchestrate and optimise ingestion, transformation and scheduled jobs. Help implement monitoring, alerting, SLAs/SLIs and runbooks to support production reliability. Assist in performance tuning and cost optimisation. DevOps & Platform Engineering Contribute to CI/CD pipelines in Azure DevOps to automate deployment of notebooks, Lakeflow Declarative Pipelines, SQL models and ADF assets. Support secure deployment patterns using private endpoints, managed identities and Key Vault. Participate in code reviews and help improve engineering practices. Collaboration & Delivery Work with BI and Analytics teams to deliver curated datasets that power dashboards across the business. Contribute to architectural discussions and the ongoing data platform roadmap. Tech You'll Use Databricks: Lakeflow Declarative Pipelines, Lakeflow Workflows, Unity Catalog, Delta Lake Azure: ADLS Gen2, Data Factory, Event Hubs (optional), Key Vault, private endpoints Languages: PySpark, Spark SQL, Python, Git DevOps: Azure DevOps Repos & Pipelines, CI/CD Analytics: Power BI, Fabric What We're Looking For Experience Commercial and proven data engineering experience. Hands-on experience delivering solutions on Azure + Databricks . Strong PySpark and Spark SQL skills within distributed compute environments. Experience working in a Lakehouse/Medallion architecture with Delta Lake. Understanding of dimensional modelling (Kimball), including SCD Type 1/2. Exposure to operational concepts such as monitoring, retries, idempotency and backfills. Mindset Keen to grow within a modern Azure Data Platform environment. Comfortable with Git, CI/CD and modern engineering workflows. Able to communicate technical concepts clearly to non-technical stakeholders. Quality-driven, collaborative and proactive. Nice to Have Databricks Certified Data Engineer Associate. Experience with streaming ingestion (Auto Loader, event streams, watermarking). Subscription/entitlement modelling (e.g., ChargeBee). Unity Catalog advanced security (RLS, PII governance). Terraform or Bicep for IaC. Fabric Semantic Models or Direct Lake optimisation experience. Why Join? Opportunity to shape and build a modern enterprise Lakehouse platform. Hands-on work with Azure, Databricks and leading-edge engineering practices. Real progression opportunities within a growing data function. Direct impact across multiple business domains.
10/12/2025
Full time
Mid-Level Data Engineer (Azure / Databricks) NO VISA REQUIREMENTS Location: Glasgow (3+ days) Reports to: Head of IT My client is undergoing a major transformation of their entire data landscape-migrating from legacy systems and manual reporting into a modern Azure + Databricks Lakehouse. They are building a secure, automated, enterprise-grade platform powered by Lakeflow Declarative Pipelines, Unity Catalog and Azure Data Factory. They are looking for a Mid-Level Data Engineer to help deliver high-quality pipelines and curated datasets used across Finance, Operations, Sales, Customer Care and Logistics. What You'll Do Lakehouse Engineering (Azure + Databricks) Build and maintain scalable ELT pipelines using Lakeflow Declarative Pipelines, PySpark and Spark SQL. Work within a Medallion architecture (Bronze ? Silver ? Gold) to deliver reliable, high-quality datasets. Ingest data from multiple sources including ChargeBee, legacy operational files, SharePoint, SFTP, SQL, REST and GraphQL APIs using Azure Data Factory and metadata-driven patterns. Apply data quality and validation rules using Lakeflow Declarative Pipelines expectations. Curated Layers & Data Modelling Develop clean and conforming Silver & Gold layers aligned to enterprise subject areas. Contribute to dimensional modelling (star schemas), harmonisation logic, SCDs and business marts powering Power BI datasets. Apply governance, lineage and permissioning through Unity Catalog. Orchestration & Observability Use Lakeflow Workflows and ADF to orchestrate and optimise ingestion, transformation and scheduled jobs. Help implement monitoring, alerting, SLAs/SLIs and runbooks to support production reliability. Assist in performance tuning and cost optimisation. DevOps & Platform Engineering Contribute to CI/CD pipelines in Azure DevOps to automate deployment of notebooks, Lakeflow Declarative Pipelines, SQL models and ADF assets. Support secure deployment patterns using private endpoints, managed identities and Key Vault. Participate in code reviews and help improve engineering practices. Collaboration & Delivery Work with BI and Analytics teams to deliver curated datasets that power dashboards across the business. Contribute to architectural discussions and the ongoing data platform roadmap. Tech You'll Use Databricks: Lakeflow Declarative Pipelines, Lakeflow Workflows, Unity Catalog, Delta Lake Azure: ADLS Gen2, Data Factory, Event Hubs (optional), Key Vault, private endpoints Languages: PySpark, Spark SQL, Python, Git DevOps: Azure DevOps Repos & Pipelines, CI/CD Analytics: Power BI, Fabric What We're Looking For Experience Commercial and proven data engineering experience. Hands-on experience delivering solutions on Azure + Databricks . Strong PySpark and Spark SQL skills within distributed compute environments. Experience working in a Lakehouse/Medallion architecture with Delta Lake. Understanding of dimensional modelling (Kimball), including SCD Type 1/2. Exposure to operational concepts such as monitoring, retries, idempotency and backfills. Mindset Keen to grow within a modern Azure Data Platform environment. Comfortable with Git, CI/CD and modern engineering workflows. Able to communicate technical concepts clearly to non-technical stakeholders. Quality-driven, collaborative and proactive. Nice to Have Databricks Certified Data Engineer Associate. Experience with streaming ingestion (Auto Loader, event streams, watermarking). Subscription/entitlement modelling (e.g., ChargeBee). Unity Catalog advanced security (RLS, PII governance). Terraform or Bicep for IaC. Fabric Semantic Models or Direct Lake optimisation experience. Why Join? Opportunity to shape and build a modern enterprise Lakehouse platform. Hands-on work with Azure, Databricks and leading-edge engineering practices. Real progression opportunities within a growing data function. Direct impact across multiple business domains.
Tenth Revolution Group
Senior Developer - £400PD - Remote
Tenth Revolution Group City, London
Senior Developer - 400PD - Remote We are seeking a skilled and collaborative Senior Developer to join our engineering team. In this role, you will contribute to the design, development, and maintenance of high-quality backend services and infrastructure platforms. You will work across a modern technology stack and help shape best practices for delivery, automation, and open ways of working. Key Responsibilities Backend Development: Develop and maintain backend services, with Python as the preferred language for new systems. Provide support for services built in Java and .NET, ensuring seamless integration and continued reliability. Enhance and maintain existing solutions using Oracle SQL PL. Database & Data Services: Work with relational databases including Oracle, SQL Server, and Postgres. Optimise data models, queries, and stored procedures to improve performance and maintainability. Infrastructure & DevOps: Build, deploy, and manage applications using cloud platforms such as AWS and Azure. Use modern DevOps tools and practices-including GitHub, Azure DevOps, Docker, Kubernetes, and Linux-based systems-to deliver scalable, reliable services. Implement and maintain CI/CD pipelines with a strong focus on automation, continuous deployment, testing, and monitoring. Quality & Testing: Develop software using Test-Driven Development (TD D ), writing automated tests before implementing code. Ensure high quality across the stack through continuous testing, observability, and feedback loops. Ways of Working: Champion open and transparent engineering practices, including maintaining visible codebases, documentation, design histories, and roadmaps. Collaborate closely with cross-functional teams and contribute to a culture of knowledge sharing and continuous improvement. What We're Looking For Strong experience developing backend services with Python, plus working knowledge of Java and/or .NET. Proficiency in SQL and experience with Oracle PL/SQL or similar technologies. Hands-on experience with cloud platforms (AWS, Azure), containerisation, DevOps tooling, and CI/CD automation. Familiarity with TDD, automated testing frameworks, and modern monitoring/observability tools. A commitment to open, transparent, and collaborative engineering practices. To apply for this role please submit your CV or contact Dillon Blackburn on (phone number removed) or at (url removed). Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
10/12/2025
Contractor
Senior Developer - 400PD - Remote We are seeking a skilled and collaborative Senior Developer to join our engineering team. In this role, you will contribute to the design, development, and maintenance of high-quality backend services and infrastructure platforms. You will work across a modern technology stack and help shape best practices for delivery, automation, and open ways of working. Key Responsibilities Backend Development: Develop and maintain backend services, with Python as the preferred language for new systems. Provide support for services built in Java and .NET, ensuring seamless integration and continued reliability. Enhance and maintain existing solutions using Oracle SQL PL. Database & Data Services: Work with relational databases including Oracle, SQL Server, and Postgres. Optimise data models, queries, and stored procedures to improve performance and maintainability. Infrastructure & DevOps: Build, deploy, and manage applications using cloud platforms such as AWS and Azure. Use modern DevOps tools and practices-including GitHub, Azure DevOps, Docker, Kubernetes, and Linux-based systems-to deliver scalable, reliable services. Implement and maintain CI/CD pipelines with a strong focus on automation, continuous deployment, testing, and monitoring. Quality & Testing: Develop software using Test-Driven Development (TD D ), writing automated tests before implementing code. Ensure high quality across the stack through continuous testing, observability, and feedback loops. Ways of Working: Champion open and transparent engineering practices, including maintaining visible codebases, documentation, design histories, and roadmaps. Collaborate closely with cross-functional teams and contribute to a culture of knowledge sharing and continuous improvement. What We're Looking For Strong experience developing backend services with Python, plus working knowledge of Java and/or .NET. Proficiency in SQL and experience with Oracle PL/SQL or similar technologies. Hands-on experience with cloud platforms (AWS, Azure), containerisation, DevOps tooling, and CI/CD automation. Familiarity with TDD, automated testing frameworks, and modern monitoring/observability tools. A commitment to open, transparent, and collaborative engineering practices. To apply for this role please submit your CV or contact Dillon Blackburn on (phone number removed) or at (url removed). Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Lorien
Junior Python Developer (FinTech, FX Trading) - Perm
Lorien
Junior Python Developer (FinTech, FX Trading) - London Hybrid Employment: Permanent, Full-time Location: Bank, London - 3 days on-site (Tue/Wed/Thu) + 2 days WFH Salary: £40,000 + training & benefits About: A startup fintech building a next-generation FX trading & settlement platform -connecting onboarding, dealing, risk, settlement, reporting, and client portals into one system . This is an awesome opportunity to help build an FX Trading platform from the ground up. The Opportunity: You'll create, build and deploy using Infrastructure-as-Code (IaC) while working across the full stack . Expect extensive training and hands-on exposure to Azure and Salesforce , plus mentorship from experienced engineers. What you'll do: Develop and optimise platform components in Python Write and tune SQL Server queries, views & stored procedures Build REST APIs for internal/external integrations Learn and contribute to event-driven pipelines (eg, Kafka/RabbitMQ ) Apply IaC and contribute to CI/CD releases Collaborate with Product, Operations & Compliance on mission-critical features What you'll bring: 1-2 years' commercial experience with Python and SQL (or strong projects/internships) Understanding of APIs , data modelling and secure access controls Curiosity, problem-solving mindset, and clear communication Nice to have: Kafka or message queues, Azure , Salesforce (Apex/Flows) , Docker , Azure DevOps/GitHub Actions GitHub/portfolio link welcome (not mandatory) Why join: Build high-impact fintech systems that move money & markets Modern stack ( Python, Kafka, SQL Server, REST, Azure, Salesforce ) Professional training & certification support Clear growth path as the platform scales globally Eligibility: Right to work in the UK (sponsorship not available) Commutable to London for 3 on-site days (Tue/Wed/Thu) Permanent only (no contractors) Apply now: Send your CV (and GitHub if available) with the subject line: Junior Python Developer - Fiscal FX. Guidant, Carbon60, Lorien & SRG - The Impellam Group Portfolio are acting as an Employment Business in relation to this vacancy.
10/12/2025
Full time
Junior Python Developer (FinTech, FX Trading) - London Hybrid Employment: Permanent, Full-time Location: Bank, London - 3 days on-site (Tue/Wed/Thu) + 2 days WFH Salary: £40,000 + training & benefits About: A startup fintech building a next-generation FX trading & settlement platform -connecting onboarding, dealing, risk, settlement, reporting, and client portals into one system . This is an awesome opportunity to help build an FX Trading platform from the ground up. The Opportunity: You'll create, build and deploy using Infrastructure-as-Code (IaC) while working across the full stack . Expect extensive training and hands-on exposure to Azure and Salesforce , plus mentorship from experienced engineers. What you'll do: Develop and optimise platform components in Python Write and tune SQL Server queries, views & stored procedures Build REST APIs for internal/external integrations Learn and contribute to event-driven pipelines (eg, Kafka/RabbitMQ ) Apply IaC and contribute to CI/CD releases Collaborate with Product, Operations & Compliance on mission-critical features What you'll bring: 1-2 years' commercial experience with Python and SQL (or strong projects/internships) Understanding of APIs , data modelling and secure access controls Curiosity, problem-solving mindset, and clear communication Nice to have: Kafka or message queues, Azure , Salesforce (Apex/Flows) , Docker , Azure DevOps/GitHub Actions GitHub/portfolio link welcome (not mandatory) Why join: Build high-impact fintech systems that move money & markets Modern stack ( Python, Kafka, SQL Server, REST, Azure, Salesforce ) Professional training & certification support Clear growth path as the platform scales globally Eligibility: Right to work in the UK (sponsorship not available) Commutable to London for 3 on-site days (Tue/Wed/Thu) Permanent only (no contractors) Apply now: Send your CV (and GitHub if available) with the subject line: Junior Python Developer - Fiscal FX. Guidant, Carbon60, Lorien & SRG - The Impellam Group Portfolio are acting as an Employment Business in relation to this vacancy.
Tenth Revolution Group
Senior Developer
Tenth Revolution Group
Senior Software Developers - 6-Month Contract - Inside IR35 400 per day Remote/UK-Based NHS Experience Essential BPSS Eligible We're seeking three Senior Software Developers to join a high-performing team delivering modern digital services across the NHS. You will help build, support, and evolve critical platforms using modern engineering practices, test-driven development, and open, transparent delivery. This is a 6-month Inside IR35 contract, with strong potential for extension. Candidates must be UK-based, BPSS eligible, and have previous NHS experience. Role Overview You will work within a collaborative, multidisciplinary environment to design, build, and maintain high-quality digital services. The role involves contributing to both new and existing systems, supporting multiple languages and platforms, and applying strong DevOps, TDD, and modern engineering practice throughout. Key Responsibilities Develop new services using Python while supporting existing components written in Java and .NET Work with Oracle PL/SQL to support legacy and operational systems Engineer solutions across cloud and infrastructure environments including AWS, Azure, Linux, SQL Server, Postgres, Docker, and Kubernetes Implement CI/CD pipelines using GitHub or Azure DevOps Deliver high-quality software using Test Driven Development (TDD) Contribute to open, transparent delivery with shared documentation, codebases, design histories, and roadmaps Collaborate with engineers, designers, and product teams to deliver secure, scalable, user-centred services Required Technical Skills Strong experience building digital services with Python Familiarity with Java and .NET for supporting existing systems Skilled in Oracle PL/SQL Experience with AWS and Azure Knowledge of GitHub, Azure DevOps, Docker, Kubernetes, and Linux Experience with SQL Server and Postgres Strong TDD practice and automated testing pipelines Non-Technical Skills Excellent communication and collaboration Comfortable working openly and transparently Strong problem-solving abilities Adaptable, proactive, and able to work in complex environments Positive, delivery-focused attitude Additional Requirements Must be UK-based BPSS eligible Previous NHS experience is essential To discuss this role further please submit your CV or contact Brandon Forbes via email at (url removed). Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
09/12/2025
Contractor
Senior Software Developers - 6-Month Contract - Inside IR35 400 per day Remote/UK-Based NHS Experience Essential BPSS Eligible We're seeking three Senior Software Developers to join a high-performing team delivering modern digital services across the NHS. You will help build, support, and evolve critical platforms using modern engineering practices, test-driven development, and open, transparent delivery. This is a 6-month Inside IR35 contract, with strong potential for extension. Candidates must be UK-based, BPSS eligible, and have previous NHS experience. Role Overview You will work within a collaborative, multidisciplinary environment to design, build, and maintain high-quality digital services. The role involves contributing to both new and existing systems, supporting multiple languages and platforms, and applying strong DevOps, TDD, and modern engineering practice throughout. Key Responsibilities Develop new services using Python while supporting existing components written in Java and .NET Work with Oracle PL/SQL to support legacy and operational systems Engineer solutions across cloud and infrastructure environments including AWS, Azure, Linux, SQL Server, Postgres, Docker, and Kubernetes Implement CI/CD pipelines using GitHub or Azure DevOps Deliver high-quality software using Test Driven Development (TDD) Contribute to open, transparent delivery with shared documentation, codebases, design histories, and roadmaps Collaborate with engineers, designers, and product teams to deliver secure, scalable, user-centred services Required Technical Skills Strong experience building digital services with Python Familiarity with Java and .NET for supporting existing systems Skilled in Oracle PL/SQL Experience with AWS and Azure Knowledge of GitHub, Azure DevOps, Docker, Kubernetes, and Linux Experience with SQL Server and Postgres Strong TDD practice and automated testing pipelines Non-Technical Skills Excellent communication and collaboration Comfortable working openly and transparently Strong problem-solving abilities Adaptable, proactive, and able to work in complex environments Positive, delivery-focused attitude Additional Requirements Must be UK-based BPSS eligible Previous NHS experience is essential To discuss this role further please submit your CV or contact Brandon Forbes via email at (url removed). Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Akkodis
Data Engineer - SC Cleared. Stevenage/Hybrid £80k
Akkodis Stevenage, Hertfordshire
Data Engineer (Strong SQL, ETL, Python) - SC Cleared OR Eligible Stevenage (Hybrid) 2-3 days onsite Up to 80,000 High-impact programme - Revolutionary platform I am looking for a Security-Cleared Data Engineer to take the reins on a range of highly ambitious Data Migration projects supporting a range of truly high-impact programmes across the UK. This is a unique opportunity to work on cutting-edge cloud, software, and infrastructure projects that shape the future of technology in both public and private sectors. You'll be part of a collaborative team delivering scalable, next-generation digital ecosystems. What you'll be doing? As a Data Engineer within our Centre of Excellence, you will play a critical role in delivering complex data migration and data engineering projects for our clients. This position focuses on the planning, execution, and optimisation of data migrations-from legacy platforms to modern cloud-based environments-ensuring accuracy, consistency, security, and continuity throughout the process Key Responsibilities Analyse existing data structures and understand business and technical requirements for migration initiatives. Design and deliver robust data migration strategies and ETL solutions. Develop automated data extraction, transformation, and loading (ETL) processes using industry-standard tools and scripts. Work closely with stakeholders to ensure seamless migration and minimal business disruption. Plan, coordinate, and execute data migration projects within defined timelines. Ensure the highest standards of data quality, integrity, and security. Troubleshoot and resolve data-related issues promptly. Collaborate with wider engineering and architecture teams to ensure migrations align with organisational and regulatory standards. Relevant exposure; Expert-level SQL skills for complex query development, performance tuning, indexing, and data transformation across on-premise databases and AWS cloud environments. Strong hands-on experience with ETL processes and tools (Talend, Informatica, Matillion, Pentaho, MuleSoft, Boomi) or scripting using Python, PySpark, and SQL. Solid understanding of data warehousing and modelling techniques (Star Schema, Snowflake Schema). Familiarity with security frameworks such as GDPR, HIPAA, ISO 27001, NIST, SOX, and PII, as well as AWS security features including IAM, KMS, and RBAC. Ability to identify and resolve data quality issues across migration projects. Strong track record of delivering end-to-end data migration projects and working effectively with both technical and non-technical stakeholders. Due to the nature of the work, SC Clearance is required or candidates must be eligible to obtain it. Salary up to 80,000 plus wider benefits - Contact me today for further insight on (phone number removed) or (url removed). Modis International Ltd acts as an employment agency for permanent recruitment and an employment business for the supply of temporary workers in the UK. Modis Europe Ltd provide a variety of international solutions that connect clients to the best talent in the world. For all positions based in Switzerland, Modis Europe Ltd works with its licensed Swiss partner Accurity GmbH to ensure that candidate applications are handled in accordance with Swiss law. Both Modis International Ltd and Modis Europe Ltd are Equal Opportunities Employers. By applying for this role your details will be submitted to Modis International Ltd and/ or Modis Europe Ltd. Our Candidate Privacy Information Statement which explains how we will use your information is available on the Modis website.
09/12/2025
Full time
Data Engineer (Strong SQL, ETL, Python) - SC Cleared OR Eligible Stevenage (Hybrid) 2-3 days onsite Up to 80,000 High-impact programme - Revolutionary platform I am looking for a Security-Cleared Data Engineer to take the reins on a range of highly ambitious Data Migration projects supporting a range of truly high-impact programmes across the UK. This is a unique opportunity to work on cutting-edge cloud, software, and infrastructure projects that shape the future of technology in both public and private sectors. You'll be part of a collaborative team delivering scalable, next-generation digital ecosystems. What you'll be doing? As a Data Engineer within our Centre of Excellence, you will play a critical role in delivering complex data migration and data engineering projects for our clients. This position focuses on the planning, execution, and optimisation of data migrations-from legacy platforms to modern cloud-based environments-ensuring accuracy, consistency, security, and continuity throughout the process Key Responsibilities Analyse existing data structures and understand business and technical requirements for migration initiatives. Design and deliver robust data migration strategies and ETL solutions. Develop automated data extraction, transformation, and loading (ETL) processes using industry-standard tools and scripts. Work closely with stakeholders to ensure seamless migration and minimal business disruption. Plan, coordinate, and execute data migration projects within defined timelines. Ensure the highest standards of data quality, integrity, and security. Troubleshoot and resolve data-related issues promptly. Collaborate with wider engineering and architecture teams to ensure migrations align with organisational and regulatory standards. Relevant exposure; Expert-level SQL skills for complex query development, performance tuning, indexing, and data transformation across on-premise databases and AWS cloud environments. Strong hands-on experience with ETL processes and tools (Talend, Informatica, Matillion, Pentaho, MuleSoft, Boomi) or scripting using Python, PySpark, and SQL. Solid understanding of data warehousing and modelling techniques (Star Schema, Snowflake Schema). Familiarity with security frameworks such as GDPR, HIPAA, ISO 27001, NIST, SOX, and PII, as well as AWS security features including IAM, KMS, and RBAC. Ability to identify and resolve data quality issues across migration projects. Strong track record of delivering end-to-end data migration projects and working effectively with both technical and non-technical stakeholders. Due to the nature of the work, SC Clearance is required or candidates must be eligible to obtain it. Salary up to 80,000 plus wider benefits - Contact me today for further insight on (phone number removed) or (url removed). Modis International Ltd acts as an employment agency for permanent recruitment and an employment business for the supply of temporary workers in the UK. Modis Europe Ltd provide a variety of international solutions that connect clients to the best talent in the world. For all positions based in Switzerland, Modis Europe Ltd works with its licensed Swiss partner Accurity GmbH to ensure that candidate applications are handled in accordance with Swiss law. Both Modis International Ltd and Modis Europe Ltd are Equal Opportunities Employers. By applying for this role your details will be submitted to Modis International Ltd and/ or Modis Europe Ltd. Our Candidate Privacy Information Statement which explains how we will use your information is available on the Modis website.
SF Recruitment
Data Engineer
SF Recruitment
We're supporting a large-scale data programme that requires an experienced Data Engineer to help transform complex, unstructured information into clean, reliable datasets suitable for analysis and reporting. The project involves working with sizeable JSON files and other mixed-format sources, standardising them, and preparing them for downstream use across several internal systems. You'll be responsible for shaping the structure, improving data quality, and ensuring outputs can be easily consumed by non-technical teams. What You'll Work On Converting varied and unstructured data (including JSON) into well-defined relational formats. Designing data models that ensure consistency and interoperability across tools. Preparing datasets for use in spreadsheets, reporting environments, and CRM systems. Resolving data quality issues: type mismatches, missing values, integrity checks, and formatting problems. Building repeatable processes and validation steps to support accurate, sustainable reporting. Partnering with operational and business teams to understand requirements and ensure outputs are fit for purpose. Skills & Experience Needed Strong SQL abilities and experience designing relational schemas. Hands-on Python skills (preferably pandas) for data wrangling and transformation. Solid understanding of data modelling principles and best practices. Good working knowledge of Excel and awareness of CRM/enterprise data structures. Experience with business intelligence/reporting tools (Power BI, Tableau, etc.) is beneficial. Able to interpret complex datasets, identify patterns/issues, and communicate findings clearly to non-technical users. Nice to Have Background in sensitive or regulated data environments. Understanding of data protection considerations. Exposure to ETL or data pipeline development.
09/12/2025
Seasonal
We're supporting a large-scale data programme that requires an experienced Data Engineer to help transform complex, unstructured information into clean, reliable datasets suitable for analysis and reporting. The project involves working with sizeable JSON files and other mixed-format sources, standardising them, and preparing them for downstream use across several internal systems. You'll be responsible for shaping the structure, improving data quality, and ensuring outputs can be easily consumed by non-technical teams. What You'll Work On Converting varied and unstructured data (including JSON) into well-defined relational formats. Designing data models that ensure consistency and interoperability across tools. Preparing datasets for use in spreadsheets, reporting environments, and CRM systems. Resolving data quality issues: type mismatches, missing values, integrity checks, and formatting problems. Building repeatable processes and validation steps to support accurate, sustainable reporting. Partnering with operational and business teams to understand requirements and ensure outputs are fit for purpose. Skills & Experience Needed Strong SQL abilities and experience designing relational schemas. Hands-on Python skills (preferably pandas) for data wrangling and transformation. Solid understanding of data modelling principles and best practices. Good working knowledge of Excel and awareness of CRM/enterprise data structures. Experience with business intelligence/reporting tools (Power BI, Tableau, etc.) is beneficial. Able to interpret complex datasets, identify patterns/issues, and communicate findings clearly to non-technical users. Nice to Have Background in sensitive or regulated data environments. Understanding of data protection considerations. Exposure to ETL or data pipeline development.
Lead Software Engineer
SR2 - Socially Responsible Recruitment Swindon, Wiltshire
Lead Software Developer | Swindon | Hybrid | £75,000 - £85,000 | SR2 are supporting an organisation on an exciting transformation journey, who are now looking for a Lead Software Developer to help shape the future of their digital platforms. This hands-on technical leadership role combines development, architecture, and mentoring, supporting our transition toward cloud-native, scalable systems. The Role Lead the design, development, and implementation of software applications. Set engineering standards, conduct code reviews, and guide architectural decisions. Modernise Legacy systems and champion CI/CD, automation, and cloud technologies. Develop and optimise applications, ensuring performance, security, and compliance. Mentor developers and support agile delivery across cross-functional teams. About You Experienced Senior/Lead Developer in a multi-system environment. Experience of delivering solutions with agile teams from a collaborative perspective Strong background in cloud-based development (Azure, AWS, or GCP). Proficient in modern programming languages and frameworks (eg, C#, .NET, JavaScript/TypeScript, Python). Experience in technical leadership Skilled in API development, Front End and Back End engineering, and CI/CD pipelines. Excellent problem-solving, communication, and leadership skills. Technical Knowledge Modern Front End (React, Angular, Vue, Blazor) and Back End (eg, .NET Core, Node.js) frameworks. Cloud and DevOps tools (Docker, Kubernetes, GitHub Actions, Jenkins). Strong understanding of architecture, design patterns, SQL/NoSQL, and security best practices. Apply now and take the next step in your technical leadership career!
09/12/2025
Full time
Lead Software Developer | Swindon | Hybrid | £75,000 - £85,000 | SR2 are supporting an organisation on an exciting transformation journey, who are now looking for a Lead Software Developer to help shape the future of their digital platforms. This hands-on technical leadership role combines development, architecture, and mentoring, supporting our transition toward cloud-native, scalable systems. The Role Lead the design, development, and implementation of software applications. Set engineering standards, conduct code reviews, and guide architectural decisions. Modernise Legacy systems and champion CI/CD, automation, and cloud technologies. Develop and optimise applications, ensuring performance, security, and compliance. Mentor developers and support agile delivery across cross-functional teams. About You Experienced Senior/Lead Developer in a multi-system environment. Experience of delivering solutions with agile teams from a collaborative perspective Strong background in cloud-based development (Azure, AWS, or GCP). Proficient in modern programming languages and frameworks (eg, C#, .NET, JavaScript/TypeScript, Python). Experience in technical leadership Skilled in API development, Front End and Back End engineering, and CI/CD pipelines. Excellent problem-solving, communication, and leadership skills. Technical Knowledge Modern Front End (React, Angular, Vue, Blazor) and Back End (eg, .NET Core, Node.js) frameworks. Cloud and DevOps tools (Docker, Kubernetes, GitHub Actions, Jenkins). Strong understanding of architecture, design patterns, SQL/NoSQL, and security best practices. Apply now and take the next step in your technical leadership career!
Fruition Group
Senior Backend Developer
Fruition Group
Senior Backend Developer Location: London - 1x a month Salary: Up to £95,000 (D.O.E) + benefits Fruition Group working with a leading Insurtech unicorn on a mission to transform the insurance industry. Their products have already generated millions in revenue, and now they're investing heavily in new innovations that will shape the sector for years to come. This is an exciting opportunity for a motivated Senior Engineer who wants to grow in a high-performing, forward-thinking business. As a Senior Backend Engineer, you'll play a central role in building and scaling the systems that power their insurance platform. You'll focus on developing Python-based cloud microservices (FastAPI preferred), contributing to the architecture of resilient, high-performance systems, and collaborating with a cross-functional team to deliver features at pace. This is a hands-on engineering position with plenty of scope to influence technical direction, improve processes, and grow your expertise in distributed systems. What will I be doing: Design, develop, and maintain Python microservices (FastAPI) running in production. Take ownership of features end-to-end, from design to deployment and monitoring. Write high-quality, testable code with a focus on scalability and resilience. Collaborate closely with engineers, product managers, and designers to deliver quickly. Contribute to evolving architecture, tooling, and CI/CD practices. What experience do I need: Strong background in Python development (FastAPI, Flask, or Django). Good knowledge of microservices, APIs, and cloud-based infrastructure. Experience with SQL and NoSQL databases, Git, and CI/CD workflows. Solid engineering fundamentals - data structures, OOP, debugging, and testing. Collaborative, curious, and eager to experiment with AI tools to improve productivity. If this position sounds of interest, please apply and a member of the team will be in touch to discuss further. We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief, sexual orientation, or age.
09/12/2025
Full time
Senior Backend Developer Location: London - 1x a month Salary: Up to £95,000 (D.O.E) + benefits Fruition Group working with a leading Insurtech unicorn on a mission to transform the insurance industry. Their products have already generated millions in revenue, and now they're investing heavily in new innovations that will shape the sector for years to come. This is an exciting opportunity for a motivated Senior Engineer who wants to grow in a high-performing, forward-thinking business. As a Senior Backend Engineer, you'll play a central role in building and scaling the systems that power their insurance platform. You'll focus on developing Python-based cloud microservices (FastAPI preferred), contributing to the architecture of resilient, high-performance systems, and collaborating with a cross-functional team to deliver features at pace. This is a hands-on engineering position with plenty of scope to influence technical direction, improve processes, and grow your expertise in distributed systems. What will I be doing: Design, develop, and maintain Python microservices (FastAPI) running in production. Take ownership of features end-to-end, from design to deployment and monitoring. Write high-quality, testable code with a focus on scalability and resilience. Collaborate closely with engineers, product managers, and designers to deliver quickly. Contribute to evolving architecture, tooling, and CI/CD practices. What experience do I need: Strong background in Python development (FastAPI, Flask, or Django). Good knowledge of microservices, APIs, and cloud-based infrastructure. Experience with SQL and NoSQL databases, Git, and CI/CD workflows. Solid engineering fundamentals - data structures, OOP, debugging, and testing. Collaborative, curious, and eager to experiment with AI tools to improve productivity. If this position sounds of interest, please apply and a member of the team will be in touch to discuss further. We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief, sexual orientation, or age.
Fruition Group
Lead Backend Developer
Fruition Group
Lead Backend Developer Location: London - 1x a month Salary: Up to £120,000 (D.O.E) + benefits Fruition Group are partnering with a high-growth Insurtech unicorn that's scaling its engineering function. This is a unique chance to work across both proven, revenue-generating products and greenfield initiatives that are reshaping the future of insurance. It's an ideal role for a driven Lead Engineer who thrives in ambitious environments and wants to make a tangible impact. As a Lead Backend Engineer, you'll take ownership of designing and scaling cloud-native Back End systems. You'll work primarily with Python (FastAPI) and play a key role in shaping the architecture of microservices that support millions of users. Beyond hands-on development, you'll provide technical leadership, mentor team members, and influence strategic engineering decisions. This is a high-impact role where your work directly drives product growth, system resilience, and platform evolution. What will I be doing: Design, develop, and optimise scalable Back End services in Python, leveraging FastAPI. Lead architectural discussions with a focus on performance, scalability, and reliability. Deliver complex features end-to-end - from design through deployment and monitoring. Provide mentorship through code reviews, technical guidance, and best practices. Collaborate with Product, Design, and Engineering teams to deliver at pace. Continuously raise the bar for engineering standards, code quality, and delivery. Shape the long-term direction of the platform's service-oriented architecture. Champion the use of AI and automation to enhance productivity across the team. What experience do I need: Strong background building and scaling Python-based systems (FastAPI, Flask, or Django REST). Proven leadership experience in a development environment Solid expertise in microservices, APIs, messaging patterns, and distributed systems. Proficient with cloud platforms (AWS, GCP, Azure) and containerisation (Docker; Kubernetes preferred). Strong engineering fundamentals - testing, clean code, performance tuning, and algorithms. Experience with relational and non-relational databases (PostgreSQL, MongoDB). Comfortable working in agile, fast-moving environments with high ownership. Curious about new technology, with a growth mindset and interest in AI-driven tools. If this role sounds of interest, please apply and a member of the team will be in touch to discuss your application. We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief, sexual orientation, or age.
09/12/2025
Full time
Lead Backend Developer Location: London - 1x a month Salary: Up to £120,000 (D.O.E) + benefits Fruition Group are partnering with a high-growth Insurtech unicorn that's scaling its engineering function. This is a unique chance to work across both proven, revenue-generating products and greenfield initiatives that are reshaping the future of insurance. It's an ideal role for a driven Lead Engineer who thrives in ambitious environments and wants to make a tangible impact. As a Lead Backend Engineer, you'll take ownership of designing and scaling cloud-native Back End systems. You'll work primarily with Python (FastAPI) and play a key role in shaping the architecture of microservices that support millions of users. Beyond hands-on development, you'll provide technical leadership, mentor team members, and influence strategic engineering decisions. This is a high-impact role where your work directly drives product growth, system resilience, and platform evolution. What will I be doing: Design, develop, and optimise scalable Back End services in Python, leveraging FastAPI. Lead architectural discussions with a focus on performance, scalability, and reliability. Deliver complex features end-to-end - from design through deployment and monitoring. Provide mentorship through code reviews, technical guidance, and best practices. Collaborate with Product, Design, and Engineering teams to deliver at pace. Continuously raise the bar for engineering standards, code quality, and delivery. Shape the long-term direction of the platform's service-oriented architecture. Champion the use of AI and automation to enhance productivity across the team. What experience do I need: Strong background building and scaling Python-based systems (FastAPI, Flask, or Django REST). Proven leadership experience in a development environment Solid expertise in microservices, APIs, messaging patterns, and distributed systems. Proficient with cloud platforms (AWS, GCP, Azure) and containerisation (Docker; Kubernetes preferred). Strong engineering fundamentals - testing, clean code, performance tuning, and algorithms. Experience with relational and non-relational databases (PostgreSQL, MongoDB). Comfortable working in agile, fast-moving environments with high ownership. Curious about new technology, with a growth mindset and interest in AI-driven tools. If this role sounds of interest, please apply and a member of the team will be in touch to discuss your application. We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief, sexual orientation, or age.
Lead Data Engineer
TPXImpact Holdings PLC
About The Role Job level: 10 Were looking for a Lead Data Engineer to join our Data Engineering and Analytics practice. In this role, you will: Lead the design, development, management and optimisation of data pipelines to ensure efficient data flows, recognising and sharing opportunities to reuse data flows where possible. Coordinate teams and set best practices and standards when it comes to data engineering principles. Champion data engineering across projects and clients. Responsibilities Lead by example, holding responsibilities for team culture, and how projects deliver the most impact and value to our clients. Be accountable for the strategic direction, delivery and growth of our work. Lead teams, strands of work and outcomes, owning commercial responsibilities. Hold and manage uncertainty and ambiguity on behalf of clients and our teams. Ensure teams and projects are inclusive through how you lead and manage others. Effectively own and hold the story of our work, ensuring we measure progress against client goals and our DT missions. Work with our teams to influence and own how we deliver more value to clients, working with time and budget constraints. Strategically plan the overall project and apply methods and approaches. Demonstrably share work with wider audiences. Elevate ideas through how you write, speak and present. Dimensions Headcount:Typically leads a multidisciplinary team or multiple workstreams (team size 515) Resource complexity:Provides leadership across multiple workstreams or technical domains within a project or programme. Responsible for delivery coordination, prioritisation, and quality, often overseeing more junior leads or specialists. Problem-solving responsibility:Solves highly complex problems, balancing technical, user, business, and operational needs. Applies expert judgement to make decisions, manage risks, and guide teams through ambiguity. Change management requirements:Leads or co-leads significant change initiatives. Responsible for managing stakeholder expectations, supporting adoption, and embedding sustainable ways of working. Internal/External interactions:Acts as a trusted partner to client and internal stakeholders at multiple levels. Leads workshops, presentations, and stakeholder engagement to ensure buy-in, alignment, and delivery clarity. Strategic timeframe working towards:Works across mid- to long-term delivery cycles (612 months), ensuring that near-term work supports broader programme and client objectives. About YouProfessional knowledge and experience Essential Proven experience in data engineering, data integration and data modelling Expertise with cloud platforms (e.g. AWS, Azure, GCP) Expertise with modern cloud data platforms (e.g. Microsoft Fabric, Databricks) Expertise with multiple data analytics tools (e.g. Power BI) Deep understanding of data warehousing concepts, ETL/ELT pipelines and dimensional modelling Proficiency in advanced programming languages (Python/PySpark, SQL) Experience in data pipeline orchestration (e.g. Airflow, Data Factory) Familiarity with DevOps and CI/CD practices (Git, Azure DevOps etc) Ability to communicate technical concepts to both technical and non-technical audiences Proven experience in delivery of complex projects in a fast paced environment with tight deadlines Desirable Advanced knowledge of data governance, data standards and best practices. Experience in a consultancy environment, demonstrating flexibility and adaptability to client needs. Experience defining and enforcing data engineering standards, patterns, and reusable frameworks Professional certifications in relevant technologies (e.g. Microsoft Azure Data Engineer, AWS Data Analytics, Databricks Certified Professional Data Engineer) Skills Data Development Process Design, build and test data products that are complex or large scale Build and lead teams to complete data integration services integration and reusable pipelines that meet performance, quality and scalability standards Collaborate with architects to align solutions with enterprise data strategy and target architectures Data Engineering and Manipulation Work with data analysts, engineers and data science and AI specialists to design and deliver products into the organisation effectively. Understand the reasons for cleansing and preparing data before including it in data products and can put reusable processes and checks in place. Access and use a range of architectures (including cloud and on-premise) and data manipulation and transformation tools deployed within the organisation. Optimise data pipelines and queries for performance and cost efficiency in distributed environments Testing (Data) Review requirements and specifications, and define system integration testing conditions for complex data products and support others to do the same Identify and manage issues and risks associated with complex data products and support others to do the same Analyse and report system test activities and results for complex data products and support others to do the same Other Skills Proficiency in developing and maintaining complex data models (conceptual, logical and physical). Strong skills in data governance and metadata management. Experience with data integration design and implementation. Ability to write efficient, maintainable code for large scale data systems. Experience with CI/CD pipelines, version control, and infrastructure-as-code (e.g. Git, Azure DevOps). Strong stakeholder communication skills, with the ability to translate technical concepts into business terms. Ability to mentor junior engineers, foster collaboration, and build a high-performing data engineering culture. Behaviours and PACT values Purpose:Be values-driven, recognising that our client's needs are paramount. Approach client engagements with professionalism and creativity, balancing commercial and operational needs. Accountability:Be accountable for delivering your part of a project on time and under budget and working well with other leaders.Lead by example, promoting a culture where quality and client experience are foremost. Craft:Balance multiple priorities while leading high-performing teams. Navigate ambiguity and set the technical direction and approach to support positive outcomes. Togetherness:Collaborate effectively with others across TPXimpact. Build strong relationships with colleagues and clients. About UsPeople-Powered Transformation We're a purpose driven organisation, supporting organisations to build a better future for people, places and the planet. Combining vast experience in the public, private and third sectors and expertise in human-centred design, data, experience and technology, were creating sustainable solutions ready for an ever-evolving world. At the heart of TPXimpact, were collaborative and empathetic. Were a team of passionate people who care deeply about the work we do and the impact we have in the world. We know that change happens through people, with people and for people. Thats why we believe in people-powered transformation. Working in close collaboration with our clients, we seek to understand their unique challenges, questioning assumptions and building in their teams the capabilities and confidence to continue learning, iterating and adapting. Benefits Include: 30 days holiday + bank holidays 2 volunteer days for causes that you are passionate about Maternity/paternity - 6 months Maternity Leave, 3 months Paternity Leave Life assurance Employer pension contribution of 5% Health cash plan Personal learning and development budget Employee Assistance Programme Access to equity in the business through a Share Incentive Plan Green incentive programmes including Electric Vehicle Leasing and the Cycle to Work Scheme Financial advice Health assessments About TPXimpact - Digital Transformation We drive fundamental change in approaches to product and service development, delivery and technology. Our agile, multidisciplinary teams use technology, design and data to deliver better results, improving outcomes for individuals, organisations and communities. By working in the open, in partnership with our clients, we not only transform their systems and services but also build the capability of their teams, so work can continue without us in the longer term. Our focus is sustainable change, always delivered with positive impact. Were an inclusive employer, and we care about diversity in our teams. Let us know in your application if you have accessibility requirements during the interview. JBRP1_UKTJ
09/12/2025
Full time
About The Role Job level: 10 Were looking for a Lead Data Engineer to join our Data Engineering and Analytics practice. In this role, you will: Lead the design, development, management and optimisation of data pipelines to ensure efficient data flows, recognising and sharing opportunities to reuse data flows where possible. Coordinate teams and set best practices and standards when it comes to data engineering principles. Champion data engineering across projects and clients. Responsibilities Lead by example, holding responsibilities for team culture, and how projects deliver the most impact and value to our clients. Be accountable for the strategic direction, delivery and growth of our work. Lead teams, strands of work and outcomes, owning commercial responsibilities. Hold and manage uncertainty and ambiguity on behalf of clients and our teams. Ensure teams and projects are inclusive through how you lead and manage others. Effectively own and hold the story of our work, ensuring we measure progress against client goals and our DT missions. Work with our teams to influence and own how we deliver more value to clients, working with time and budget constraints. Strategically plan the overall project and apply methods and approaches. Demonstrably share work with wider audiences. Elevate ideas through how you write, speak and present. Dimensions Headcount:Typically leads a multidisciplinary team or multiple workstreams (team size 515) Resource complexity:Provides leadership across multiple workstreams or technical domains within a project or programme. Responsible for delivery coordination, prioritisation, and quality, often overseeing more junior leads or specialists. Problem-solving responsibility:Solves highly complex problems, balancing technical, user, business, and operational needs. Applies expert judgement to make decisions, manage risks, and guide teams through ambiguity. Change management requirements:Leads or co-leads significant change initiatives. Responsible for managing stakeholder expectations, supporting adoption, and embedding sustainable ways of working. Internal/External interactions:Acts as a trusted partner to client and internal stakeholders at multiple levels. Leads workshops, presentations, and stakeholder engagement to ensure buy-in, alignment, and delivery clarity. Strategic timeframe working towards:Works across mid- to long-term delivery cycles (612 months), ensuring that near-term work supports broader programme and client objectives. About YouProfessional knowledge and experience Essential Proven experience in data engineering, data integration and data modelling Expertise with cloud platforms (e.g. AWS, Azure, GCP) Expertise with modern cloud data platforms (e.g. Microsoft Fabric, Databricks) Expertise with multiple data analytics tools (e.g. Power BI) Deep understanding of data warehousing concepts, ETL/ELT pipelines and dimensional modelling Proficiency in advanced programming languages (Python/PySpark, SQL) Experience in data pipeline orchestration (e.g. Airflow, Data Factory) Familiarity with DevOps and CI/CD practices (Git, Azure DevOps etc) Ability to communicate technical concepts to both technical and non-technical audiences Proven experience in delivery of complex projects in a fast paced environment with tight deadlines Desirable Advanced knowledge of data governance, data standards and best practices. Experience in a consultancy environment, demonstrating flexibility and adaptability to client needs. Experience defining and enforcing data engineering standards, patterns, and reusable frameworks Professional certifications in relevant technologies (e.g. Microsoft Azure Data Engineer, AWS Data Analytics, Databricks Certified Professional Data Engineer) Skills Data Development Process Design, build and test data products that are complex or large scale Build and lead teams to complete data integration services integration and reusable pipelines that meet performance, quality and scalability standards Collaborate with architects to align solutions with enterprise data strategy and target architectures Data Engineering and Manipulation Work with data analysts, engineers and data science and AI specialists to design and deliver products into the organisation effectively. Understand the reasons for cleansing and preparing data before including it in data products and can put reusable processes and checks in place. Access and use a range of architectures (including cloud and on-premise) and data manipulation and transformation tools deployed within the organisation. Optimise data pipelines and queries for performance and cost efficiency in distributed environments Testing (Data) Review requirements and specifications, and define system integration testing conditions for complex data products and support others to do the same Identify and manage issues and risks associated with complex data products and support others to do the same Analyse and report system test activities and results for complex data products and support others to do the same Other Skills Proficiency in developing and maintaining complex data models (conceptual, logical and physical). Strong skills in data governance and metadata management. Experience with data integration design and implementation. Ability to write efficient, maintainable code for large scale data systems. Experience with CI/CD pipelines, version control, and infrastructure-as-code (e.g. Git, Azure DevOps). Strong stakeholder communication skills, with the ability to translate technical concepts into business terms. Ability to mentor junior engineers, foster collaboration, and build a high-performing data engineering culture. Behaviours and PACT values Purpose:Be values-driven, recognising that our client's needs are paramount. Approach client engagements with professionalism and creativity, balancing commercial and operational needs. Accountability:Be accountable for delivering your part of a project on time and under budget and working well with other leaders.Lead by example, promoting a culture where quality and client experience are foremost. Craft:Balance multiple priorities while leading high-performing teams. Navigate ambiguity and set the technical direction and approach to support positive outcomes. Togetherness:Collaborate effectively with others across TPXimpact. Build strong relationships with colleagues and clients. About UsPeople-Powered Transformation We're a purpose driven organisation, supporting organisations to build a better future for people, places and the planet. Combining vast experience in the public, private and third sectors and expertise in human-centred design, data, experience and technology, were creating sustainable solutions ready for an ever-evolving world. At the heart of TPXimpact, were collaborative and empathetic. Were a team of passionate people who care deeply about the work we do and the impact we have in the world. We know that change happens through people, with people and for people. Thats why we believe in people-powered transformation. Working in close collaboration with our clients, we seek to understand their unique challenges, questioning assumptions and building in their teams the capabilities and confidence to continue learning, iterating and adapting. Benefits Include: 30 days holiday + bank holidays 2 volunteer days for causes that you are passionate about Maternity/paternity - 6 months Maternity Leave, 3 months Paternity Leave Life assurance Employer pension contribution of 5% Health cash plan Personal learning and development budget Employee Assistance Programme Access to equity in the business through a Share Incentive Plan Green incentive programmes including Electric Vehicle Leasing and the Cycle to Work Scheme Financial advice Health assessments About TPXimpact - Digital Transformation We drive fundamental change in approaches to product and service development, delivery and technology. Our agile, multidisciplinary teams use technology, design and data to deliver better results, improving outcomes for individuals, organisations and communities. By working in the open, in partnership with our clients, we not only transform their systems and services but also build the capability of their teams, so work can continue without us in the longer term. Our focus is sustainable change, always delivered with positive impact. Were an inclusive employer, and we care about diversity in our teams. Let us know in your application if you have accessibility requirements during the interview. JBRP1_UKTJ
Tenth Revolution Group
Senior Data Engineer
Tenth Revolution Group Havant, Hampshire
Senior Data Engineer Salary: Up to 70,000 I am working with a forward-thinking organisation that is modernising its data platform to support scalable analytics and business intelligence across the Group. With a strong focus on Microsoft technologies and cloud-first architecture, they are looking to bring on a Data Engineer to help design and deliver impactful data solutions using Azure. This is a hands-on role where you will work across the full data stack, collaborating with architects, analysts, and stakeholders to build a future-ready platform that drives insight and decision-making. In this role, you will be responsible for: Building and managing data pipelines using Azure Data Factory and related services. Building and maintaining data lakes, data warehouses, and ETL/ELT processes. Designing scalable data solutions and models for reporting in Power BI. Supporting data migration from legacy systems into the new platform. Ensuring data models are optimised for performance and reusability. To be successful in this role, you will have: Hands-on experience creating data pipelines using Azure services such as Synapse and Data Factory. Reporting experience with Power BI. Strong understanding of SQL, Python, or PySpark. Knowledge of the Azure data platform including Azure Data Lake Storage, Azure SQL Data Warehouse, or Azure Databricks. Some of the package/role details include: Salary up to 70,000 Hybrid working model twice per week in Portsmouth Pension scheme and private healthcare options Opportunities for training and development This is just a brief overview of the role. For the full details, simply apply with your CV and I'll be in touch to discuss it further.
09/12/2025
Full time
Senior Data Engineer Salary: Up to 70,000 I am working with a forward-thinking organisation that is modernising its data platform to support scalable analytics and business intelligence across the Group. With a strong focus on Microsoft technologies and cloud-first architecture, they are looking to bring on a Data Engineer to help design and deliver impactful data solutions using Azure. This is a hands-on role where you will work across the full data stack, collaborating with architects, analysts, and stakeholders to build a future-ready platform that drives insight and decision-making. In this role, you will be responsible for: Building and managing data pipelines using Azure Data Factory and related services. Building and maintaining data lakes, data warehouses, and ETL/ELT processes. Designing scalable data solutions and models for reporting in Power BI. Supporting data migration from legacy systems into the new platform. Ensuring data models are optimised for performance and reusability. To be successful in this role, you will have: Hands-on experience creating data pipelines using Azure services such as Synapse and Data Factory. Reporting experience with Power BI. Strong understanding of SQL, Python, or PySpark. Knowledge of the Azure data platform including Azure Data Lake Storage, Azure SQL Data Warehouse, or Azure Databricks. Some of the package/role details include: Salary up to 70,000 Hybrid working model twice per week in Portsmouth Pension scheme and private healthcare options Opportunities for training and development This is just a brief overview of the role. For the full details, simply apply with your CV and I'll be in touch to discuss it further.
Raynet Recruitment
Data Architect
Raynet Recruitment Grays, Essex
Data Architect Thurrock RM17 6SL Analysis and synthesis of data: You will apply basic techniques for the analysis of data from a variety of internal and external sources and synthesise your findings. Your analysis will support both service improvement and wider strategy development, policy, and service design work across the organisation. You will effectively involve a variety of data professionals and domain experts in this analysis and synthesis and will present clear findings that colleagues can understand and use. Communication: You will communicate effectively with technical and non-technical stakeholders in a variety of roles. You will build strong collaborative relationships with colleagues from front line to senior leadership and host discussions that help define needs, generate new insights, improve data literacy, and promote data culture. You will be an advocate for the team and can manage differing perspectives and potentially difficult dynamics. Data management: You will understand data governance and how it works in relation to other organisational governance structures and will be a proactive participant in and promoter of Thurrock's data governance practices. You will use your experience to manage data, ensuring adherence to standards and maintaining data dictionaries. You will effectively manage risk to privacy in adherence to national legislation and local practices. Data modelling, cleansing and enrichment: You will be able to either produce or maintain data models and understand where to use different types of data models, developing Thurrock's business intelligence architecture in collaboration with our Data Engineers and Data Architects. You will also have some understanding in reverse-engineer a data models from live systems. You will have understanding of different tools and industry-recognised data-modelling patterns and standards, comparing different data models, communicating data structures using documentation such as schema diagrams. Data quality assurance, validation and linkage: You will identify appropriate ways to collect, collate and prepare data as set by the Data Architecture team and Data Engineers. This will involve informing the design of front end system and surveys to ensure enhanced user experience and data quality. You will make judgements as to whether data are accurate and fit for purpose and will support services in maintaining good data quality through the development of data quality auditing systems. You will define and implement batch cleansing processes where appropriate with limited guidance. Data visualisation: You will use the most appropriate medium to visualise data to tell compelling stories that are relevant to business goals and can be acted upon. Your work will take advantage of a wide variety of data visualisation tools and methodologies, presenting complex information in a way that is engaging, useful and readily intelligible to a range of audiences such as front line staff, managers, and senior leadership. You will present, communicate, and disseminate data appropriately and with influence in settings ranging from operational meetings to high profile strategic partnerships. IT and mathematics: You will apply your knowledge and experience of IT and mathematical skills, including tools and techniques. You can adopt those most appropriate for the environment and always work in a manner that is sensitive to information security. You will use your experience of using a variety of tools such as MS Excel, Qlik, SQL, R, Python, QGIS, Tableau. Logical and creative thinking: You will respond effectively to problems in databases, data processes, data products and services as they occur. You will initiate actions, monitor services, and identify trends to resolve problems.
08/12/2025
Seasonal
Data Architect Thurrock RM17 6SL Analysis and synthesis of data: You will apply basic techniques for the analysis of data from a variety of internal and external sources and synthesise your findings. Your analysis will support both service improvement and wider strategy development, policy, and service design work across the organisation. You will effectively involve a variety of data professionals and domain experts in this analysis and synthesis and will present clear findings that colleagues can understand and use. Communication: You will communicate effectively with technical and non-technical stakeholders in a variety of roles. You will build strong collaborative relationships with colleagues from front line to senior leadership and host discussions that help define needs, generate new insights, improve data literacy, and promote data culture. You will be an advocate for the team and can manage differing perspectives and potentially difficult dynamics. Data management: You will understand data governance and how it works in relation to other organisational governance structures and will be a proactive participant in and promoter of Thurrock's data governance practices. You will use your experience to manage data, ensuring adherence to standards and maintaining data dictionaries. You will effectively manage risk to privacy in adherence to national legislation and local practices. Data modelling, cleansing and enrichment: You will be able to either produce or maintain data models and understand where to use different types of data models, developing Thurrock's business intelligence architecture in collaboration with our Data Engineers and Data Architects. You will also have some understanding in reverse-engineer a data models from live systems. You will have understanding of different tools and industry-recognised data-modelling patterns and standards, comparing different data models, communicating data structures using documentation such as schema diagrams. Data quality assurance, validation and linkage: You will identify appropriate ways to collect, collate and prepare data as set by the Data Architecture team and Data Engineers. This will involve informing the design of front end system and surveys to ensure enhanced user experience and data quality. You will make judgements as to whether data are accurate and fit for purpose and will support services in maintaining good data quality through the development of data quality auditing systems. You will define and implement batch cleansing processes where appropriate with limited guidance. Data visualisation: You will use the most appropriate medium to visualise data to tell compelling stories that are relevant to business goals and can be acted upon. Your work will take advantage of a wide variety of data visualisation tools and methodologies, presenting complex information in a way that is engaging, useful and readily intelligible to a range of audiences such as front line staff, managers, and senior leadership. You will present, communicate, and disseminate data appropriately and with influence in settings ranging from operational meetings to high profile strategic partnerships. IT and mathematics: You will apply your knowledge and experience of IT and mathematical skills, including tools and techniques. You can adopt those most appropriate for the environment and always work in a manner that is sensitive to information security. You will use your experience of using a variety of tools such as MS Excel, Qlik, SQL, R, Python, QGIS, Tableau. Logical and creative thinking: You will respond effectively to problems in databases, data processes, data products and services as they occur. You will initiate actions, monitor services, and identify trends to resolve problems.
Trek Recruitment Ltd
IT Engineer
Trek Recruitment Ltd Wrexham, Clwyd
C# Software Developer (with Networking & IT Systems exposure) Location: Wrexham, North Wales Salary: £35,000 + excellent benefits Our client is a leader in developing manufacturing tech and on their behalf we re looking for a sharp, hands-on C# developer who loves building real-world applications that make a genuine difference on the factory floor and in the office. This isn t a typical helpdesk role. Yes, you ll be the go-to person for the company s IT systems, but the main focus is writing clean, maintainable code primarily in C# .NET Core to extend and improve the client's in-house manufacturing and business platforms. Networking and infrastructure are part of the mix (because everything you build has to talk to the real world), but they re secondary to solid programming skills. THE ROLE Designing and building new features and tools in C# .NET Core (this is the bulk of the role) Maintaining and enhancing our existing bespoke applications Writing scripts and automation tools in Python/PHP when it makes sense Occasional Laravel/PHP work on internal web tools Helping keep the infrastructure running smoothly (Windows/Linux servers, Active Directory, VMware, Office 365, backups, Cisco Meraki Wi-Fi, Fortinet firewalls, etc.) Configuring switches, firewalls, laptops, Raspberry Pis and label printers when needed Supporting users (mostly remotely via Teams) but this is light compared to the development work Creating BI dashboards and playing with AI/automation ideas YOU First and foremost: strong C# .NET Core skills you enjoy writing code more than resetting passwords Comfortable with modern development tools (GitHub, VS Code, Jira, etc.) Happy to touch infrastructure when required you understand networking basics, Active Directory, servers, VMware, etc. (you don t need to be a CCNA, just not scared of it) Bonus points if you ve worked with SQL Server, IIS, REST APIs, or have any manufacturing/warehouse exposure Minimum HNC/HND or degree in Computer Science / Software Development (or equivalent experience) 1 2 years+ commercial programming experience (more is fine too) Why you ll love it here You get to own and shape real products used every day by hundreds of people Proper variety one day you re adding a new feature in C#, the next you re deploying a Raspberry Pi on the shop floor Forward-thinking company that s genuinely investing in digital transformation and AI Small, friendly IT team no corporate red tape Great package: enhanced pension, private healthcare, heavily subsidised canteen, 25 days holiday + banks, hybrid flexibility If you re a C# developer who wants to stay close to the metal, see your code make an immediate impact, and doesn t mind rolling up your sleeves on the occasional bit of networking or infrastructure, this could be perfect.
07/12/2025
Full time
C# Software Developer (with Networking & IT Systems exposure) Location: Wrexham, North Wales Salary: £35,000 + excellent benefits Our client is a leader in developing manufacturing tech and on their behalf we re looking for a sharp, hands-on C# developer who loves building real-world applications that make a genuine difference on the factory floor and in the office. This isn t a typical helpdesk role. Yes, you ll be the go-to person for the company s IT systems, but the main focus is writing clean, maintainable code primarily in C# .NET Core to extend and improve the client's in-house manufacturing and business platforms. Networking and infrastructure are part of the mix (because everything you build has to talk to the real world), but they re secondary to solid programming skills. THE ROLE Designing and building new features and tools in C# .NET Core (this is the bulk of the role) Maintaining and enhancing our existing bespoke applications Writing scripts and automation tools in Python/PHP when it makes sense Occasional Laravel/PHP work on internal web tools Helping keep the infrastructure running smoothly (Windows/Linux servers, Active Directory, VMware, Office 365, backups, Cisco Meraki Wi-Fi, Fortinet firewalls, etc.) Configuring switches, firewalls, laptops, Raspberry Pis and label printers when needed Supporting users (mostly remotely via Teams) but this is light compared to the development work Creating BI dashboards and playing with AI/automation ideas YOU First and foremost: strong C# .NET Core skills you enjoy writing code more than resetting passwords Comfortable with modern development tools (GitHub, VS Code, Jira, etc.) Happy to touch infrastructure when required you understand networking basics, Active Directory, servers, VMware, etc. (you don t need to be a CCNA, just not scared of it) Bonus points if you ve worked with SQL Server, IIS, REST APIs, or have any manufacturing/warehouse exposure Minimum HNC/HND or degree in Computer Science / Software Development (or equivalent experience) 1 2 years+ commercial programming experience (more is fine too) Why you ll love it here You get to own and shape real products used every day by hundreds of people Proper variety one day you re adding a new feature in C#, the next you re deploying a Raspberry Pi on the shop floor Forward-thinking company that s genuinely investing in digital transformation and AI Small, friendly IT team no corporate red tape Great package: enhanced pension, private healthcare, heavily subsidised canteen, 25 days holiday + banks, hybrid flexibility If you re a C# developer who wants to stay close to the metal, see your code make an immediate impact, and doesn t mind rolling up your sleeves on the occasional bit of networking or infrastructure, this could be perfect.

Modal Window

  • Home
  • Contact
  • About Us
  • FAQs
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • IT blog
  • Facebook
  • Twitter
  • LinkedIn
  • Youtube
© 2008-2025 IT Job Board