it job board logo
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
  • Recruiting? Post a job
  • Sign in
  • Sign up
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

15 jobs found

Email me jobs like this
Refine Search
Current Search
senior ai engineer databricks uk
Experis
Senior Python Developer
Experis City, London
Senior Python Developer Duration: 14 Months Location: London, UK (3 days in office) Active SC Clearance required Inside IR35 - Umbrella only About the Role: We are seeking a highly skilled and versatile Senior Developer to join our team and contribute to the development and maintenance of our cutting-edge Azure Databricks platform for economic data. This platform is critical for our Monetary Analysis, Forecasting, and Modelling activities. The Senior Developer will be responsible for both front-end and back-end development, with a focus on Python, including the challenging but rewarding task of reverse engineering an existing codebase to integrate new features and improvements. This role requires a strong understanding of software development principles, experience with various programming languages and frameworks, and a passion for building high-quality, scalable, and maintainable software. Drive the development of MVPs, ensuring timely delivery and alignment with business goals. Key Responsibilities: Full-Stack Development: Contribute to both front-end and back-end development of applications and APIs interacting with the Azure Databricks platform. Develop user interfaces using modern front-end frameworks (e.g., React, Angular, Vue.js) and ensure a seamless user experience. Develop robust and efficient back-end services and APIs using Python. Python Development: Develop and maintain Python code for data processing, API development, and integration with the Azure Databricks environment. Utilise relevant Python libraries and frameworks (e.g., Flask, Django, Pandas, NumPy). Collaborate with cross-functional teams to build and enhance banking applications Work closely with UI/UX Designers to integrate visualizations seamlessly into web applications or other platforms Work on data interfaces to connect various systems within the bank. Write unit and integration tests to ensure code quality and reliability. .NET Development (Optional): Develop and maintain .NET code for back-end services, APIs, and integrations with other systems. Utilise relevant .NET frameworks and technologies (e.g., ASP.NET Core, C#, Entity Framework). Write unit and integration tests to ensure code quality and reliability. Reverse Engineering: Analyse and understand existing codebases (potentially Python) to identify areas for improvement, bug fixes, and new feature implementation. Document findings and create clear specifications for changes. Implement changes while maintaining the stability and functionality of the existing system. Join us to work on impactful projects, utilise cutting-edge tech, and be part of a collaborative team shaping public sector innovation. Apply now to make a difference!
31/03/2026
Contractor
Senior Python Developer Duration: 14 Months Location: London, UK (3 days in office) Active SC Clearance required Inside IR35 - Umbrella only About the Role: We are seeking a highly skilled and versatile Senior Developer to join our team and contribute to the development and maintenance of our cutting-edge Azure Databricks platform for economic data. This platform is critical for our Monetary Analysis, Forecasting, and Modelling activities. The Senior Developer will be responsible for both front-end and back-end development, with a focus on Python, including the challenging but rewarding task of reverse engineering an existing codebase to integrate new features and improvements. This role requires a strong understanding of software development principles, experience with various programming languages and frameworks, and a passion for building high-quality, scalable, and maintainable software. Drive the development of MVPs, ensuring timely delivery and alignment with business goals. Key Responsibilities: Full-Stack Development: Contribute to both front-end and back-end development of applications and APIs interacting with the Azure Databricks platform. Develop user interfaces using modern front-end frameworks (e.g., React, Angular, Vue.js) and ensure a seamless user experience. Develop robust and efficient back-end services and APIs using Python. Python Development: Develop and maintain Python code for data processing, API development, and integration with the Azure Databricks environment. Utilise relevant Python libraries and frameworks (e.g., Flask, Django, Pandas, NumPy). Collaborate with cross-functional teams to build and enhance banking applications Work closely with UI/UX Designers to integrate visualizations seamlessly into web applications or other platforms Work on data interfaces to connect various systems within the bank. Write unit and integration tests to ensure code quality and reliability. .NET Development (Optional): Develop and maintain .NET code for back-end services, APIs, and integrations with other systems. Utilise relevant .NET frameworks and technologies (e.g., ASP.NET Core, C#, Entity Framework). Write unit and integration tests to ensure code quality and reliability. Reverse Engineering: Analyse and understand existing codebases (potentially Python) to identify areas for improvement, bug fixes, and new feature implementation. Document findings and create clear specifications for changes. Implement changes while maintaining the stability and functionality of the existing system. Join us to work on impactful projects, utilise cutting-edge tech, and be part of a collaborative team shaping public sector innovation. Apply now to make a difference!
83Zero Ltd
Senior Data Engineer
83Zero Ltd
Company Overview We are working with an innovative organisation that recognises the increasing complexity of project delivery. Since 2013, our client has been helping companies of all sizes improve the way projects are delivered. Their mission is to become the number one provider of innovative project solutions, driven by a community of experienced, caring, and passionate professionals who are committed to improving project delivery. Why Join Our Client? Our client is currently in an exciting phase of growth, making this an excellent time to join their journey. They are building something special-scaling the business while maintaining a strong people-first approach. Investment in their teams is a key priority, creating an environment where development is encouraged and individuals are supported to grow with the organisation. Their culture sets them apart from other consulting practices, and they are looking to build a team that is equally ambitious. Position Overview Our client is seeking a Senior Data Engineer who thrives on building scalable, cloud-first data systems. In this role, you will design and manage data pipelines that support analytics, AI, and automation across complex infrastructure programmes. Your work will play a key part in enabling data-driven transformation across critical UK industries. Core Responsibilities Design, build, and optimise data pipelines using Azure Data Factory, Synapse, and Databricks Develop and maintain ETL/ELT workflows to ensure high data quality and reliability Collaborate with analysts and AI engineers to deliver robust and reusable data products Manage data lakes and warehouses using formats such as Delta Lake and Parquet Implement best practices for data governance, performance, and security Continuously evaluate and adopt new technologies to evolve the organisation's data platform Provide technical guidance to junior engineers and contribute to team capability building Technical Stack Core: Azure Data Factory Azure Synapse Analytics Azure Data Lake Storage Gen2 SQL Server Databricks Enhancements: Python (PySpark, Pandas) CI/CD (Azure DevOps) Infrastructure as Code (Terraform, Bicep) REST APIs GitHub Actions Desirable: Microsoft Fabric Delta Live Tables Power BI dataset automation DataOps practices What You'll Bring Professional experience in data engineering or cloud data development Strong understanding of data architecture, APIs, and modern data pipeline design Hands-on experience within Microsoft's Azure ecosystem, with an interest in emerging technologies such as Fabric, AI-enhanced ETL, and real-time data streaming Proven ability to lead technical workstreams and mentor junior team members A strong alignment with the organisation's IDEAL values: Integrity, Drive, Empathy, Adaptability, and Loyalty Ready to Apply? This is a fantastic opportunity to join a forward-thinking organisation at a key stage of growth, working on impactful projects across critical industries. If you're looking to take the next step in your career within a collaborative and innovative environment, we'd love to hear from you.
31/03/2026
Full time
Company Overview We are working with an innovative organisation that recognises the increasing complexity of project delivery. Since 2013, our client has been helping companies of all sizes improve the way projects are delivered. Their mission is to become the number one provider of innovative project solutions, driven by a community of experienced, caring, and passionate professionals who are committed to improving project delivery. Why Join Our Client? Our client is currently in an exciting phase of growth, making this an excellent time to join their journey. They are building something special-scaling the business while maintaining a strong people-first approach. Investment in their teams is a key priority, creating an environment where development is encouraged and individuals are supported to grow with the organisation. Their culture sets them apart from other consulting practices, and they are looking to build a team that is equally ambitious. Position Overview Our client is seeking a Senior Data Engineer who thrives on building scalable, cloud-first data systems. In this role, you will design and manage data pipelines that support analytics, AI, and automation across complex infrastructure programmes. Your work will play a key part in enabling data-driven transformation across critical UK industries. Core Responsibilities Design, build, and optimise data pipelines using Azure Data Factory, Synapse, and Databricks Develop and maintain ETL/ELT workflows to ensure high data quality and reliability Collaborate with analysts and AI engineers to deliver robust and reusable data products Manage data lakes and warehouses using formats such as Delta Lake and Parquet Implement best practices for data governance, performance, and security Continuously evaluate and adopt new technologies to evolve the organisation's data platform Provide technical guidance to junior engineers and contribute to team capability building Technical Stack Core: Azure Data Factory Azure Synapse Analytics Azure Data Lake Storage Gen2 SQL Server Databricks Enhancements: Python (PySpark, Pandas) CI/CD (Azure DevOps) Infrastructure as Code (Terraform, Bicep) REST APIs GitHub Actions Desirable: Microsoft Fabric Delta Live Tables Power BI dataset automation DataOps practices What You'll Bring Professional experience in data engineering or cloud data development Strong understanding of data architecture, APIs, and modern data pipeline design Hands-on experience within Microsoft's Azure ecosystem, with an interest in emerging technologies such as Fabric, AI-enhanced ETL, and real-time data streaming Proven ability to lead technical workstreams and mentor junior team members A strong alignment with the organisation's IDEAL values: Integrity, Drive, Empathy, Adaptability, and Loyalty Ready to Apply? This is a fantastic opportunity to join a forward-thinking organisation at a key stage of growth, working on impactful projects across critical industries. If you're looking to take the next step in your career within a collaborative and innovative environment, we'd love to hear from you.
Stott and May
Principle Data Engineer
Stott and May
Principal Data Engineer - Hybrid (London/Winchester) We're seeking a hands-on Principal Data Engineer to design and deliver enterprise-scale, cloud-native data platforms that power analytics, reporting, and Real Time decision-making. This is a strategic technical leadership role where you'll shape architecture, mentor engineers, and deliver end-to-end solutions across a modern AWS/Databricks stack. What you'll do Lead the design of scalable, secure data architectures on AWS. Build and optimise ETL/ELT pipelines for batch and streaming data. Deploy and manage Apache Spark jobs on Databricks and Delta Lake. Write production-grade Python and SQL for large-scale data transformations. Drive data quality, governance, and automation through CI/CD and IaC. Collaborate with data scientists, analysts, and business stakeholders. Mentor and guide data engineering teams. What we're looking for Proven experience in senior/principal data engineering roles. Expertise in AWS, Databricks, Apache Spark, Python, and SQL. Strong background in cloud-native data platforms, Real Time processing, and data lakes. Hands-on experience with tools such as Airflow, Kafka, Docker, GitLab CI/CD. Excellent stakeholder engagement and leadership skills. What's on offer £84000 salary + 10% bonus 6% pension contribution Private medical & flexible benefits package 25 days annual leave (plus buy/sell options) Hybrid working - travel to London or Winchester once/twice per week Join a company at the forefront of media, connectivity, and smart technology, where your work directly powers millions of daily connections across the UK.
06/10/2025
Full time
Principal Data Engineer - Hybrid (London/Winchester) We're seeking a hands-on Principal Data Engineer to design and deliver enterprise-scale, cloud-native data platforms that power analytics, reporting, and Real Time decision-making. This is a strategic technical leadership role where you'll shape architecture, mentor engineers, and deliver end-to-end solutions across a modern AWS/Databricks stack. What you'll do Lead the design of scalable, secure data architectures on AWS. Build and optimise ETL/ELT pipelines for batch and streaming data. Deploy and manage Apache Spark jobs on Databricks and Delta Lake. Write production-grade Python and SQL for large-scale data transformations. Drive data quality, governance, and automation through CI/CD and IaC. Collaborate with data scientists, analysts, and business stakeholders. Mentor and guide data engineering teams. What we're looking for Proven experience in senior/principal data engineering roles. Expertise in AWS, Databricks, Apache Spark, Python, and SQL. Strong background in cloud-native data platforms, Real Time processing, and data lakes. Hands-on experience with tools such as Airflow, Kafka, Docker, GitLab CI/CD. Excellent stakeholder engagement and leadership skills. What's on offer £84000 salary + 10% bonus 6% pension contribution Private medical & flexible benefits package 25 days annual leave (plus buy/sell options) Hybrid working - travel to London or Winchester once/twice per week Join a company at the forefront of media, connectivity, and smart technology, where your work directly powers millions of daily connections across the UK.
Tenth Revolution Group
Senior AWS Data Engineer - London - £125,000
Tenth Revolution Group
Senior AWS Data Engineer - London - £125,000 Please note - this role will require you to work from the London based office. You must have the unrestricted right to work in the UK to be eligible for this role. This organisation is not able to offer sponsorship. An exciting opportunity to join a greenfield initiative focused on transforming how market data is accessed and utilised. As a Senior AWS Data Engineer, you'll play a key role in designing and building a cutting-edge data platform using technologies like Databricks, Snowflake, and AWS Glue. Key Responsibilities: Build and maintain scalable data pipelines, warehouses, and lakes. Design secure, high-performance data architectures. Develop processing and analysis algorithms for complex data sets. Collaborate with data scientists to deploy machine learning models. Contribute to strategy, planning, and continuous improvement. Required Experience: Hands-on experience with AWS data tools: Glue, PySpark, Athena, Iceberg, Lake Formation. Strong Python and SQL skills for data processing and analysis. Deep understanding of data governance, quality, and security. Knowledge of market data and its business applications. Desirable Experience: Experience with Databricks and Snowflake. Familiarity with machine learning and data science concepts. Strategic thinking and ability to influence cross-functional teams. This role offers the chance to work across multiple business areas, solve complex data challenges, and contribute to long-term strategic goals. You'll be empowered to lead, collaborate, and innovate in a dynamic environment. To apply for this role please submit your CV or contact David Airey on or at . Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
03/10/2025
Full time
Senior AWS Data Engineer - London - £125,000 Please note - this role will require you to work from the London based office. You must have the unrestricted right to work in the UK to be eligible for this role. This organisation is not able to offer sponsorship. An exciting opportunity to join a greenfield initiative focused on transforming how market data is accessed and utilised. As a Senior AWS Data Engineer, you'll play a key role in designing and building a cutting-edge data platform using technologies like Databricks, Snowflake, and AWS Glue. Key Responsibilities: Build and maintain scalable data pipelines, warehouses, and lakes. Design secure, high-performance data architectures. Develop processing and analysis algorithms for complex data sets. Collaborate with data scientists to deploy machine learning models. Contribute to strategy, planning, and continuous improvement. Required Experience: Hands-on experience with AWS data tools: Glue, PySpark, Athena, Iceberg, Lake Formation. Strong Python and SQL skills for data processing and analysis. Deep understanding of data governance, quality, and security. Knowledge of market data and its business applications. Desirable Experience: Experience with Databricks and Snowflake. Familiarity with machine learning and data science concepts. Strategic thinking and ability to influence cross-functional teams. This role offers the chance to work across multiple business areas, solve complex data challenges, and contribute to long-term strategic goals. You'll be empowered to lead, collaborate, and innovate in a dynamic environment. To apply for this role please submit your CV or contact David Airey on or at . Tenth Revolution Group are the go-to recruiter for Data & AI roles in the UK offering more opportunities across the country than any other recruitment agency. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, and the London Fabric User Group. We are the global leaders in Data & AI recruitment.
Greencore
Senior Data Engineer
Greencore Worksop, Nottinghamshire
Why Greencore? We're a leading manufacturer of convenience food in the UK and our purpose is to make everyday taste better! We're a vibrant, fast-paced leading food manufacturer. Employing 13,300 colleagues across 16 manufacturing units and 17 distribution depots across the UK. We supply all the UK's food retailers with everything from Sandwiches, soups and sushi to cooking sauces, pickles and ready meals, and in FY24, we generated revenues of £1.8bn. Our vast direct-to-store (DTS) distribution network, comprising of 17 depots nationwide, enables us to make over 10,500 daily deliveries of our own chilled and frozen produce and that of third parties. Why is this exciting for your career as a Senior Data Engineer? The MBE Programme presents a huge opportunity for colleagues across the technology function to play a central role in the design, shape, delivery and execution of an enterprise wide digital transformation programme. The complexity of the initiative, within a FTSE 250 business, will allow for large-scale problem solving, group wide impact assessment and supporting the delivery of an enablement project to future proof the business. Why we embarked on Making Business Easier? Over time processes have become increasingly complex, increasing both the risk and cost they pose, whilst restricting our agility. At the same time, our customers and the market expect more from us than ever before. Making Business Easier forms a fundamental foundation for our commercial and operational excellence agendas, whilst supporting managing our cost base effectively in the future. The MBE Programme will streamline and simplify core processes, provide easier access to quality business data and will invest in the right technology to enable these processes. What you'll be doing: As a Senior Data Engineer, you will play a key role in shaping and delivering enterprise-wide data solutions that translate complex business requirements into scalable, high-performance data platforms. In this role, you will help define and guide the structure of data systems, focusing on seamless integration, accessibility, and governance, while optimising data flows to support both analytics and operational needs. Collaborating closely with business stakeholders, data engineers, and analysts, you will ensure that data platforms are robust, efficient, and adaptable to evolving business priorities. You will also support the usage, alignment, and consistency of data models; therefore, will have a wide-ranging role across many business projects and deliverables Shape and implement data solutions that align with business objectives and leverage both cloud and on-premise technologies Translate complex business needs into scalable, high-performing data solutions Support the development and application of best practices in data governance, security, and system design Collaborate closely with business stakeholders, product teams, and engineers to design and deliver effective, integrated data solutions Optimise data flows and pipelines to enable a wide range of analytical and operational use cases Promote data consistency across transactional and analytical systems through well-designed integration approaches Contribute to the design and ongoing improvement of data platforms - including data lakes, data warehouses, and other distributed storage environments - focused on efficiency, scalability, and ease of maintenance Mentor and support junior engineers and analysts in applying best practices in data engineering and solution design What you'll need: 5+ years of Data Engineering experience, with expertise in Azure data services and/or Microsoft Fabric Strong expertise in designing scalable data platforms and managing cloud-based data ecosystems Proven track record in data integration, ETL processes, and optimising large-scale data systems Expertise in cloud-based data platforms (AWS, Azure, Google Cloud) and distributed storage solutions Proficiency in Python, PySpark, SQL, NoSQL, and data processing frameworks (Spark, Databricks) Expertise in ETL/ELT design and orchestration in Azure, as well as pipeline performance tuning & optimisation Competent in integrating relational, NoSQL, and streaming data sources Management of CI/CD pipelines & Git-based workflows Good knowledge of data governance, privacy regulations, and security best practices Experience with modern data architectures, including data lakes, data mesh, and event-driven data processing Strong problem-solving and analytical skills to translate complex business needs into scalable data solutions Excellent communication and stakeholder management to align business and technical goals High attention to detail and commitment to data quality, security, and governance Ability to mentor and guide teams, fostering a culture of best practices in data architecture Power BI and DAX for data visualisation (desirable) Knowledge of Azure Machine Learning and AI services (desirable) Experience with streaming platforms like Event Hub or Kafka Familiarity with cloud cost optimisation techniques (desirable) What you'll get: Competitive salary and job-related benefits 25 days holiday allowance plus bank holidays Car Allowance Annual Target Bonus Pension up to 8% matched PMI Cover: Individual Life insurance up to 4x salary Company share save scheme Greencore Qualifications Exclusive Greencore employee discount platform Access to a full Wellbeing Centre platform
03/10/2025
Full time
Why Greencore? We're a leading manufacturer of convenience food in the UK and our purpose is to make everyday taste better! We're a vibrant, fast-paced leading food manufacturer. Employing 13,300 colleagues across 16 manufacturing units and 17 distribution depots across the UK. We supply all the UK's food retailers with everything from Sandwiches, soups and sushi to cooking sauces, pickles and ready meals, and in FY24, we generated revenues of £1.8bn. Our vast direct-to-store (DTS) distribution network, comprising of 17 depots nationwide, enables us to make over 10,500 daily deliveries of our own chilled and frozen produce and that of third parties. Why is this exciting for your career as a Senior Data Engineer? The MBE Programme presents a huge opportunity for colleagues across the technology function to play a central role in the design, shape, delivery and execution of an enterprise wide digital transformation programme. The complexity of the initiative, within a FTSE 250 business, will allow for large-scale problem solving, group wide impact assessment and supporting the delivery of an enablement project to future proof the business. Why we embarked on Making Business Easier? Over time processes have become increasingly complex, increasing both the risk and cost they pose, whilst restricting our agility. At the same time, our customers and the market expect more from us than ever before. Making Business Easier forms a fundamental foundation for our commercial and operational excellence agendas, whilst supporting managing our cost base effectively in the future. The MBE Programme will streamline and simplify core processes, provide easier access to quality business data and will invest in the right technology to enable these processes. What you'll be doing: As a Senior Data Engineer, you will play a key role in shaping and delivering enterprise-wide data solutions that translate complex business requirements into scalable, high-performance data platforms. In this role, you will help define and guide the structure of data systems, focusing on seamless integration, accessibility, and governance, while optimising data flows to support both analytics and operational needs. Collaborating closely with business stakeholders, data engineers, and analysts, you will ensure that data platforms are robust, efficient, and adaptable to evolving business priorities. You will also support the usage, alignment, and consistency of data models; therefore, will have a wide-ranging role across many business projects and deliverables Shape and implement data solutions that align with business objectives and leverage both cloud and on-premise technologies Translate complex business needs into scalable, high-performing data solutions Support the development and application of best practices in data governance, security, and system design Collaborate closely with business stakeholders, product teams, and engineers to design and deliver effective, integrated data solutions Optimise data flows and pipelines to enable a wide range of analytical and operational use cases Promote data consistency across transactional and analytical systems through well-designed integration approaches Contribute to the design and ongoing improvement of data platforms - including data lakes, data warehouses, and other distributed storage environments - focused on efficiency, scalability, and ease of maintenance Mentor and support junior engineers and analysts in applying best practices in data engineering and solution design What you'll need: 5+ years of Data Engineering experience, with expertise in Azure data services and/or Microsoft Fabric Strong expertise in designing scalable data platforms and managing cloud-based data ecosystems Proven track record in data integration, ETL processes, and optimising large-scale data systems Expertise in cloud-based data platforms (AWS, Azure, Google Cloud) and distributed storage solutions Proficiency in Python, PySpark, SQL, NoSQL, and data processing frameworks (Spark, Databricks) Expertise in ETL/ELT design and orchestration in Azure, as well as pipeline performance tuning & optimisation Competent in integrating relational, NoSQL, and streaming data sources Management of CI/CD pipelines & Git-based workflows Good knowledge of data governance, privacy regulations, and security best practices Experience with modern data architectures, including data lakes, data mesh, and event-driven data processing Strong problem-solving and analytical skills to translate complex business needs into scalable data solutions Excellent communication and stakeholder management to align business and technical goals High attention to detail and commitment to data quality, security, and governance Ability to mentor and guide teams, fostering a culture of best practices in data architecture Power BI and DAX for data visualisation (desirable) Knowledge of Azure Machine Learning and AI services (desirable) Experience with streaming platforms like Event Hub or Kafka Familiarity with cloud cost optimisation techniques (desirable) What you'll get: Competitive salary and job-related benefits 25 days holiday allowance plus bank holidays Car Allowance Annual Target Bonus Pension up to 8% matched PMI Cover: Individual Life insurance up to 4x salary Company share save scheme Greencore Qualifications Exclusive Greencore employee discount platform Access to a full Wellbeing Centre platform
Greencore
Senior Data Engineer
Greencore Scofton, Nottinghamshire
Why Greencore? We're a leading manufacturer of convenience food in the UK and our purpose is to make everyday taste better! We're a vibrant, fast-paced leading food manufacturer. Employing 13,300 colleagues across 16 manufacturing units and 17 distribution depots across the UK. We supply all the UK's food retailers with everything from Sandwiches, soups and sushi to cooking sauces, pickles and ready meals, and in FY24, we generated revenues of 1.8bn. Our vast direct-to-store (DTS) distribution network, comprising of 17 depots nationwide, enables us to make over 10,500 daily deliveries of our own chilled and frozen produce and that of third parties. Why is this exciting for your career as a Senior Data Engineer? The MBE Programme presents a huge opportunity for colleagues across the technology function to play a central role in the design, shape, delivery and execution of an enterprise wide digital transformation programme. The complexity of the initiative, within a FTSE 250 business, will allow for large-scale problem solving, group wide impact assessment and supporting the delivery of an enablement project to future proof the business. Why we embarked on Making Business Easier? Over time processes have become increasingly complex, increasing both the risk and cost they pose, whilst restricting our agility. At the same time, our customers and the market expect more from us than ever before. Making Business Easier forms a fundamental foundation for our commercial and operational excellence agendas, whilst supporting managing our cost base effectively in the future. The MBE Programme will streamline and simplify core processes, provide easier access to quality business data and will invest in the right technology to enable these processes. What you'll be doing: As a Senior Data Engineer, you will play a key role in shaping and delivering enterprise-wide data solutions that translate complex business requirements into scalable, high-performance data platforms. In this role, you will help define and guide the structure of data systems, focusing on seamless integration, accessibility, and governance, while optimising data flows to support both analytics and operational needs. Collaborating closely with business stakeholders, data engineers, and analysts, you will ensure that data platforms are robust, efficient, and adaptable to evolving business priorities. You will also support the usage, alignment, and consistency of data models; therefore, will have a wide-ranging role across many business projects and deliverables Shape and implement data solutions that align with business objectives and leverage both cloud and on-premise technologies Translate complex business needs into scalable, high-performing data solutions Support the development and application of best practices in data governance, security, and system design Collaborate closely with business stakeholders, product teams, and engineers to design and deliver effective, integrated data solutions Optimise data flows and pipelines to enable a wide range of analytical and operational use cases Promote data consistency across transactional and analytical systems through well-designed integration approaches Contribute to the design and ongoing improvement of data platforms - including data lakes, data warehouses, and other distributed storage environments - focused on efficiency, scalability, and ease of maintenance Mentor and support junior engineers and analysts in applying best practices in data engineering and solution design What you'll need: 5+ years of Data Engineering experience, with expertise in Azure data services and/or Microsoft Fabric Strong expertise in designing scalable data platforms and managing cloud-based data ecosystems Proven track record in data integration, ETL processes, and optimising large-scale data systems Expertise in cloud-based data platforms (AWS, Azure, Google Cloud) and distributed storage solutions Proficiency in Python, PySpark, SQL, NoSQL, and data processing frameworks (Spark, Databricks) Expertise in ETL/ELT design and orchestration in Azure, as well as pipeline performance tuning & optimisation Competent in integrating relational, NoSQL, and streaming data sources Management of CI/CD pipelines & Git-based workflows Good knowledge of data governance, privacy regulations, and security best practices Experience with modern data architectures, including data lakes, data mesh, and event-driven data processing Strong problem-solving and analytical skills to translate complex business needs into scalable data solutions Excellent communication and stakeholder management to align business and technical goals High attention to detail and commitment to data quality, security, and governance Ability to mentor and guide teams, fostering a culture of best practices in data architecture Power BI and DAX for data visualisation (desirable) Knowledge of Azure Machine Learning and AI services (desirable) Experience with streaming platforms like Event Hub or Kafka Familiarity with cloud cost optimisation techniques (desirable) What you'll get: Competitive salary and job-related benefits 25 days holiday allowance plus bank holidays Car Allowance Annual Target Bonus Pension up to 8% matched PMI Cover: Individual Life insurance up to 4x salary Company share save scheme Greencore Qualifications Exclusive Greencore employee discount platform Access to a full Wellbeing Centre platform
02/10/2025
Full time
Why Greencore? We're a leading manufacturer of convenience food in the UK and our purpose is to make everyday taste better! We're a vibrant, fast-paced leading food manufacturer. Employing 13,300 colleagues across 16 manufacturing units and 17 distribution depots across the UK. We supply all the UK's food retailers with everything from Sandwiches, soups and sushi to cooking sauces, pickles and ready meals, and in FY24, we generated revenues of 1.8bn. Our vast direct-to-store (DTS) distribution network, comprising of 17 depots nationwide, enables us to make over 10,500 daily deliveries of our own chilled and frozen produce and that of third parties. Why is this exciting for your career as a Senior Data Engineer? The MBE Programme presents a huge opportunity for colleagues across the technology function to play a central role in the design, shape, delivery and execution of an enterprise wide digital transformation programme. The complexity of the initiative, within a FTSE 250 business, will allow for large-scale problem solving, group wide impact assessment and supporting the delivery of an enablement project to future proof the business. Why we embarked on Making Business Easier? Over time processes have become increasingly complex, increasing both the risk and cost they pose, whilst restricting our agility. At the same time, our customers and the market expect more from us than ever before. Making Business Easier forms a fundamental foundation for our commercial and operational excellence agendas, whilst supporting managing our cost base effectively in the future. The MBE Programme will streamline and simplify core processes, provide easier access to quality business data and will invest in the right technology to enable these processes. What you'll be doing: As a Senior Data Engineer, you will play a key role in shaping and delivering enterprise-wide data solutions that translate complex business requirements into scalable, high-performance data platforms. In this role, you will help define and guide the structure of data systems, focusing on seamless integration, accessibility, and governance, while optimising data flows to support both analytics and operational needs. Collaborating closely with business stakeholders, data engineers, and analysts, you will ensure that data platforms are robust, efficient, and adaptable to evolving business priorities. You will also support the usage, alignment, and consistency of data models; therefore, will have a wide-ranging role across many business projects and deliverables Shape and implement data solutions that align with business objectives and leverage both cloud and on-premise technologies Translate complex business needs into scalable, high-performing data solutions Support the development and application of best practices in data governance, security, and system design Collaborate closely with business stakeholders, product teams, and engineers to design and deliver effective, integrated data solutions Optimise data flows and pipelines to enable a wide range of analytical and operational use cases Promote data consistency across transactional and analytical systems through well-designed integration approaches Contribute to the design and ongoing improvement of data platforms - including data lakes, data warehouses, and other distributed storage environments - focused on efficiency, scalability, and ease of maintenance Mentor and support junior engineers and analysts in applying best practices in data engineering and solution design What you'll need: 5+ years of Data Engineering experience, with expertise in Azure data services and/or Microsoft Fabric Strong expertise in designing scalable data platforms and managing cloud-based data ecosystems Proven track record in data integration, ETL processes, and optimising large-scale data systems Expertise in cloud-based data platforms (AWS, Azure, Google Cloud) and distributed storage solutions Proficiency in Python, PySpark, SQL, NoSQL, and data processing frameworks (Spark, Databricks) Expertise in ETL/ELT design and orchestration in Azure, as well as pipeline performance tuning & optimisation Competent in integrating relational, NoSQL, and streaming data sources Management of CI/CD pipelines & Git-based workflows Good knowledge of data governance, privacy regulations, and security best practices Experience with modern data architectures, including data lakes, data mesh, and event-driven data processing Strong problem-solving and analytical skills to translate complex business needs into scalable data solutions Excellent communication and stakeholder management to align business and technical goals High attention to detail and commitment to data quality, security, and governance Ability to mentor and guide teams, fostering a culture of best practices in data architecture Power BI and DAX for data visualisation (desirable) Knowledge of Azure Machine Learning and AI services (desirable) Experience with streaming platforms like Event Hub or Kafka Familiarity with cloud cost optimisation techniques (desirable) What you'll get: Competitive salary and job-related benefits 25 days holiday allowance plus bank holidays Car Allowance Annual Target Bonus Pension up to 8% matched PMI Cover: Individual Life insurance up to 4x salary Company share save scheme Greencore Qualifications Exclusive Greencore employee discount platform Access to a full Wellbeing Centre platform
Guidant Global
IT Data and Analytics Senior Development Operations Engineer
Guidant Global Reading, Oxfordshire
Base Location: Reading / Havant / Perth Salary: 600 per day Working Pattern: 40 hours per week / Full time Embark on a transformative career journey with SSE energy company, where innovation meets impact in the heart of the IT sector. As a pivotal player in our forward-thinking team, you'll harness cutting-edge technology to drive change and propel the UK towards its ambitious net-zero targets. Your expertise will not only shape the future of energy but also carve a sustainable world for generations to come. Join us and be at the forefront of the green revolution, where every line of code contributes to a cleaner, brighter future. Key Responsibilities: Provide technical leadership and oversight to the group Data & Analytics platform team. Responsible for ensuring the reliability, security and scalability of analytics platform services. Deliver full automation of the deployment of Data & Analytics platform services via Infrastructure as code. Help to set development standards, configure operational support processes and provide technical assurance. Provide support to Data & Analytics platform users and internal development teams interacting with the Data & Analytics platform services. What do you need? Extensive experience of deploying Azure and ideally AWS cloud resources and be fully conversant with agile and DevOps development methodology. Extensive experience in using Terraform to deploy cloud resources as infrastructure as code. Excellent understanding of CI/CD principles and experience with related tools (e.g. Azure DevOps, GitHub Actions). Strong knowledge of scripting languages such as PowerShell, Python and Azure CLI and proven experience with automation runbooks, VM maintenance scripts and SQL. Strong understanding of cloud access control and governance such as RBAC and IAM. Strong knowledge on Cloud Networking (Azure) such as private endpoints, Firewalls, NSGs, NAT gateways and route tables. Good knowledge in Microsoft Entra ID such as managing App registrations, Enterprise Apps, AD groups, managed identities and Privileged Identity Management. Proven experience in IaaS such as virtual machines - both Windows and Linux. Familiarity with server patching and maintenance. Strong understanding of security best practices within Azure and ideally AWS. Experience of configuring cloud data services (preferably Databricks) in Azure and ideally AWS. Excellent communication and collaboration skills, with the ability to work across multiple technical and non-technical teams. What happens now? After submitting your application for the Data and Analytics Senior Development Operations Engineer role, we understand you're eager to hear back. We value your time and interest, and if your application is successful, you will be contacted directly by the team within 2 working days. We appreciate your patience and look forward to the possibility of welcoming you aboard.
01/10/2025
Contractor
Base Location: Reading / Havant / Perth Salary: 600 per day Working Pattern: 40 hours per week / Full time Embark on a transformative career journey with SSE energy company, where innovation meets impact in the heart of the IT sector. As a pivotal player in our forward-thinking team, you'll harness cutting-edge technology to drive change and propel the UK towards its ambitious net-zero targets. Your expertise will not only shape the future of energy but also carve a sustainable world for generations to come. Join us and be at the forefront of the green revolution, where every line of code contributes to a cleaner, brighter future. Key Responsibilities: Provide technical leadership and oversight to the group Data & Analytics platform team. Responsible for ensuring the reliability, security and scalability of analytics platform services. Deliver full automation of the deployment of Data & Analytics platform services via Infrastructure as code. Help to set development standards, configure operational support processes and provide technical assurance. Provide support to Data & Analytics platform users and internal development teams interacting with the Data & Analytics platform services. What do you need? Extensive experience of deploying Azure and ideally AWS cloud resources and be fully conversant with agile and DevOps development methodology. Extensive experience in using Terraform to deploy cloud resources as infrastructure as code. Excellent understanding of CI/CD principles and experience with related tools (e.g. Azure DevOps, GitHub Actions). Strong knowledge of scripting languages such as PowerShell, Python and Azure CLI and proven experience with automation runbooks, VM maintenance scripts and SQL. Strong understanding of cloud access control and governance such as RBAC and IAM. Strong knowledge on Cloud Networking (Azure) such as private endpoints, Firewalls, NSGs, NAT gateways and route tables. Good knowledge in Microsoft Entra ID such as managing App registrations, Enterprise Apps, AD groups, managed identities and Privileged Identity Management. Proven experience in IaaS such as virtual machines - both Windows and Linux. Familiarity with server patching and maintenance. Strong understanding of security best practices within Azure and ideally AWS. Experience of configuring cloud data services (preferably Databricks) in Azure and ideally AWS. Excellent communication and collaboration skills, with the ability to work across multiple technical and non-technical teams. What happens now? After submitting your application for the Data and Analytics Senior Development Operations Engineer role, we understand you're eager to hear back. We value your time and interest, and if your application is successful, you will be contacted directly by the team within 2 working days. We appreciate your patience and look forward to the possibility of welcoming you aboard.
Noir
Lead Data Engineer Databricks - Leeds
Noir Leeds, Yorkshire
Lead Data Engineer (Databricks) - Leeds (Lead Data Engineer, Team Lead, Technical Lead, Senior Data Engineer, Data Engineer, Python, PySpark, SQL, Big Data, Databricks, R, Machine Learning, AI, Agile, Scrum, TDD, BDD, CI / CD, SOLID principles, Github, Azure DevOps, Jenkins, Terraform, AWS CDK, AWS CloudFormation, Azure, Lead Data Engineer, Team Lead, Technical Lead, Senior Data Engineer, Data Engineer) Our client is a global innovator and world leader with one of the most recognisable names within technology. They are looking for a Lead Data Engineer with significant Databricks experience as well as leadership responsibility to run an exceptional Agile engineering team and provide technical leadership through coaching and mentorship. We are seeking a Lead Data Engineer capable of leading client delivery, ensuring the highest standards. This will include working with architects, creating automated tests, instilling a culture of continuous improvement and setting standards for the team. You will be responsible for building a greenfield modern data platform using cutting-edge technologies, architecting big data solutions and developing complex enterprise data ETL and ML pipelines and projections. The successful candidate will have strong Python, PySpark and SQL experience, possess a clear understanding of databricks, as well as a passion for Data Science (R, Machine Learning and AI). Database experience with SQL and No-SQL - Aurora, MS SQL Server, MySQL is expected, as well as significant Agile and Scrum exposure along with SOLID principles. Continuous Integration tools, Infrastructure as code and strong Cloud Platform knowledge, ideally with AWS is also key. We are keen to hear from talented Lead Data Engineer candidates from all backgrounds. This is a truly amazing opportunity to work for a prestigious brand that will do wonders for your career. They invest heavily in training and career development with unlimited career progression for top performers. Location: Leeds Salary: £55k - £70k + Pension + Benefits To apply for this position please send your CV to Nathan Warner at Noir Consulting. (Lead Data Engineer, Team Lead, Technical Lead, Senior Data Engineer, Data Engineer, Python, PySpark, SQL, Big Data, Databricks, R, Machine Learning, AI, Agile, Scrum, TDD, BDD, CI / CD, SOLID principles, Github, Azure DevOps, Jenkins, Terraform, AWS CDK, AWS CloudFormation, Azure, Lead Data Engineer, Team Lead, Technical Lead, Senior Data Engineer, Data Engineer) NOIRUKTECHREC NOIRUKREC
01/10/2025
Full time
Lead Data Engineer (Databricks) - Leeds (Lead Data Engineer, Team Lead, Technical Lead, Senior Data Engineer, Data Engineer, Python, PySpark, SQL, Big Data, Databricks, R, Machine Learning, AI, Agile, Scrum, TDD, BDD, CI / CD, SOLID principles, Github, Azure DevOps, Jenkins, Terraform, AWS CDK, AWS CloudFormation, Azure, Lead Data Engineer, Team Lead, Technical Lead, Senior Data Engineer, Data Engineer) Our client is a global innovator and world leader with one of the most recognisable names within technology. They are looking for a Lead Data Engineer with significant Databricks experience as well as leadership responsibility to run an exceptional Agile engineering team and provide technical leadership through coaching and mentorship. We are seeking a Lead Data Engineer capable of leading client delivery, ensuring the highest standards. This will include working with architects, creating automated tests, instilling a culture of continuous improvement and setting standards for the team. You will be responsible for building a greenfield modern data platform using cutting-edge technologies, architecting big data solutions and developing complex enterprise data ETL and ML pipelines and projections. The successful candidate will have strong Python, PySpark and SQL experience, possess a clear understanding of databricks, as well as a passion for Data Science (R, Machine Learning and AI). Database experience with SQL and No-SQL - Aurora, MS SQL Server, MySQL is expected, as well as significant Agile and Scrum exposure along with SOLID principles. Continuous Integration tools, Infrastructure as code and strong Cloud Platform knowledge, ideally with AWS is also key. We are keen to hear from talented Lead Data Engineer candidates from all backgrounds. This is a truly amazing opportunity to work for a prestigious brand that will do wonders for your career. They invest heavily in training and career development with unlimited career progression for top performers. Location: Leeds Salary: £55k - £70k + Pension + Benefits To apply for this position please send your CV to Nathan Warner at Noir Consulting. (Lead Data Engineer, Team Lead, Technical Lead, Senior Data Engineer, Data Engineer, Python, PySpark, SQL, Big Data, Databricks, R, Machine Learning, AI, Agile, Scrum, TDD, BDD, CI / CD, SOLID principles, Github, Azure DevOps, Jenkins, Terraform, AWS CDK, AWS CloudFormation, Azure, Lead Data Engineer, Team Lead, Technical Lead, Senior Data Engineer, Data Engineer) NOIRUKTECHREC NOIRUKREC
BBC
Lead MLOps Engineer
BBC
Job Introduction BBC R&D has recently established an Automation Applied Research Area focussed on the use of Machine Learning across the BBC. Automation works closely with other BBC R&D Applied Research Areas, BBC Product and Technology Groups and senior business stakeholders across the BBC to accelerate Machine Learning based innovation. Reporting to the Head of Automation, this role will lead a team of experts exploring the ML platforms, tools, performance, and sustainability that will underpin the BBC's approach to Machine Learning innovation. It will ensure that best practice and correct technology choices are downstreamed into R&D ML applications as well as supporting the wider BBC in making the right strategic decisions for its future ML technology. BBC R&D has five applied research areas focussed on Audiences, Automation, Distribution, Infrastructure and Production who are looking to solve some of the most interesting challenges in Media and Broadcasting; as well as our Commercial, Partnerships & Engagement team who ensure we're collaborating with the right external partners and optimising commercial returns through the exploitation of our Intellectual Property and grant funding. Our work supports the BBC's current ambition as well as informing future strategy. If you're excited by the prospect of working in an innovative environment with smart and supportive colleagues, then BBC R&D is the place for you. Role Responsibility This is a hands-on role. Your key responsibilities will be: Build and lead a team of ML engineers to develop an infrastructure to manage ML lifecycle through experimentation, deployment, and testing. Own the Automation MLOps strategy, roadmap, and backlog. Provide leadership and guidance on the delivery of ML models from prototypes to production, mentor and coach team members on ML engineering best practises; work alongside researchers to enable BBC to benefit more rapidly from fundamental ML research. Contribute to the design of ML systems and infrastructure to shape how ML is used across the BBC. Develop relationships with pan-BBC and external contributors and stakeholders. You will need to bring to life long-term ambitions to secure required support and buy-in for tangible and intangible benefits and outcomes. Focus on ensuring our ML technology delivers on performance, cost and sustainability goals and is supportive of the BBC's responsible and ethical ML objectives. Work with our Technology Strategy and Governance team to identify and communicate strategic investment decisions required to mature the BBC's ML technology in line with business needs. Are you the right candidate? Solid understanding of machine learning concepts and algorithms Experience deploying machine learning solutions Expert knowledge of Python programming and machine learning libraries (Scikit-learn, TensorFlow, Keras, PyTorch, MxNet, etc.) Experience implementing ML automation, MLOps (scalable deployment practices aimed to deploy and maintain machine learning models in production reliably and efficiently) and related tools (e.g., MLflow, Kubeflow, Airflow, Sagemaker) Experience working in accordance with DevOps principles, and with industry deployment best practices using CI/CD tools and infrastructure as code (e.g., Docker, Kubernetes, Terraform) Experience in at least one cloud platform (e.g., AWS, GCP, Azure) and associated machine learning services, e.g., Amazon SageMaker, Azure ML, Databricks. Package Description Band: E Contract type: Permanent - Full time Location: UK wide We're happy to discuss flexible working. Please indicate your choice under the flexible working question in the application . There is no obligation to raise this at the application stage but if you wish to do so, you are welcome to. Flexible working will be part of the discussion at offer stage. Excellent career progression - the BBC offers great opportunities for employees to seek new challenges and work in different areas of the organisation. Unrivalled training and development opportunities - our in-house Academy hosts a wide range of internal and external courses and certification. Benefits - We offer a competitive salary package, a flexible 35-hour working week for work-life balance and 26 days (1 of which is a corporation day) with the option to buy an extra 5 days, a defined pension scheme and discounted dental, health care, gym and much more. The situation regarding the coronavirus outbreak is developing quickly and the BBC is keen to continue to ensure the safety and wellbeing of people across the BBC, while continuing to protect our services. To reduce the risk access to BBC buildings is limited to those essential to our broadcast output. From Wednesday 18 th March until further notice all assessments and interviews will be conducted remotely. For more information go to Mae'r sefyllfa gyda'r coronafeirws yn datblygu'n gyflym, ac mae'r BBC yn awyddus i barhau i sicrhau diogelwch a lles pobl ar draws y BBC, gan barhau i warchod ein gwasanaethau hefyd. I leihau'r risg, dim ond y bobl sy'n hanfodol i'n hallbwn darlledu fydd yn cael mynediad i adeiladau'r BBC. O ddydd Mercher 18 fed Mawrth ymlaen, bydd pob asesiad a chyfweliad yn cael ei gynnal o bell, nes rhoddir gwybod yn wahanol. I gael mwy o wybodaeth, ewch i About the BBC We don't focus simply on what we do - we also care how we do it. Our values and the way we behave are important to us. Please make sure you've read about our values and behaviours in the document attached below. Diversity matters at the BBC. We have a working environment where we value and respect every individual's unique contribution, enabling all of our employees to thrive and achieve their full potential. We want to attract the broadest range of talented people to be part of the BBC - whether that's to contribute to our programming or our wide range of non-production roles. The more diverse our workforce, the better able we are to respond to and reflect our audiences in all their diversity. We are committed to equality of opportunity and welcome applications from individuals, regardless of age, gender, ethnicity, disability, sexual orientation, gender identity, socio-economic background, religion and/or belief. We will consider flexible working requests for all roles, unless operational requirements prevent otherwise. To find out more about Diversity and Inclusion at the BBC, please click here
23/09/2022
Full time
Job Introduction BBC R&D has recently established an Automation Applied Research Area focussed on the use of Machine Learning across the BBC. Automation works closely with other BBC R&D Applied Research Areas, BBC Product and Technology Groups and senior business stakeholders across the BBC to accelerate Machine Learning based innovation. Reporting to the Head of Automation, this role will lead a team of experts exploring the ML platforms, tools, performance, and sustainability that will underpin the BBC's approach to Machine Learning innovation. It will ensure that best practice and correct technology choices are downstreamed into R&D ML applications as well as supporting the wider BBC in making the right strategic decisions for its future ML technology. BBC R&D has five applied research areas focussed on Audiences, Automation, Distribution, Infrastructure and Production who are looking to solve some of the most interesting challenges in Media and Broadcasting; as well as our Commercial, Partnerships & Engagement team who ensure we're collaborating with the right external partners and optimising commercial returns through the exploitation of our Intellectual Property and grant funding. Our work supports the BBC's current ambition as well as informing future strategy. If you're excited by the prospect of working in an innovative environment with smart and supportive colleagues, then BBC R&D is the place for you. Role Responsibility This is a hands-on role. Your key responsibilities will be: Build and lead a team of ML engineers to develop an infrastructure to manage ML lifecycle through experimentation, deployment, and testing. Own the Automation MLOps strategy, roadmap, and backlog. Provide leadership and guidance on the delivery of ML models from prototypes to production, mentor and coach team members on ML engineering best practises; work alongside researchers to enable BBC to benefit more rapidly from fundamental ML research. Contribute to the design of ML systems and infrastructure to shape how ML is used across the BBC. Develop relationships with pan-BBC and external contributors and stakeholders. You will need to bring to life long-term ambitions to secure required support and buy-in for tangible and intangible benefits and outcomes. Focus on ensuring our ML technology delivers on performance, cost and sustainability goals and is supportive of the BBC's responsible and ethical ML objectives. Work with our Technology Strategy and Governance team to identify and communicate strategic investment decisions required to mature the BBC's ML technology in line with business needs. Are you the right candidate? Solid understanding of machine learning concepts and algorithms Experience deploying machine learning solutions Expert knowledge of Python programming and machine learning libraries (Scikit-learn, TensorFlow, Keras, PyTorch, MxNet, etc.) Experience implementing ML automation, MLOps (scalable deployment practices aimed to deploy and maintain machine learning models in production reliably and efficiently) and related tools (e.g., MLflow, Kubeflow, Airflow, Sagemaker) Experience working in accordance with DevOps principles, and with industry deployment best practices using CI/CD tools and infrastructure as code (e.g., Docker, Kubernetes, Terraform) Experience in at least one cloud platform (e.g., AWS, GCP, Azure) and associated machine learning services, e.g., Amazon SageMaker, Azure ML, Databricks. Package Description Band: E Contract type: Permanent - Full time Location: UK wide We're happy to discuss flexible working. Please indicate your choice under the flexible working question in the application . There is no obligation to raise this at the application stage but if you wish to do so, you are welcome to. Flexible working will be part of the discussion at offer stage. Excellent career progression - the BBC offers great opportunities for employees to seek new challenges and work in different areas of the organisation. Unrivalled training and development opportunities - our in-house Academy hosts a wide range of internal and external courses and certification. Benefits - We offer a competitive salary package, a flexible 35-hour working week for work-life balance and 26 days (1 of which is a corporation day) with the option to buy an extra 5 days, a defined pension scheme and discounted dental, health care, gym and much more. The situation regarding the coronavirus outbreak is developing quickly and the BBC is keen to continue to ensure the safety and wellbeing of people across the BBC, while continuing to protect our services. To reduce the risk access to BBC buildings is limited to those essential to our broadcast output. From Wednesday 18 th March until further notice all assessments and interviews will be conducted remotely. For more information go to Mae'r sefyllfa gyda'r coronafeirws yn datblygu'n gyflym, ac mae'r BBC yn awyddus i barhau i sicrhau diogelwch a lles pobl ar draws y BBC, gan barhau i warchod ein gwasanaethau hefyd. I leihau'r risg, dim ond y bobl sy'n hanfodol i'n hallbwn darlledu fydd yn cael mynediad i adeiladau'r BBC. O ddydd Mercher 18 fed Mawrth ymlaen, bydd pob asesiad a chyfweliad yn cael ei gynnal o bell, nes rhoddir gwybod yn wahanol. I gael mwy o wybodaeth, ewch i About the BBC We don't focus simply on what we do - we also care how we do it. Our values and the way we behave are important to us. Please make sure you've read about our values and behaviours in the document attached below. Diversity matters at the BBC. We have a working environment where we value and respect every individual's unique contribution, enabling all of our employees to thrive and achieve their full potential. We want to attract the broadest range of talented people to be part of the BBC - whether that's to contribute to our programming or our wide range of non-production roles. The more diverse our workforce, the better able we are to respond to and reflect our audiences in all their diversity. We are committed to equality of opportunity and welcome applications from individuals, regardless of age, gender, ethnicity, disability, sexual orientation, gender identity, socio-economic background, religion and/or belief. We will consider flexible working requests for all roles, unless operational requirements prevent otherwise. To find out more about Diversity and Inclusion at the BBC, please click here
Senior DBA / Cloud Engineer
Ecotricity Group Limited
About The Role Learn more about the general tasks related to this opportunity below, as well as required skills. Ecotricity, the UK's first true green energy provider, has a strong internal team supporting and developing solutions across multiple mission critical platforms including Databricks on AWS, SQL, Power BI and other cloud centric data solutions. In this technical hands-on role, as a Lead DBA/ Cloud Engineer you will be contributing to delivery of central data platform, helping Ecotricity become more efficient by leveraging our data. The Ecotricity Technology department is a small friendly team with a strong focus on getting results, with everyone committed to delivering both individually and as part of the group/project. We're proud to be an ethical company, and this naturally attracts ethical people, making for a good safe working environment and a team that works and wins together. We also have a competitive benefits package and chose to invest in our people whenever we can. Responsibilities Managing, monitoring and maintaining on-premise SQL servers Review existing infrastructure landscape and develop future strategy and vision Consolidate and migrate existing on-prem serves to cloud Build strong relationships with key stakeholders outside the department Be responsible for data quality and ensuring problems are resolved swiftly Identify and seek out technical debt, aiming to reduce this at each opportunity Contribute to organisational awareness of technical best practice Overall responsibility for ensuring that faults are resolved swiftly, and background processes are robust and actively monitored Seek day to day opportunities to upskill and cross train with your peers About You You will have previous demonstrable experience in a SQL DBA role. You will have expert knowledge of MS SQL Server management (both on-prem and cloud) and comfortable developing T-SQL. Your past projects include migrating on-prem servers to cloud and you would like to develop yourself to support cloud infrastructure. You will have great understanding of SQL internals, schemas, permissions, indexing and partitioning. You will have experience in performance tuning and optimisation techniques using native and 3rd party tools. Knowledge of the Energy industry would be useful, but not necessary. We will actively support you, but as a potentially remote role you should be self motivated, delivery driven, and not need to be led. You should strive for best practice and technical excellence and be a person that actively looks for continual improvement opportunities. Knowledge and skills Highly experienced as DBA Microsoft SQL Server T-SQL BitBucket / GitHub Azure Devops PowerShell Advantageous AWS Databricks Atlassian toolset in Jira, Confluence About Us What's in it for you... Healthcare plan, life assurance and generous pension contribution Volunteering day Hybrid working Various company discounts (including shops, days out and events) Holiday of 25 days (plus bank holidays) & ability to buy/sell days Cycle to work scheme, car pooling and onsite parking available As a valued member of the team you will be supporting the Group Environmental policy and its associated targets to make the Green Britain Group net carbon neutral by 2025 Flexibility statement The fast moving nature of the company's business means that from time to time you may be asked to perform duties or tasks outside of your original job description on an ad hoc basis. This allows the company to use it's people in the best possible way at all times and helps the employees to make make their contribution in a changing environment. Ecotricity is Britain's greenest energy company. When we started back in 1995, we were the first company in the world to provide a new kind of electricity- the green kind. Our mission was, and remains, to change the way energy is made and used in Britain- by replacing fossil fuels with clean, renewable energy. We don't just supply green energy, we use the money from our customers' bills to make it ourselves too- we build windmills and sun parks in Britain. We call this 'bills in to mills'. In 2021, we started work on building two new solar parks, and now, in 2022, we're bringing geothermal energy to our customers' fuel mix, a first in the UK. We're also developing green gas mills which will generate 100% green gas from a source that we will never run out of- grass. We don't just focus on energy though- we built Electric Highways, Britain's leading network of electric vehicle charging points, we helped Forest Green Rovers become the greenest football club in the world, and, in partnership with RSPB, we launched Britain's greenest mobile phone service, Ecotalk + RSPB. Ecotricity is an equal opportunities employer and is committed to providing equality for all. Job Type: Permanent Benefits: Additional leave Casual dress Company events Cycle to work scheme Employee discount Life insurance On-site parking Referral programme Sick pay Store discounts Work from home Schedule: Monday to Friday Reference ID: 1418
22/09/2022
Full time
About The Role Learn more about the general tasks related to this opportunity below, as well as required skills. Ecotricity, the UK's first true green energy provider, has a strong internal team supporting and developing solutions across multiple mission critical platforms including Databricks on AWS, SQL, Power BI and other cloud centric data solutions. In this technical hands-on role, as a Lead DBA/ Cloud Engineer you will be contributing to delivery of central data platform, helping Ecotricity become more efficient by leveraging our data. The Ecotricity Technology department is a small friendly team with a strong focus on getting results, with everyone committed to delivering both individually and as part of the group/project. We're proud to be an ethical company, and this naturally attracts ethical people, making for a good safe working environment and a team that works and wins together. We also have a competitive benefits package and chose to invest in our people whenever we can. Responsibilities Managing, monitoring and maintaining on-premise SQL servers Review existing infrastructure landscape and develop future strategy and vision Consolidate and migrate existing on-prem serves to cloud Build strong relationships with key stakeholders outside the department Be responsible for data quality and ensuring problems are resolved swiftly Identify and seek out technical debt, aiming to reduce this at each opportunity Contribute to organisational awareness of technical best practice Overall responsibility for ensuring that faults are resolved swiftly, and background processes are robust and actively monitored Seek day to day opportunities to upskill and cross train with your peers About You You will have previous demonstrable experience in a SQL DBA role. You will have expert knowledge of MS SQL Server management (both on-prem and cloud) and comfortable developing T-SQL. Your past projects include migrating on-prem servers to cloud and you would like to develop yourself to support cloud infrastructure. You will have great understanding of SQL internals, schemas, permissions, indexing and partitioning. You will have experience in performance tuning and optimisation techniques using native and 3rd party tools. Knowledge of the Energy industry would be useful, but not necessary. We will actively support you, but as a potentially remote role you should be self motivated, delivery driven, and not need to be led. You should strive for best practice and technical excellence and be a person that actively looks for continual improvement opportunities. Knowledge and skills Highly experienced as DBA Microsoft SQL Server T-SQL BitBucket / GitHub Azure Devops PowerShell Advantageous AWS Databricks Atlassian toolset in Jira, Confluence About Us What's in it for you... Healthcare plan, life assurance and generous pension contribution Volunteering day Hybrid working Various company discounts (including shops, days out and events) Holiday of 25 days (plus bank holidays) & ability to buy/sell days Cycle to work scheme, car pooling and onsite parking available As a valued member of the team you will be supporting the Group Environmental policy and its associated targets to make the Green Britain Group net carbon neutral by 2025 Flexibility statement The fast moving nature of the company's business means that from time to time you may be asked to perform duties or tasks outside of your original job description on an ad hoc basis. This allows the company to use it's people in the best possible way at all times and helps the employees to make make their contribution in a changing environment. Ecotricity is Britain's greenest energy company. When we started back in 1995, we were the first company in the world to provide a new kind of electricity- the green kind. Our mission was, and remains, to change the way energy is made and used in Britain- by replacing fossil fuels with clean, renewable energy. We don't just supply green energy, we use the money from our customers' bills to make it ourselves too- we build windmills and sun parks in Britain. We call this 'bills in to mills'. In 2021, we started work on building two new solar parks, and now, in 2022, we're bringing geothermal energy to our customers' fuel mix, a first in the UK. We're also developing green gas mills which will generate 100% green gas from a source that we will never run out of- grass. We don't just focus on energy though- we built Electric Highways, Britain's leading network of electric vehicle charging points, we helped Forest Green Rovers become the greenest football club in the world, and, in partnership with RSPB, we launched Britain's greenest mobile phone service, Ecotalk + RSPB. Ecotricity is an equal opportunities employer and is committed to providing equality for all. Job Type: Permanent Benefits: Additional leave Casual dress Company events Cycle to work scheme Employee discount Life insurance On-site parking Referral programme Sick pay Store discounts Work from home Schedule: Monday to Friday Reference ID: 1418
Omni RMS
Senior Data Scientist
Omni RMS
Ofcom is the regulator for the communications services that we use and rely on each day. We make sure people get the best from their broadband, home phone and mobile services, as well as keeping an eye on TV and radio. We also oversee the universal postal service, look after the airwaves used by wireless devices, and also help to make sure people don't get scammed and are protected from bad practices, and we have recently taken on the regulation of video-sharing platforms, and we are preparing for a role in protecting people from online harms. Protecting consumers is at the heart of what we do. Our culture is clear - we live by our values: Empowerment; Excellence; Collaboration; Agility and Respect. These define how we work to deliver our purpose, now and in the future. The behaviours which support these values set the path for a fully inclusive and innovative culture at Ofcom. We focus not only on what we do, but how we do it. We pride ourselves on being an organisation of people who genuinely care about helping others. The Data Innovation Hub works with colleagues across the organisation to improve Ofcom's capability to process, analyse and extract insights from data. For example, we crunch terabytes of data to give us an unparalleled understanding of the coverage and performance of the UK's broadband and mobile networks. And we're exploring how artificial intelligence and machine learning can help us extract more insights from complex data sets or help us do things more efficiently. Purpose of the role This role will support the delivery of our data strategy and growing data agenda by bringing an open mind and strong intellectual curiosity to develop our data analysis capability. Identifying and solving complex business problems with data science and ensuring that we continue to innovate and leverage our data. We are looking for talent to join a central team that can work with all areas of regulatory work, assisting and working alongside colleagues across the organisation, including those working on the new online safety regime. by, exploring the application of deep learning for anomaly detection, combining social media analysis and natural language processing to understand issues affecting consumers or developing new ways to visualise and interact with our geospatial data. Expectations To work collaboratively with colleagues in the Data Innovation Hub, policy and operations teams to understand where data science can add value and deliver innovative projects Work with data engineers to source, access, manipulate and engineer data across a range of sources and data formats Build credible statistical models from a range of data sources data and use best coding practices to generate reproducible work in an agile way. Taking projects through to productions where appropriate Explore and visualise data to present a story in a meaningful way, and communicate new insights to a range of technical and non-technical audiences Maintain an awareness of, and utilise an evolving range of data analysis tools and techniques, including open source, some of which must be learnt quickly, as and when required Promote data science skills and inspire best practice in other teams, building an understanding the variety of data science methods and how they can be effectively applied to solve problems Ensuring compliance with our relevant data and information security principles Support colleagues in engagement with external stakeholders on a range of data and data science related topics. Essential Skills & Knowledge You will need to demonstrate knowledge or experience of: Translating complex business problems into analytical projects, then leading or delivering impactful data or analysis with tangible outcomes. A range of data and analytic tools (such as Python, R, Power BI, Azure Databricks, or equivalent), and cloud computing environments. Machine Learning, statistics, or other data analysis tools and techniques. In particular, but not limited to, we would welcome applications from candidates with experience in the following areas of data science: Natural Language Processing Geospatial Data Science Graphs / Network Analytics Neural Networks and Deep Learning Additional experience: Applying scientific methods through experimental design, exploratory data analysis and hypothesis testing to reach robust conclusions, including reviewing the work of others Acting as a thought leader within the Data Science community, including through management and development of junior staff and others' projects. Exploring and advocating ways of utilising new data science tools and techniques to tackle business and organisational challenges, drawing on the application of, and innovation in communication services industries, academia, and other sectors. Strong interpersonal skills and evidence of ability to interact effectively with a range of stakeholders, both internally and externally, to communicate technical concepts across organisational and technical boundaries. Working independently, demonstrating flexibility and adaptability, and supporting a culture of collaboration and trust. Previous experience in the one of our regulatory areas, particularly in online, would be desirable, but is not essential.
09/11/2021
Full time
Ofcom is the regulator for the communications services that we use and rely on each day. We make sure people get the best from their broadband, home phone and mobile services, as well as keeping an eye on TV and radio. We also oversee the universal postal service, look after the airwaves used by wireless devices, and also help to make sure people don't get scammed and are protected from bad practices, and we have recently taken on the regulation of video-sharing platforms, and we are preparing for a role in protecting people from online harms. Protecting consumers is at the heart of what we do. Our culture is clear - we live by our values: Empowerment; Excellence; Collaboration; Agility and Respect. These define how we work to deliver our purpose, now and in the future. The behaviours which support these values set the path for a fully inclusive and innovative culture at Ofcom. We focus not only on what we do, but how we do it. We pride ourselves on being an organisation of people who genuinely care about helping others. The Data Innovation Hub works with colleagues across the organisation to improve Ofcom's capability to process, analyse and extract insights from data. For example, we crunch terabytes of data to give us an unparalleled understanding of the coverage and performance of the UK's broadband and mobile networks. And we're exploring how artificial intelligence and machine learning can help us extract more insights from complex data sets or help us do things more efficiently. Purpose of the role This role will support the delivery of our data strategy and growing data agenda by bringing an open mind and strong intellectual curiosity to develop our data analysis capability. Identifying and solving complex business problems with data science and ensuring that we continue to innovate and leverage our data. We are looking for talent to join a central team that can work with all areas of regulatory work, assisting and working alongside colleagues across the organisation, including those working on the new online safety regime. by, exploring the application of deep learning for anomaly detection, combining social media analysis and natural language processing to understand issues affecting consumers or developing new ways to visualise and interact with our geospatial data. Expectations To work collaboratively with colleagues in the Data Innovation Hub, policy and operations teams to understand where data science can add value and deliver innovative projects Work with data engineers to source, access, manipulate and engineer data across a range of sources and data formats Build credible statistical models from a range of data sources data and use best coding practices to generate reproducible work in an agile way. Taking projects through to productions where appropriate Explore and visualise data to present a story in a meaningful way, and communicate new insights to a range of technical and non-technical audiences Maintain an awareness of, and utilise an evolving range of data analysis tools and techniques, including open source, some of which must be learnt quickly, as and when required Promote data science skills and inspire best practice in other teams, building an understanding the variety of data science methods and how they can be effectively applied to solve problems Ensuring compliance with our relevant data and information security principles Support colleagues in engagement with external stakeholders on a range of data and data science related topics. Essential Skills & Knowledge You will need to demonstrate knowledge or experience of: Translating complex business problems into analytical projects, then leading or delivering impactful data or analysis with tangible outcomes. A range of data and analytic tools (such as Python, R, Power BI, Azure Databricks, or equivalent), and cloud computing environments. Machine Learning, statistics, or other data analysis tools and techniques. In particular, but not limited to, we would welcome applications from candidates with experience in the following areas of data science: Natural Language Processing Geospatial Data Science Graphs / Network Analytics Neural Networks and Deep Learning Additional experience: Applying scientific methods through experimental design, exploratory data analysis and hypothesis testing to reach robust conclusions, including reviewing the work of others Acting as a thought leader within the Data Science community, including through management and development of junior staff and others' projects. Exploring and advocating ways of utilising new data science tools and techniques to tackle business and organisational challenges, drawing on the application of, and innovation in communication services industries, academia, and other sectors. Strong interpersonal skills and evidence of ability to interact effectively with a range of stakeholders, both internally and externally, to communicate technical concepts across organisational and technical boundaries. Working independently, demonstrating flexibility and adaptability, and supporting a culture of collaboration and trust. Previous experience in the one of our regulatory areas, particularly in online, would be desirable, but is not essential.
The Access group
Business Analyst Business Excellence
The Access group Loughborough, Leicestershire
** Please note this role can be homebased anywhere in the UK ** What are we all about? At Access we love software and how technology never stays the same. It is this obsession that drives us to work closely across sectors to understand the business needs of our customers - from professional services to manufacturing to not for profits and more. We are passionate about helping our customers stay one step ahead of the challenges facing their industry and business. That is why over 1 million users and over 35,000 organisations rely on Access software to help their organisation thrive. About you: This is a process and data-oriented role, and you will be a self-motivated, target driven individual with an eye for detail, strong numerical and technical skills and a keen interest in process engineering, data analysis and reporting. You will enjoy being part of a team that drives Business Excellence and delivers transformational programmes of work, accelerating business performance. Day-to-day, you will: • Be responsible for defining and documenting new and existing functional processes • Work closely with a variety of internal teams to understand the key KPIs used to drive our business and create a visualisation of this data in a manner that can be easily interpreted by Senior Leadership. • Working with key project stakeholders to communicate and formulate the business vision for transformational projects, the scope of the project, and to map out initial requirements. • Strive to ensure that data collection processes are automated and documented where possible to maximise efficiency. • Work with large data sets, using Excel and other data visualisation tools to deliver on programme and divisional objectives • Work closely with Programme and Project Managers in the Business Excellence team to deliver analysis required to drive programme outcomes. • Work with other Data Analysts on the team to share knowledge, expertise, and best practice • Work with managers to ensure that the effort and timeline to deliver on a particular piece of analysis is understood • Be proactive and recommend process improvements or new tools with the overall goal of helping managers to gain greater business insight through the exposure of data or metrics. As a well-rounded Business Analyst, your Skills and Experiences likely include: • Previous experience working in a similar role within a Business Excellence team • High level of communication and interpersonal skills, with the ability to work with a diverse range of internal stakeholders • Strong working knowledge of product lifecycles and sales processes • Highly analytical and data driven • Excellent Excel skills required • Excellent Powerpoint skills required • Experience with at least one data analytics tool such as PowerBI, Tableau, Databricks & Alteryx etc is essential. • Knowledge of Salesforce is an advantage • Ability to influence individuals/teams within Access to ensure that the goals of the team are delivered • Ability to prioritize, multi-task, and perform effectively under pressure • Ability to work on own initiative and drive tasks forward is essential • Strong team player who can contribute to the development of our strategy and flex as our team requirements change What does Access offer you? We are a growing software company, and we take the development of our people seriously. We will work with you to carve out your success plan and opportunity to accelerate your career and make a real difference. In addition to our standard benefits of 25 days holiday, a match contributory pension and healthcare you will get: • A Competitive Salary • Giving Back/Charity days • Quarterly Socials • 6 weeks Sabbaticals (after 6 years of service) • The Access Group Big Break: our all-expenses paid holiday to Spain
04/11/2021
Full time
** Please note this role can be homebased anywhere in the UK ** What are we all about? At Access we love software and how technology never stays the same. It is this obsession that drives us to work closely across sectors to understand the business needs of our customers - from professional services to manufacturing to not for profits and more. We are passionate about helping our customers stay one step ahead of the challenges facing their industry and business. That is why over 1 million users and over 35,000 organisations rely on Access software to help their organisation thrive. About you: This is a process and data-oriented role, and you will be a self-motivated, target driven individual with an eye for detail, strong numerical and technical skills and a keen interest in process engineering, data analysis and reporting. You will enjoy being part of a team that drives Business Excellence and delivers transformational programmes of work, accelerating business performance. Day-to-day, you will: • Be responsible for defining and documenting new and existing functional processes • Work closely with a variety of internal teams to understand the key KPIs used to drive our business and create a visualisation of this data in a manner that can be easily interpreted by Senior Leadership. • Working with key project stakeholders to communicate and formulate the business vision for transformational projects, the scope of the project, and to map out initial requirements. • Strive to ensure that data collection processes are automated and documented where possible to maximise efficiency. • Work with large data sets, using Excel and other data visualisation tools to deliver on programme and divisional objectives • Work closely with Programme and Project Managers in the Business Excellence team to deliver analysis required to drive programme outcomes. • Work with other Data Analysts on the team to share knowledge, expertise, and best practice • Work with managers to ensure that the effort and timeline to deliver on a particular piece of analysis is understood • Be proactive and recommend process improvements or new tools with the overall goal of helping managers to gain greater business insight through the exposure of data or metrics. As a well-rounded Business Analyst, your Skills and Experiences likely include: • Previous experience working in a similar role within a Business Excellence team • High level of communication and interpersonal skills, with the ability to work with a diverse range of internal stakeholders • Strong working knowledge of product lifecycles and sales processes • Highly analytical and data driven • Excellent Excel skills required • Excellent Powerpoint skills required • Experience with at least one data analytics tool such as PowerBI, Tableau, Databricks & Alteryx etc is essential. • Knowledge of Salesforce is an advantage • Ability to influence individuals/teams within Access to ensure that the goals of the team are delivered • Ability to prioritize, multi-task, and perform effectively under pressure • Ability to work on own initiative and drive tasks forward is essential • Strong team player who can contribute to the development of our strategy and flex as our team requirements change What does Access offer you? We are a growing software company, and we take the development of our people seriously. We will work with you to carve out your success plan and opportunity to accelerate your career and make a real difference. In addition to our standard benefits of 25 days holiday, a match contributory pension and healthcare you will get: • A Competitive Salary • Giving Back/Charity days • Quarterly Socials • 6 weeks Sabbaticals (after 6 years of service) • The Access Group Big Break: our all-expenses paid holiday to Spain
Jefferson Frank
Senior Data Engineer AWS or Azure
Jefferson Frank
Senior Data Engineer | Remote, Full-Time | UK Based Candidates | Consultancy Our Client are a UK based Strategy, Transformation, Data and Technology Consulting firm with a 30+ person Engineering team who solve complex problems through advanced technical solutions. Working week - a mixture between Client work (4 days) and internal products/training (on a Friday) - the internal products / training is a chance for you to work on your own personal/professional development which may be exposure to new areas like ML, or participating in the approved paid certifications related to Data or Cloud (e.g. AWS Solutions Architect or Databricks, as an example) The projects vary across a range of industries, but rest assured, you will never be expected to travel if on a remote contract (full-time contract) apart from the odd occasion to meet the Client. They are very flexible in matching you onto opportunities within your local area as well to keep the commutes minimal without the need for a hotel or time away from your family. In terms of what you will be getting involved with, again, very much dependent on where your interests lay... be that something greenfield within the financial services or elsewhere under a different industry/domain. In the past, they have created entire banks from scratch!! So plenty of exciting work to get behind. Data Engineering - they are currently looking for Senior and Lead Data Engineers to come into the business and hit the ground running. As they are a smaller sized company, your contributions will not go unnoticed and there are clearly defined progression maps to have an understanding of what objectives need to be hit in order to gain promotion and elevate within your career. You will be technically curious, innovative, and able to work communicate effectively with the wider team and stakeholders. Requirements: Python or Scala programming Continuous integration, Continuous deployment (CI/CD) Cloud technologies: AWS or Azure Pandas and ETL Terraform or Cloudformation Experience with Big Data (Spark etc.) Experience with Analytics and Visualisation SQL Databases Benefits: Full-time position DOE - from £60k to £80k Company Bonus Remote working and flexible working Healthcare and medical benefits Company pension scheme benefits Well-being benefits and perks 25-30 days holiday Paid volunteering days For more information, please contact ALEX from Jefferson Frank on or Jefferson Frank are the AWS recruiter of choice and an AWS Advanced Technology Partner helping organisations across the Globe with their cloud requirements.
10/09/2021
Full time
Senior Data Engineer | Remote, Full-Time | UK Based Candidates | Consultancy Our Client are a UK based Strategy, Transformation, Data and Technology Consulting firm with a 30+ person Engineering team who solve complex problems through advanced technical solutions. Working week - a mixture between Client work (4 days) and internal products/training (on a Friday) - the internal products / training is a chance for you to work on your own personal/professional development which may be exposure to new areas like ML, or participating in the approved paid certifications related to Data or Cloud (e.g. AWS Solutions Architect or Databricks, as an example) The projects vary across a range of industries, but rest assured, you will never be expected to travel if on a remote contract (full-time contract) apart from the odd occasion to meet the Client. They are very flexible in matching you onto opportunities within your local area as well to keep the commutes minimal without the need for a hotel or time away from your family. In terms of what you will be getting involved with, again, very much dependent on where your interests lay... be that something greenfield within the financial services or elsewhere under a different industry/domain. In the past, they have created entire banks from scratch!! So plenty of exciting work to get behind. Data Engineering - they are currently looking for Senior and Lead Data Engineers to come into the business and hit the ground running. As they are a smaller sized company, your contributions will not go unnoticed and there are clearly defined progression maps to have an understanding of what objectives need to be hit in order to gain promotion and elevate within your career. You will be technically curious, innovative, and able to work communicate effectively with the wider team and stakeholders. Requirements: Python or Scala programming Continuous integration, Continuous deployment (CI/CD) Cloud technologies: AWS or Azure Pandas and ETL Terraform or Cloudformation Experience with Big Data (Spark etc.) Experience with Analytics and Visualisation SQL Databases Benefits: Full-time position DOE - from £60k to £80k Company Bonus Remote working and flexible working Healthcare and medical benefits Company pension scheme benefits Well-being benefits and perks 25-30 days holiday Paid volunteering days For more information, please contact ALEX from Jefferson Frank on or Jefferson Frank are the AWS recruiter of choice and an AWS Advanced Technology Partner helping organisations across the Globe with their cloud requirements.
Ordnance Survey
Senior Engineer
Ordnance Survey Southampton, UK
We have an exciting opportunity available for a Senior Engineer to join our team. This role will be based at our state-of-the-art Headquarters in Southampton, however all our people, where they can, are currently working from home and we recognise the future need to be flexible. As such we are open to applications that can deliver work remotely where appropriate. In return, you will receive a competitive salary of £40,749 - £45,543 per annum. Ordnance Survey (OS) provides national mapping services for Great Britain and is a world-leading provider of geospatial solutions. OS location data and expertise has helped governments make smarter decisions, businesses gain valuable data insight, and everyone experience the world outside for over 225 years. We offer fantastic benefits in return for joining us as our Senior Engineer: - Competitive salary plus performance-related bonus - Competitive pension - 37 hour working week (with flexible working options) - 25 days annual leave (30 days after five years), plus bank holidays and an extra 3 over Christmas - A suite of excellent additional benefits About the role: We’re recruiting for a highly motivated technical Senior Engineer to join Ordnance Survey’s Data Engineering team in developing next-generation Microsoft Azure-based systems. Using C# and Python to work with geospatial big data and machine learning technologies (in a forward-thinking agile environment), this role will challenge you to understand requirements and develop cutting-edge solutions for our customers. Your team will look to you for technical coaching and mentoring, as you work together to develop and deliver systems that are functional, performant, scalable, tested, secure, maintainable, and supportable. What we’re looking for in our Senior Engineer: - Microsoft Azure Cloud Services (or similar) - C# or Python, with a track record in developing and delivering production-ready systems - Providing technical coaching and mentoring - Influencing key business and technical stakeholders - Advocating software engineering industry best practice (e.g. cloud design patterns, TDD, SOLID) - Line Management (desirable) - Technologies such as Databricks, Spark, Elasticsearch, TensorFlow, Keras (desirable Our growing Technology and Design team plays a key role in ensuring OS is at the cutting edge of geospatial capability and is looking for people to join them. Its mission is to work across the business to provide customer-centric design and technology services. If you would like to be part of this, please click apply now to be considered as our Senior Engineer – we’d love to hear from you! Closing Date: Sunday 4th April 2021
12/03/2021
Full time
We have an exciting opportunity available for a Senior Engineer to join our team. This role will be based at our state-of-the-art Headquarters in Southampton, however all our people, where they can, are currently working from home and we recognise the future need to be flexible. As such we are open to applications that can deliver work remotely where appropriate. In return, you will receive a competitive salary of £40,749 - £45,543 per annum. Ordnance Survey (OS) provides national mapping services for Great Britain and is a world-leading provider of geospatial solutions. OS location data and expertise has helped governments make smarter decisions, businesses gain valuable data insight, and everyone experience the world outside for over 225 years. We offer fantastic benefits in return for joining us as our Senior Engineer: - Competitive salary plus performance-related bonus - Competitive pension - 37 hour working week (with flexible working options) - 25 days annual leave (30 days after five years), plus bank holidays and an extra 3 over Christmas - A suite of excellent additional benefits About the role: We’re recruiting for a highly motivated technical Senior Engineer to join Ordnance Survey’s Data Engineering team in developing next-generation Microsoft Azure-based systems. Using C# and Python to work with geospatial big data and machine learning technologies (in a forward-thinking agile environment), this role will challenge you to understand requirements and develop cutting-edge solutions for our customers. Your team will look to you for technical coaching and mentoring, as you work together to develop and deliver systems that are functional, performant, scalable, tested, secure, maintainable, and supportable. What we’re looking for in our Senior Engineer: - Microsoft Azure Cloud Services (or similar) - C# or Python, with a track record in developing and delivering production-ready systems - Providing technical coaching and mentoring - Influencing key business and technical stakeholders - Advocating software engineering industry best practice (e.g. cloud design patterns, TDD, SOLID) - Line Management (desirable) - Technologies such as Databricks, Spark, Elasticsearch, TensorFlow, Keras (desirable Our growing Technology and Design team plays a key role in ensuring OS is at the cutting edge of geospatial capability and is looking for people to join them. Its mission is to work across the business to provide customer-centric design and technology services. If you would like to be part of this, please click apply now to be considered as our Senior Engineer – we’d love to hear from you! Closing Date: Sunday 4th April 2021
Ordnance Survey
Senior Data Engineer
Ordnance Survey Southampton, UK
We have an exciting position available for a Senior Data Engineer to join our team based at our HQ in Southampton, however all our people, where they can, are currently working from home and we recognise the future need to be flexible; as such we are open to applications that can deliver work remotely where appropriate. You will join us on a full time, permanent basis and in return, you will receive a competitive salary of £53,754 - £63,240 per annum. Ordnance Survey (OS) provides national mapping services for Great Britain and is a world-leading provider of geospatial solutions. OS location data and expertise has helped governments make smarter decisions, businesses gain valuable data insight, and everyone experience the world outside for over 225 years. We offer fantastic benefits in return for joining us as our Senior Data Engineer: - Competitive salary plus performance-related bonus - Competitive pension - 37 hour working week (with flexible working options) - 25 days annual leave (30 days after five years), bank holidays, and an extra 3 over Christmas - Plus, a suite of excellent additional benefits About the role: Joining our Data Engineering team, our Senior Data Engineer will provide leadership and expertise in developing new techniques, tools and pipelines. You will transform and manipulate geospatial data into forms suitable for visualisation, analysis and further development with your Agile team. Providing insight and experience to inform data-driven decision-making, your focus will be on building, maintaining and troubleshooting data processing systems to meet business requirements; with an emphasis on security, reliability, fault-tolerance, scalability, fidelity, and efficiency. Researching and utilising cutting-edge big data and machine learning technologies to automate and simplify processes, is all part of the role. You’ll be a key team member in one of our development teams, with the primary focus of your day-to-day activities being data manipulation and delivering data pipelines to aid development. What we’re looking for in our ideal Senior Data Engineer: - A degree-level qualification in a quantitative discipline or significant experience in a relevant field, such as statistical modelling or mathematics - Experience in Microsoft Azure Cloud Services (or similar) - Working knowledge of Databricks and Spark - Relevant programming language(s) and databases - Geospatial data processing and analysis techniques - Influencing key business and technical stakeholders, with strong presentation and consultancy skills - Providing technical coaching and mentoring Our growing Technology and Design team plays a key role in ensuring OS is at the cutting edge of geospatial capability and is looking for people to join them. Its mission is to work across the business to provide customer centric design and technology services. If you would like to be part of this, click apply now to be considered as our Senior Data Engineer, we’d love to hear from you! Closing date: Sunday 28th February 2021
05/02/2021
Full time
We have an exciting position available for a Senior Data Engineer to join our team based at our HQ in Southampton, however all our people, where they can, are currently working from home and we recognise the future need to be flexible; as such we are open to applications that can deliver work remotely where appropriate. You will join us on a full time, permanent basis and in return, you will receive a competitive salary of £53,754 - £63,240 per annum. Ordnance Survey (OS) provides national mapping services for Great Britain and is a world-leading provider of geospatial solutions. OS location data and expertise has helped governments make smarter decisions, businesses gain valuable data insight, and everyone experience the world outside for over 225 years. We offer fantastic benefits in return for joining us as our Senior Data Engineer: - Competitive salary plus performance-related bonus - Competitive pension - 37 hour working week (with flexible working options) - 25 days annual leave (30 days after five years), bank holidays, and an extra 3 over Christmas - Plus, a suite of excellent additional benefits About the role: Joining our Data Engineering team, our Senior Data Engineer will provide leadership and expertise in developing new techniques, tools and pipelines. You will transform and manipulate geospatial data into forms suitable for visualisation, analysis and further development with your Agile team. Providing insight and experience to inform data-driven decision-making, your focus will be on building, maintaining and troubleshooting data processing systems to meet business requirements; with an emphasis on security, reliability, fault-tolerance, scalability, fidelity, and efficiency. Researching and utilising cutting-edge big data and machine learning technologies to automate and simplify processes, is all part of the role. You’ll be a key team member in one of our development teams, with the primary focus of your day-to-day activities being data manipulation and delivering data pipelines to aid development. What we’re looking for in our ideal Senior Data Engineer: - A degree-level qualification in a quantitative discipline or significant experience in a relevant field, such as statistical modelling or mathematics - Experience in Microsoft Azure Cloud Services (or similar) - Working knowledge of Databricks and Spark - Relevant programming language(s) and databases - Geospatial data processing and analysis techniques - Influencing key business and technical stakeholders, with strong presentation and consultancy skills - Providing technical coaching and mentoring Our growing Technology and Design team plays a key role in ensuring OS is at the cutting edge of geospatial capability and is looking for people to join them. Its mission is to work across the business to provide customer centric design and technology services. If you would like to be part of this, click apply now to be considered as our Senior Data Engineer, we’d love to hear from you! Closing date: Sunday 28th February 2021

Modal Window

  • Home
  • Contact
  • About Us
  • FAQs
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • IT blog
  • Facebook
  • Twitter
  • LinkedIn
  • Youtube
© 2008-2026 IT Job Board