it job board logo
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
  • Recruiting? Post a job
  • Sign in
  • Sign up
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

16 jobs found

Email me jobs like this
Refine Search
Current Search
senior data engineer aws airflow python
ISR Recruitment Ltd
Senior Data Engineer
ISR Recruitment Ltd Manchester, Lancashire
Senior Data Engineer Hybrid-working (Manchester + Home-based) c£60,000 to £75,000 per year (DOE) Plus an excellent company benefits package (including Private Healthcare, Bonuses, Professional Accreditations and Subscriptions, 25 days Annual Leave + Bank Holidays, etc.) The Opportunity: We are supporting a leading IT Consultancy operating at the forefront of digital services and transformation across the UK public sector are seeking an experienced Senior Data Engineer to play a key role in designing and delivering modern, scalable data platforms that support critical national services. Working within collaborative, multi-disciplinary teams, you will take ownership of end-to-end data engineering delivery across greenfield and transformation initiatives. You will influence technical direction, guide engineering best practice, and support the development of high-quality, robust data services that operate at enterprise scale. You will be consulting across modern cloud ecosystems and data technologies, with opportunities to deepen expertise in Python, SQL, cloud-native data tooling, orchestration platforms and streaming technologies across AWS, Azure and GCP. Skills and Experience: Proven experience delivering production-grade data engineering solutions within complex environments Strong Python skills for building, testing and operating scalable data pipelines Experience working with at least one major cloud platform (AWS, Azure or GCP) Strong SQL expertise and experience working with relational databases such as PostgreSQL or Microsoft SQL Server Experience working with NoSQL technologies such as DynamoDB, MongoDB or similar Hands-on Kafka (or equivalent streaming) and workflow orchestration (Airflow) experience Strong understanding of data architecture patterns including data lakes, warehouses and event-driven architectures Experience of consulting across Agile delivery environments, implementing data quality, validation and monitoring frameworks Role and Responsibilities: Lead the design, build and delivery of data platforms and services across the full engineering life cycle Own technical delivery of data pipelines, models and platform components, ensuring solutions are robust, scalable and maintainable Design, develop and deploy ETL/ELT pipelines to ingest, transform and optimise large-scale datasets Build and operate event-driven architectures (Kafka) and orchestrate workflows (Airflow) Apply strong data architecture principles across data lakes, warehouses and event-driven solutions Develop and maintain streaming pipelines using technologies such as Kafka Implement monitoring and observability solutions using tooling such as Prometheus and Grafana Ensure data quality, validation and governance processes are built into engineering workflows Act as a trusted technical advisor to clients and stakeholders (client-facing), translating business requirements into robust engineering solutions Support delivery planning activities, including estimation, risk identification and dependency management Mentor and support other engineers, contributing to a culture of continuous improvement and engineering excellence Applications: Please contact Edward Laing here at ISR to learn more about our client and how they are leading the way in developing the next generation of technical solutions through innovation and transformational technology?
03/04/2026
Full time
Senior Data Engineer Hybrid-working (Manchester + Home-based) c£60,000 to £75,000 per year (DOE) Plus an excellent company benefits package (including Private Healthcare, Bonuses, Professional Accreditations and Subscriptions, 25 days Annual Leave + Bank Holidays, etc.) The Opportunity: We are supporting a leading IT Consultancy operating at the forefront of digital services and transformation across the UK public sector are seeking an experienced Senior Data Engineer to play a key role in designing and delivering modern, scalable data platforms that support critical national services. Working within collaborative, multi-disciplinary teams, you will take ownership of end-to-end data engineering delivery across greenfield and transformation initiatives. You will influence technical direction, guide engineering best practice, and support the development of high-quality, robust data services that operate at enterprise scale. You will be consulting across modern cloud ecosystems and data technologies, with opportunities to deepen expertise in Python, SQL, cloud-native data tooling, orchestration platforms and streaming technologies across AWS, Azure and GCP. Skills and Experience: Proven experience delivering production-grade data engineering solutions within complex environments Strong Python skills for building, testing and operating scalable data pipelines Experience working with at least one major cloud platform (AWS, Azure or GCP) Strong SQL expertise and experience working with relational databases such as PostgreSQL or Microsoft SQL Server Experience working with NoSQL technologies such as DynamoDB, MongoDB or similar Hands-on Kafka (or equivalent streaming) and workflow orchestration (Airflow) experience Strong understanding of data architecture patterns including data lakes, warehouses and event-driven architectures Experience of consulting across Agile delivery environments, implementing data quality, validation and monitoring frameworks Role and Responsibilities: Lead the design, build and delivery of data platforms and services across the full engineering life cycle Own technical delivery of data pipelines, models and platform components, ensuring solutions are robust, scalable and maintainable Design, develop and deploy ETL/ELT pipelines to ingest, transform and optimise large-scale datasets Build and operate event-driven architectures (Kafka) and orchestrate workflows (Airflow) Apply strong data architecture principles across data lakes, warehouses and event-driven solutions Develop and maintain streaming pipelines using technologies such as Kafka Implement monitoring and observability solutions using tooling such as Prometheus and Grafana Ensure data quality, validation and governance processes are built into engineering workflows Act as a trusted technical advisor to clients and stakeholders (client-facing), translating business requirements into robust engineering solutions Support delivery planning activities, including estimation, risk identification and dependency management Mentor and support other engineers, contributing to a culture of continuous improvement and engineering excellence Applications: Please contact Edward Laing here at ISR to learn more about our client and how they are leading the way in developing the next generation of technical solutions through innovation and transformational technology?
Stott and May
Principle Data Engineer
Stott and May
Principal Data Engineer - Hybrid (London/Winchester) We're seeking a hands-on Principal Data Engineer to design and deliver enterprise-scale, cloud-native data platforms that power analytics, reporting, and Real Time decision-making. This is a strategic technical leadership role where you'll shape architecture, mentor engineers, and deliver end-to-end solutions across a modern AWS/Databricks stack. What you'll do Lead the design of scalable, secure data architectures on AWS. Build and optimise ETL/ELT pipelines for batch and streaming data. Deploy and manage Apache Spark jobs on Databricks and Delta Lake. Write production-grade Python and SQL for large-scale data transformations. Drive data quality, governance, and automation through CI/CD and IaC. Collaborate with data scientists, analysts, and business stakeholders. Mentor and guide data engineering teams. What we're looking for Proven experience in senior/principal data engineering roles. Expertise in AWS, Databricks, Apache Spark, Python, and SQL. Strong background in cloud-native data platforms, Real Time processing, and data lakes. Hands-on experience with tools such as Airflow, Kafka, Docker, GitLab CI/CD. Excellent stakeholder engagement and leadership skills. What's on offer £84000 salary + 10% bonus 6% pension contribution Private medical & flexible benefits package 25 days annual leave (plus buy/sell options) Hybrid working - travel to London or Winchester once/twice per week Join a company at the forefront of media, connectivity, and smart technology, where your work directly powers millions of daily connections across the UK.
06/10/2025
Full time
Principal Data Engineer - Hybrid (London/Winchester) We're seeking a hands-on Principal Data Engineer to design and deliver enterprise-scale, cloud-native data platforms that power analytics, reporting, and Real Time decision-making. This is a strategic technical leadership role where you'll shape architecture, mentor engineers, and deliver end-to-end solutions across a modern AWS/Databricks stack. What you'll do Lead the design of scalable, secure data architectures on AWS. Build and optimise ETL/ELT pipelines for batch and streaming data. Deploy and manage Apache Spark jobs on Databricks and Delta Lake. Write production-grade Python and SQL for large-scale data transformations. Drive data quality, governance, and automation through CI/CD and IaC. Collaborate with data scientists, analysts, and business stakeholders. Mentor and guide data engineering teams. What we're looking for Proven experience in senior/principal data engineering roles. Expertise in AWS, Databricks, Apache Spark, Python, and SQL. Strong background in cloud-native data platforms, Real Time processing, and data lakes. Hands-on experience with tools such as Airflow, Kafka, Docker, GitLab CI/CD. Excellent stakeholder engagement and leadership skills. What's on offer £84000 salary + 10% bonus 6% pension contribution Private medical & flexible benefits package 25 days annual leave (plus buy/sell options) Hybrid working - travel to London or Winchester once/twice per week Join a company at the forefront of media, connectivity, and smart technology, where your work directly powers millions of daily connections across the UK.
Joseph Harry Ltd
Python Software Engineer AWS Java Data Finance London
Joseph Harry Ltd
Senior Python Software Engineer (Senior Architecture Programmer Developer Python Java Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You MUST have the following: Advanced ability as a Senior Python Software Engineer/Technical Lead/Solutions Architect/Principal Engineer Good design and architecture ability Java Three or more of the following: Iceberg Dremio DBT Arrow Snowflake Glue Athena Airflow Agile The following is DESIRABLE, not essential: Trading, Front Office finance Spark Buy-side asset management (hedge fund, asset manager, investment management) Role: Senior Python Software Engineer (Senior Architecture Programmer Developer Python Java Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund Snowflake) required by my asset management client in London. You will join a relatively new department that is responsible for the data used across the Front Office. The data is sourced from a variety of external vendors and internal departments and held in an AWS data lake, although this is being migrated to a data mesh architecture. They are working heavily with Python, Java and AWS. If you have any experience in Iceberg, Dremio, DBT, Arrow, Spark, Snowflake, Glue, Athena, Airflow or related tools, this would also be very advantageous. You will join a team of 5 that are responsible for pricing data for the Front Office. This is a senior role in the team and will demand a strong ability in design and architecture. If you come in at the right level, you could be the deputy for the team manager. They have a very flexible hybrid working set up. Salary: £120-150k + 15% Bonus + 10% Pension
03/10/2025
Full time
Senior Python Software Engineer (Senior Architecture Programmer Developer Python Java Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You MUST have the following: Advanced ability as a Senior Python Software Engineer/Technical Lead/Solutions Architect/Principal Engineer Good design and architecture ability Java Three or more of the following: Iceberg Dremio DBT Arrow Snowflake Glue Athena Airflow Agile The following is DESIRABLE, not essential: Trading, Front Office finance Spark Buy-side asset management (hedge fund, asset manager, investment management) Role: Senior Python Software Engineer (Senior Architecture Programmer Developer Python Java Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund Snowflake) required by my asset management client in London. You will join a relatively new department that is responsible for the data used across the Front Office. The data is sourced from a variety of external vendors and internal departments and held in an AWS data lake, although this is being migrated to a data mesh architecture. They are working heavily with Python, Java and AWS. If you have any experience in Iceberg, Dremio, DBT, Arrow, Spark, Snowflake, Glue, Athena, Airflow or related tools, this would also be very advantageous. You will join a team of 5 that are responsible for pricing data for the Front Office. This is a senior role in the team and will demand a strong ability in design and architecture. If you come in at the right level, you could be the deputy for the team manager. They have a very flexible hybrid working set up. Salary: £120-150k + 15% Bonus + 10% Pension
Joseph Harry Ltd
Senior Java Software Engineer AWS Python Data Finance London
Joseph Harry Ltd
Senior Java Software Engineer (Senior Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You MUST have the following: Advanced ability as a Senior Java Software Engineer/Technical Lead/Solutions Architect/Principal Engineer Good design and architecture ability Python Three or more of the following: Iceberg Dremio DBT Arrow Snowflake Glue Athena Airflow Agile The following is DESIRABLE, not essential: Trading, Front Office finance Spark Buy-side asset management (hedge fund, asset manager, investment management) Role: Senior Java Software Engineer (Senior Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund Snowflake) required by my asset management client in London. You will join a relatively new department that is responsible for the data used across the Front Office. The data is sourced from a variety of external vendors and internal departments and held in an AWS data lake, although this is being migrated to a data mesh architecture. They are working heavily with Java, Python and AWS. If you have any experience in Iceberg, Dremio, DBT, Arrow, Spark, Snowflake, Glue, Athena, Airflow or related tools, this would also be very advantageous. You will join a team of 5 that are responsible for pricing data for the Front Office. This is a senior role in the team and will demand a strong ability in design and architecture. If you come in at the right level, you could be the deputy for the team manager. They have a very flexible hybrid working set up. Salary: £90-120k + 15% Bonus + 10% Pension
03/10/2025
Full time
Senior Java Software Engineer (Senior Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You MUST have the following: Advanced ability as a Senior Java Software Engineer/Technical Lead/Solutions Architect/Principal Engineer Good design and architecture ability Python Three or more of the following: Iceberg Dremio DBT Arrow Snowflake Glue Athena Airflow Agile The following is DESIRABLE, not essential: Trading, Front Office finance Spark Buy-side asset management (hedge fund, asset manager, investment management) Role: Senior Java Software Engineer (Senior Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund Snowflake) required by my asset management client in London. You will join a relatively new department that is responsible for the data used across the Front Office. The data is sourced from a variety of external vendors and internal departments and held in an AWS data lake, although this is being migrated to a data mesh architecture. They are working heavily with Java, Python and AWS. If you have any experience in Iceberg, Dremio, DBT, Arrow, Spark, Snowflake, Glue, Athena, Airflow or related tools, this would also be very advantageous. You will join a team of 5 that are responsible for pricing data for the Front Office. This is a senior role in the team and will demand a strong ability in design and architecture. If you come in at the right level, you could be the deputy for the team manager. They have a very flexible hybrid working set up. Salary: £90-120k + 15% Bonus + 10% Pension
Joseph Harry Ltd
Senior Java Software Engineer AWS Python Data Finance London
Joseph Harry Ltd
Senior Java Software Engineer (Senior Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You MUST have the following: Advanced ability as a Senior Java Software Engineer/Technical Lead/Solutions Architect/Principal Engineer Good design and architecture ability Python Three or more of the following: Iceberg Dremio DBT Arrow Snowflake Glue Athena Airflow Agile The following is DESIRABLE, not essential: Trading, Front Office finance Spark Buy-side asset management (hedge fund, asset manager, investment management) Role: Senior Java Software Engineer (Senior Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund Snowflake) required by my asset management client in London. You will join a relatively new department that is responsible for the data used across the Front Office. The data is sourced from a variety of external vendors and internal departments and held in an AWS data lake, although this is being migrated to a data mesh architecture. They are working heavily with Java, Python and AWS. If you have any experience in Iceberg, Dremio, DBT, Arrow, Spark, Snowflake, Glue, Athena, Airflow or related tools, this would also be very advantageous. You will join a team of 5 that are responsible for pricing data for the Front Office. This is a senior role in the team and will demand a strong ability in design and architecture. If you come in at the right level, you could be the deputy for the team manager. They have a very flexible hybrid working set up. Salary: £120-150k + 15% Bonus + 10% Pension
03/10/2025
Full time
Senior Java Software Engineer (Senior Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You MUST have the following: Advanced ability as a Senior Java Software Engineer/Technical Lead/Solutions Architect/Principal Engineer Good design and architecture ability Python Three or more of the following: Iceberg Dremio DBT Arrow Snowflake Glue Athena Airflow Agile The following is DESIRABLE, not essential: Trading, Front Office finance Spark Buy-side asset management (hedge fund, asset manager, investment management) Role: Senior Java Software Engineer (Senior Architecture Programmer Developer Java Python Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund Snowflake) required by my asset management client in London. You will join a relatively new department that is responsible for the data used across the Front Office. The data is sourced from a variety of external vendors and internal departments and held in an AWS data lake, although this is being migrated to a data mesh architecture. They are working heavily with Java, Python and AWS. If you have any experience in Iceberg, Dremio, DBT, Arrow, Spark, Snowflake, Glue, Athena, Airflow or related tools, this would also be very advantageous. You will join a team of 5 that are responsible for pricing data for the Front Office. This is a senior role in the team and will demand a strong ability in design and architecture. If you come in at the right level, you could be the deputy for the team manager. They have a very flexible hybrid working set up. Salary: £120-150k + 15% Bonus + 10% Pension
Joseph Harry Ltd
Python Software Engineer AWS Java Data Finance London
Joseph Harry Ltd
Senior Python Software Engineer (Senior Architecture Programmer Developer Python Java Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You MUST have the following: Advanced ability as a Senior Python Software Engineer/Technical Lead/Solutions Architect/Principal Engineer Good design and architecture ability Java Three or more of the following: Iceberg Dremio DBT Arrow Snowflake Glue Athena Airflow Agile The following is DESIRABLE, not essential: Trading, Front Office finance Spark Buy-side asset management (hedge fund, asset manager, investment management) Role: Senior Python Software Engineer (Senior Architecture Programmer Developer Python Java Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund Snowflake) required by my asset management client in London. You will join a relatively new department that is responsible for the data used across the Front Office. The data is sourced from a variety of external vendors and internal departments and held in an AWS data lake, although this is being migrated to a data mesh architecture. They are working heavily with Python, Java and AWS. If you have any experience in Iceberg, Dremio, DBT, Arrow, Spark, Snowflake, Glue, Athena, Airflow or related tools, this would also be very advantageous. You will join a team of 5 that are responsible for pricing data for the Front Office. This is a senior role in the team and will demand a strong ability in design and architecture. If you come in at the right level, you could be the deputy for the team manager. They have a very flexible hybrid working set up. Duration: 12-24 months Rate: £600-900/day
03/10/2025
Contractor
Senior Python Software Engineer (Senior Architecture Programmer Developer Python Java Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You MUST have the following: Advanced ability as a Senior Python Software Engineer/Technical Lead/Solutions Architect/Principal Engineer Good design and architecture ability Java Three or more of the following: Iceberg Dremio DBT Arrow Snowflake Glue Athena Airflow Agile The following is DESIRABLE, not essential: Trading, Front Office finance Spark Buy-side asset management (hedge fund, asset manager, investment management) Role: Senior Python Software Engineer (Senior Architecture Programmer Developer Python Java Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund Snowflake) required by my asset management client in London. You will join a relatively new department that is responsible for the data used across the Front Office. The data is sourced from a variety of external vendors and internal departments and held in an AWS data lake, although this is being migrated to a data mesh architecture. They are working heavily with Python, Java and AWS. If you have any experience in Iceberg, Dremio, DBT, Arrow, Spark, Snowflake, Glue, Athena, Airflow or related tools, this would also be very advantageous. You will join a team of 5 that are responsible for pricing data for the Front Office. This is a senior role in the team and will demand a strong ability in design and architecture. If you come in at the right level, you could be the deputy for the team manager. They have a very flexible hybrid working set up. Duration: 12-24 months Rate: £600-900/day
Joseph Harry Ltd
Python Software Engineer AWS Java Data Finance London
Joseph Harry Ltd
Senior Python Software Engineer (Senior Architecture Programmer Developer Python Java Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You MUST have the following: Advanced ability as a Senior Python Software Engineer/Technical Lead/Solutions Architect/Principal Engineer Good design and architecture ability Java Three or more of the following: Iceberg Dremio DBT Arrow Snowflake Glue Athena Airflow Agile The following is DESIRABLE, not essential: Trading, Front Office finance Spark Buy-side asset management (hedge fund, asset manager, investment management) Role: Senior Python Software Engineer (Senior Architecture Programmer Developer Python Java Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund Snowflake) required by my asset management client in London. You will join a relatively new department that is responsible for the data used across the Front Office. The data is sourced from a variety of external vendors and internal departments and held in an AWS data lake, although this is being migrated to a data mesh architecture. They are working heavily with Python, Java and AWS. If you have any experience in Iceberg, Dremio, DBT, Arrow, Spark, Snowflake, Glue, Athena, Airflow or related tools, this would also be very advantageous. You will join a team of 5 that are responsible for pricing data for the Front Office. This is a senior role in the team and will demand a strong ability in design and architecture. If you come in at the right level, you could be the deputy for the team manager. They have a very flexible hybrid working set up. Salary: £90-120k + 15% Bonus + 10% Pension
03/10/2025
Full time
Senior Python Software Engineer (Senior Architecture Programmer Developer Python Java Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund) required by my asset management client in London. You MUST have the following: Advanced ability as a Senior Python Software Engineer/Technical Lead/Solutions Architect/Principal Engineer Good design and architecture ability Java Three or more of the following: Iceberg Dremio DBT Arrow Snowflake Glue Athena Airflow Agile The following is DESIRABLE, not essential: Trading, Front Office finance Spark Buy-side asset management (hedge fund, asset manager, investment management) Role: Senior Python Software Engineer (Senior Architecture Programmer Developer Python Java Software Engineer Data Enterprise Engineering Developer Programmer AWS Python Athena Glue Airflow Ignite JavaScript Agile Pandas NumPy SciPy Spark Dremio Snowflake Apache Iceburg Iceberg Arrow DBT gRPC protobuf TypeScript Finance Trading Front Office Investment Banking Asset Manager Financial Services FX Fixed Income Equities Commodities Derivatives Hedge Fund Snowflake) required by my asset management client in London. You will join a relatively new department that is responsible for the data used across the Front Office. The data is sourced from a variety of external vendors and internal departments and held in an AWS data lake, although this is being migrated to a data mesh architecture. They are working heavily with Python, Java and AWS. If you have any experience in Iceberg, Dremio, DBT, Arrow, Spark, Snowflake, Glue, Athena, Airflow or related tools, this would also be very advantageous. You will join a team of 5 that are responsible for pricing data for the Front Office. This is a senior role in the team and will demand a strong ability in design and architecture. If you come in at the right level, you could be the deputy for the team manager. They have a very flexible hybrid working set up. Salary: £90-120k + 15% Bonus + 10% Pension
Principal ML Engineer
Method-Resourcing
Senior Machine Learning Engineer - Behavioural Modeling & Threat Detection - £150,000 - £180,000 - Fully Remote UK BASED CANDIDATES ONLY My client is looking for an experienced Machine Learning Engineer ready to play a pivotal role in shaping the technical direction of their behavioural modelling and threat detection systems. This position offers the opportunity to influence not just their engineering roadmap, but how they fundamentally approach solving complex, real-world security challenges with data. You'll work at the intersection of data science, ML infrastructure, and product innovation, leading efforts to build and evolve ML-driven capabilities, while also ensuring the reliability and scalability of their models in production environments. What You'll Do Spearhead the design and refinement of machine learning models focused on understanding behaviour patterns and identifying cybersecurity anomalies. Partner with product, engineering, and domain experts to translate strategic goals and customer needs into practical, scalable ML solutions. Drive model development end-to-end, from exploratory analysis, feature design, and prototyping to validation and deployment. Collaborate with platform and infra teams to operationalize models and ship ML-powered features into production. Continuously assess and iterate on production models, balancing long-term ML strategy with tactical improvements. Champion code quality, observability, and resilience within their ML systems through reviews and hands-on contributions. Help shape their internal ML standards and practices, ensuring they stay ahead of industry advancements. Offer technical mentorship and be a thought partner to colleagues across data, ML, and engineering disciplines. What We're Looking For Hands-on experience in developing and deploying machine learning models at scale. Deep familiarity with core ML concepts (classification, time-series, statistical modeling) and their real-world tradeoffs. Fluency in Python and commonly used ML libraries (e.g. pandas, scikit-learn; experience with PyTorch or TensorFlow is a plus). Experience with model lifecycle management (MLOps), including monitoring, retraining, and model versioning. Ability to work across data infrastructure, from SQL to large-scale distributed data tools (Spark, etc.). Strong written and verbal communication skills, especially in cross-functional contexts. Bonus Experience (Nice to Have) Exposure to large language models (LLMs) or foundational model adaptation. Previous work in cybersecurity, anomaly detection, or behavioural analytics. Familiarity with orchestration frameworks (Airflow or similar). Experience with scalable ML systems, pipelines, or real-time data processing. Advanced degree or equivalent experience in ML/AI research or applied science. Cloud platform proficiency (AWS, GCP, Azure). If this sounds like something you would be interested in, please apply with your latest CV, a number to reach you on and I will be in touch. Alternatively, you can email me at . RSG Plc is acting as an Employment Agency in relation to this vacancy.
02/10/2025
Full time
Senior Machine Learning Engineer - Behavioural Modeling & Threat Detection - £150,000 - £180,000 - Fully Remote UK BASED CANDIDATES ONLY My client is looking for an experienced Machine Learning Engineer ready to play a pivotal role in shaping the technical direction of their behavioural modelling and threat detection systems. This position offers the opportunity to influence not just their engineering roadmap, but how they fundamentally approach solving complex, real-world security challenges with data. You'll work at the intersection of data science, ML infrastructure, and product innovation, leading efforts to build and evolve ML-driven capabilities, while also ensuring the reliability and scalability of their models in production environments. What You'll Do Spearhead the design and refinement of machine learning models focused on understanding behaviour patterns and identifying cybersecurity anomalies. Partner with product, engineering, and domain experts to translate strategic goals and customer needs into practical, scalable ML solutions. Drive model development end-to-end, from exploratory analysis, feature design, and prototyping to validation and deployment. Collaborate with platform and infra teams to operationalize models and ship ML-powered features into production. Continuously assess and iterate on production models, balancing long-term ML strategy with tactical improvements. Champion code quality, observability, and resilience within their ML systems through reviews and hands-on contributions. Help shape their internal ML standards and practices, ensuring they stay ahead of industry advancements. Offer technical mentorship and be a thought partner to colleagues across data, ML, and engineering disciplines. What We're Looking For Hands-on experience in developing and deploying machine learning models at scale. Deep familiarity with core ML concepts (classification, time-series, statistical modeling) and their real-world tradeoffs. Fluency in Python and commonly used ML libraries (e.g. pandas, scikit-learn; experience with PyTorch or TensorFlow is a plus). Experience with model lifecycle management (MLOps), including monitoring, retraining, and model versioning. Ability to work across data infrastructure, from SQL to large-scale distributed data tools (Spark, etc.). Strong written and verbal communication skills, especially in cross-functional contexts. Bonus Experience (Nice to Have) Exposure to large language models (LLMs) or foundational model adaptation. Previous work in cybersecurity, anomaly detection, or behavioural analytics. Familiarity with orchestration frameworks (Airflow or similar). Experience with scalable ML systems, pipelines, or real-time data processing. Advanced degree or equivalent experience in ML/AI research or applied science. Cloud platform proficiency (AWS, GCP, Azure). If this sounds like something you would be interested in, please apply with your latest CV, a number to reach you on and I will be in touch. Alternatively, you can email me at . RSG Plc is acting as an Employment Agency in relation to this vacancy.
BBC
Data Engineer Mid Senior Glasgow
BBC
Job Introduction Are you looking for a role that builds on your experience with engineering data systems? Are you keen to learn, grow, and develop? Do you want to work in a nurturing culture where you can flourish and be your best? Would you like to work on services that are used by key BBC products and millions of people each day? If so, this opportunity could be for you. We are currently looking for talented Data Engineers to join our data teams here in the BBC. We are looking for talent at all levels that have a passion for engineering software with a focus on managing the huge amounts of data we manage and the systems that support it all. We currently have roles open in London, Manchester and Glasgow working in a hybrid way. The BBC's world-class online products (iPlayer, News, Sport, Sounds, Bitesize and many others) reach millions of audience members every week and create billions of rows of data per day. Our data teams enable our audience to receive personalised experiences and enable us to understand our audience better. Data is crucial to our ambitions to making a tailored BBC for everyone. Some of the key benefits you'll get from working in this role are: •A variety of challenging work - our data teams work on a wide variety of different products and services. We create and maintain numerous data services that scale to petabytes of data. •Unrivalled training and development opportunities - we operate a people first culture and pride ourselves on your development. Our in-house Academy hosts a wide range of internal and external courses and certification. We value our engineers, offering regular training and development opportunities, as well as '10% time' - allowing dedicated time for self-improvement, learning and innovation. •Excellent career progression - the BBC offers great opportunities for employees to seek new challenges and work in different areas of the organisation. •Benefits - We offer a competitive salary package, a flexible 35-hour working week for work-life balance and 26 days holiday with the option to buy an extra 5 days, a defined pension scheme and discounted dental, health care, gym and much more. •Working with cutting edge technology - We are constantly looking to leverage new technologies to make our systems more effective. Role Responsibility You will be a passionate engineer with a background in either software or data engineering and be keen to enhance your skills in data services and systems. You will be working within one of our Agile development teams to deliver new products, product improvements and enhance our technologies. You will be supported by your team and leaders to deliver value to our audience with an eye on quality, scale, and security. As a data engineer you will be responsible for helping maintain pipelines for ingest, processing and summarisation of data, as well as our improving our recommendations engines using machine learning. We don't expect you to have experience in all of these but below gives you examples of the technologies and practices we value: •Good communication skills - a great candidate will be able to talk to other developers, to non-developers, and is happy to communicate with people remotely across multiple BBC sites •The ability to question the way we work, and the tools and processes we use - we're always aiming to make our team the best it can be •An enthusiasm for writing clean, well-documented, and testable code •Curious and embraces change - we're always learning new technologies and requirements often change, you'll enjoy this challenge •Goal oriented - you'll enjoy finishing the job by developing the final details Are you the right candidate? The Ideal Candidate You don't need you to be an expert in all these areas, only some of them. If you have a base understanding of the areas and their underlying principles, you will shine. So don't feel that you can't apply if you don't have all these skills. After all, you'll work with, and be mentored by, a friendly development team, and the BBC will provide many opportunities for learning as you progress. •Knowledge of Big Data pipelines and processing technologies such as Hadoop, Spark and Beam •Experience with SQL and relational DBs •Familiarity with Data Warehousing technologies (such as Redshift / Big Query / Snowflake) •Experience of OO programming languages such as Java 8+ or Python. •Experience of cloud computing (preferably AWS or GCP) •Experience of Containerisation e.g. Docker / Kubernetes •Experience of infrastructure as code e.g. Terraform •Experience of Pipeline orchestration e.g. Airflow or Jenkins •Software testing practices including unit testing frameworks such as JUnit and/or Mockito •Version control systems such as git Package Description Band: D Contract type: Permanent Location: London / Salford / Glasgow We're happy to discuss flexible working. Please indicate your choice under the flexible working question in the application . There is no obligation to raise this at the application stage but if you wish to do so, you are welcome to. Flexible working will be part of the discussion at offer stage. Excellent career progression - the BBC offers great opportunities for employees to seek new challenges and work in different areas of the organisation. Unrivalled training and development opportunities - our in-house Academy hosts a wide range of internal and external courses and certification. Benefits - We offer a competitive salary package, a flexible 35-hour working week for work-life balance and 26 days (1 of which is a corporation day) with the option to buy an extra 5 days, a defined pension scheme and discounted dental, health care, gym and much more. The situation regarding the coronavirus outbreak is developing quickly and the BBC is keen to continue to ensure the safety and wellbeing of people across the BBC, while continuing to protect our services. To reduce the risk access to BBC buildings is limited to those essential to our broadcast output. From Wednesday 18 th March until further notice all assessments and interviews will be conducted remotely. For more information go to Mae'r sefyllfa gyda'r coronafeirws yn datblygu'n gyflym, ac mae'r BBC yn awyddus i barhau i sicrhau diogelwch a lles pobl ar draws y BBC, gan barhau i warchod ein gwasanaethau hefyd. I leihau'r risg, dim ond y bobl sy'n hanfodol i'n hallbwn darlledu fydd yn cael mynediad i adeiladau'r BBC. O ddydd Mercher 18 fed Mawrth ymlaen, bydd pob asesiad a chyfweliad yn cael ei gynnal o bell, nes rhoddir gwybod yn wahanol. I gael mwy o wybodaeth, ewch i
23/09/2022
Full time
Job Introduction Are you looking for a role that builds on your experience with engineering data systems? Are you keen to learn, grow, and develop? Do you want to work in a nurturing culture where you can flourish and be your best? Would you like to work on services that are used by key BBC products and millions of people each day? If so, this opportunity could be for you. We are currently looking for talented Data Engineers to join our data teams here in the BBC. We are looking for talent at all levels that have a passion for engineering software with a focus on managing the huge amounts of data we manage and the systems that support it all. We currently have roles open in London, Manchester and Glasgow working in a hybrid way. The BBC's world-class online products (iPlayer, News, Sport, Sounds, Bitesize and many others) reach millions of audience members every week and create billions of rows of data per day. Our data teams enable our audience to receive personalised experiences and enable us to understand our audience better. Data is crucial to our ambitions to making a tailored BBC for everyone. Some of the key benefits you'll get from working in this role are: •A variety of challenging work - our data teams work on a wide variety of different products and services. We create and maintain numerous data services that scale to petabytes of data. •Unrivalled training and development opportunities - we operate a people first culture and pride ourselves on your development. Our in-house Academy hosts a wide range of internal and external courses and certification. We value our engineers, offering regular training and development opportunities, as well as '10% time' - allowing dedicated time for self-improvement, learning and innovation. •Excellent career progression - the BBC offers great opportunities for employees to seek new challenges and work in different areas of the organisation. •Benefits - We offer a competitive salary package, a flexible 35-hour working week for work-life balance and 26 days holiday with the option to buy an extra 5 days, a defined pension scheme and discounted dental, health care, gym and much more. •Working with cutting edge technology - We are constantly looking to leverage new technologies to make our systems more effective. Role Responsibility You will be a passionate engineer with a background in either software or data engineering and be keen to enhance your skills in data services and systems. You will be working within one of our Agile development teams to deliver new products, product improvements and enhance our technologies. You will be supported by your team and leaders to deliver value to our audience with an eye on quality, scale, and security. As a data engineer you will be responsible for helping maintain pipelines for ingest, processing and summarisation of data, as well as our improving our recommendations engines using machine learning. We don't expect you to have experience in all of these but below gives you examples of the technologies and practices we value: •Good communication skills - a great candidate will be able to talk to other developers, to non-developers, and is happy to communicate with people remotely across multiple BBC sites •The ability to question the way we work, and the tools and processes we use - we're always aiming to make our team the best it can be •An enthusiasm for writing clean, well-documented, and testable code •Curious and embraces change - we're always learning new technologies and requirements often change, you'll enjoy this challenge •Goal oriented - you'll enjoy finishing the job by developing the final details Are you the right candidate? The Ideal Candidate You don't need you to be an expert in all these areas, only some of them. If you have a base understanding of the areas and their underlying principles, you will shine. So don't feel that you can't apply if you don't have all these skills. After all, you'll work with, and be mentored by, a friendly development team, and the BBC will provide many opportunities for learning as you progress. •Knowledge of Big Data pipelines and processing technologies such as Hadoop, Spark and Beam •Experience with SQL and relational DBs •Familiarity with Data Warehousing technologies (such as Redshift / Big Query / Snowflake) •Experience of OO programming languages such as Java 8+ or Python. •Experience of cloud computing (preferably AWS or GCP) •Experience of Containerisation e.g. Docker / Kubernetes •Experience of infrastructure as code e.g. Terraform •Experience of Pipeline orchestration e.g. Airflow or Jenkins •Software testing practices including unit testing frameworks such as JUnit and/or Mockito •Version control systems such as git Package Description Band: D Contract type: Permanent Location: London / Salford / Glasgow We're happy to discuss flexible working. Please indicate your choice under the flexible working question in the application . There is no obligation to raise this at the application stage but if you wish to do so, you are welcome to. Flexible working will be part of the discussion at offer stage. Excellent career progression - the BBC offers great opportunities for employees to seek new challenges and work in different areas of the organisation. Unrivalled training and development opportunities - our in-house Academy hosts a wide range of internal and external courses and certification. Benefits - We offer a competitive salary package, a flexible 35-hour working week for work-life balance and 26 days (1 of which is a corporation day) with the option to buy an extra 5 days, a defined pension scheme and discounted dental, health care, gym and much more. The situation regarding the coronavirus outbreak is developing quickly and the BBC is keen to continue to ensure the safety and wellbeing of people across the BBC, while continuing to protect our services. To reduce the risk access to BBC buildings is limited to those essential to our broadcast output. From Wednesday 18 th March until further notice all assessments and interviews will be conducted remotely. For more information go to Mae'r sefyllfa gyda'r coronafeirws yn datblygu'n gyflym, ac mae'r BBC yn awyddus i barhau i sicrhau diogelwch a lles pobl ar draws y BBC, gan barhau i warchod ein gwasanaethau hefyd. I leihau'r risg, dim ond y bobl sy'n hanfodol i'n hallbwn darlledu fydd yn cael mynediad i adeiladau'r BBC. O ddydd Mercher 18 fed Mawrth ymlaen, bydd pob asesiad a chyfweliad yn cael ei gynnal o bell, nes rhoddir gwybod yn wahanol. I gael mwy o wybodaeth, ewch i
BBC
Lead MLOps Engineer
BBC
Job Introduction BBC R&D has recently established an Automation Applied Research Area focussed on the use of Machine Learning across the BBC. Automation works closely with other BBC R&D Applied Research Areas, BBC Product and Technology Groups and senior business stakeholders across the BBC to accelerate Machine Learning based innovation. Reporting to the Head of Automation, this role will lead a team of experts exploring the ML platforms, tools, performance, and sustainability that will underpin the BBC's approach to Machine Learning innovation. It will ensure that best practice and correct technology choices are downstreamed into R&D ML applications as well as supporting the wider BBC in making the right strategic decisions for its future ML technology. BBC R&D has five applied research areas focussed on Audiences, Automation, Distribution, Infrastructure and Production who are looking to solve some of the most interesting challenges in Media and Broadcasting; as well as our Commercial, Partnerships & Engagement team who ensure we're collaborating with the right external partners and optimising commercial returns through the exploitation of our Intellectual Property and grant funding. Our work supports the BBC's current ambition as well as informing future strategy. If you're excited by the prospect of working in an innovative environment with smart and supportive colleagues, then BBC R&D is the place for you. Role Responsibility This is a hands-on role. Your key responsibilities will be: Build and lead a team of ML engineers to develop an infrastructure to manage ML lifecycle through experimentation, deployment, and testing. Own the Automation MLOps strategy, roadmap, and backlog. Provide leadership and guidance on the delivery of ML models from prototypes to production, mentor and coach team members on ML engineering best practises; work alongside researchers to enable BBC to benefit more rapidly from fundamental ML research. Contribute to the design of ML systems and infrastructure to shape how ML is used across the BBC. Develop relationships with pan-BBC and external contributors and stakeholders. You will need to bring to life long-term ambitions to secure required support and buy-in for tangible and intangible benefits and outcomes. Focus on ensuring our ML technology delivers on performance, cost and sustainability goals and is supportive of the BBC's responsible and ethical ML objectives. Work with our Technology Strategy and Governance team to identify and communicate strategic investment decisions required to mature the BBC's ML technology in line with business needs. Are you the right candidate? Solid understanding of machine learning concepts and algorithms Experience deploying machine learning solutions Expert knowledge of Python programming and machine learning libraries (Scikit-learn, TensorFlow, Keras, PyTorch, MxNet, etc.) Experience implementing ML automation, MLOps (scalable deployment practices aimed to deploy and maintain machine learning models in production reliably and efficiently) and related tools (e.g., MLflow, Kubeflow, Airflow, Sagemaker) Experience working in accordance with DevOps principles, and with industry deployment best practices using CI/CD tools and infrastructure as code (e.g., Docker, Kubernetes, Terraform) Experience in at least one cloud platform (e.g., AWS, GCP, Azure) and associated machine learning services, e.g., Amazon SageMaker, Azure ML, Databricks. Package Description Band: E Contract type: Permanent - Full time Location: UK wide We're happy to discuss flexible working. Please indicate your choice under the flexible working question in the application . There is no obligation to raise this at the application stage but if you wish to do so, you are welcome to. Flexible working will be part of the discussion at offer stage. Excellent career progression - the BBC offers great opportunities for employees to seek new challenges and work in different areas of the organisation. Unrivalled training and development opportunities - our in-house Academy hosts a wide range of internal and external courses and certification. Benefits - We offer a competitive salary package, a flexible 35-hour working week for work-life balance and 26 days (1 of which is a corporation day) with the option to buy an extra 5 days, a defined pension scheme and discounted dental, health care, gym and much more. The situation regarding the coronavirus outbreak is developing quickly and the BBC is keen to continue to ensure the safety and wellbeing of people across the BBC, while continuing to protect our services. To reduce the risk access to BBC buildings is limited to those essential to our broadcast output. From Wednesday 18 th March until further notice all assessments and interviews will be conducted remotely. For more information go to Mae'r sefyllfa gyda'r coronafeirws yn datblygu'n gyflym, ac mae'r BBC yn awyddus i barhau i sicrhau diogelwch a lles pobl ar draws y BBC, gan barhau i warchod ein gwasanaethau hefyd. I leihau'r risg, dim ond y bobl sy'n hanfodol i'n hallbwn darlledu fydd yn cael mynediad i adeiladau'r BBC. O ddydd Mercher 18 fed Mawrth ymlaen, bydd pob asesiad a chyfweliad yn cael ei gynnal o bell, nes rhoddir gwybod yn wahanol. I gael mwy o wybodaeth, ewch i About the BBC We don't focus simply on what we do - we also care how we do it. Our values and the way we behave are important to us. Please make sure you've read about our values and behaviours in the document attached below. Diversity matters at the BBC. We have a working environment where we value and respect every individual's unique contribution, enabling all of our employees to thrive and achieve their full potential. We want to attract the broadest range of talented people to be part of the BBC - whether that's to contribute to our programming or our wide range of non-production roles. The more diverse our workforce, the better able we are to respond to and reflect our audiences in all their diversity. We are committed to equality of opportunity and welcome applications from individuals, regardless of age, gender, ethnicity, disability, sexual orientation, gender identity, socio-economic background, religion and/or belief. We will consider flexible working requests for all roles, unless operational requirements prevent otherwise. To find out more about Diversity and Inclusion at the BBC, please click here
23/09/2022
Full time
Job Introduction BBC R&D has recently established an Automation Applied Research Area focussed on the use of Machine Learning across the BBC. Automation works closely with other BBC R&D Applied Research Areas, BBC Product and Technology Groups and senior business stakeholders across the BBC to accelerate Machine Learning based innovation. Reporting to the Head of Automation, this role will lead a team of experts exploring the ML platforms, tools, performance, and sustainability that will underpin the BBC's approach to Machine Learning innovation. It will ensure that best practice and correct technology choices are downstreamed into R&D ML applications as well as supporting the wider BBC in making the right strategic decisions for its future ML technology. BBC R&D has five applied research areas focussed on Audiences, Automation, Distribution, Infrastructure and Production who are looking to solve some of the most interesting challenges in Media and Broadcasting; as well as our Commercial, Partnerships & Engagement team who ensure we're collaborating with the right external partners and optimising commercial returns through the exploitation of our Intellectual Property and grant funding. Our work supports the BBC's current ambition as well as informing future strategy. If you're excited by the prospect of working in an innovative environment with smart and supportive colleagues, then BBC R&D is the place for you. Role Responsibility This is a hands-on role. Your key responsibilities will be: Build and lead a team of ML engineers to develop an infrastructure to manage ML lifecycle through experimentation, deployment, and testing. Own the Automation MLOps strategy, roadmap, and backlog. Provide leadership and guidance on the delivery of ML models from prototypes to production, mentor and coach team members on ML engineering best practises; work alongside researchers to enable BBC to benefit more rapidly from fundamental ML research. Contribute to the design of ML systems and infrastructure to shape how ML is used across the BBC. Develop relationships with pan-BBC and external contributors and stakeholders. You will need to bring to life long-term ambitions to secure required support and buy-in for tangible and intangible benefits and outcomes. Focus on ensuring our ML technology delivers on performance, cost and sustainability goals and is supportive of the BBC's responsible and ethical ML objectives. Work with our Technology Strategy and Governance team to identify and communicate strategic investment decisions required to mature the BBC's ML technology in line with business needs. Are you the right candidate? Solid understanding of machine learning concepts and algorithms Experience deploying machine learning solutions Expert knowledge of Python programming and machine learning libraries (Scikit-learn, TensorFlow, Keras, PyTorch, MxNet, etc.) Experience implementing ML automation, MLOps (scalable deployment practices aimed to deploy and maintain machine learning models in production reliably and efficiently) and related tools (e.g., MLflow, Kubeflow, Airflow, Sagemaker) Experience working in accordance with DevOps principles, and with industry deployment best practices using CI/CD tools and infrastructure as code (e.g., Docker, Kubernetes, Terraform) Experience in at least one cloud platform (e.g., AWS, GCP, Azure) and associated machine learning services, e.g., Amazon SageMaker, Azure ML, Databricks. Package Description Band: E Contract type: Permanent - Full time Location: UK wide We're happy to discuss flexible working. Please indicate your choice under the flexible working question in the application . There is no obligation to raise this at the application stage but if you wish to do so, you are welcome to. Flexible working will be part of the discussion at offer stage. Excellent career progression - the BBC offers great opportunities for employees to seek new challenges and work in different areas of the organisation. Unrivalled training and development opportunities - our in-house Academy hosts a wide range of internal and external courses and certification. Benefits - We offer a competitive salary package, a flexible 35-hour working week for work-life balance and 26 days (1 of which is a corporation day) with the option to buy an extra 5 days, a defined pension scheme and discounted dental, health care, gym and much more. The situation regarding the coronavirus outbreak is developing quickly and the BBC is keen to continue to ensure the safety and wellbeing of people across the BBC, while continuing to protect our services. To reduce the risk access to BBC buildings is limited to those essential to our broadcast output. From Wednesday 18 th March until further notice all assessments and interviews will be conducted remotely. For more information go to Mae'r sefyllfa gyda'r coronafeirws yn datblygu'n gyflym, ac mae'r BBC yn awyddus i barhau i sicrhau diogelwch a lles pobl ar draws y BBC, gan barhau i warchod ein gwasanaethau hefyd. I leihau'r risg, dim ond y bobl sy'n hanfodol i'n hallbwn darlledu fydd yn cael mynediad i adeiladau'r BBC. O ddydd Mercher 18 fed Mawrth ymlaen, bydd pob asesiad a chyfweliad yn cael ei gynnal o bell, nes rhoddir gwybod yn wahanol. I gael mwy o wybodaeth, ewch i About the BBC We don't focus simply on what we do - we also care how we do it. Our values and the way we behave are important to us. Please make sure you've read about our values and behaviours in the document attached below. Diversity matters at the BBC. We have a working environment where we value and respect every individual's unique contribution, enabling all of our employees to thrive and achieve their full potential. We want to attract the broadest range of talented people to be part of the BBC - whether that's to contribute to our programming or our wide range of non-production roles. The more diverse our workforce, the better able we are to respond to and reflect our audiences in all their diversity. We are committed to equality of opportunity and welcome applications from individuals, regardless of age, gender, ethnicity, disability, sexual orientation, gender identity, socio-economic background, religion and/or belief. We will consider flexible working requests for all roles, unless operational requirements prevent otherwise. To find out more about Diversity and Inclusion at the BBC, please click here
Senior Data Engineer - GCP
Randstad Tech IT Glasgow, Lanarkshire
Hi {firstName}, My client, a leading global technology organisation, is currently in the market for 2 Senior Data Engineers to come on board to build infrastructure and solutions for the management and organisation of the businesses data. In order to succeed in this role, you will be required to gather requirements from stakeholders, design data structures and data models, design and build data pipelines and understand how to build secure, scalable, efficient solutions. Responsibilities will include; - Build and maintain robust, scalable, automated cloud-based data pipelines - Recommend ways to improve data reliability, efficiency and quality - Partake in cross team code reviews to ensure all solutions developed meet required standards - Analyse source systems, define underlying data sources and transformation requirements, design suitable data models and document the design/specifications - Provide insight into the changing data environment, data processing, data storage and utilisation requirements for the business. Essentials: - Have solid experience working in a Big Data ecosystem processing data - including file systems, data structures, automation etc. - Well versed in working within cloud data - including design and implementation - Ideally in GCP but would consider other cloud platforms. (AWS) - A deep understanding of Python and SQL - Well versed in developing data pipelines with tools such as Apache Airflow or GCP Composer - Good experience in implementing ETL (or ELT) - Knowledge of CI\CD best practices. Desirables: - Experience with Terraform - Experience of developing ETL or ELT using Apache Beam, GCP Dataflow, Apache Spark or GCP Dataproc - Any knowledge of Salesforce The role is paying between £75k - £85k per annum and is fully remote. If you are interested in hearing more, please apply for the role and I will be in touch. Randstad Technologies Ltd is a leading specialist recruitment business for the IT industry. Please note that due to a high level of applications, we can only respond to applicants whose skills & qualifications are suitable for this position. No terminology in this advert is intended to discriminate against any of the protected characteristics that fall under the Equality Act 2010. For the purposes of the Conduct Regulations 2003, when advertising permanent vacancies we are acting as an Employment Agency, and when advertising temporary/contract vacancies we are acting as an Employment Business.
07/10/2021
Full time
Hi {firstName}, My client, a leading global technology organisation, is currently in the market for 2 Senior Data Engineers to come on board to build infrastructure and solutions for the management and organisation of the businesses data. In order to succeed in this role, you will be required to gather requirements from stakeholders, design data structures and data models, design and build data pipelines and understand how to build secure, scalable, efficient solutions. Responsibilities will include; - Build and maintain robust, scalable, automated cloud-based data pipelines - Recommend ways to improve data reliability, efficiency and quality - Partake in cross team code reviews to ensure all solutions developed meet required standards - Analyse source systems, define underlying data sources and transformation requirements, design suitable data models and document the design/specifications - Provide insight into the changing data environment, data processing, data storage and utilisation requirements for the business. Essentials: - Have solid experience working in a Big Data ecosystem processing data - including file systems, data structures, automation etc. - Well versed in working within cloud data - including design and implementation - Ideally in GCP but would consider other cloud platforms. (AWS) - A deep understanding of Python and SQL - Well versed in developing data pipelines with tools such as Apache Airflow or GCP Composer - Good experience in implementing ETL (or ELT) - Knowledge of CI\CD best practices. Desirables: - Experience with Terraform - Experience of developing ETL or ELT using Apache Beam, GCP Dataflow, Apache Spark or GCP Dataproc - Any knowledge of Salesforce The role is paying between £75k - £85k per annum and is fully remote. If you are interested in hearing more, please apply for the role and I will be in touch. Randstad Technologies Ltd is a leading specialist recruitment business for the IT industry. Please note that due to a high level of applications, we can only respond to applicants whose skills & qualifications are suitable for this position. No terminology in this advert is intended to discriminate against any of the protected characteristics that fall under the Equality Act 2010. For the purposes of the Conduct Regulations 2003, when advertising permanent vacancies we are acting as an Employment Agency, and when advertising temporary/contract vacancies we are acting as an Employment Business.
Senior Data Engineer - GCP
Randstad Tech IT Manchester, Lancashire
Hi {firstName}, My client, a leading global technology organisation, is currently in the market for 2 Senior Data Engineers to come on board to build infrastructure and solutions for the management and organisation of the businesses data. In order to succeed in this role, you will be required to gather requirements from stakeholders, design data structures and data models, design and build data pipelines and understand how to build secure, scalable, efficient solutions. Responsibilities will include; - Build and maintain robust, scalable, automated cloud-based data pipelines - Recommend ways to improve data reliability, efficiency and quality - Partake in cross team code reviews to ensure all solutions developed meet required standards - Analyse source systems, define underlying data sources and transformation requirements, design suitable data models and document the design/specifications - Provide insight into the changing data environment, data processing, data storage and utilisation requirements for the business. Essentials: - Have solid experience working in a Big Data ecosystem processing data - including file systems, data structures, automation etc. - Well versed in working within cloud data - including design and implementation - Ideally in GCP but would consider other cloud platforms. (AWS) - A deep understanding of Python and SQL - Well versed in developing data pipelines with tools such as Apache Airflow or GCP Composer - Good experience in implementing ETL (or ELT) - Knowledge of CI\CD best practices. Desirables: - Experience with Terraform - Experience of developing ETL or ELT using Apache Beam, GCP Dataflow, Apache Spark or GCP Dataproc - Any knowledge of Salesforce The role is paying between £75k - £85k per annum and is fully remote. If you are interested in hearing more, please apply for the role and I will be in touch. Randstad Technologies Ltd is a leading specialist recruitment business for the IT industry. Please note that due to a high level of applications, we can only respond to applicants whose skills & qualifications are suitable for this position. No terminology in this advert is intended to discriminate against any of the protected characteristics that fall under the Equality Act 2010. For the purposes of the Conduct Regulations 2003, when advertising permanent vacancies we are acting as an Employment Agency, and when advertising temporary/contract vacancies we are acting as an Employment Business.
07/10/2021
Full time
Hi {firstName}, My client, a leading global technology organisation, is currently in the market for 2 Senior Data Engineers to come on board to build infrastructure and solutions for the management and organisation of the businesses data. In order to succeed in this role, you will be required to gather requirements from stakeholders, design data structures and data models, design and build data pipelines and understand how to build secure, scalable, efficient solutions. Responsibilities will include; - Build and maintain robust, scalable, automated cloud-based data pipelines - Recommend ways to improve data reliability, efficiency and quality - Partake in cross team code reviews to ensure all solutions developed meet required standards - Analyse source systems, define underlying data sources and transformation requirements, design suitable data models and document the design/specifications - Provide insight into the changing data environment, data processing, data storage and utilisation requirements for the business. Essentials: - Have solid experience working in a Big Data ecosystem processing data - including file systems, data structures, automation etc. - Well versed in working within cloud data - including design and implementation - Ideally in GCP but would consider other cloud platforms. (AWS) - A deep understanding of Python and SQL - Well versed in developing data pipelines with tools such as Apache Airflow or GCP Composer - Good experience in implementing ETL (or ELT) - Knowledge of CI\CD best practices. Desirables: - Experience with Terraform - Experience of developing ETL or ELT using Apache Beam, GCP Dataflow, Apache Spark or GCP Dataproc - Any knowledge of Salesforce The role is paying between £75k - £85k per annum and is fully remote. If you are interested in hearing more, please apply for the role and I will be in touch. Randstad Technologies Ltd is a leading specialist recruitment business for the IT industry. Please note that due to a high level of applications, we can only respond to applicants whose skills & qualifications are suitable for this position. No terminology in this advert is intended to discriminate against any of the protected characteristics that fall under the Equality Act 2010. For the purposes of the Conduct Regulations 2003, when advertising permanent vacancies we are acting as an Employment Agency, and when advertising temporary/contract vacancies we are acting as an Employment Business.
BP
Senior Cloud Data Security Engineer - AWS
BP Blackburn, Lancashire
Job Profile Summary bp are looking for a Senior Cloud Data Security Engineer with a strong background in AWS , with experience in designing, engineering, and deploying security across Big Data cloud platforms. KEEP READING, IF YOU ARE AN ENGINEER WITH PASSION FOR BIG DATA AND HANDS-ON SECURITY DEVELOPMENT EXPERIENCE, AS YOU WOULD BE A GREAT FIT. WHAT YOU'LL BE DOING · You'll have direct responsibility to design, engineer and deploy security features and processes responding to data platform development demands. · You'll be leading the ongoing development of Security Operations within the data services, drafting producers and runbooks. · You'll be joining bp's Digital Security function providing security engineering capability supporting products and services across the bp business. Job Advert · You'll be working closely with product & service owners and their teams who operate bp's data platforms. · Lead threat modelling, engineering planning workshops and security design reviews with engineering teams, providing subject matter expertise in resolving complex security problems. · You'll work in an agile environment, mastering continuous improvement deliveries. · As a senior team member, you will have opportunity to mentor others and draft their learning paths. WHO ARE WE LOOKING FOR? · Excellent cloud / data lake / data platform and associated technologies knowledge. · Experience working in digital security across cloud platforms with a specific emphasis on working across big data / data analytics platforms. · Strong demonstrated 'hands on' technical proficiency (see 'Technical competencies'). · Good grasp of Software Development Lifecycle (SDLC) · Experience working across information lifecycle management · A good understanding and application of technology to support the regulatory environment for information management - particularly related to personal and high value data. · A strong communicator, influencer and team player. · A passion for continuous learning, professional development and knowledge sharing. This role would suit a mid-level cloud data security engineer looking to advance into senior position, taking responsibility for the team deliverables and performance. TECHNICAL COMPETENCIES A range of core technical proficiency / knowledge across the following domains: GENERAL · Azure DevOps Pipelines or Github Actions · Infrastructure as a Code tools PROGRAMMING · .NET, Java, Scala · Python 3 · PowerShell / Bash TOOLS · ETL tools (Amazon Glue, Apache Airflow) · Logging capabilities (CloudTrail, CloudWatch) · Programming hosts (Lambda, EC2, EKS) · Data (Databricks, EMR, Redshift, DynamoDB, RDS, S3, Athena, Lake Formation) · Messaging (Kafka, SNS, SQS, ActiveMQ) · CloudFormation, Terraform · IAM, Config, SSM, Security Hub · Step functions PROFICIENCY / KNOWLEDGE ACROSS SOME OF THE FOLLOWING DOMAINS WOULD ALSO BE PREFERABLE: · AWS Databricks · Microsoft / AWS Certified · CISSP, CCSP, CIPT At bp , we provide a phenomenal environment and benefits such as an open and inclusive culture, a great work-life balance, tremendous learning and development opportunities to craft your career path, life and health insurance, medical care package and many others! Diversity sits at the heart of our company and as an equal opportunity employer, we stay true to our mission by ensuring that our place can be anyone's place. We do not discriminate based on race, religion, color, national origin, gender and gender identity, sexual orientation, age, marital status, veteran status or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application and interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Reinvent your career as you help our business to meet the challenges of the future. Apply now! #bpInformationSecurity Entity Innovation & Engineering Job Family Group IT&S Group Relocation available No Travel required Negligible travel Time Type Full time Country United Kingdom About BP INNOVATION & ENGINEERING Join us in creating, growing, and delivering innovation at pace, enabling us to thrive while transitioning to a net zero ‎world. All without compromising our operational risk management. Working with us, you can do this by: • deploying our integrated capability and standards in service of our net zero and ‎safety ambitions • driving our digital transformation and pioneering new business models • collaborating to deliver competitive customer-focused energy solutions • originating, scaling and commercialising innovative ideas, and creating ground-breaking new ‎businesses from them • protecting us by assuring management of our greatest physical and digital risks Because together we are: • Originators, builders, guardians and disruptors • Engineers, technologists, scientists and entrepreneurs‎ • Empathetic, curious, creative and inclusive
07/10/2021
Full time
Job Profile Summary bp are looking for a Senior Cloud Data Security Engineer with a strong background in AWS , with experience in designing, engineering, and deploying security across Big Data cloud platforms. KEEP READING, IF YOU ARE AN ENGINEER WITH PASSION FOR BIG DATA AND HANDS-ON SECURITY DEVELOPMENT EXPERIENCE, AS YOU WOULD BE A GREAT FIT. WHAT YOU'LL BE DOING · You'll have direct responsibility to design, engineer and deploy security features and processes responding to data platform development demands. · You'll be leading the ongoing development of Security Operations within the data services, drafting producers and runbooks. · You'll be joining bp's Digital Security function providing security engineering capability supporting products and services across the bp business. Job Advert · You'll be working closely with product & service owners and their teams who operate bp's data platforms. · Lead threat modelling, engineering planning workshops and security design reviews with engineering teams, providing subject matter expertise in resolving complex security problems. · You'll work in an agile environment, mastering continuous improvement deliveries. · As a senior team member, you will have opportunity to mentor others and draft their learning paths. WHO ARE WE LOOKING FOR? · Excellent cloud / data lake / data platform and associated technologies knowledge. · Experience working in digital security across cloud platforms with a specific emphasis on working across big data / data analytics platforms. · Strong demonstrated 'hands on' technical proficiency (see 'Technical competencies'). · Good grasp of Software Development Lifecycle (SDLC) · Experience working across information lifecycle management · A good understanding and application of technology to support the regulatory environment for information management - particularly related to personal and high value data. · A strong communicator, influencer and team player. · A passion for continuous learning, professional development and knowledge sharing. This role would suit a mid-level cloud data security engineer looking to advance into senior position, taking responsibility for the team deliverables and performance. TECHNICAL COMPETENCIES A range of core technical proficiency / knowledge across the following domains: GENERAL · Azure DevOps Pipelines or Github Actions · Infrastructure as a Code tools PROGRAMMING · .NET, Java, Scala · Python 3 · PowerShell / Bash TOOLS · ETL tools (Amazon Glue, Apache Airflow) · Logging capabilities (CloudTrail, CloudWatch) · Programming hosts (Lambda, EC2, EKS) · Data (Databricks, EMR, Redshift, DynamoDB, RDS, S3, Athena, Lake Formation) · Messaging (Kafka, SNS, SQS, ActiveMQ) · CloudFormation, Terraform · IAM, Config, SSM, Security Hub · Step functions PROFICIENCY / KNOWLEDGE ACROSS SOME OF THE FOLLOWING DOMAINS WOULD ALSO BE PREFERABLE: · AWS Databricks · Microsoft / AWS Certified · CISSP, CCSP, CIPT At bp , we provide a phenomenal environment and benefits such as an open and inclusive culture, a great work-life balance, tremendous learning and development opportunities to craft your career path, life and health insurance, medical care package and many others! Diversity sits at the heart of our company and as an equal opportunity employer, we stay true to our mission by ensuring that our place can be anyone's place. We do not discriminate based on race, religion, color, national origin, gender and gender identity, sexual orientation, age, marital status, veteran status or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application and interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Reinvent your career as you help our business to meet the challenges of the future. Apply now! #bpInformationSecurity Entity Innovation & Engineering Job Family Group IT&S Group Relocation available No Travel required Negligible travel Time Type Full time Country United Kingdom About BP INNOVATION & ENGINEERING Join us in creating, growing, and delivering innovation at pace, enabling us to thrive while transitioning to a net zero ‎world. All without compromising our operational risk management. Working with us, you can do this by: • deploying our integrated capability and standards in service of our net zero and ‎safety ambitions • driving our digital transformation and pioneering new business models • collaborating to deliver competitive customer-focused energy solutions • originating, scaling and commercialising innovative ideas, and creating ground-breaking new ‎businesses from them • protecting us by assuring management of our greatest physical and digital risks Because together we are: • Originators, builders, guardians and disruptors • Engineers, technologists, scientists and entrepreneurs‎ • Empathetic, curious, creative and inclusive
BP
Senior Cloud Data Security Engineer - AWS
BP Wormley, Surrey
Job Profile Summary bp are looking for a Senior Cloud Data Security Engineer with a strong background in AWS , with experience in designing, engineering, and deploying security across Big Data cloud platforms. KEEP READING, IF YOU ARE AN ENGINEER WITH PASSION FOR BIG DATA AND HANDS-ON SECURITY DEVELOPMENT EXPERIENCE, AS YOU WOULD BE A GREAT FIT. WHAT YOU'LL BE DOING · You'll have direct responsibility to design, engineer and deploy security features and processes responding to data platform development demands. · You'll be leading the ongoing development of Security Operations within the data services, drafting producers and runbooks. · You'll be joining bp's Digital Security function providing security engineering capability supporting products and services across the bp business. Job Advert · You'll be working closely with product & service owners and their teams who operate bp's data platforms. · Lead threat modelling, engineering planning workshops and security design reviews with engineering teams, providing subject matter expertise in resolving complex security problems. · You'll work in an agile environment, mastering continuous improvement deliveries. · As a senior team member, you will have opportunity to mentor others and draft their learning paths. WHO ARE WE LOOKING FOR? · Excellent cloud / data lake / data platform and associated technologies knowledge. · Experience working in digital security across cloud platforms with a specific emphasis on working across big data / data analytics platforms. · Strong demonstrated 'hands on' technical proficiency (see 'Technical competencies'). · Good grasp of Software Development Lifecycle (SDLC) · Experience working across information lifecycle management · A good understanding and application of technology to support the regulatory environment for information management - particularly related to personal and high value data. · A strong communicator, influencer and team player. · A passion for continuous learning, professional development and knowledge sharing. This role would suit a mid-level cloud data security engineer looking to advance into senior position, taking responsibility for the team deliverables and performance. TECHNICAL COMPETENCIES A range of core technical proficiency / knowledge across the following domains: GENERAL · Azure DevOps Pipelines or Github Actions · Infrastructure as a Code tools PROGRAMMING · .NET, Java, Scala · Python 3 · PowerShell / Bash TOOLS · ETL tools (Amazon Glue, Apache Airflow) · Logging capabilities (CloudTrail, CloudWatch) · Programming hosts (Lambda, EC2, EKS) · Data (Databricks, EMR, Redshift, DynamoDB, RDS, S3, Athena, Lake Formation) · Messaging (Kafka, SNS, SQS, ActiveMQ) · CloudFormation, Terraform · IAM, Config, SSM, Security Hub · Step functions PROFICIENCY / KNOWLEDGE ACROSS SOME OF THE FOLLOWING DOMAINS WOULD ALSO BE PREFERABLE: · AWS Databricks · Microsoft / AWS Certified · CISSP, CCSP, CIPT At bp , we provide a phenomenal environment and benefits such as an open and inclusive culture, a great work-life balance, tremendous learning and development opportunities to craft your career path, life and health insurance, medical care package and many others! Diversity sits at the heart of our company and as an equal opportunity employer, we stay true to our mission by ensuring that our place can be anyone's place. We do not discriminate based on race, religion, color, national origin, gender and gender identity, sexual orientation, age, marital status, veteran status or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application and interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Reinvent your career as you help our business to meet the challenges of the future. Apply now! #bpInformationSecurity Entity Innovation & Engineering Job Family Group IT&S Group Relocation available No Travel required Negligible travel Time Type Full time Country United Kingdom About BP INNOVATION & ENGINEERING Join us in creating, growing, and delivering innovation at pace, enabling us to thrive while transitioning to a net zero ‎world. All without compromising our operational risk management. Working with us, you can do this by: • deploying our integrated capability and standards in service of our net zero and ‎safety ambitions • driving our digital transformation and pioneering new business models • collaborating to deliver competitive customer-focused energy solutions • originating, scaling and commercialising innovative ideas, and creating ground-breaking new ‎businesses from them • protecting us by assuring management of our greatest physical and digital risks Because together we are: • Originators, builders, guardians and disruptors • Engineers, technologists, scientists and entrepreneurs‎ • Empathetic, curious, creative and inclusive
07/10/2021
Full time
Job Profile Summary bp are looking for a Senior Cloud Data Security Engineer with a strong background in AWS , with experience in designing, engineering, and deploying security across Big Data cloud platforms. KEEP READING, IF YOU ARE AN ENGINEER WITH PASSION FOR BIG DATA AND HANDS-ON SECURITY DEVELOPMENT EXPERIENCE, AS YOU WOULD BE A GREAT FIT. WHAT YOU'LL BE DOING · You'll have direct responsibility to design, engineer and deploy security features and processes responding to data platform development demands. · You'll be leading the ongoing development of Security Operations within the data services, drafting producers and runbooks. · You'll be joining bp's Digital Security function providing security engineering capability supporting products and services across the bp business. Job Advert · You'll be working closely with product & service owners and their teams who operate bp's data platforms. · Lead threat modelling, engineering planning workshops and security design reviews with engineering teams, providing subject matter expertise in resolving complex security problems. · You'll work in an agile environment, mastering continuous improvement deliveries. · As a senior team member, you will have opportunity to mentor others and draft their learning paths. WHO ARE WE LOOKING FOR? · Excellent cloud / data lake / data platform and associated technologies knowledge. · Experience working in digital security across cloud platforms with a specific emphasis on working across big data / data analytics platforms. · Strong demonstrated 'hands on' technical proficiency (see 'Technical competencies'). · Good grasp of Software Development Lifecycle (SDLC) · Experience working across information lifecycle management · A good understanding and application of technology to support the regulatory environment for information management - particularly related to personal and high value data. · A strong communicator, influencer and team player. · A passion for continuous learning, professional development and knowledge sharing. This role would suit a mid-level cloud data security engineer looking to advance into senior position, taking responsibility for the team deliverables and performance. TECHNICAL COMPETENCIES A range of core technical proficiency / knowledge across the following domains: GENERAL · Azure DevOps Pipelines or Github Actions · Infrastructure as a Code tools PROGRAMMING · .NET, Java, Scala · Python 3 · PowerShell / Bash TOOLS · ETL tools (Amazon Glue, Apache Airflow) · Logging capabilities (CloudTrail, CloudWatch) · Programming hosts (Lambda, EC2, EKS) · Data (Databricks, EMR, Redshift, DynamoDB, RDS, S3, Athena, Lake Formation) · Messaging (Kafka, SNS, SQS, ActiveMQ) · CloudFormation, Terraform · IAM, Config, SSM, Security Hub · Step functions PROFICIENCY / KNOWLEDGE ACROSS SOME OF THE FOLLOWING DOMAINS WOULD ALSO BE PREFERABLE: · AWS Databricks · Microsoft / AWS Certified · CISSP, CCSP, CIPT At bp , we provide a phenomenal environment and benefits such as an open and inclusive culture, a great work-life balance, tremendous learning and development opportunities to craft your career path, life and health insurance, medical care package and many others! Diversity sits at the heart of our company and as an equal opportunity employer, we stay true to our mission by ensuring that our place can be anyone's place. We do not discriminate based on race, religion, color, national origin, gender and gender identity, sexual orientation, age, marital status, veteran status or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application and interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Reinvent your career as you help our business to meet the challenges of the future. Apply now! #bpInformationSecurity Entity Innovation & Engineering Job Family Group IT&S Group Relocation available No Travel required Negligible travel Time Type Full time Country United Kingdom About BP INNOVATION & ENGINEERING Join us in creating, growing, and delivering innovation at pace, enabling us to thrive while transitioning to a net zero ‎world. All without compromising our operational risk management. Working with us, you can do this by: • deploying our integrated capability and standards in service of our net zero and ‎safety ambitions • driving our digital transformation and pioneering new business models • collaborating to deliver competitive customer-focused energy solutions • originating, scaling and commercialising innovative ideas, and creating ground-breaking new ‎businesses from them • protecting us by assuring management of our greatest physical and digital risks Because together we are: • Originators, builders, guardians and disruptors • Engineers, technologists, scientists and entrepreneurs‎ • Empathetic, curious, creative and inclusive
Nutmeg
Senior Data Engineer
Nutmeg
Who we are: Nutmeg is Europe's leading Digital Wealth Manager, but we don't want to stop there. We're continuing to build our platform to help us achieve our mission of being the most trusted Digital Wealth Manager in the world. Since being founded in 2011 we've: Grown to 160+ employees Raised over £100M in funding Launched 4 amazing products including JISA and Lifetime ISA Won multiple awards including Best Online Stocks & Shares ISA Provider for the fifth year in a row! We hit the 130,000 investor milestone in early 2021 and now manage over £3 billion AUM. *We offer flexible working* Job in a nutshell: We run a pure AWS-based cloud environment and deliver features using a continuous delivery approach. Our Data platform comprises a mix of services and open-source products fully running in Kubernetes and utilising AWS native Data solutions. Nutmeg Data solution is a mix of batching and streaming processes leveraging Airflow, Apache Kafka and AWS Data tools. Our key characteristic is enabling a self-service experience for all Data stakeholders. Nutmeg products are served by a polyglot mix of microservices designed following Domain-Driven Design principles and composing an Event-Driven Architecture powered by Apache Kafka. As a Senior Data Engineer, you will closely collaborate with technical and non-technical teams to deliver Data solutions supporting Nutmeg's Data strategy. We are looking for someone with previous job experience as a senior engineer and a strong passion for Data challenges. Requirements Your skills: Following Data engineering industry best practice Full ownership of end-to-end Data pipelines Designing, implementing, and maintaining Data models Writing automated test around Data models Understanding of CI/CD principles Experience with cloud platforms for Data (ideally AWS) Experience in converting business requirements into technical deliverables Previous experience with two or more of the following: Airflow, dbt, Kafka Connect, Looker, Python, and Redshift You might also have: DataOps best practice Experience in collaborating with BI and Data Science teams Use of agile/lean methodologies for continuous delivery and improvement Knowledge of monitoring, metrics or Site Reliability Engineering Understanding of Data governance and security standards Benefits 25 days' holiday Birthday day off 2 days' paid community leave Competitive salary Private healthcare with Vitality from day 1 Access to a digital GP and other healthcare resources Season ticket and bike loans Access to a wellbeing platform & regular knowledge sharing Regular homeworking perks and rewards Cycle storage and showers onsite Discounted Nutmeg account for you and your family and friends Part of an inclusive Nutmeg team
15/09/2021
Full time
Who we are: Nutmeg is Europe's leading Digital Wealth Manager, but we don't want to stop there. We're continuing to build our platform to help us achieve our mission of being the most trusted Digital Wealth Manager in the world. Since being founded in 2011 we've: Grown to 160+ employees Raised over £100M in funding Launched 4 amazing products including JISA and Lifetime ISA Won multiple awards including Best Online Stocks & Shares ISA Provider for the fifth year in a row! We hit the 130,000 investor milestone in early 2021 and now manage over £3 billion AUM. *We offer flexible working* Job in a nutshell: We run a pure AWS-based cloud environment and deliver features using a continuous delivery approach. Our Data platform comprises a mix of services and open-source products fully running in Kubernetes and utilising AWS native Data solutions. Nutmeg Data solution is a mix of batching and streaming processes leveraging Airflow, Apache Kafka and AWS Data tools. Our key characteristic is enabling a self-service experience for all Data stakeholders. Nutmeg products are served by a polyglot mix of microservices designed following Domain-Driven Design principles and composing an Event-Driven Architecture powered by Apache Kafka. As a Senior Data Engineer, you will closely collaborate with technical and non-technical teams to deliver Data solutions supporting Nutmeg's Data strategy. We are looking for someone with previous job experience as a senior engineer and a strong passion for Data challenges. Requirements Your skills: Following Data engineering industry best practice Full ownership of end-to-end Data pipelines Designing, implementing, and maintaining Data models Writing automated test around Data models Understanding of CI/CD principles Experience with cloud platforms for Data (ideally AWS) Experience in converting business requirements into technical deliverables Previous experience with two or more of the following: Airflow, dbt, Kafka Connect, Looker, Python, and Redshift You might also have: DataOps best practice Experience in collaborating with BI and Data Science teams Use of agile/lean methodologies for continuous delivery and improvement Knowledge of monitoring, metrics or Site Reliability Engineering Understanding of Data governance and security standards Benefits 25 days' holiday Birthday day off 2 days' paid community leave Competitive salary Private healthcare with Vitality from day 1 Access to a digital GP and other healthcare resources Season ticket and bike loans Access to a wellbeing platform & regular knowledge sharing Regular homeworking perks and rewards Cycle storage and showers onsite Discounted Nutmeg account for you and your family and friends Part of an inclusive Nutmeg team
Nutmeg
Senior Data Engineer
Nutmeg
Who we are: Nutmeg is Europe's leading Digital Wealth Manager, but we don't want to stop there. We're continuing to build our platform to help us achieve our mission of being the most trusted Digital Wealth Manager in the world. Since being founded in 2011 we've: Grown to 160+ employees Raised over £100M in funding Launched 4 amazing products including JISA and Lifetime ISA Won multiple awards including Best Online Stocks & Shares ISA Provider for the fifth year in a row! We hit the 130,000 investor milestone in early 2021 and now manage over £3 billion AUM. *We offer flexible working* Job in a nutshell: We run a pure AWS-based cloud environment and deliver features using a continuous delivery approach. Our Data platform comprises a mix of services and open-source products fully running in Kubernetes and utilising AWS native Data solutions. Nutmeg Data solution is a mix of batching and streaming processes leveraging Airflow, Apache Kafka and AWS Data tools. Our key characteristic is enabling a self-service experience for all Data stakeholders. Nutmeg products are served by a polyglot mix of microservices designed following Domain-Driven Design principles and composing an Event-Driven Architecture powered by Apache Kafka. As a Senior Data Engineer, you will closely collaborate with technical and non-technical teams to deliver Data solutions supporting Nutmeg's Data strategy. We are looking for someone with previous job experience as a senior engineer and a strong passion for Data challenges. Requirements Your skills: Following Data engineering industry best practice Full ownership of end-to-end Data pipelines Designing, implementing, and maintaining Data models Writing automated test around Data models Understanding of CI/CD principles Experience with cloud platforms for Data (ideally AWS) Experience in converting business requirements into technical deliverables Previous experience with two or more of the following: Airflow, dbt, Kafka Connect, Looker, Python, and Redshift You might also have: DataOps best practice Experience in collaborating with BI and Data Science teams Use of agile/lean methodologies for continuous delivery and improvement Knowledge of monitoring, metrics or Site Reliability Engineering Understanding of Data governance and security standards Benefits 25 days' holiday Birthday day off 2 days' paid community leave Competitive salary Private healthcare with Vitality from day 1 Access to a digital GP and other healthcare resources Season ticket and bike loans Access to a wellbeing platform & regular knowledge sharing Regular homeworking perks and rewards Cycle storage and showers onsite Discounted Nutmeg account for you and your family and friends Part of an inclusive Nutmeg team
14/09/2021
Full time
Who we are: Nutmeg is Europe's leading Digital Wealth Manager, but we don't want to stop there. We're continuing to build our platform to help us achieve our mission of being the most trusted Digital Wealth Manager in the world. Since being founded in 2011 we've: Grown to 160+ employees Raised over £100M in funding Launched 4 amazing products including JISA and Lifetime ISA Won multiple awards including Best Online Stocks & Shares ISA Provider for the fifth year in a row! We hit the 130,000 investor milestone in early 2021 and now manage over £3 billion AUM. *We offer flexible working* Job in a nutshell: We run a pure AWS-based cloud environment and deliver features using a continuous delivery approach. Our Data platform comprises a mix of services and open-source products fully running in Kubernetes and utilising AWS native Data solutions. Nutmeg Data solution is a mix of batching and streaming processes leveraging Airflow, Apache Kafka and AWS Data tools. Our key characteristic is enabling a self-service experience for all Data stakeholders. Nutmeg products are served by a polyglot mix of microservices designed following Domain-Driven Design principles and composing an Event-Driven Architecture powered by Apache Kafka. As a Senior Data Engineer, you will closely collaborate with technical and non-technical teams to deliver Data solutions supporting Nutmeg's Data strategy. We are looking for someone with previous job experience as a senior engineer and a strong passion for Data challenges. Requirements Your skills: Following Data engineering industry best practice Full ownership of end-to-end Data pipelines Designing, implementing, and maintaining Data models Writing automated test around Data models Understanding of CI/CD principles Experience with cloud platforms for Data (ideally AWS) Experience in converting business requirements into technical deliverables Previous experience with two or more of the following: Airflow, dbt, Kafka Connect, Looker, Python, and Redshift You might also have: DataOps best practice Experience in collaborating with BI and Data Science teams Use of agile/lean methodologies for continuous delivery and improvement Knowledge of monitoring, metrics or Site Reliability Engineering Understanding of Data governance and security standards Benefits 25 days' holiday Birthday day off 2 days' paid community leave Competitive salary Private healthcare with Vitality from day 1 Access to a digital GP and other healthcare resources Season ticket and bike loans Access to a wellbeing platform & regular knowledge sharing Regular homeworking perks and rewards Cycle storage and showers onsite Discounted Nutmeg account for you and your family and friends Part of an inclusive Nutmeg team

Modal Window

  • Home
  • Contact
  • About Us
  • FAQs
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • IT blog
  • Facebook
  • Twitter
  • LinkedIn
  • Youtube
© 2008-2026 IT Job Board