it job board logo
  • Home
  • Find IT Jobs
  • Register CV
  • Register as Employer
  • Contact us
  • Career Advice
  • Recruiting? Post a job
  • Sign in
  • Sign up
  • Home
  • Find IT Jobs
  • Register CV
  • Register as Employer
  • Contact us
  • Career Advice
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

12 jobs found

Email me jobs like this
Refine Search
Current Search
senior dataops engineer
Inspire People
Senior Data Engineer
Inspire People South Croydon, Surrey
HM Land Registry (HMLR) is undertaking one of the largest transformation programmes in government, modernising the digital systems that support over £7 trillion of property ownership. As a Senior Data Engineer, you will help establish a new data engineering capability, contributing to the development of reliable data pipelines and products that improve data access, integrity and value across the organisation. Your work will support programmes that shape how HMLR manages and uses its data for years to come. Salary up to £60,800, 29% employer pension contribution plus full Civil Service benefits. Flexible, hybrid working from Plymouth, Croydon or Coventry. About the role This role has come to fruition as HMLR embarks on a significant modernisation of its core services and data infrastructure. With new funding secured and a dedicated Data Engineering capability being formed for the first time, there is a crucial need to build strong, reliable data systems that can support future services and national programmes. As a Senior Data Engineer, you'll design and deliver robust data systems, pipelines and products that support analytics and operational decision-making. Working in agile teams, you'll provide technical leadership, guide colleagues and help shape solutions across the organisation. You'll also support opportunity discovery, develop prototypes and production-ready solutions, and continually improve data engineering practices and systems in production. If you would like to find out more about the role, the Data Engineering capability and what it's like to work at HMLR, a Hiring Manager Q&A session where you can virtually 'meet the team' will be held via Teams on Tuesday, 6th of January at 12:30pm. Follow the apply link to register your interest. Key Responsibilities Identify opportunities to reuse and optimise data flows, including building streaming systems, managing databases and improving code performance. Lead the development and maintenance of data engineering solutions, advising teams as a subject matter expert and ensuring alignment with HMLR standards and approved technologies. Collaborate with senior colleagues to understand where data engineering adds value, supporting strategic and operational decision-making. Support and guide junior team members, contribute to the data engineering community and advocate for data quality, maintainability and reusable components. Continuously improve data systems and maintain awareness of best practice and emerging approaches Essential Skills Experience with large-scale analytics engines (e.g. Spark/PySpark) and scripting languages such as Python or Scala. Hands-on use of cloud data stacks (e.g. SageMaker Notebooks, S3, Glue, Athena) and modern storage formats or frameworks (e.g. Parquet, Delta Lake, Fabric). Experience with DevOps/DataOps tooling and practices (e.g. Terraform) and testing data pipelines, including end-to-end, data quality, monitoring, unit and contract testing. Experience managing the full data lifecycle, including development, analysis, modelling, integration and metadata management. Communicating clearly with technical and non-technical stakeholders, leading discussions in multidisciplinary teams and managing differing viewpointsProfiling data, analysing source systems and creating data models that suit different organisational need Desirable Skills IBM DB2/QREP BMC Control-M Experience of Talend Data Integration (e.g. ETL build and deployment using Talend Cloud. Experience of leadership Experience of metadata management Experience of testing Location Expectation is to be working from any of the advertised locations 60% of your time across the month (typically three days per week). Hours are flexible and condensed hours are an option. Locations available: Croydon, Coventry, Plymouth Salary Civil Service Grade: SEO Dependent upon assessment at interview, your starting salary will be one of the following: Developing: £49,600 Proficient: £55,200 Accomplished: £60,800 Benefits Over 29% employer pension contribution Annual leave of 28.5 days per year plus 8 public holidays A clear progression pathway including personalised training and development plans Expensed accreditations with dedicated training days Flexi-time scheme (you decide what working hours work best for you) Opportunity to work condensed hours Social and sports clubs Access to an Employee Assistance Programme for counselling and support Interest-free season ticket loan Cycle to Work scheme (salary sacrifice) HMLR has a strong and positive culture, a commitment to inclusivity, a focus on continuous learning and development, and flexible ways of working. Further Information Application deadline: 11:55pm Thursday 8th of January 2026 Please apply with a CV that provides evidence against the essential skills. HMLR does not hold a UK Visa & Immigration (UKVI) Skilled Worker Licence and is unable to sponsor individuals for Skilled Worker sponsorship. If you are a motivated and experienced data professional who enjoys leading on complex data challenges, shaping technical approaches and working collaboratively in agile teams, this is your chance to make a significant impact. Join HM Land Registry and play a crucial role in developing and guiding the data capabilities that support property ownership and public services across England and Wales. Apply now or contact Keesha in complete confidence.
05/12/2025
Full time
HM Land Registry (HMLR) is undertaking one of the largest transformation programmes in government, modernising the digital systems that support over £7 trillion of property ownership. As a Senior Data Engineer, you will help establish a new data engineering capability, contributing to the development of reliable data pipelines and products that improve data access, integrity and value across the organisation. Your work will support programmes that shape how HMLR manages and uses its data for years to come. Salary up to £60,800, 29% employer pension contribution plus full Civil Service benefits. Flexible, hybrid working from Plymouth, Croydon or Coventry. About the role This role has come to fruition as HMLR embarks on a significant modernisation of its core services and data infrastructure. With new funding secured and a dedicated Data Engineering capability being formed for the first time, there is a crucial need to build strong, reliable data systems that can support future services and national programmes. As a Senior Data Engineer, you'll design and deliver robust data systems, pipelines and products that support analytics and operational decision-making. Working in agile teams, you'll provide technical leadership, guide colleagues and help shape solutions across the organisation. You'll also support opportunity discovery, develop prototypes and production-ready solutions, and continually improve data engineering practices and systems in production. If you would like to find out more about the role, the Data Engineering capability and what it's like to work at HMLR, a Hiring Manager Q&A session where you can virtually 'meet the team' will be held via Teams on Tuesday, 6th of January at 12:30pm. Follow the apply link to register your interest. Key Responsibilities Identify opportunities to reuse and optimise data flows, including building streaming systems, managing databases and improving code performance. Lead the development and maintenance of data engineering solutions, advising teams as a subject matter expert and ensuring alignment with HMLR standards and approved technologies. Collaborate with senior colleagues to understand where data engineering adds value, supporting strategic and operational decision-making. Support and guide junior team members, contribute to the data engineering community and advocate for data quality, maintainability and reusable components. Continuously improve data systems and maintain awareness of best practice and emerging approaches Essential Skills Experience with large-scale analytics engines (e.g. Spark/PySpark) and scripting languages such as Python or Scala. Hands-on use of cloud data stacks (e.g. SageMaker Notebooks, S3, Glue, Athena) and modern storage formats or frameworks (e.g. Parquet, Delta Lake, Fabric). Experience with DevOps/DataOps tooling and practices (e.g. Terraform) and testing data pipelines, including end-to-end, data quality, monitoring, unit and contract testing. Experience managing the full data lifecycle, including development, analysis, modelling, integration and metadata management. Communicating clearly with technical and non-technical stakeholders, leading discussions in multidisciplinary teams and managing differing viewpointsProfiling data, analysing source systems and creating data models that suit different organisational need Desirable Skills IBM DB2/QREP BMC Control-M Experience of Talend Data Integration (e.g. ETL build and deployment using Talend Cloud. Experience of leadership Experience of metadata management Experience of testing Location Expectation is to be working from any of the advertised locations 60% of your time across the month (typically three days per week). Hours are flexible and condensed hours are an option. Locations available: Croydon, Coventry, Plymouth Salary Civil Service Grade: SEO Dependent upon assessment at interview, your starting salary will be one of the following: Developing: £49,600 Proficient: £55,200 Accomplished: £60,800 Benefits Over 29% employer pension contribution Annual leave of 28.5 days per year plus 8 public holidays A clear progression pathway including personalised training and development plans Expensed accreditations with dedicated training days Flexi-time scheme (you decide what working hours work best for you) Opportunity to work condensed hours Social and sports clubs Access to an Employee Assistance Programme for counselling and support Interest-free season ticket loan Cycle to Work scheme (salary sacrifice) HMLR has a strong and positive culture, a commitment to inclusivity, a focus on continuous learning and development, and flexible ways of working. Further Information Application deadline: 11:55pm Thursday 8th of January 2026 Please apply with a CV that provides evidence against the essential skills. HMLR does not hold a UK Visa & Immigration (UKVI) Skilled Worker Licence and is unable to sponsor individuals for Skilled Worker sponsorship. If you are a motivated and experienced data professional who enjoys leading on complex data challenges, shaping technical approaches and working collaboratively in agile teams, this is your chance to make a significant impact. Join HM Land Registry and play a crucial role in developing and guiding the data capabilities that support property ownership and public services across England and Wales. Apply now or contact Keesha in complete confidence.
TRIA
Data Architect
TRIA
Data Architect 75,000 - 80,000 - + Benefits & pension Central London - Hybrid working, office working 2 days per week This is a great opportunity for a talented Data Architect / Senior Data Engineer ready to step up into a Data Architect role for a well established, market leading company. Reporting into the Head of Data Engineering & Data Platforms, you'll join an ambitious Data Engineering team who are designing and building robust, scalable foundations that power data driven decision making - using the latest tech. This is a hands on role with great scope for growth and impact. You will bring solid experience: Designing and building robust data pipelines to bring internal and external raw data sources into AWS cloud-based storage and processing platforms Defining and designing scalable, future-ready solutions that deliver strong and flexible data architecture that supports across the business Developing and executing a migration plan for legacy data into Snowflake, ensuring a smooth and efficient transition Working in a modern Dev/ DataOps culture The tech stack will include: AWS, Fivetran, DBT, Snowflake, Power BI, Sifflet, Postgres SQL, SQL Server, Datavault On offer is a salary up 80,000 (depending on experience), plus benefits and a generous pension If this is you, please apply. I look forward to talking through your CV, career ambitions and this great opportunity in more detail.
12/11/2025
Full time
Data Architect 75,000 - 80,000 - + Benefits & pension Central London - Hybrid working, office working 2 days per week This is a great opportunity for a talented Data Architect / Senior Data Engineer ready to step up into a Data Architect role for a well established, market leading company. Reporting into the Head of Data Engineering & Data Platforms, you'll join an ambitious Data Engineering team who are designing and building robust, scalable foundations that power data driven decision making - using the latest tech. This is a hands on role with great scope for growth and impact. You will bring solid experience: Designing and building robust data pipelines to bring internal and external raw data sources into AWS cloud-based storage and processing platforms Defining and designing scalable, future-ready solutions that deliver strong and flexible data architecture that supports across the business Developing and executing a migration plan for legacy data into Snowflake, ensuring a smooth and efficient transition Working in a modern Dev/ DataOps culture The tech stack will include: AWS, Fivetran, DBT, Snowflake, Power BI, Sifflet, Postgres SQL, SQL Server, Datavault On offer is a salary up 80,000 (depending on experience), plus benefits and a generous pension If this is you, please apply. I look forward to talking through your CV, career ambitions and this great opportunity in more detail.
Sanderson
Data Engineering Manager
Sanderson Edinburgh, Midlothian
Data Engineering Manager Who are Diligenta? Diligenta's vision is to be acknowledged as Best-in-class Platform-based Life and Pensions Administration Service provider. Customer service is at the heart of everything we do, and our aim is to transform our clients' operations. A business that has been described as 'home' by existing employees, we drive a culture that is founded on positive change and development. Summary of the role: As a Data Engineering Manager, you will be responsible for the engineering teams in the Data delivery and Data management scopes. These are key positions within the Data office as they provide technical leadership, mentorship and team development, as well as contributing to the development and implementation of our Data strategy.You will be responsible for organising the team in an efficient and effective manner and prioritising the work to align with organisational priorities. This includes providing technical leadership during development phases, Agile based delivery and engineering best practices. As in any engineering management position, there will be a certain amount of hands-on work, and so a technical background is required, but the main focus will be the management of the team, workstreams and software lifecycles. Benefits: 33 days including Bank Holidays Eligibility for an annual discretionary bonus scheme Personal and career development opportunities to progress your aspirations within the company as well as through our global parent company (Tata Consultancy Services) Access to Perks at Work (an online discounted shopping platform) saving you money on a wide range of goods and services, including your weekly food shop, holidays and electrical goods. Cycle to Work Scheme & Interest-free Season Ticket loans. A companywide Wellbeing programme, including an Employee Assistance Programme and other benefits/resources to support your mental/physical and financial wellbeing. A comprehensive set of Moments that Matter policies, such as Carer's Leave, Foster Leave and Retirement Leave A contributory company pension scheme where we match your contributions up to 6%, Group Life Assurance ('Death in Service') & Group Income Protection What you'll be doing: Contribution to, creation and implementation of best practices to coding standards, naming conventions and design briefs. Overseeing coding standards and repository management. Ensuring the correct documentation is created and maintained for all products. Working with the PM and BA tier to estimate work, prioritise and maintain the backlog. Provide Team leadership by setting standards for behaviour and fostering a culture of collaboration and growth. Manage team development plans ensuring each team member has a meaningful technical and career development plan. Work towards the creation, implementation and evolution of competency frameworks for all engineering positions. Lead the engineering team in the design, building and maintenance of scalable and robust pipelines and supporting MS Fabric infrastructure. Carry out strategic planning for team expansion. Work with PM resources to schedule, complete and document post sprint and project retrospectives, ensuring that identified actions are carried out as required. Identify, qualify and implement tooling requirements. Provide high level technical overviews and insights to senior leadership. Provide incident management as required and follow up with comprehensive RCAs. Will help create and implement a DataOps environment. What we're looking for: Prior experience in a similar role as either an Engineering Manager, Technical Lead or Senior Engineering position in either a software or data delivery environment. Educated up to at least A-level or equivalent standards. Solid understanding of data governance concepts. Strong Analytical skills. Good stakeholder management skills. Familiar with data privacy regulations such as GDPR. Knowledge or understanding of ETL/ELT solutions. Knowledge of Relational and non-relational data management systems. Understanding of big data environments. Knowledge of CICD tools and processes. Experience with GIT source safe. Experience of working in an Agile delivery environment. Experience of coaching and developing teams. Software or Data product delivery. If you need any help or adjustments during the recruitment process, please let us know.Ready to take the next step in your career? Apply today and become part of our innovative team!
02/10/2025
Full time
Data Engineering Manager Who are Diligenta? Diligenta's vision is to be acknowledged as Best-in-class Platform-based Life and Pensions Administration Service provider. Customer service is at the heart of everything we do, and our aim is to transform our clients' operations. A business that has been described as 'home' by existing employees, we drive a culture that is founded on positive change and development. Summary of the role: As a Data Engineering Manager, you will be responsible for the engineering teams in the Data delivery and Data management scopes. These are key positions within the Data office as they provide technical leadership, mentorship and team development, as well as contributing to the development and implementation of our Data strategy.You will be responsible for organising the team in an efficient and effective manner and prioritising the work to align with organisational priorities. This includes providing technical leadership during development phases, Agile based delivery and engineering best practices. As in any engineering management position, there will be a certain amount of hands-on work, and so a technical background is required, but the main focus will be the management of the team, workstreams and software lifecycles. Benefits: 33 days including Bank Holidays Eligibility for an annual discretionary bonus scheme Personal and career development opportunities to progress your aspirations within the company as well as through our global parent company (Tata Consultancy Services) Access to Perks at Work (an online discounted shopping platform) saving you money on a wide range of goods and services, including your weekly food shop, holidays and electrical goods. Cycle to Work Scheme & Interest-free Season Ticket loans. A companywide Wellbeing programme, including an Employee Assistance Programme and other benefits/resources to support your mental/physical and financial wellbeing. A comprehensive set of Moments that Matter policies, such as Carer's Leave, Foster Leave and Retirement Leave A contributory company pension scheme where we match your contributions up to 6%, Group Life Assurance ('Death in Service') & Group Income Protection What you'll be doing: Contribution to, creation and implementation of best practices to coding standards, naming conventions and design briefs. Overseeing coding standards and repository management. Ensuring the correct documentation is created and maintained for all products. Working with the PM and BA tier to estimate work, prioritise and maintain the backlog. Provide Team leadership by setting standards for behaviour and fostering a culture of collaboration and growth. Manage team development plans ensuring each team member has a meaningful technical and career development plan. Work towards the creation, implementation and evolution of competency frameworks for all engineering positions. Lead the engineering team in the design, building and maintenance of scalable and robust pipelines and supporting MS Fabric infrastructure. Carry out strategic planning for team expansion. Work with PM resources to schedule, complete and document post sprint and project retrospectives, ensuring that identified actions are carried out as required. Identify, qualify and implement tooling requirements. Provide high level technical overviews and insights to senior leadership. Provide incident management as required and follow up with comprehensive RCAs. Will help create and implement a DataOps environment. What we're looking for: Prior experience in a similar role as either an Engineering Manager, Technical Lead or Senior Engineering position in either a software or data delivery environment. Educated up to at least A-level or equivalent standards. Solid understanding of data governance concepts. Strong Analytical skills. Good stakeholder management skills. Familiar with data privacy regulations such as GDPR. Knowledge or understanding of ETL/ELT solutions. Knowledge of Relational and non-relational data management systems. Understanding of big data environments. Knowledge of CICD tools and processes. Experience with GIT source safe. Experience of working in an Agile delivery environment. Experience of coaching and developing teams. Software or Data product delivery. If you need any help or adjustments during the recruitment process, please let us know.Ready to take the next step in your career? Apply today and become part of our innovative team!
Sanderson
Data Engineering Manager
Sanderson Peterborough, Cambridgeshire
Data Engineering Manager Who are Diligenta? Diligenta's vision is to be acknowledged as Best-in-class Platform-based Life and Pensions Administration Service provider. Customer service is at the heart of everything we do, and our aim is to transform our clients' operations. A business that has been described as 'home' by existing employees, we drive a culture that is founded on positive change and development. Summary of the role: As a Data Engineering Manager, you will be responsible for the engineering teams in the Data delivery and Data management scopes. These are key positions within the Data office as they provide technical leadership, mentorship and team development, as well as contributing to the development and implementation of our Data strategy.You will be responsible for organising the team in an efficient and effective manner and prioritising the work to align with organisational priorities. This includes providing technical leadership during development phases, Agile based delivery and engineering best practices. As in any engineering management position, there will be a certain amount of hands-on work, and so a technical background is required, but the main focus will be the management of the team, workstreams and software lifecycles. Benefits: 33 days including Bank Holidays Eligibility for an annual discretionary bonus scheme Personal and career development opportunities to progress your aspirations within the company as well as through our global parent company (Tata Consultancy Services) Access to Perks at Work (an online discounted shopping platform) saving you money on a wide range of goods and services, including your weekly food shop, holidays and electrical goods. Cycle to Work Scheme & Interest-free Season Ticket loans. A companywide Wellbeing programme, including an Employee Assistance Programme and other benefits/resources to support your mental/physical and financial wellbeing. A comprehensive set of Moments that Matter policies, such as Carer's Leave, Foster Leave and Retirement Leave A contributory company pension scheme where we match your contributions up to 6%, Group Life Assurance ('Death in Service') & Group Income Protection What you'll be doing: Contribution to, creation and implementation of best practices to coding standards, naming conventions and design briefs. Overseeing coding standards and repository management. Ensuring the correct documentation is created and maintained for all products. Working with the PM and BA tier to estimate work, prioritise and maintain the backlog. Provide Team leadership by setting standards for behaviour and fostering a culture of collaboration and growth. Manage team development plans ensuring each team member has a meaningful technical and career development plan. Work towards the creation, implementation and evolution of competency frameworks for all engineering positions. Lead the engineering team in the design, building and maintenance of scalable and robust pipelines and supporting MS Fabric infrastructure. Carry out strategic planning for team expansion. Work with PM resources to schedule, complete and document post sprint and project retrospectives, ensuring that identified actions are carried out as required. Identify, qualify and implement tooling requirements. Provide high level technical overviews and insights to senior leadership. Provide incident management as required and follow up with comprehensive RCAs. Will help create and implement a DataOps environment. What we're looking for: Prior experience in a similar role as either an Engineering Manager, Technical Lead or Senior Engineering position in either a software or data delivery environment. Educated up to at least A-level or equivalent standards. Solid understanding of data governance concepts. Strong Analytical skills. Good stakeholder management skills. Familiar with data privacy regulations such as GDPR. Knowledge or understanding of ETL/ELT solutions. Knowledge of Relational and non-relational data management systems. Understanding of big data environments. Knowledge of CICD tools and processes. Experience with GIT source safe. Experience of working in an Agile delivery environment. Experience of coaching and developing teams. Software or Data product delivery. If you need any help or adjustments during the recruitment process, please let us know.Ready to take the next step in your career? Apply today and become part of our innovative team!
02/10/2025
Full time
Data Engineering Manager Who are Diligenta? Diligenta's vision is to be acknowledged as Best-in-class Platform-based Life and Pensions Administration Service provider. Customer service is at the heart of everything we do, and our aim is to transform our clients' operations. A business that has been described as 'home' by existing employees, we drive a culture that is founded on positive change and development. Summary of the role: As a Data Engineering Manager, you will be responsible for the engineering teams in the Data delivery and Data management scopes. These are key positions within the Data office as they provide technical leadership, mentorship and team development, as well as contributing to the development and implementation of our Data strategy.You will be responsible for organising the team in an efficient and effective manner and prioritising the work to align with organisational priorities. This includes providing technical leadership during development phases, Agile based delivery and engineering best practices. As in any engineering management position, there will be a certain amount of hands-on work, and so a technical background is required, but the main focus will be the management of the team, workstreams and software lifecycles. Benefits: 33 days including Bank Holidays Eligibility for an annual discretionary bonus scheme Personal and career development opportunities to progress your aspirations within the company as well as through our global parent company (Tata Consultancy Services) Access to Perks at Work (an online discounted shopping platform) saving you money on a wide range of goods and services, including your weekly food shop, holidays and electrical goods. Cycle to Work Scheme & Interest-free Season Ticket loans. A companywide Wellbeing programme, including an Employee Assistance Programme and other benefits/resources to support your mental/physical and financial wellbeing. A comprehensive set of Moments that Matter policies, such as Carer's Leave, Foster Leave and Retirement Leave A contributory company pension scheme where we match your contributions up to 6%, Group Life Assurance ('Death in Service') & Group Income Protection What you'll be doing: Contribution to, creation and implementation of best practices to coding standards, naming conventions and design briefs. Overseeing coding standards and repository management. Ensuring the correct documentation is created and maintained for all products. Working with the PM and BA tier to estimate work, prioritise and maintain the backlog. Provide Team leadership by setting standards for behaviour and fostering a culture of collaboration and growth. Manage team development plans ensuring each team member has a meaningful technical and career development plan. Work towards the creation, implementation and evolution of competency frameworks for all engineering positions. Lead the engineering team in the design, building and maintenance of scalable and robust pipelines and supporting MS Fabric infrastructure. Carry out strategic planning for team expansion. Work with PM resources to schedule, complete and document post sprint and project retrospectives, ensuring that identified actions are carried out as required. Identify, qualify and implement tooling requirements. Provide high level technical overviews and insights to senior leadership. Provide incident management as required and follow up with comprehensive RCAs. Will help create and implement a DataOps environment. What we're looking for: Prior experience in a similar role as either an Engineering Manager, Technical Lead or Senior Engineering position in either a software or data delivery environment. Educated up to at least A-level or equivalent standards. Solid understanding of data governance concepts. Strong Analytical skills. Good stakeholder management skills. Familiar with data privacy regulations such as GDPR. Knowledge or understanding of ETL/ELT solutions. Knowledge of Relational and non-relational data management systems. Understanding of big data environments. Knowledge of CICD tools and processes. Experience with GIT source safe. Experience of working in an Agile delivery environment. Experience of coaching and developing teams. Software or Data product delivery. If you need any help or adjustments during the recruitment process, please let us know.Ready to take the next step in your career? Apply today and become part of our innovative team!
Senior DevOps Engineer
GlaxoSmithKline Stevenage, Hertfordshire
Site Name: UK - Hertfordshire - Stevenage, USA - Connecticut - Hartford, USA - Delaware - Dover, USA - Maryland - Rockville, USA - Massachusetts - Cambridge, USA - Massachusetts - Waltham, USA - New Jersey - Trenton, USA - Pennsylvania - Upper Providence Posted Date: Jun 6 2022 The mission of the Data Science and Data Engineering (DSDE) organization within GSK Pharmaceuticals R&D is to get the right data, to the right people, at the right time. TheData Framework and Opsorganization ensures we can do this efficiently, reliably, transparently, and at scale through the creation of a leading-edge, cloud-native data services framework. We focus heavily on developer experience, on strong, semantic abstractions for the data ecosystem, on professional operations and aggressive automation, and on transparency of operations and cost. Achieving delivery of the right data to the right people at the right time needs design and implementation of data flows and data products which leverage internal and external data assets and tools to drive discovery and development is a key objective for the Data Science and Data Engineering (DSDE) team within GSK's Pharmaceutical R&D organization. There are five key drivers for this approach, which are closely aligned with GSK's corporate priorities of Innovation, Performance and Trust: Automation of end-to-end data flows :Faster and reliable ingestion of high throughput data in genetics, genomics and multi-omics, to extract value of investments in new technology (instrument to analysis-ready data in Enabling governance by design of external and internal data :with engineered practical solutions for controlled use and monitoring Innovative disease-specific and domain-expert specific data products : to enable computational scientists and their research unit collaborators to get faster to key insights leading to faster biopharmaceutical development cycles. Supporting e2ecode traceability and data provenance :Increasing assurance of data integrity through automation, integration. Improving engineering efficiency :Extensible, reusable, scalable,updateable,maintainable, virtualized traceable data and code would be driven by data engineering innovation and better resource utilization. We are looking for experienced Senior DevOps Engineers to join our growing Data Ops team. As a Senior Dev Ops Engineer is a highly technical individual contributor, building modern, cloud-native, DevOps-first systems for standardizing and templatizingbiomedical and scientificdata engineering, with demonstrable experience across the following areas: Deliver declarative components for common data ingestion, transformation and publishing techniques Define and implement data governance aligned to modern standards Establish scalable, automated processes for data engineering teams across GSK Thought leader and partner with wider DSDE data engineering teams to advise on implementation and best practices Cloud Infrastructure-as-Code Define Service and Flow orchestration Data as a configurable resource(including configuration-driven access to scientific data modelling tools) Observability (monitoring, alerting, logging, tracing, etc.) Enable quality engineering through KPIs and code coverage and quality checks Standardise GitOps/declarative software development lifecycle Audit as a service Senior DevOpsEngineers take full ownership of delivering high-performing, high-impactbiomedical and scientificdataopsproducts and services, froma description of apattern thatcustomer Data Engineers are trying touseall the way through tofinal delivery (and ongoing monitoring and operations)of a templated project and all associated automation. They arestandard-bearers for software engineering and quality coding practices within theteam andareexpected to mentor more junior engineers; they may even coordinate the work of more junior engineers on a large project.Theydevise useful metrics for ensuring their services are meeting customer demand and having animpact anditerate to deliver and improve on those metrics in an agile fashion. Successful Senior DevOpsEngineers are developing expertise with the types of data and types of tools that are leveraged in the biomedical and scientific data engineering space, andhas the following skills and experience(withsignificant depth in one or more of these areas): Demonstrable experience deploying robust modularised/container-based solutions to production (ideally GCP) and leveraging the Cloud NativeComputing Foundation (CNCF) ecosystem Significant depth in DevOps principles and tools (e.g.GitOps, Jenkins,CircleCI, Azure DevOps, etc.), and how to integrate these tools with other productivity tools (e.g. Jira, Slack, Microsoft Teams) to build a comprehensive workflow Programming in Python. Scala orGo Embedding agile software engineering (task/issue management, testing, documentation, software development lifecycle, source control, etc.) Leveraging major cloud providers, both via Kubernetesorvia vendor-specific services Authentication and Authorization flows and associated technologies (e.g.OAuth2 + JWT) Common distributed data tools (e.g.Spark, Hive) The DSDE team is built on the principles of ownership, accountability, continuous development, and collaboration. We hire for the long term, and we're motivated to make this a great place to work. Our leaders will be committed to your career and development from day one. Why you? Basic Qualifications: We are looking for professionals with these required skills to achieve our goals: Masters in Computer Science with a focus in Data Engineering, DataOps, DevOps, MLOps, Software Engineering, etc, plus 5 years job experience (or PhD plus 3 years job experience) Experience with DevOps tools and concepts (e.g. Jira, GitLabs / Jenkins / CircleCI / Azure DevOps /etc.)Excellent with common distributed data tools in a production setting (Spark, Kafka, etc) Experience with specialized data architecture (e.g. optimizing physical layout for access patterns, including bloom filters, optimizing against self-describing formats such as ORC or Parquet, etc.) Experience with search / indexing systems (e.g. Elasticsearch) Expertise with agile development in Python, Scala, Go, and/or C++ Experience building reusable components on top of the CNCF ecosystem including Kubernetes Metrics-first mindset Experience mentoring junior engineers into deep technical expertise Preferred Qualifications: If you have the following characteristics, it would be a plus: Experience with agile software development Experience with building and designing a DevOps-first way of working Experience with building reusable components on top of the CNCF ecosystem including Kubernetes (or similar ecosystem ) LI-GSK Why GSK? Our values and expectationsare at the heart of everything we do and form an important part of our culture. These include Patient focus, Transparency, Respect, Integrity along with Courage, Accountability, Development, and Teamwork. As GSK focuses on our values and expectations and a culture of innovation, performance, and trust, the successful candidate will demonstrate the following capabilities: Operating at pace and agile decision making - using evidence and applying judgement to balance pace, rigour and risk. Committed to delivering high-quality results, overcoming challenges, focusing on what matters, execution. Continuously looking for opportunities to learn, build skills and share learning. Sustaining energy and wellbeing Building strong relationships and collaboration, honest and open conversations. Budgeting and cost consciousness As a company driven by our values of Patient focus, Transparency, Respect and Integrity, we know inclusion and diversity are essential for us to be able to succeed. We want all our colleagues to thrive at GSK bringing their unique experiences, ensuring they feel good and to keep growing their careers. As a candidate for a role, we want you to feel the same way. As an Equal Opportunity Employer, we are open to all talent. In the US, we also adhere to Affirmative Action principles. This ensures that all qualified applicants will receive equal consideration for employment without regard to race/ethnicity, colour, national origin, religion, gender, pregnancy, marital status, sexual orientation, gender identity/expression, age, disability, genetic information, military service, covered/protected veteran status or any other federal, state or local protected class ( US only). We believe in an agile working culture for all our roles. If flexibility is important to you, we encourage you to explore with our hiring team what the opportunities are. Please don't hesitate to contact us if you'd like to discuss any adjustments to our process which might help you demonstrate your strengths and capabilities. You can either call us on , or send an email As you apply, we will ask you to share some personal information which is entirely voluntary..... click apply for full job details
23/09/2022
Full time
Site Name: UK - Hertfordshire - Stevenage, USA - Connecticut - Hartford, USA - Delaware - Dover, USA - Maryland - Rockville, USA - Massachusetts - Cambridge, USA - Massachusetts - Waltham, USA - New Jersey - Trenton, USA - Pennsylvania - Upper Providence Posted Date: Jun 6 2022 The mission of the Data Science and Data Engineering (DSDE) organization within GSK Pharmaceuticals R&D is to get the right data, to the right people, at the right time. TheData Framework and Opsorganization ensures we can do this efficiently, reliably, transparently, and at scale through the creation of a leading-edge, cloud-native data services framework. We focus heavily on developer experience, on strong, semantic abstractions for the data ecosystem, on professional operations and aggressive automation, and on transparency of operations and cost. Achieving delivery of the right data to the right people at the right time needs design and implementation of data flows and data products which leverage internal and external data assets and tools to drive discovery and development is a key objective for the Data Science and Data Engineering (DSDE) team within GSK's Pharmaceutical R&D organization. There are five key drivers for this approach, which are closely aligned with GSK's corporate priorities of Innovation, Performance and Trust: Automation of end-to-end data flows :Faster and reliable ingestion of high throughput data in genetics, genomics and multi-omics, to extract value of investments in new technology (instrument to analysis-ready data in Enabling governance by design of external and internal data :with engineered practical solutions for controlled use and monitoring Innovative disease-specific and domain-expert specific data products : to enable computational scientists and their research unit collaborators to get faster to key insights leading to faster biopharmaceutical development cycles. Supporting e2ecode traceability and data provenance :Increasing assurance of data integrity through automation, integration. Improving engineering efficiency :Extensible, reusable, scalable,updateable,maintainable, virtualized traceable data and code would be driven by data engineering innovation and better resource utilization. We are looking for experienced Senior DevOps Engineers to join our growing Data Ops team. As a Senior Dev Ops Engineer is a highly technical individual contributor, building modern, cloud-native, DevOps-first systems for standardizing and templatizingbiomedical and scientificdata engineering, with demonstrable experience across the following areas: Deliver declarative components for common data ingestion, transformation and publishing techniques Define and implement data governance aligned to modern standards Establish scalable, automated processes for data engineering teams across GSK Thought leader and partner with wider DSDE data engineering teams to advise on implementation and best practices Cloud Infrastructure-as-Code Define Service and Flow orchestration Data as a configurable resource(including configuration-driven access to scientific data modelling tools) Observability (monitoring, alerting, logging, tracing, etc.) Enable quality engineering through KPIs and code coverage and quality checks Standardise GitOps/declarative software development lifecycle Audit as a service Senior DevOpsEngineers take full ownership of delivering high-performing, high-impactbiomedical and scientificdataopsproducts and services, froma description of apattern thatcustomer Data Engineers are trying touseall the way through tofinal delivery (and ongoing monitoring and operations)of a templated project and all associated automation. They arestandard-bearers for software engineering and quality coding practices within theteam andareexpected to mentor more junior engineers; they may even coordinate the work of more junior engineers on a large project.Theydevise useful metrics for ensuring their services are meeting customer demand and having animpact anditerate to deliver and improve on those metrics in an agile fashion. Successful Senior DevOpsEngineers are developing expertise with the types of data and types of tools that are leveraged in the biomedical and scientific data engineering space, andhas the following skills and experience(withsignificant depth in one or more of these areas): Demonstrable experience deploying robust modularised/container-based solutions to production (ideally GCP) and leveraging the Cloud NativeComputing Foundation (CNCF) ecosystem Significant depth in DevOps principles and tools (e.g.GitOps, Jenkins,CircleCI, Azure DevOps, etc.), and how to integrate these tools with other productivity tools (e.g. Jira, Slack, Microsoft Teams) to build a comprehensive workflow Programming in Python. Scala orGo Embedding agile software engineering (task/issue management, testing, documentation, software development lifecycle, source control, etc.) Leveraging major cloud providers, both via Kubernetesorvia vendor-specific services Authentication and Authorization flows and associated technologies (e.g.OAuth2 + JWT) Common distributed data tools (e.g.Spark, Hive) The DSDE team is built on the principles of ownership, accountability, continuous development, and collaboration. We hire for the long term, and we're motivated to make this a great place to work. Our leaders will be committed to your career and development from day one. Why you? Basic Qualifications: We are looking for professionals with these required skills to achieve our goals: Masters in Computer Science with a focus in Data Engineering, DataOps, DevOps, MLOps, Software Engineering, etc, plus 5 years job experience (or PhD plus 3 years job experience) Experience with DevOps tools and concepts (e.g. Jira, GitLabs / Jenkins / CircleCI / Azure DevOps /etc.)Excellent with common distributed data tools in a production setting (Spark, Kafka, etc) Experience with specialized data architecture (e.g. optimizing physical layout for access patterns, including bloom filters, optimizing against self-describing formats such as ORC or Parquet, etc.) Experience with search / indexing systems (e.g. Elasticsearch) Expertise with agile development in Python, Scala, Go, and/or C++ Experience building reusable components on top of the CNCF ecosystem including Kubernetes Metrics-first mindset Experience mentoring junior engineers into deep technical expertise Preferred Qualifications: If you have the following characteristics, it would be a plus: Experience with agile software development Experience with building and designing a DevOps-first way of working Experience with building reusable components on top of the CNCF ecosystem including Kubernetes (or similar ecosystem ) LI-GSK Why GSK? Our values and expectationsare at the heart of everything we do and form an important part of our culture. These include Patient focus, Transparency, Respect, Integrity along with Courage, Accountability, Development, and Teamwork. As GSK focuses on our values and expectations and a culture of innovation, performance, and trust, the successful candidate will demonstrate the following capabilities: Operating at pace and agile decision making - using evidence and applying judgement to balance pace, rigour and risk. Committed to delivering high-quality results, overcoming challenges, focusing on what matters, execution. Continuously looking for opportunities to learn, build skills and share learning. Sustaining energy and wellbeing Building strong relationships and collaboration, honest and open conversations. Budgeting and cost consciousness As a company driven by our values of Patient focus, Transparency, Respect and Integrity, we know inclusion and diversity are essential for us to be able to succeed. We want all our colleagues to thrive at GSK bringing their unique experiences, ensuring they feel good and to keep growing their careers. As a candidate for a role, we want you to feel the same way. As an Equal Opportunity Employer, we are open to all talent. In the US, we also adhere to Affirmative Action principles. This ensures that all qualified applicants will receive equal consideration for employment without regard to race/ethnicity, colour, national origin, religion, gender, pregnancy, marital status, sexual orientation, gender identity/expression, age, disability, genetic information, military service, covered/protected veteran status or any other federal, state or local protected class ( US only). We believe in an agile working culture for all our roles. If flexibility is important to you, we encourage you to explore with our hiring team what the opportunities are. Please don't hesitate to contact us if you'd like to discuss any adjustments to our process which might help you demonstrate your strengths and capabilities. You can either call us on , or send an email As you apply, we will ask you to share some personal information which is entirely voluntary..... click apply for full job details
Senior Data Operations Engineer
GlaxoSmithKline Stevenage, Hertfordshire
Site Name: UK - Hertfordshire - Stevenage, USA - Connecticut - Hartford, USA - Delaware - Dover, USA - Maryland - Rockville, USA - Massachusetts - Waltham, USA - Pennsylvania - Upper Providence, Warren NJ Posted Date: Aug The mission of the Data Science and Data Engineering (DSDE) organization within GSK Pharmaceuticals R&D is to get the right data, to the right people, at the right time. TheData Framework and Opsorganization ensures we can do this efficiently, reliably, transparently, and at scale through the creation of a leading-edge, cloud-native data services framework. We focus heavily on developer experience, on strong, semantic abstractions for the data ecosystem, on professional operations and aggressive automation, and on transparency of operations and cost. Achieving delivery of the right data to the right people at the right time needs design and implementation of data flows and data products which leverage internal and external data assets and tools to drive discovery and development is a key objective for the Data Science and Data Engineering (DS D E) team within GSK's Pharmaceutical R&D organisation . There are five key drivers for this approach, which are closely aligned with GSK's corporate priorities of Innovation, Performance and Trust: Automation of end-to-end data flows: Faster and reliable ingestion of high throughput data in genetics, genomics and multi-omics, to extract value of investments in new technology (instrument to analysis-ready data in Enabling governance by design of external and internal data: with engineered practical solutions for controlled use and monitoring Innovative disease-specific and domain-expert specific data products : to enable computational scientists and their research unit collaborators to get faster to key insights leading to faster biopharmaceutical development cycles. Supporting e2 e code traceability and data provenance: Increasing assurance of data integrity through automation, integration Improving engineering efficiency: Extensible, reusable, scalable,updateable,maintainable, virtualized traceable data and code would b e driven by data engineering innovation and better resource utilization. We are looking for an experienced Sr. Data Ops Engineer to join our growing Data Ops team. As a Sr. Data Ops Engineer is a highly technical individual contributor, building modern, cloud-native, DevOps-first systems for standardizing and templatizingbiomedical and scientificdata engineering, with demonstrable experience across the following areas : Deliver declarative components for common data ingestion, transformation and publishing techniques Define and implement data governance aligned to modern standards Establish scalable, automated processes for data engineering team s across GSK Thought leader and partner with wider DSDE data engineering teams to advise on implementation and best practices Cloud Infrastructure-as-Code D efine Service and Flow orchestration Data as a configurable resource(including configuration-driven access to scientific data modelling tools) Ob servabilty (monitoring, alerting, logging, tracing, ...) Enable quality engineering through KPIs and c ode coverage and quality checks Standardise GitOps /declarative software development lifecycle Audit as a service Sr. DataOpsEngineerstake full ownership of delivering high-performing, high-impactbiomedical and scientificdataopsproducts and services, froma description of apattern thatcustomer Data Engineers are trying touseall the way through tofinal delivery (and ongoing monitoring and operations)of a templated project and all associated automation. They arestandard-bearers for software engineering and quality coding practices within theteam andareexpected to mentor more junior engineers; they may even coordinate the work of more junior engineers on a large project.Theydevise useful metrics for ensuring their services are meeting customer demand and having animpact anditerate to deliver and improve on those metrics in an agile fashion. A successfulSr.DataOpsEngineeris developing expertise with the types of data and types of tools that are leveraged in the biomedical and scientific data engineering space, andhas the following skills and experience(withsignificant depth in one or more of these areas): Demonstrable experience deploying robust modularised/ container based solutions to production (ideally GCP) and leveraging the Cloud NativeComputing Foundation (CNCF) ecosystem Significant depth in DevOps principles and tools ( e.g. GitOps , Jenkins, CircleCI , Azure DevOps, ...), and how to integrate these tools with other productivity tools (e.g. Jira, Slack, Microsoft Teams) to build a comprehensive workflow P rogramming in Python. Scala or Go Embedding agile s oftware engineering ( task/issue management, testing, documentation, software development lifecycle, source control, ) Leveraging major cloud providers, both via Kubernetes or via vendor-specific services Authentication and Authorization flows and associated technologies ( e.g. OAuth2 + JWT) Common distributed data tools ( e.g. Spark, Hive) The DSDE team is built on the principles of ownership, accountability, continuous development, and collaboration. We hire for the long term, and we're motivated to make this a great place to work. Our leaders will be committed to your career and development from day one. Why you? Basic Qualifications: Bachelors degree in Computer Science with a focus in Data Engineering, DataOps, DevOps, MLOps, Software Engineering, etc, plus 7 years job experience or Masters degree with 5 Years of experience (or PhD plus 3 years job experience) Deep experience with DevOps tools and concepts ( e.g. Jira, GitLabs / Jenkins / CircleCI / Azure DevOps / ...) Excellent with common distributed data tools in a production setting (Spark, Kafka, etc) Experience with specialized data architecture ( e.g. optimizing physical layout for access patterns, including bloom filters, optimizing against self-describing formats such as ORC or Parquet, etc) Experience with search / indexing systems ( e.g. Elasticsearch) Deep expertise with agile development in Python, Scala, Go, and/or C++ Experience building reusable components on top of the CNCF ecosystem including Kubernetes Metrics-first mindset Experience mentoring junior engineers into deep technical expertise Preferred Qualifications: If you have the following characteristics, it would be a plus: Experience with agile software development Experience building and designing a DevOps-first way of working Demonstrated experience building reusable components on top of the CNCF ecosystem including Kubernetes (or similar ecosystem ) LI-GSK Why GSK? Our values and expectations are at the heart of everything we do and form an important part of our culture. These include Patient focus, Transparency, Respect, Integrity along with Courage, Accountability, Development, and Teamwork. As GSK focuses on our values and expectations and a culture of innovation, performance, and trust, the successful candidate will demonstrate the following capabilities: Operating at pace and agile decision making - using evidence and applying judgement to balance pace, rigour and risk. Committed to delivering high-quality results, overcoming challenges, focusing on what matters, execution. Continuously looking for opportunities to learn, build skills and share learning. Sustaining energy and wellbeing Building strong relationships and collaboration, honest and open conversations. Budgeting and cost consciousness As a company driven by our values of Patient focus, Transparency, Respect and Integrity, we know inclusion and diversity are essential for us to be able to succeed. We want all our colleagues to thrive at GSK bringing their unique experiences, ensuring they feel good and to keep growing their careers. As a candidate for a role, we want you to feel the same way. As an Equal Opportunity Employer, we are open to all talent. In the US, we also adhere to Affirmative Action principles. This ensures that all qualified applicants will receive equal consideration for employment without regard to neurodiversity, race/ethnicity, colour, national origin, religion, gender, pregnancy, marital status, sexual orientation, gender identity/expression, age, disability, genetic information, military service, covered/protected veteran status or any other federal, state or local protected class ( US only). We believe in an agile working culture for all our roles. If flexibility is important to you, we encourage you to explore with our hiring team what the opportunities are. Please don't hesitate to contact us if you'd like to discuss any adjustments to our process which might help you demonstrate your strengths and capabilities. You can either call us on , or send an email As you apply, we will ask you to share some personal information which is entirely voluntary..... click apply for full job details
23/09/2022
Full time
Site Name: UK - Hertfordshire - Stevenage, USA - Connecticut - Hartford, USA - Delaware - Dover, USA - Maryland - Rockville, USA - Massachusetts - Waltham, USA - Pennsylvania - Upper Providence, Warren NJ Posted Date: Aug The mission of the Data Science and Data Engineering (DSDE) organization within GSK Pharmaceuticals R&D is to get the right data, to the right people, at the right time. TheData Framework and Opsorganization ensures we can do this efficiently, reliably, transparently, and at scale through the creation of a leading-edge, cloud-native data services framework. We focus heavily on developer experience, on strong, semantic abstractions for the data ecosystem, on professional operations and aggressive automation, and on transparency of operations and cost. Achieving delivery of the right data to the right people at the right time needs design and implementation of data flows and data products which leverage internal and external data assets and tools to drive discovery and development is a key objective for the Data Science and Data Engineering (DS D E) team within GSK's Pharmaceutical R&D organisation . There are five key drivers for this approach, which are closely aligned with GSK's corporate priorities of Innovation, Performance and Trust: Automation of end-to-end data flows: Faster and reliable ingestion of high throughput data in genetics, genomics and multi-omics, to extract value of investments in new technology (instrument to analysis-ready data in Enabling governance by design of external and internal data: with engineered practical solutions for controlled use and monitoring Innovative disease-specific and domain-expert specific data products : to enable computational scientists and their research unit collaborators to get faster to key insights leading to faster biopharmaceutical development cycles. Supporting e2 e code traceability and data provenance: Increasing assurance of data integrity through automation, integration Improving engineering efficiency: Extensible, reusable, scalable,updateable,maintainable, virtualized traceable data and code would b e driven by data engineering innovation and better resource utilization. We are looking for an experienced Sr. Data Ops Engineer to join our growing Data Ops team. As a Sr. Data Ops Engineer is a highly technical individual contributor, building modern, cloud-native, DevOps-first systems for standardizing and templatizingbiomedical and scientificdata engineering, with demonstrable experience across the following areas : Deliver declarative components for common data ingestion, transformation and publishing techniques Define and implement data governance aligned to modern standards Establish scalable, automated processes for data engineering team s across GSK Thought leader and partner with wider DSDE data engineering teams to advise on implementation and best practices Cloud Infrastructure-as-Code D efine Service and Flow orchestration Data as a configurable resource(including configuration-driven access to scientific data modelling tools) Ob servabilty (monitoring, alerting, logging, tracing, ...) Enable quality engineering through KPIs and c ode coverage and quality checks Standardise GitOps /declarative software development lifecycle Audit as a service Sr. DataOpsEngineerstake full ownership of delivering high-performing, high-impactbiomedical and scientificdataopsproducts and services, froma description of apattern thatcustomer Data Engineers are trying touseall the way through tofinal delivery (and ongoing monitoring and operations)of a templated project and all associated automation. They arestandard-bearers for software engineering and quality coding practices within theteam andareexpected to mentor more junior engineers; they may even coordinate the work of more junior engineers on a large project.Theydevise useful metrics for ensuring their services are meeting customer demand and having animpact anditerate to deliver and improve on those metrics in an agile fashion. A successfulSr.DataOpsEngineeris developing expertise with the types of data and types of tools that are leveraged in the biomedical and scientific data engineering space, andhas the following skills and experience(withsignificant depth in one or more of these areas): Demonstrable experience deploying robust modularised/ container based solutions to production (ideally GCP) and leveraging the Cloud NativeComputing Foundation (CNCF) ecosystem Significant depth in DevOps principles and tools ( e.g. GitOps , Jenkins, CircleCI , Azure DevOps, ...), and how to integrate these tools with other productivity tools (e.g. Jira, Slack, Microsoft Teams) to build a comprehensive workflow P rogramming in Python. Scala or Go Embedding agile s oftware engineering ( task/issue management, testing, documentation, software development lifecycle, source control, ) Leveraging major cloud providers, both via Kubernetes or via vendor-specific services Authentication and Authorization flows and associated technologies ( e.g. OAuth2 + JWT) Common distributed data tools ( e.g. Spark, Hive) The DSDE team is built on the principles of ownership, accountability, continuous development, and collaboration. We hire for the long term, and we're motivated to make this a great place to work. Our leaders will be committed to your career and development from day one. Why you? Basic Qualifications: Bachelors degree in Computer Science with a focus in Data Engineering, DataOps, DevOps, MLOps, Software Engineering, etc, plus 7 years job experience or Masters degree with 5 Years of experience (or PhD plus 3 years job experience) Deep experience with DevOps tools and concepts ( e.g. Jira, GitLabs / Jenkins / CircleCI / Azure DevOps / ...) Excellent with common distributed data tools in a production setting (Spark, Kafka, etc) Experience with specialized data architecture ( e.g. optimizing physical layout for access patterns, including bloom filters, optimizing against self-describing formats such as ORC or Parquet, etc) Experience with search / indexing systems ( e.g. Elasticsearch) Deep expertise with agile development in Python, Scala, Go, and/or C++ Experience building reusable components on top of the CNCF ecosystem including Kubernetes Metrics-first mindset Experience mentoring junior engineers into deep technical expertise Preferred Qualifications: If you have the following characteristics, it would be a plus: Experience with agile software development Experience building and designing a DevOps-first way of working Demonstrated experience building reusable components on top of the CNCF ecosystem including Kubernetes (or similar ecosystem ) LI-GSK Why GSK? Our values and expectations are at the heart of everything we do and form an important part of our culture. These include Patient focus, Transparency, Respect, Integrity along with Courage, Accountability, Development, and Teamwork. As GSK focuses on our values and expectations and a culture of innovation, performance, and trust, the successful candidate will demonstrate the following capabilities: Operating at pace and agile decision making - using evidence and applying judgement to balance pace, rigour and risk. Committed to delivering high-quality results, overcoming challenges, focusing on what matters, execution. Continuously looking for opportunities to learn, build skills and share learning. Sustaining energy and wellbeing Building strong relationships and collaboration, honest and open conversations. Budgeting and cost consciousness As a company driven by our values of Patient focus, Transparency, Respect and Integrity, we know inclusion and diversity are essential for us to be able to succeed. We want all our colleagues to thrive at GSK bringing their unique experiences, ensuring they feel good and to keep growing their careers. As a candidate for a role, we want you to feel the same way. As an Equal Opportunity Employer, we are open to all talent. In the US, we also adhere to Affirmative Action principles. This ensures that all qualified applicants will receive equal consideration for employment without regard to neurodiversity, race/ethnicity, colour, national origin, religion, gender, pregnancy, marital status, sexual orientation, gender identity/expression, age, disability, genetic information, military service, covered/protected veteran status or any other federal, state or local protected class ( US only). We believe in an agile working culture for all our roles. If flexibility is important to you, we encourage you to explore with our hiring team what the opportunities are. Please don't hesitate to contact us if you'd like to discuss any adjustments to our process which might help you demonstrate your strengths and capabilities. You can either call us on , or send an email As you apply, we will ask you to share some personal information which is entirely voluntary..... click apply for full job details
Omni RMS
Senior Data Architect
Omni RMS Manchester, Lancashire
Team Overview Data is central to Ofcom's work. We use data from a wide range of sources to understand the dynamics of the sectors we regulate and to provide key insights to inform policy decisions. In addition to existing regulatory responsibilities, there will be exciting opportunities to contribute to Ofcom's new duties in relation to the regulation of Online Harms. It is expected that this new area of work will expand the scope of data-driven activities at Ofcom in terms of the variety and volume of data, as well as the range of analysis we do. To ensure that Ofcom has the appropriate data capabilities to undertake the analysis effectively, ICT has created a new role for an Data Solutions Architect Purpose of the Role The Data Solutions Architect works closely with ICT and the Data Innovation Hub. You will architect, design, implement and oversee the operations of data solutions that empower data professionals to efficiently and effectively deliver their work. Candidates will exhibit critical thinking skills, the ability to synthesize complex problems, and have relevant skills and experience for enabling the transformation of data to create solutions that add value to a myriad of business requirements. You must have a deep understanding of the full data lifecycle and the role that high-quality data plays across applications, machine learning, business analytics, and reporting. You will lead the design and development of solution architectures in response to business requirements. This includes identifying and evaluating alternative architectures, their trade-offs in cost, performance and scalability. And ensuring that the relevant technical strategies, policies, standards and practices (including security) are applied correctly. The end to end solution will be fit for purpose - i.e. meet the needs of business, the agreed requirements, and support the strategic direction of Ofcom. Work with IT teams, business analysts and data analytics teams to understand data consumers' needs and develop solutions By maintaining your knowledge of emerging trends in data usage, tools and analysis techniques you will support our on-going development activities and continually promote data innovation as a means to achieve the business outcomes for specific Groups, and Ofcom. You will need to be self-motivated, an effective communicator and have a collaborative delivery approach.You will work in a collaborative cross-functional environment and interact with the full spectrum of colleagues (data engineers, data analysts, data scientists, operational support and policy makers), and you will need to inform and influence senior managers. Requirements of the Role Build strong relationships with colleagues across the business, understanding their motivations behind projects and own technical activities to translate business requirements (both functional and non-functional) into a solution. Ensuring the required business value is delivered. Fostering a customer centric approach to ensure delivery of business value and an iterative approach that responds to feedback and changing needs. Perform deep dives into technical areas to solve a specific solution or design challenges. Using trials or POCs to prove or discount an approach, to critic your own design. You will be responsible for ensuring that the solutions you help deliver form an integral part of the ICT estate and align with the wider reference architecture and domain roadmaps. Manage stakeholder expectations and be flexible, working on many different projects and topics at the same time period. Manage proactive and reactive communication. Facilitate difficult discussions within the team or with diverse senior stakeholders and external / 3rd parties as necessary. Provide documentation of solutions detailing the business, data, application and technology layers. Work with Data Engineers to define data pipelines and data lakes, covering the ingression, ETL or ELT, and the cataloguing of data. Takes overall responsibility for planning effective data storage, cost, security, quality, sharing, availability, retention and publishing within the organisation. Develops analytics policy, standards and guidelines Ensuring successful transitions for solutions into production ensuring production support have the necessary knowledge and documentation to support the service. Skills, knowledge and experience Robust Data and Technical/Solutions Architecture skills - sets direction for and possesses a deep understanding of architecture and strategies which integrate with industry trends e.g. TOGAF Hands-on experience with analytical tools and languages such as: Python, R, SQL, Azure Data Factory (and SSIS), Databricks, Power BI, Git etc. Experience of infrastructure: cloud-based technologies used for storage, data lakes, data warehouses, data streaming, Databases, ETL / ELT / Transformation. Experience of DevOps/ DataOps methods in development of data solutions to ensure pipelines and processes can be automated and controlled. Experience with Cloud based data and analytical initiatives utilising Azure, AWS, Google Cloud or similar cloud services Experience of working closely with Data Professionals (E.g. Data Scientists and Data Analysts) to understanding their needs. Experience of implementating of statistical, Artificial Intelligence, Machine Learning and Deep learning applications. Experience with integrations (e.g. via APIs) with external vendors to share data between organizations Experience of working with external technology supplier and service providers to deliver business solutions SFIA Skill Enterprise and business architecture STPL - Level 5 Solution architecture ARCH - Level 5 Requirements definition and management REQM- Level 5 Database design DBDS- Level 5 Analytics INAN- Level 4 Emerging Technology Monitoring (EMRG)- Level 4 Relationship Management RLMT- Level 5
04/11/2021
Full time
Team Overview Data is central to Ofcom's work. We use data from a wide range of sources to understand the dynamics of the sectors we regulate and to provide key insights to inform policy decisions. In addition to existing regulatory responsibilities, there will be exciting opportunities to contribute to Ofcom's new duties in relation to the regulation of Online Harms. It is expected that this new area of work will expand the scope of data-driven activities at Ofcom in terms of the variety and volume of data, as well as the range of analysis we do. To ensure that Ofcom has the appropriate data capabilities to undertake the analysis effectively, ICT has created a new role for an Data Solutions Architect Purpose of the Role The Data Solutions Architect works closely with ICT and the Data Innovation Hub. You will architect, design, implement and oversee the operations of data solutions that empower data professionals to efficiently and effectively deliver their work. Candidates will exhibit critical thinking skills, the ability to synthesize complex problems, and have relevant skills and experience for enabling the transformation of data to create solutions that add value to a myriad of business requirements. You must have a deep understanding of the full data lifecycle and the role that high-quality data plays across applications, machine learning, business analytics, and reporting. You will lead the design and development of solution architectures in response to business requirements. This includes identifying and evaluating alternative architectures, their trade-offs in cost, performance and scalability. And ensuring that the relevant technical strategies, policies, standards and practices (including security) are applied correctly. The end to end solution will be fit for purpose - i.e. meet the needs of business, the agreed requirements, and support the strategic direction of Ofcom. Work with IT teams, business analysts and data analytics teams to understand data consumers' needs and develop solutions By maintaining your knowledge of emerging trends in data usage, tools and analysis techniques you will support our on-going development activities and continually promote data innovation as a means to achieve the business outcomes for specific Groups, and Ofcom. You will need to be self-motivated, an effective communicator and have a collaborative delivery approach.You will work in a collaborative cross-functional environment and interact with the full spectrum of colleagues (data engineers, data analysts, data scientists, operational support and policy makers), and you will need to inform and influence senior managers. Requirements of the Role Build strong relationships with colleagues across the business, understanding their motivations behind projects and own technical activities to translate business requirements (both functional and non-functional) into a solution. Ensuring the required business value is delivered. Fostering a customer centric approach to ensure delivery of business value and an iterative approach that responds to feedback and changing needs. Perform deep dives into technical areas to solve a specific solution or design challenges. Using trials or POCs to prove or discount an approach, to critic your own design. You will be responsible for ensuring that the solutions you help deliver form an integral part of the ICT estate and align with the wider reference architecture and domain roadmaps. Manage stakeholder expectations and be flexible, working on many different projects and topics at the same time period. Manage proactive and reactive communication. Facilitate difficult discussions within the team or with diverse senior stakeholders and external / 3rd parties as necessary. Provide documentation of solutions detailing the business, data, application and technology layers. Work with Data Engineers to define data pipelines and data lakes, covering the ingression, ETL or ELT, and the cataloguing of data. Takes overall responsibility for planning effective data storage, cost, security, quality, sharing, availability, retention and publishing within the organisation. Develops analytics policy, standards and guidelines Ensuring successful transitions for solutions into production ensuring production support have the necessary knowledge and documentation to support the service. Skills, knowledge and experience Robust Data and Technical/Solutions Architecture skills - sets direction for and possesses a deep understanding of architecture and strategies which integrate with industry trends e.g. TOGAF Hands-on experience with analytical tools and languages such as: Python, R, SQL, Azure Data Factory (and SSIS), Databricks, Power BI, Git etc. Experience of infrastructure: cloud-based technologies used for storage, data lakes, data warehouses, data streaming, Databases, ETL / ELT / Transformation. Experience of DevOps/ DataOps methods in development of data solutions to ensure pipelines and processes can be automated and controlled. Experience with Cloud based data and analytical initiatives utilising Azure, AWS, Google Cloud or similar cloud services Experience of working closely with Data Professionals (E.g. Data Scientists and Data Analysts) to understanding their needs. Experience of implementating of statistical, Artificial Intelligence, Machine Learning and Deep learning applications. Experience with integrations (e.g. via APIs) with external vendors to share data between organizations Experience of working with external technology supplier and service providers to deliver business solutions SFIA Skill Enterprise and business architecture STPL - Level 5 Solution architecture ARCH - Level 5 Requirements definition and management REQM- Level 5 Database design DBDS- Level 5 Analytics INAN- Level 4 Emerging Technology Monitoring (EMRG)- Level 4 Relationship Management RLMT- Level 5
Omni RMS
Senior Data Architect
Omni RMS
Team Overview Data is central to Ofcom's work. We use data from a wide range of sources to understand the dynamics of the sectors we regulate and to provide key insights to inform policy decisions. In addition to existing regulatory responsibilities, there will be exciting opportunities to contribute to Ofcom's new duties in relation to the regulation of Online Harms. It is expected that this new area of work will expand the scope of data-driven activities at Ofcom in terms of the variety and volume of data, as well as the range of analysis we do. To ensure that Ofcom has the appropriate data capabilities to undertake the analysis effectively, ICT has created a new role for an Data Solutions Architect Purpose of the Role The Data Solutions Architect works closely with ICT and the Data Innovation Hub. You will architect, design, implement and oversee the operations of data solutions that empower data professionals to efficiently and effectively deliver their work. Candidates will exhibit critical thinking skills, the ability to synthesize complex problems, and have relevant skills and experience for enabling the transformation of data to create solutions that add value to a myriad of business requirements. You must have a deep understanding of the full data lifecycle and the role that high-quality data plays across applications, machine learning, business analytics, and reporting. You will lead the design and development of solution architectures in response to business requirements. This includes identifying and evaluating alternative architectures, their trade-offs in cost, performance and scalability. And ensuring that the relevant technical strategies, policies, standards and practices (including security) are applied correctly. The end to end solution will be fit for purpose - i.e. meet the needs of business, the agreed requirements, and support the strategic direction of Ofcom. Work with IT teams, business analysts and data analytics teams to understand data consumers' needs and develop solutions By maintaining your knowledge of emerging trends in data usage, tools and analysis techniques you will support our on-going development activities and continually promote data innovation as a means to achieve the business outcomes for specific Groups, and Ofcom. You will need to be self-motivated, an effective communicator and have a collaborative delivery approach.You will work in a collaborative cross-functional environment and interact with the full spectrum of colleagues (data engineers, data analysts, data scientists, operational support and policy makers), and you will need to inform and influence senior managers. Requirements of the Role Build strong relationships with colleagues across the business, understanding their motivations behind projects and own technical activities to translate business requirements (both functional and non-functional) into a solution. Ensuring the required business value is delivered. Fostering a customer centric approach to ensure delivery of business value and an iterative approach that responds to feedback and changing needs. Perform deep dives into technical areas to solve a specific solution or design challenges. Using trials or POCs to prove or discount an approach, to critic your own design. You will be responsible for ensuring that the solutions you help deliver form an integral part of the ICT estate and align with the wider reference architecture and domain roadmaps. Manage stakeholder expectations and be flexible, working on many different projects and topics at the same time period. Manage proactive and reactive communication. Facilitate difficult discussions within the team or with diverse senior stakeholders and external / 3rd parties as necessary. Provide documentation of solutions detailing the business, data, application and technology layers. Work with Data Engineers to define data pipelines and data lakes, covering the ingression, ETL or ELT, and the cataloguing of data. Takes overall responsibility for planning effective data storage, cost, security, quality, sharing, availability, retention and publishing within the organisation. Develops analytics policy, standards and guidelines Ensuring successful transitions for solutions into production ensuring production support have the necessary knowledge and documentation to support the service. Skills, knowledge and experience Robust Data and Technical/Solutions Architecture skills - sets direction for and possesses a deep understanding of architecture and strategies which integrate with industry trends e.g. TOGAF Hands-on experience with analytical tools and languages such as: Python, R, SQL, Azure Data Factory (and SSIS), Databricks, Power BI, Git etc. Experience of infrastructure: cloud-based technologies used for storage, data lakes, data warehouses, data streaming, Databases, ETL / ELT / Transformation. Experience of DevOps/ DataOps methods in development of data solutions to ensure pipelines and processes can be automated and controlled. Experience with Cloud based data and analytical initiatives utilising Azure, AWS, Google Cloud or similar cloud services Experience of working closely with Data Professionals (E.g. Data Scientists and Data Analysts) to understanding their needs. Experience of implementating of statistical, Artificial Intelligence, Machine Learning and Deep learning applications. Experience with integrations (e.g. via APIs) with external vendors to share data between organizations Experience of working with external technology supplier and service providers to deliver business solutions SFIA Skill Enterprise and business architecture STPL - Level 5 Solution architecture ARCH - Level 5 Requirements definition and management REQM- Level 5 Database design DBDS- Level 5 Analytics INAN- Level 4 Emerging Technology Monitoring (EMRG)- Level 4 Relationship Management RLMT- Level 5
04/11/2021
Full time
Team Overview Data is central to Ofcom's work. We use data from a wide range of sources to understand the dynamics of the sectors we regulate and to provide key insights to inform policy decisions. In addition to existing regulatory responsibilities, there will be exciting opportunities to contribute to Ofcom's new duties in relation to the regulation of Online Harms. It is expected that this new area of work will expand the scope of data-driven activities at Ofcom in terms of the variety and volume of data, as well as the range of analysis we do. To ensure that Ofcom has the appropriate data capabilities to undertake the analysis effectively, ICT has created a new role for an Data Solutions Architect Purpose of the Role The Data Solutions Architect works closely with ICT and the Data Innovation Hub. You will architect, design, implement and oversee the operations of data solutions that empower data professionals to efficiently and effectively deliver their work. Candidates will exhibit critical thinking skills, the ability to synthesize complex problems, and have relevant skills and experience for enabling the transformation of data to create solutions that add value to a myriad of business requirements. You must have a deep understanding of the full data lifecycle and the role that high-quality data plays across applications, machine learning, business analytics, and reporting. You will lead the design and development of solution architectures in response to business requirements. This includes identifying and evaluating alternative architectures, their trade-offs in cost, performance and scalability. And ensuring that the relevant technical strategies, policies, standards and practices (including security) are applied correctly. The end to end solution will be fit for purpose - i.e. meet the needs of business, the agreed requirements, and support the strategic direction of Ofcom. Work with IT teams, business analysts and data analytics teams to understand data consumers' needs and develop solutions By maintaining your knowledge of emerging trends in data usage, tools and analysis techniques you will support our on-going development activities and continually promote data innovation as a means to achieve the business outcomes for specific Groups, and Ofcom. You will need to be self-motivated, an effective communicator and have a collaborative delivery approach.You will work in a collaborative cross-functional environment and interact with the full spectrum of colleagues (data engineers, data analysts, data scientists, operational support and policy makers), and you will need to inform and influence senior managers. Requirements of the Role Build strong relationships with colleagues across the business, understanding their motivations behind projects and own technical activities to translate business requirements (both functional and non-functional) into a solution. Ensuring the required business value is delivered. Fostering a customer centric approach to ensure delivery of business value and an iterative approach that responds to feedback and changing needs. Perform deep dives into technical areas to solve a specific solution or design challenges. Using trials or POCs to prove or discount an approach, to critic your own design. You will be responsible for ensuring that the solutions you help deliver form an integral part of the ICT estate and align with the wider reference architecture and domain roadmaps. Manage stakeholder expectations and be flexible, working on many different projects and topics at the same time period. Manage proactive and reactive communication. Facilitate difficult discussions within the team or with diverse senior stakeholders and external / 3rd parties as necessary. Provide documentation of solutions detailing the business, data, application and technology layers. Work with Data Engineers to define data pipelines and data lakes, covering the ingression, ETL or ELT, and the cataloguing of data. Takes overall responsibility for planning effective data storage, cost, security, quality, sharing, availability, retention and publishing within the organisation. Develops analytics policy, standards and guidelines Ensuring successful transitions for solutions into production ensuring production support have the necessary knowledge and documentation to support the service. Skills, knowledge and experience Robust Data and Technical/Solutions Architecture skills - sets direction for and possesses a deep understanding of architecture and strategies which integrate with industry trends e.g. TOGAF Hands-on experience with analytical tools and languages such as: Python, R, SQL, Azure Data Factory (and SSIS), Databricks, Power BI, Git etc. Experience of infrastructure: cloud-based technologies used for storage, data lakes, data warehouses, data streaming, Databases, ETL / ELT / Transformation. Experience of DevOps/ DataOps methods in development of data solutions to ensure pipelines and processes can be automated and controlled. Experience with Cloud based data and analytical initiatives utilising Azure, AWS, Google Cloud or similar cloud services Experience of working closely with Data Professionals (E.g. Data Scientists and Data Analysts) to understanding their needs. Experience of implementating of statistical, Artificial Intelligence, Machine Learning and Deep learning applications. Experience with integrations (e.g. via APIs) with external vendors to share data between organizations Experience of working with external technology supplier and service providers to deliver business solutions SFIA Skill Enterprise and business architecture STPL - Level 5 Solution architecture ARCH - Level 5 Requirements definition and management REQM- Level 5 Database design DBDS- Level 5 Analytics INAN- Level 4 Emerging Technology Monitoring (EMRG)- Level 4 Relationship Management RLMT- Level 5
Omni RMS
Senior Data Architect
Omni RMS Warrington, Cheshire
Team Overview Data is central to Ofcom's work. We use data from a wide range of sources to understand the dynamics of the sectors we regulate and to provide key insights to inform policy decisions. In addition to existing regulatory responsibilities, there will be exciting opportunities to contribute to Ofcom's new duties in relation to the regulation of Online Harms. It is expected that this new area of work will expand the scope of data-driven activities at Ofcom in terms of the variety and volume of data, as well as the range of analysis we do. To ensure that Ofcom has the appropriate data capabilities to undertake the analysis effectively, ICT has created a new role for an Data Solutions Architect Purpose of the Role The Data Solutions Architect works closely with ICT and the Data Innovation Hub. You will architect, design, implement and oversee the operations of data solutions that empower data professionals to efficiently and effectively deliver their work. Candidates will exhibit critical thinking skills, the ability to synthesize complex problems, and have relevant skills and experience for enabling the transformation of data to create solutions that add value to a myriad of business requirements. You must have a deep understanding of the full data lifecycle and the role that high-quality data plays across applications, machine learning, business analytics, and reporting. You will lead the design and development of solution architectures in response to business requirements. This includes identifying and evaluating alternative architectures, their trade-offs in cost, performance and scalability. And ensuring that the relevant technical strategies, policies, standards and practices (including security) are applied correctly. The end to end solution will be fit for purpose - i.e. meet the needs of business, the agreed requirements, and support the strategic direction of Ofcom. Work with IT teams, business analysts and data analytics teams to understand data consumers' needs and develop solutions By maintaining your knowledge of emerging trends in data usage, tools and analysis techniques you will support our on-going development activities and continually promote data innovation as a means to achieve the business outcomes for specific Groups, and Ofcom. You will need to be self-motivated, an effective communicator and have a collaborative delivery approach.You will work in a collaborative cross-functional environment and interact with the full spectrum of colleagues (data engineers, data analysts, data scientists, operational support and policy makers), and you will need to inform and influence senior managers. Requirements of the Role Build strong relationships with colleagues across the business, understanding their motivations behind projects and own technical activities to translate business requirements (both functional and non-functional) into a solution. Ensuring the required business value is delivered. Fostering a customer centric approach to ensure delivery of business value and an iterative approach that responds to feedback and changing needs. Perform deep dives into technical areas to solve a specific solution or design challenges. Using trials or POCs to prove or discount an approach, to critic your own design. You will be responsible for ensuring that the solutions you help deliver form an integral part of the ICT estate and align with the wider reference architecture and domain roadmaps. Manage stakeholder expectations and be flexible, working on many different projects and topics at the same time period. Manage proactive and reactive communication. Facilitate difficult discussions within the team or with diverse senior stakeholders and external / 3rd parties as necessary. Provide documentation of solutions detailing the business, data, application and technology layers. Work with Data Engineers to define data pipelines and data lakes, covering the ingression, ETL or ELT, and the cataloguing of data. Takes overall responsibility for planning effective data storage, cost, security, quality, sharing, availability, retention and publishing within the organisation. Develops analytics policy, standards and guidelines Ensuring successful transitions for solutions into production ensuring production support have the necessary knowledge and documentation to support the service. Skills, knowledge and experience Robust Data and Technical/Solutions Architecture skills - sets direction for and possesses a deep understanding of architecture and strategies which integrate with industry trends e.g. TOGAF Hands-on experience with analytical tools and languages such as: Python, R, SQL, Azure Data Factory (and SSIS), Databricks, Power BI, Git etc. Experience of infrastructure: cloud-based technologies used for storage, data lakes, data warehouses, data streaming, Databases, ETL / ELT / Transformation. Experience of DevOps/ DataOps methods in development of data solutions to ensure pipelines and processes can be automated and controlled. Experience with Cloud based data and analytical initiatives utilising Azure, AWS, Google Cloud or similar cloud services Experience of working closely with Data Professionals (E.g. Data Scientists and Data Analysts) to understanding their needs. Experience of implementating of statistical, Artificial Intelligence, Machine Learning and Deep learning applications. Experience with integrations (e.g. via APIs) with external vendors to share data between organizations Experience of working with external technology supplier and service providers to deliver business solutions SFIA Skill Enterprise and business architecture STPL - Level 5 Solution architecture ARCH - Level 5 Requirements definition and management REQM- Level 5 Database design DBDS- Level 5 Analytics INAN- Level 4 Emerging Technology Monitoring (EMRG)- Level 4 Relationship Management RLMT- Level 5
04/11/2021
Full time
Team Overview Data is central to Ofcom's work. We use data from a wide range of sources to understand the dynamics of the sectors we regulate and to provide key insights to inform policy decisions. In addition to existing regulatory responsibilities, there will be exciting opportunities to contribute to Ofcom's new duties in relation to the regulation of Online Harms. It is expected that this new area of work will expand the scope of data-driven activities at Ofcom in terms of the variety and volume of data, as well as the range of analysis we do. To ensure that Ofcom has the appropriate data capabilities to undertake the analysis effectively, ICT has created a new role for an Data Solutions Architect Purpose of the Role The Data Solutions Architect works closely with ICT and the Data Innovation Hub. You will architect, design, implement and oversee the operations of data solutions that empower data professionals to efficiently and effectively deliver their work. Candidates will exhibit critical thinking skills, the ability to synthesize complex problems, and have relevant skills and experience for enabling the transformation of data to create solutions that add value to a myriad of business requirements. You must have a deep understanding of the full data lifecycle and the role that high-quality data plays across applications, machine learning, business analytics, and reporting. You will lead the design and development of solution architectures in response to business requirements. This includes identifying and evaluating alternative architectures, their trade-offs in cost, performance and scalability. And ensuring that the relevant technical strategies, policies, standards and practices (including security) are applied correctly. The end to end solution will be fit for purpose - i.e. meet the needs of business, the agreed requirements, and support the strategic direction of Ofcom. Work with IT teams, business analysts and data analytics teams to understand data consumers' needs and develop solutions By maintaining your knowledge of emerging trends in data usage, tools and analysis techniques you will support our on-going development activities and continually promote data innovation as a means to achieve the business outcomes for specific Groups, and Ofcom. You will need to be self-motivated, an effective communicator and have a collaborative delivery approach.You will work in a collaborative cross-functional environment and interact with the full spectrum of colleagues (data engineers, data analysts, data scientists, operational support and policy makers), and you will need to inform and influence senior managers. Requirements of the Role Build strong relationships with colleagues across the business, understanding their motivations behind projects and own technical activities to translate business requirements (both functional and non-functional) into a solution. Ensuring the required business value is delivered. Fostering a customer centric approach to ensure delivery of business value and an iterative approach that responds to feedback and changing needs. Perform deep dives into technical areas to solve a specific solution or design challenges. Using trials or POCs to prove or discount an approach, to critic your own design. You will be responsible for ensuring that the solutions you help deliver form an integral part of the ICT estate and align with the wider reference architecture and domain roadmaps. Manage stakeholder expectations and be flexible, working on many different projects and topics at the same time period. Manage proactive and reactive communication. Facilitate difficult discussions within the team or with diverse senior stakeholders and external / 3rd parties as necessary. Provide documentation of solutions detailing the business, data, application and technology layers. Work with Data Engineers to define data pipelines and data lakes, covering the ingression, ETL or ELT, and the cataloguing of data. Takes overall responsibility for planning effective data storage, cost, security, quality, sharing, availability, retention and publishing within the organisation. Develops analytics policy, standards and guidelines Ensuring successful transitions for solutions into production ensuring production support have the necessary knowledge and documentation to support the service. Skills, knowledge and experience Robust Data and Technical/Solutions Architecture skills - sets direction for and possesses a deep understanding of architecture and strategies which integrate with industry trends e.g. TOGAF Hands-on experience with analytical tools and languages such as: Python, R, SQL, Azure Data Factory (and SSIS), Databricks, Power BI, Git etc. Experience of infrastructure: cloud-based technologies used for storage, data lakes, data warehouses, data streaming, Databases, ETL / ELT / Transformation. Experience of DevOps/ DataOps methods in development of data solutions to ensure pipelines and processes can be automated and controlled. Experience with Cloud based data and analytical initiatives utilising Azure, AWS, Google Cloud or similar cloud services Experience of working closely with Data Professionals (E.g. Data Scientists and Data Analysts) to understanding their needs. Experience of implementating of statistical, Artificial Intelligence, Machine Learning and Deep learning applications. Experience with integrations (e.g. via APIs) with external vendors to share data between organizations Experience of working with external technology supplier and service providers to deliver business solutions SFIA Skill Enterprise and business architecture STPL - Level 5 Solution architecture ARCH - Level 5 Requirements definition and management REQM- Level 5 Database design DBDS- Level 5 Analytics INAN- Level 4 Emerging Technology Monitoring (EMRG)- Level 4 Relationship Management RLMT- Level 5
Nutmeg
Senior Data Engineer
Nutmeg
Who we are: Nutmeg is Europe's leading Digital Wealth Manager, but we don't want to stop there. We're continuing to build our platform to help us achieve our mission of being the most trusted Digital Wealth Manager in the world. Since being founded in 2011 we've: Grown to 160+ employees Raised over £100M in funding Launched 4 amazing products including JISA and Lifetime ISA Won multiple awards including Best Online Stocks & Shares ISA Provider for the fifth year in a row! We hit the 130,000 investor milestone in early 2021 and now manage over £3 billion AUM. *We offer flexible working* Job in a nutshell: We run a pure AWS-based cloud environment and deliver features using a continuous delivery approach. Our Data platform comprises a mix of services and open-source products fully running in Kubernetes and utilising AWS native Data solutions. Nutmeg Data solution is a mix of batching and streaming processes leveraging Airflow, Apache Kafka and AWS Data tools. Our key characteristic is enabling a self-service experience for all Data stakeholders. Nutmeg products are served by a polyglot mix of microservices designed following Domain-Driven Design principles and composing an Event-Driven Architecture powered by Apache Kafka. As a Senior Data Engineer, you will closely collaborate with technical and non-technical teams to deliver Data solutions supporting Nutmeg's Data strategy. We are looking for someone with previous job experience as a senior engineer and a strong passion for Data challenges. Requirements Your skills: Following Data engineering industry best practice Full ownership of end-to-end Data pipelines Designing, implementing, and maintaining Data models Writing automated test around Data models Understanding of CI/CD principles Experience with cloud platforms for Data (ideally AWS) Experience in converting business requirements into technical deliverables Previous experience with two or more of the following: Airflow, dbt, Kafka Connect, Looker, Python, and Redshift You might also have: DataOps best practice Experience in collaborating with BI and Data Science teams Use of agile/lean methodologies for continuous delivery and improvement Knowledge of monitoring, metrics or Site Reliability Engineering Understanding of Data governance and security standards Benefits 25 days' holiday Birthday day off 2 days' paid community leave Competitive salary Private healthcare with Vitality from day 1 Access to a digital GP and other healthcare resources Season ticket and bike loans Access to a wellbeing platform & regular knowledge sharing Regular homeworking perks and rewards Cycle storage and showers onsite Discounted Nutmeg account for you and your family and friends Part of an inclusive Nutmeg team
15/09/2021
Full time
Who we are: Nutmeg is Europe's leading Digital Wealth Manager, but we don't want to stop there. We're continuing to build our platform to help us achieve our mission of being the most trusted Digital Wealth Manager in the world. Since being founded in 2011 we've: Grown to 160+ employees Raised over £100M in funding Launched 4 amazing products including JISA and Lifetime ISA Won multiple awards including Best Online Stocks & Shares ISA Provider for the fifth year in a row! We hit the 130,000 investor milestone in early 2021 and now manage over £3 billion AUM. *We offer flexible working* Job in a nutshell: We run a pure AWS-based cloud environment and deliver features using a continuous delivery approach. Our Data platform comprises a mix of services and open-source products fully running in Kubernetes and utilising AWS native Data solutions. Nutmeg Data solution is a mix of batching and streaming processes leveraging Airflow, Apache Kafka and AWS Data tools. Our key characteristic is enabling a self-service experience for all Data stakeholders. Nutmeg products are served by a polyglot mix of microservices designed following Domain-Driven Design principles and composing an Event-Driven Architecture powered by Apache Kafka. As a Senior Data Engineer, you will closely collaborate with technical and non-technical teams to deliver Data solutions supporting Nutmeg's Data strategy. We are looking for someone with previous job experience as a senior engineer and a strong passion for Data challenges. Requirements Your skills: Following Data engineering industry best practice Full ownership of end-to-end Data pipelines Designing, implementing, and maintaining Data models Writing automated test around Data models Understanding of CI/CD principles Experience with cloud platforms for Data (ideally AWS) Experience in converting business requirements into technical deliverables Previous experience with two or more of the following: Airflow, dbt, Kafka Connect, Looker, Python, and Redshift You might also have: DataOps best practice Experience in collaborating with BI and Data Science teams Use of agile/lean methodologies for continuous delivery and improvement Knowledge of monitoring, metrics or Site Reliability Engineering Understanding of Data governance and security standards Benefits 25 days' holiday Birthday day off 2 days' paid community leave Competitive salary Private healthcare with Vitality from day 1 Access to a digital GP and other healthcare resources Season ticket and bike loans Access to a wellbeing platform & regular knowledge sharing Regular homeworking perks and rewards Cycle storage and showers onsite Discounted Nutmeg account for you and your family and friends Part of an inclusive Nutmeg team
Jonothan Bosworth
Lead Data Engineer
Jonothan Bosworth Gloucester, Gloucestershire
Lead Data Engineer | Remote working | Gloucester | £65,000 - £80,000 Jonothan Bosworth Recruitment Specialists are currently seeking a Lead Data Engineer where you will join a well-established company who are at the forefront of a new growth plan and underway with an ambitious program of work. You will join a new data team as part of the emerging data strategy. This job opportunity is for an experienced Data Engineer who is looking to progress their career into a lead role innovating with the latest technologies to design and lead technical teams in building internal as well as client-facing solutions using Databricks, Azure Stack, and Power BI. As Lead Data Engineer you will help build high performance data platforms from the ground up and establish and manage the Data Engineering team along the way, ensuring they develop, maintain, and optimise data pipelines using best practice within a DataOps methodology. THE BASICS: You will design and implement numerous complex data flows to connect operational systems, data for analytics and business intelligence (BI) systems. Specifically: Design / Implement data storage and processing solutions. Data security and compliance. Monitor and optimise data solutions. Build Data Engineering capacity through technical support and personal development of Data Engineers. Inspire best practice for data products and services, and work with senior team members to identify, plan, develop and deliver data services. KEY SKILLS: You will have experience leading a team along with Cloud architecture and distributed systems Having worked on Big Data projects you will have experience using Big Data Frameworks to create Data Pipelines with the latest stream processing systems (e.g., Kafka, Storm, Spark-Streaming, etc.) Advanced Programming / Scripting (Java, Python, R etc.) Data Strategy, Architectures and Governance, Data Management and Security Data Integrations using Azure Data Factory, Data Bricks and APIs Data Repositories in SQL Server and Analysis Services Data Modelling, SQL and Azure Data Warehouse and Reporting solutions Able to work well under pressure, flexible, positive & focused during times of change. Travel to Gloucester twice a week. For more information, please contact Claire at Jonothan Bosworth Recruitment Specialists. NC_20_LDE_CE We are an equal opportunities employer, committed to diversity and inclusion. We are active anti-slavery advocates and prohibit discrimination and harassment of any kind based on race, colour, sex, religion, sexual orientation, national origin, disability, genetic information, pregnancy, or any other protected characteristic.
14/09/2021
Full time
Lead Data Engineer | Remote working | Gloucester | £65,000 - £80,000 Jonothan Bosworth Recruitment Specialists are currently seeking a Lead Data Engineer where you will join a well-established company who are at the forefront of a new growth plan and underway with an ambitious program of work. You will join a new data team as part of the emerging data strategy. This job opportunity is for an experienced Data Engineer who is looking to progress their career into a lead role innovating with the latest technologies to design and lead technical teams in building internal as well as client-facing solutions using Databricks, Azure Stack, and Power BI. As Lead Data Engineer you will help build high performance data platforms from the ground up and establish and manage the Data Engineering team along the way, ensuring they develop, maintain, and optimise data pipelines using best practice within a DataOps methodology. THE BASICS: You will design and implement numerous complex data flows to connect operational systems, data for analytics and business intelligence (BI) systems. Specifically: Design / Implement data storage and processing solutions. Data security and compliance. Monitor and optimise data solutions. Build Data Engineering capacity through technical support and personal development of Data Engineers. Inspire best practice for data products and services, and work with senior team members to identify, plan, develop and deliver data services. KEY SKILLS: You will have experience leading a team along with Cloud architecture and distributed systems Having worked on Big Data projects you will have experience using Big Data Frameworks to create Data Pipelines with the latest stream processing systems (e.g., Kafka, Storm, Spark-Streaming, etc.) Advanced Programming / Scripting (Java, Python, R etc.) Data Strategy, Architectures and Governance, Data Management and Security Data Integrations using Azure Data Factory, Data Bricks and APIs Data Repositories in SQL Server and Analysis Services Data Modelling, SQL and Azure Data Warehouse and Reporting solutions Able to work well under pressure, flexible, positive & focused during times of change. Travel to Gloucester twice a week. For more information, please contact Claire at Jonothan Bosworth Recruitment Specialists. NC_20_LDE_CE We are an equal opportunities employer, committed to diversity and inclusion. We are active anti-slavery advocates and prohibit discrimination and harassment of any kind based on race, colour, sex, religion, sexual orientation, national origin, disability, genetic information, pregnancy, or any other protected characteristic.
Nutmeg
Senior Data Engineer
Nutmeg
Who we are: Nutmeg is Europe's leading Digital Wealth Manager, but we don't want to stop there. We're continuing to build our platform to help us achieve our mission of being the most trusted Digital Wealth Manager in the world. Since being founded in 2011 we've: Grown to 160+ employees Raised over £100M in funding Launched 4 amazing products including JISA and Lifetime ISA Won multiple awards including Best Online Stocks & Shares ISA Provider for the fifth year in a row! We hit the 130,000 investor milestone in early 2021 and now manage over £3 billion AUM. *We offer flexible working* Job in a nutshell: We run a pure AWS-based cloud environment and deliver features using a continuous delivery approach. Our Data platform comprises a mix of services and open-source products fully running in Kubernetes and utilising AWS native Data solutions. Nutmeg Data solution is a mix of batching and streaming processes leveraging Airflow, Apache Kafka and AWS Data tools. Our key characteristic is enabling a self-service experience for all Data stakeholders. Nutmeg products are served by a polyglot mix of microservices designed following Domain-Driven Design principles and composing an Event-Driven Architecture powered by Apache Kafka. As a Senior Data Engineer, you will closely collaborate with technical and non-technical teams to deliver Data solutions supporting Nutmeg's Data strategy. We are looking for someone with previous job experience as a senior engineer and a strong passion for Data challenges. Requirements Your skills: Following Data engineering industry best practice Full ownership of end-to-end Data pipelines Designing, implementing, and maintaining Data models Writing automated test around Data models Understanding of CI/CD principles Experience with cloud platforms for Data (ideally AWS) Experience in converting business requirements into technical deliverables Previous experience with two or more of the following: Airflow, dbt, Kafka Connect, Looker, Python, and Redshift You might also have: DataOps best practice Experience in collaborating with BI and Data Science teams Use of agile/lean methodologies for continuous delivery and improvement Knowledge of monitoring, metrics or Site Reliability Engineering Understanding of Data governance and security standards Benefits 25 days' holiday Birthday day off 2 days' paid community leave Competitive salary Private healthcare with Vitality from day 1 Access to a digital GP and other healthcare resources Season ticket and bike loans Access to a wellbeing platform & regular knowledge sharing Regular homeworking perks and rewards Cycle storage and showers onsite Discounted Nutmeg account for you and your family and friends Part of an inclusive Nutmeg team
14/09/2021
Full time
Who we are: Nutmeg is Europe's leading Digital Wealth Manager, but we don't want to stop there. We're continuing to build our platform to help us achieve our mission of being the most trusted Digital Wealth Manager in the world. Since being founded in 2011 we've: Grown to 160+ employees Raised over £100M in funding Launched 4 amazing products including JISA and Lifetime ISA Won multiple awards including Best Online Stocks & Shares ISA Provider for the fifth year in a row! We hit the 130,000 investor milestone in early 2021 and now manage over £3 billion AUM. *We offer flexible working* Job in a nutshell: We run a pure AWS-based cloud environment and deliver features using a continuous delivery approach. Our Data platform comprises a mix of services and open-source products fully running in Kubernetes and utilising AWS native Data solutions. Nutmeg Data solution is a mix of batching and streaming processes leveraging Airflow, Apache Kafka and AWS Data tools. Our key characteristic is enabling a self-service experience for all Data stakeholders. Nutmeg products are served by a polyglot mix of microservices designed following Domain-Driven Design principles and composing an Event-Driven Architecture powered by Apache Kafka. As a Senior Data Engineer, you will closely collaborate with technical and non-technical teams to deliver Data solutions supporting Nutmeg's Data strategy. We are looking for someone with previous job experience as a senior engineer and a strong passion for Data challenges. Requirements Your skills: Following Data engineering industry best practice Full ownership of end-to-end Data pipelines Designing, implementing, and maintaining Data models Writing automated test around Data models Understanding of CI/CD principles Experience with cloud platforms for Data (ideally AWS) Experience in converting business requirements into technical deliverables Previous experience with two or more of the following: Airflow, dbt, Kafka Connect, Looker, Python, and Redshift You might also have: DataOps best practice Experience in collaborating with BI and Data Science teams Use of agile/lean methodologies for continuous delivery and improvement Knowledge of monitoring, metrics or Site Reliability Engineering Understanding of Data governance and security standards Benefits 25 days' holiday Birthday day off 2 days' paid community leave Competitive salary Private healthcare with Vitality from day 1 Access to a digital GP and other healthcare resources Season ticket and bike loans Access to a wellbeing platform & regular knowledge sharing Regular homeworking perks and rewards Cycle storage and showers onsite Discounted Nutmeg account for you and your family and friends Part of an inclusive Nutmeg team

Modal Window

  • Home
  • Contact
  • About Us
  • FAQs
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • IT blog
  • Facebook
  • Twitter
  • LinkedIn
  • Youtube
© 2008-2025 IT Job Board