it job board logo
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
  • Recruiting? Post a job
  • Sign in
  • Sign up
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

11 jobs found

Email me jobs like this
Refine Search
Current Search
senior dataops engineer
Cambridge University Press & Assessment
Principal Data Scientist
Cambridge University Press & Assessment Cambridge, UK
Job Title: Principal Data Scientist Salary: £74,200 - £99,250 Location: Cambridge/Hybrid with 2 day per week at the office Contract: Permanent  Hours: Full time 35 hours per week Are you excited by the challenge of applying data science and AI to problems that genuinely matter? At Cambridge Assessment, we are transforming how assessments are designed, delivered and marked worldwide. As a Principal Data Scientist , you will play a pivotal role at the heart of this transformation – leading our data science capability for AI-enabled assessment products used by millions of learners globally. We are Cambridge University Press & Assessment, a world-leading academic publisher and assessment organisation and a proud part of the University of Cambridge.  This is a senior, influential role where you will combine deep technical expertise with strategic leadership. You will shape our data strategy, lead and mentor a growing team, and work closely with researchers, engineers and product teams to turn complex data into insight, innovation and trusted solutions. About the role    As Principal Data Scientist, you will lead the operational data science and analytics capability within our Assessment & Research Capabilities (ARC) function. You will be the data leader for automarking, representing ARC's data capability across Exam Technology and the wider organisation. You will: Set the direction for data science and analytics supporting automarking and AI-driven assessment Lead and grow a small, high-impact team of data scientists and engineers Curate high-quality data products used across research, machine learning and product teams Act as a trusted partner to senior stakeholders, influencing product and research decisions with evidence and insight Ensure sensitive exam and candidate data is handled responsibly and ethically Additional responsibilities and accountabilities include:   Lead data science, data engineering and analytics activities within ARC Define and own the data strategy for automarking and related AI capabilities Design and oversee data warehouses, pipelines and integrations with the wider organisation Translate complex business and research needs into robust data solutions Provide expert input into product, research and architectural decisions, up to board level Build strong relationships with internal teams and external research partners Champion best practice in data quality, DataOps and analytics engineering This position has been classified as a hybrid role, requiring the selected candidate to typically spend 40-60% of their time collaborating and connecting face-to-face at their dedicated location. Aside from our hybrid principles, other flexible working requests will be considered from the first day of employment, including other work arrangements should you require adjustments due to a disability or long-term health condition.  About You    To be successful in this role, you will bring: Extensive experience in data science, analytics or analytics engineering in a complex environment Advanced SQL skills, including writing, analysing and optimising large analytical queries Strong experience with a data science programming language such as Python, R or Julia Hands-on experience with data transformation tools such as dbt, Dataform or SQLMesh Experience using BI and visualisation tools such as Metabase, Looker, Tableau or Power BI A strong understanding of data warehousing principles (e.g. Kimball methodology) Experience designing data models that enable self-service analytics Proven ability to translate business or research questions into data-driven insights Experience communicating complex technical concepts to non-technical and senior audiences Leadership experience, including mentoring and guiding other data professionals If you meet the above minimum requirements, we encourage you to apply. Your application will be even stronger if you can also demonstrate the following  desirable  criteria:  Machine learning or AI product experience Exposure to automarking, assessment, or high-stakes data environments Skills in experimentation and statistical analysis (A/B testing, forecasting) Familiarity with DataOps (CI/CD, testing, orchestration, observability) For a detailed job description, please refer to the link at the bottom of the advert on our careers site. We are a Disability Confident (DC) employer that is committed to equality and inclusion ensuring our recruitment process is accessible to all. The DC scheme's  Offer of an Interview  commitment applies to applicants who opt in, and disclose a disability or a long-term health condition, and best meet the minimum criteria for the role. In instances where interviewing all qualifying candidates is not practicable, we prioritise those who best meet the minimum criteria, as we would for applicants who do not have a disability or long-term health condition. Cambridge University Press & Assessment is an approved UK employer for the sponsorship of eligible roles and applicants under the Skilled Worker visa route. Please refer to the  gov.uk  website for guidance to understand your own eligibility based on the role you are applying for. Rewards and benefits     We will support you to be at your best in work and to live well outside of it. In addition to competitive salaries, we offer a world-class, flexible  rewards package , featuring family-friendly and planet-friendly benefits including:  28 days annual leave plus bank holidays  Private medical and Permanent Health Insurance   Discretionary annual bonus   Group personal pension scheme  Life assurance up to 4 x annual salary   Green travel schemes   Ready to pursue your potential? Apply now. We aim to support candidates by making our interview process clear and transparent. The closing date for all applications will be  13   March 2026.   We will review applications on an ongoing basis, and shortlisted candidates can expect interviews to take place shortly after it closes. As part of the application process, you can expect:   At application stage: four technical questions to answer when submitting your CV. Stage 1 : 30-minute screening call with the hiring manager. Stage 2 : 60-minute session includes questions about key skills as well as a code review or whiteboard exercise. Stage 3 : 90-minute system design exercise with an assignment provided at least three days before the interview. During the interview, is where the designs are explained and discussed. Stage 4 : Leadership and cultural 45-minute interview. If you require any reasonable adjustments during the recruitment process due to a disability or a long-term health condition, there will be an opportunity for you to inform us via the online application form. We will do our best to accommodate your needs.  Please note that successful applicants will be subject to satisfactory background checks including DBS due to working in a regulated industry. We are committed to an equitable recruitment process. As such, applications must be submitted via our official online application procedure. Please refrain from sending your CV directly to our recruiters. If you experience technical difficulties or require additional support with submitting your online application, contact the Recruiter.  Why join us   Joining us is your opportunity to pursue potential. You will belong to a collaborative team that is exploring new and better ways to serve students, teachers and researchers across the globe – for the benefit of individuals, society and the world. Sharing our mission will inspire your own growth, development and progress, in an environment which embraces difference, change and aspiration. Cambridge University Press & Assessment is committed to being a place where anyone can enjoy a successful career, where it is safe to speak up, and where we learn continuously to improve together. We welcome applications from all candidates, regardless of demographic characteristics (age, disability, educational attainment, ethnicity, gender, marital status, neurodiversity, religion, sex, gender identity and sexual identity), cultural, or social class/background.  We believe better outcomes come through diversity of thought, background and approach. We welcome applications from people from all backgrounds and communities, actively seeking to employ people from a wide range of different communities.
02/03/2026
Full time
Job Title: Principal Data Scientist Salary: £74,200 - £99,250 Location: Cambridge/Hybrid with 2 day per week at the office Contract: Permanent  Hours: Full time 35 hours per week Are you excited by the challenge of applying data science and AI to problems that genuinely matter? At Cambridge Assessment, we are transforming how assessments are designed, delivered and marked worldwide. As a Principal Data Scientist , you will play a pivotal role at the heart of this transformation – leading our data science capability for AI-enabled assessment products used by millions of learners globally. We are Cambridge University Press & Assessment, a world-leading academic publisher and assessment organisation and a proud part of the University of Cambridge.  This is a senior, influential role where you will combine deep technical expertise with strategic leadership. You will shape our data strategy, lead and mentor a growing team, and work closely with researchers, engineers and product teams to turn complex data into insight, innovation and trusted solutions. About the role    As Principal Data Scientist, you will lead the operational data science and analytics capability within our Assessment & Research Capabilities (ARC) function. You will be the data leader for automarking, representing ARC's data capability across Exam Technology and the wider organisation. You will: Set the direction for data science and analytics supporting automarking and AI-driven assessment Lead and grow a small, high-impact team of data scientists and engineers Curate high-quality data products used across research, machine learning and product teams Act as a trusted partner to senior stakeholders, influencing product and research decisions with evidence and insight Ensure sensitive exam and candidate data is handled responsibly and ethically Additional responsibilities and accountabilities include:   Lead data science, data engineering and analytics activities within ARC Define and own the data strategy for automarking and related AI capabilities Design and oversee data warehouses, pipelines and integrations with the wider organisation Translate complex business and research needs into robust data solutions Provide expert input into product, research and architectural decisions, up to board level Build strong relationships with internal teams and external research partners Champion best practice in data quality, DataOps and analytics engineering This position has been classified as a hybrid role, requiring the selected candidate to typically spend 40-60% of their time collaborating and connecting face-to-face at their dedicated location. Aside from our hybrid principles, other flexible working requests will be considered from the first day of employment, including other work arrangements should you require adjustments due to a disability or long-term health condition.  About You    To be successful in this role, you will bring: Extensive experience in data science, analytics or analytics engineering in a complex environment Advanced SQL skills, including writing, analysing and optimising large analytical queries Strong experience with a data science programming language such as Python, R or Julia Hands-on experience with data transformation tools such as dbt, Dataform or SQLMesh Experience using BI and visualisation tools such as Metabase, Looker, Tableau or Power BI A strong understanding of data warehousing principles (e.g. Kimball methodology) Experience designing data models that enable self-service analytics Proven ability to translate business or research questions into data-driven insights Experience communicating complex technical concepts to non-technical and senior audiences Leadership experience, including mentoring and guiding other data professionals If you meet the above minimum requirements, we encourage you to apply. Your application will be even stronger if you can also demonstrate the following  desirable  criteria:  Machine learning or AI product experience Exposure to automarking, assessment, or high-stakes data environments Skills in experimentation and statistical analysis (A/B testing, forecasting) Familiarity with DataOps (CI/CD, testing, orchestration, observability) For a detailed job description, please refer to the link at the bottom of the advert on our careers site. We are a Disability Confident (DC) employer that is committed to equality and inclusion ensuring our recruitment process is accessible to all. The DC scheme's  Offer of an Interview  commitment applies to applicants who opt in, and disclose a disability or a long-term health condition, and best meet the minimum criteria for the role. In instances where interviewing all qualifying candidates is not practicable, we prioritise those who best meet the minimum criteria, as we would for applicants who do not have a disability or long-term health condition. Cambridge University Press & Assessment is an approved UK employer for the sponsorship of eligible roles and applicants under the Skilled Worker visa route. Please refer to the  gov.uk  website for guidance to understand your own eligibility based on the role you are applying for. Rewards and benefits     We will support you to be at your best in work and to live well outside of it. In addition to competitive salaries, we offer a world-class, flexible  rewards package , featuring family-friendly and planet-friendly benefits including:  28 days annual leave plus bank holidays  Private medical and Permanent Health Insurance   Discretionary annual bonus   Group personal pension scheme  Life assurance up to 4 x annual salary   Green travel schemes   Ready to pursue your potential? Apply now. We aim to support candidates by making our interview process clear and transparent. The closing date for all applications will be  13   March 2026.   We will review applications on an ongoing basis, and shortlisted candidates can expect interviews to take place shortly after it closes. As part of the application process, you can expect:   At application stage: four technical questions to answer when submitting your CV. Stage 1 : 30-minute screening call with the hiring manager. Stage 2 : 60-minute session includes questions about key skills as well as a code review or whiteboard exercise. Stage 3 : 90-minute system design exercise with an assignment provided at least three days before the interview. During the interview, is where the designs are explained and discussed. Stage 4 : Leadership and cultural 45-minute interview. If you require any reasonable adjustments during the recruitment process due to a disability or a long-term health condition, there will be an opportunity for you to inform us via the online application form. We will do our best to accommodate your needs.  Please note that successful applicants will be subject to satisfactory background checks including DBS due to working in a regulated industry. We are committed to an equitable recruitment process. As such, applications must be submitted via our official online application procedure. Please refrain from sending your CV directly to our recruiters. If you experience technical difficulties or require additional support with submitting your online application, contact the Recruiter.  Why join us   Joining us is your opportunity to pursue potential. You will belong to a collaborative team that is exploring new and better ways to serve students, teachers and researchers across the globe – for the benefit of individuals, society and the world. Sharing our mission will inspire your own growth, development and progress, in an environment which embraces difference, change and aspiration. Cambridge University Press & Assessment is committed to being a place where anyone can enjoy a successful career, where it is safe to speak up, and where we learn continuously to improve together. We welcome applications from all candidates, regardless of demographic characteristics (age, disability, educational attainment, ethnicity, gender, marital status, neurodiversity, religion, sex, gender identity and sexual identity), cultural, or social class/background.  We believe better outcomes come through diversity of thought, background and approach. We welcome applications from people from all backgrounds and communities, actively seeking to employ people from a wide range of different communities.
Sanderson
Data Engineering Manager
Sanderson Edinburgh, Midlothian
Data Engineering Manager Who are Diligenta? Diligenta's vision is to be acknowledged as Best-in-class Platform-based Life and Pensions Administration Service provider. Customer service is at the heart of everything we do, and our aim is to transform our clients' operations. A business that has been described as 'home' by existing employees, we drive a culture that is founded on positive change and development. Summary of the role: As a Data Engineering Manager, you will be responsible for the engineering teams in the Data delivery and Data management scopes. These are key positions within the Data office as they provide technical leadership, mentorship and team development, as well as contributing to the development and implementation of our Data strategy.You will be responsible for organising the team in an efficient and effective manner and prioritising the work to align with organisational priorities. This includes providing technical leadership during development phases, Agile based delivery and engineering best practices. As in any engineering management position, there will be a certain amount of hands-on work, and so a technical background is required, but the main focus will be the management of the team, workstreams and software lifecycles. Benefits: 33 days including Bank Holidays Eligibility for an annual discretionary bonus scheme Personal and career development opportunities to progress your aspirations within the company as well as through our global parent company (Tata Consultancy Services) Access to Perks at Work (an online discounted shopping platform) saving you money on a wide range of goods and services, including your weekly food shop, holidays and electrical goods. Cycle to Work Scheme & Interest-free Season Ticket loans. A companywide Wellbeing programme, including an Employee Assistance Programme and other benefits/resources to support your mental/physical and financial wellbeing. A comprehensive set of Moments that Matter policies, such as Carer's Leave, Foster Leave and Retirement Leave A contributory company pension scheme where we match your contributions up to 6%, Group Life Assurance ('Death in Service') & Group Income Protection What you'll be doing: Contribution to, creation and implementation of best practices to coding standards, naming conventions and design briefs. Overseeing coding standards and repository management. Ensuring the correct documentation is created and maintained for all products. Working with the PM and BA tier to estimate work, prioritise and maintain the backlog. Provide Team leadership by setting standards for behaviour and fostering a culture of collaboration and growth. Manage team development plans ensuring each team member has a meaningful technical and career development plan. Work towards the creation, implementation and evolution of competency frameworks for all engineering positions. Lead the engineering team in the design, building and maintenance of scalable and robust pipelines and supporting MS Fabric infrastructure. Carry out strategic planning for team expansion. Work with PM resources to schedule, complete and document post sprint and project retrospectives, ensuring that identified actions are carried out as required. Identify, qualify and implement tooling requirements. Provide high level technical overviews and insights to senior leadership. Provide incident management as required and follow up with comprehensive RCAs. Will help create and implement a DataOps environment. What we're looking for: Prior experience in a similar role as either an Engineering Manager, Technical Lead or Senior Engineering position in either a software or data delivery environment. Educated up to at least A-level or equivalent standards. Solid understanding of data governance concepts. Strong Analytical skills. Good stakeholder management skills. Familiar with data privacy regulations such as GDPR. Knowledge or understanding of ETL/ELT solutions. Knowledge of Relational and non-relational data management systems. Understanding of big data environments. Knowledge of CICD tools and processes. Experience with GIT source safe. Experience of working in an Agile delivery environment. Experience of coaching and developing teams. Software or Data product delivery. If you need any help or adjustments during the recruitment process, please let us know.Ready to take the next step in your career? Apply today and become part of our innovative team!
02/10/2025
Full time
Data Engineering Manager Who are Diligenta? Diligenta's vision is to be acknowledged as Best-in-class Platform-based Life and Pensions Administration Service provider. Customer service is at the heart of everything we do, and our aim is to transform our clients' operations. A business that has been described as 'home' by existing employees, we drive a culture that is founded on positive change and development. Summary of the role: As a Data Engineering Manager, you will be responsible for the engineering teams in the Data delivery and Data management scopes. These are key positions within the Data office as they provide technical leadership, mentorship and team development, as well as contributing to the development and implementation of our Data strategy.You will be responsible for organising the team in an efficient and effective manner and prioritising the work to align with organisational priorities. This includes providing technical leadership during development phases, Agile based delivery and engineering best practices. As in any engineering management position, there will be a certain amount of hands-on work, and so a technical background is required, but the main focus will be the management of the team, workstreams and software lifecycles. Benefits: 33 days including Bank Holidays Eligibility for an annual discretionary bonus scheme Personal and career development opportunities to progress your aspirations within the company as well as through our global parent company (Tata Consultancy Services) Access to Perks at Work (an online discounted shopping platform) saving you money on a wide range of goods and services, including your weekly food shop, holidays and electrical goods. Cycle to Work Scheme & Interest-free Season Ticket loans. A companywide Wellbeing programme, including an Employee Assistance Programme and other benefits/resources to support your mental/physical and financial wellbeing. A comprehensive set of Moments that Matter policies, such as Carer's Leave, Foster Leave and Retirement Leave A contributory company pension scheme where we match your contributions up to 6%, Group Life Assurance ('Death in Service') & Group Income Protection What you'll be doing: Contribution to, creation and implementation of best practices to coding standards, naming conventions and design briefs. Overseeing coding standards and repository management. Ensuring the correct documentation is created and maintained for all products. Working with the PM and BA tier to estimate work, prioritise and maintain the backlog. Provide Team leadership by setting standards for behaviour and fostering a culture of collaboration and growth. Manage team development plans ensuring each team member has a meaningful technical and career development plan. Work towards the creation, implementation and evolution of competency frameworks for all engineering positions. Lead the engineering team in the design, building and maintenance of scalable and robust pipelines and supporting MS Fabric infrastructure. Carry out strategic planning for team expansion. Work with PM resources to schedule, complete and document post sprint and project retrospectives, ensuring that identified actions are carried out as required. Identify, qualify and implement tooling requirements. Provide high level technical overviews and insights to senior leadership. Provide incident management as required and follow up with comprehensive RCAs. Will help create and implement a DataOps environment. What we're looking for: Prior experience in a similar role as either an Engineering Manager, Technical Lead or Senior Engineering position in either a software or data delivery environment. Educated up to at least A-level or equivalent standards. Solid understanding of data governance concepts. Strong Analytical skills. Good stakeholder management skills. Familiar with data privacy regulations such as GDPR. Knowledge or understanding of ETL/ELT solutions. Knowledge of Relational and non-relational data management systems. Understanding of big data environments. Knowledge of CICD tools and processes. Experience with GIT source safe. Experience of working in an Agile delivery environment. Experience of coaching and developing teams. Software or Data product delivery. If you need any help or adjustments during the recruitment process, please let us know.Ready to take the next step in your career? Apply today and become part of our innovative team!
Sanderson
Data Engineering Manager
Sanderson Peterborough, Cambridgeshire
Data Engineering Manager Who are Diligenta? Diligenta's vision is to be acknowledged as Best-in-class Platform-based Life and Pensions Administration Service provider. Customer service is at the heart of everything we do, and our aim is to transform our clients' operations. A business that has been described as 'home' by existing employees, we drive a culture that is founded on positive change and development. Summary of the role: As a Data Engineering Manager, you will be responsible for the engineering teams in the Data delivery and Data management scopes. These are key positions within the Data office as they provide technical leadership, mentorship and team development, as well as contributing to the development and implementation of our Data strategy.You will be responsible for organising the team in an efficient and effective manner and prioritising the work to align with organisational priorities. This includes providing technical leadership during development phases, Agile based delivery and engineering best practices. As in any engineering management position, there will be a certain amount of hands-on work, and so a technical background is required, but the main focus will be the management of the team, workstreams and software lifecycles. Benefits: 33 days including Bank Holidays Eligibility for an annual discretionary bonus scheme Personal and career development opportunities to progress your aspirations within the company as well as through our global parent company (Tata Consultancy Services) Access to Perks at Work (an online discounted shopping platform) saving you money on a wide range of goods and services, including your weekly food shop, holidays and electrical goods. Cycle to Work Scheme & Interest-free Season Ticket loans. A companywide Wellbeing programme, including an Employee Assistance Programme and other benefits/resources to support your mental/physical and financial wellbeing. A comprehensive set of Moments that Matter policies, such as Carer's Leave, Foster Leave and Retirement Leave A contributory company pension scheme where we match your contributions up to 6%, Group Life Assurance ('Death in Service') & Group Income Protection What you'll be doing: Contribution to, creation and implementation of best practices to coding standards, naming conventions and design briefs. Overseeing coding standards and repository management. Ensuring the correct documentation is created and maintained for all products. Working with the PM and BA tier to estimate work, prioritise and maintain the backlog. Provide Team leadership by setting standards for behaviour and fostering a culture of collaboration and growth. Manage team development plans ensuring each team member has a meaningful technical and career development plan. Work towards the creation, implementation and evolution of competency frameworks for all engineering positions. Lead the engineering team in the design, building and maintenance of scalable and robust pipelines and supporting MS Fabric infrastructure. Carry out strategic planning for team expansion. Work with PM resources to schedule, complete and document post sprint and project retrospectives, ensuring that identified actions are carried out as required. Identify, qualify and implement tooling requirements. Provide high level technical overviews and insights to senior leadership. Provide incident management as required and follow up with comprehensive RCAs. Will help create and implement a DataOps environment. What we're looking for: Prior experience in a similar role as either an Engineering Manager, Technical Lead or Senior Engineering position in either a software or data delivery environment. Educated up to at least A-level or equivalent standards. Solid understanding of data governance concepts. Strong Analytical skills. Good stakeholder management skills. Familiar with data privacy regulations such as GDPR. Knowledge or understanding of ETL/ELT solutions. Knowledge of Relational and non-relational data management systems. Understanding of big data environments. Knowledge of CICD tools and processes. Experience with GIT source safe. Experience of working in an Agile delivery environment. Experience of coaching and developing teams. Software or Data product delivery. If you need any help or adjustments during the recruitment process, please let us know.Ready to take the next step in your career? Apply today and become part of our innovative team!
02/10/2025
Full time
Data Engineering Manager Who are Diligenta? Diligenta's vision is to be acknowledged as Best-in-class Platform-based Life and Pensions Administration Service provider. Customer service is at the heart of everything we do, and our aim is to transform our clients' operations. A business that has been described as 'home' by existing employees, we drive a culture that is founded on positive change and development. Summary of the role: As a Data Engineering Manager, you will be responsible for the engineering teams in the Data delivery and Data management scopes. These are key positions within the Data office as they provide technical leadership, mentorship and team development, as well as contributing to the development and implementation of our Data strategy.You will be responsible for organising the team in an efficient and effective manner and prioritising the work to align with organisational priorities. This includes providing technical leadership during development phases, Agile based delivery and engineering best practices. As in any engineering management position, there will be a certain amount of hands-on work, and so a technical background is required, but the main focus will be the management of the team, workstreams and software lifecycles. Benefits: 33 days including Bank Holidays Eligibility for an annual discretionary bonus scheme Personal and career development opportunities to progress your aspirations within the company as well as through our global parent company (Tata Consultancy Services) Access to Perks at Work (an online discounted shopping platform) saving you money on a wide range of goods and services, including your weekly food shop, holidays and electrical goods. Cycle to Work Scheme & Interest-free Season Ticket loans. A companywide Wellbeing programme, including an Employee Assistance Programme and other benefits/resources to support your mental/physical and financial wellbeing. A comprehensive set of Moments that Matter policies, such as Carer's Leave, Foster Leave and Retirement Leave A contributory company pension scheme where we match your contributions up to 6%, Group Life Assurance ('Death in Service') & Group Income Protection What you'll be doing: Contribution to, creation and implementation of best practices to coding standards, naming conventions and design briefs. Overseeing coding standards and repository management. Ensuring the correct documentation is created and maintained for all products. Working with the PM and BA tier to estimate work, prioritise and maintain the backlog. Provide Team leadership by setting standards for behaviour and fostering a culture of collaboration and growth. Manage team development plans ensuring each team member has a meaningful technical and career development plan. Work towards the creation, implementation and evolution of competency frameworks for all engineering positions. Lead the engineering team in the design, building and maintenance of scalable and robust pipelines and supporting MS Fabric infrastructure. Carry out strategic planning for team expansion. Work with PM resources to schedule, complete and document post sprint and project retrospectives, ensuring that identified actions are carried out as required. Identify, qualify and implement tooling requirements. Provide high level technical overviews and insights to senior leadership. Provide incident management as required and follow up with comprehensive RCAs. Will help create and implement a DataOps environment. What we're looking for: Prior experience in a similar role as either an Engineering Manager, Technical Lead or Senior Engineering position in either a software or data delivery environment. Educated up to at least A-level or equivalent standards. Solid understanding of data governance concepts. Strong Analytical skills. Good stakeholder management skills. Familiar with data privacy regulations such as GDPR. Knowledge or understanding of ETL/ELT solutions. Knowledge of Relational and non-relational data management systems. Understanding of big data environments. Knowledge of CICD tools and processes. Experience with GIT source safe. Experience of working in an Agile delivery environment. Experience of coaching and developing teams. Software or Data product delivery. If you need any help or adjustments during the recruitment process, please let us know.Ready to take the next step in your career? Apply today and become part of our innovative team!
Senior DevOps Engineer
GlaxoSmithKline Stevenage, Hertfordshire
Site Name: UK - Hertfordshire - Stevenage, USA - Connecticut - Hartford, USA - Delaware - Dover, USA - Maryland - Rockville, USA - Massachusetts - Cambridge, USA - Massachusetts - Waltham, USA - New Jersey - Trenton, USA - Pennsylvania - Upper Providence Posted Date: Jun 6 2022 The mission of the Data Science and Data Engineering (DSDE) organization within GSK Pharmaceuticals R&D is to get the right data, to the right people, at the right time. TheData Framework and Opsorganization ensures we can do this efficiently, reliably, transparently, and at scale through the creation of a leading-edge, cloud-native data services framework. We focus heavily on developer experience, on strong, semantic abstractions for the data ecosystem, on professional operations and aggressive automation, and on transparency of operations and cost. Achieving delivery of the right data to the right people at the right time needs design and implementation of data flows and data products which leverage internal and external data assets and tools to drive discovery and development is a key objective for the Data Science and Data Engineering (DSDE) team within GSK's Pharmaceutical R&D organization. There are five key drivers for this approach, which are closely aligned with GSK's corporate priorities of Innovation, Performance and Trust: Automation of end-to-end data flows :Faster and reliable ingestion of high throughput data in genetics, genomics and multi-omics, to extract value of investments in new technology (instrument to analysis-ready data in Enabling governance by design of external and internal data :with engineered practical solutions for controlled use and monitoring Innovative disease-specific and domain-expert specific data products : to enable computational scientists and their research unit collaborators to get faster to key insights leading to faster biopharmaceutical development cycles. Supporting e2ecode traceability and data provenance :Increasing assurance of data integrity through automation, integration. Improving engineering efficiency :Extensible, reusable, scalable,updateable,maintainable, virtualized traceable data and code would be driven by data engineering innovation and better resource utilization. We are looking for experienced Senior DevOps Engineers to join our growing Data Ops team. As a Senior Dev Ops Engineer is a highly technical individual contributor, building modern, cloud-native, DevOps-first systems for standardizing and templatizingbiomedical and scientificdata engineering, with demonstrable experience across the following areas: Deliver declarative components for common data ingestion, transformation and publishing techniques Define and implement data governance aligned to modern standards Establish scalable, automated processes for data engineering teams across GSK Thought leader and partner with wider DSDE data engineering teams to advise on implementation and best practices Cloud Infrastructure-as-Code Define Service and Flow orchestration Data as a configurable resource(including configuration-driven access to scientific data modelling tools) Observability (monitoring, alerting, logging, tracing, etc.) Enable quality engineering through KPIs and code coverage and quality checks Standardise GitOps/declarative software development lifecycle Audit as a service Senior DevOpsEngineers take full ownership of delivering high-performing, high-impactbiomedical and scientificdataopsproducts and services, froma description of apattern thatcustomer Data Engineers are trying touseall the way through tofinal delivery (and ongoing monitoring and operations)of a templated project and all associated automation. They arestandard-bearers for software engineering and quality coding practices within theteam andareexpected to mentor more junior engineers; they may even coordinate the work of more junior engineers on a large project.Theydevise useful metrics for ensuring their services are meeting customer demand and having animpact anditerate to deliver and improve on those metrics in an agile fashion. Successful Senior DevOpsEngineers are developing expertise with the types of data and types of tools that are leveraged in the biomedical and scientific data engineering space, andhas the following skills and experience(withsignificant depth in one or more of these areas): Demonstrable experience deploying robust modularised/container-based solutions to production (ideally GCP) and leveraging the Cloud NativeComputing Foundation (CNCF) ecosystem Significant depth in DevOps principles and tools (e.g.GitOps, Jenkins,CircleCI, Azure DevOps, etc.), and how to integrate these tools with other productivity tools (e.g. Jira, Slack, Microsoft Teams) to build a comprehensive workflow Programming in Python. Scala orGo Embedding agile software engineering (task/issue management, testing, documentation, software development lifecycle, source control, etc.) Leveraging major cloud providers, both via Kubernetesorvia vendor-specific services Authentication and Authorization flows and associated technologies (e.g.OAuth2 + JWT) Common distributed data tools (e.g.Spark, Hive) The DSDE team is built on the principles of ownership, accountability, continuous development, and collaboration. We hire for the long term, and we're motivated to make this a great place to work. Our leaders will be committed to your career and development from day one. Why you? Basic Qualifications: We are looking for professionals with these required skills to achieve our goals: Masters in Computer Science with a focus in Data Engineering, DataOps, DevOps, MLOps, Software Engineering, etc, plus 5 years job experience (or PhD plus 3 years job experience) Experience with DevOps tools and concepts (e.g. Jira, GitLabs / Jenkins / CircleCI / Azure DevOps /etc.)Excellent with common distributed data tools in a production setting (Spark, Kafka, etc) Experience with specialized data architecture (e.g. optimizing physical layout for access patterns, including bloom filters, optimizing against self-describing formats such as ORC or Parquet, etc.) Experience with search / indexing systems (e.g. Elasticsearch) Expertise with agile development in Python, Scala, Go, and/or C++ Experience building reusable components on top of the CNCF ecosystem including Kubernetes Metrics-first mindset Experience mentoring junior engineers into deep technical expertise Preferred Qualifications: If you have the following characteristics, it would be a plus: Experience with agile software development Experience with building and designing a DevOps-first way of working Experience with building reusable components on top of the CNCF ecosystem including Kubernetes (or similar ecosystem ) LI-GSK Why GSK? Our values and expectationsare at the heart of everything we do and form an important part of our culture. These include Patient focus, Transparency, Respect, Integrity along with Courage, Accountability, Development, and Teamwork. As GSK focuses on our values and expectations and a culture of innovation, performance, and trust, the successful candidate will demonstrate the following capabilities: Operating at pace and agile decision making - using evidence and applying judgement to balance pace, rigour and risk. Committed to delivering high-quality results, overcoming challenges, focusing on what matters, execution. Continuously looking for opportunities to learn, build skills and share learning. Sustaining energy and wellbeing Building strong relationships and collaboration, honest and open conversations. Budgeting and cost consciousness As a company driven by our values of Patient focus, Transparency, Respect and Integrity, we know inclusion and diversity are essential for us to be able to succeed. We want all our colleagues to thrive at GSK bringing their unique experiences, ensuring they feel good and to keep growing their careers. As a candidate for a role, we want you to feel the same way. As an Equal Opportunity Employer, we are open to all talent. In the US, we also adhere to Affirmative Action principles. This ensures that all qualified applicants will receive equal consideration for employment without regard to race/ethnicity, colour, national origin, religion, gender, pregnancy, marital status, sexual orientation, gender identity/expression, age, disability, genetic information, military service, covered/protected veteran status or any other federal, state or local protected class ( US only). We believe in an agile working culture for all our roles. If flexibility is important to you, we encourage you to explore with our hiring team what the opportunities are. Please don't hesitate to contact us if you'd like to discuss any adjustments to our process which might help you demonstrate your strengths and capabilities. You can either call us on , or send an email As you apply, we will ask you to share some personal information which is entirely voluntary..... click apply for full job details
23/09/2022
Full time
Site Name: UK - Hertfordshire - Stevenage, USA - Connecticut - Hartford, USA - Delaware - Dover, USA - Maryland - Rockville, USA - Massachusetts - Cambridge, USA - Massachusetts - Waltham, USA - New Jersey - Trenton, USA - Pennsylvania - Upper Providence Posted Date: Jun 6 2022 The mission of the Data Science and Data Engineering (DSDE) organization within GSK Pharmaceuticals R&D is to get the right data, to the right people, at the right time. TheData Framework and Opsorganization ensures we can do this efficiently, reliably, transparently, and at scale through the creation of a leading-edge, cloud-native data services framework. We focus heavily on developer experience, on strong, semantic abstractions for the data ecosystem, on professional operations and aggressive automation, and on transparency of operations and cost. Achieving delivery of the right data to the right people at the right time needs design and implementation of data flows and data products which leverage internal and external data assets and tools to drive discovery and development is a key objective for the Data Science and Data Engineering (DSDE) team within GSK's Pharmaceutical R&D organization. There are five key drivers for this approach, which are closely aligned with GSK's corporate priorities of Innovation, Performance and Trust: Automation of end-to-end data flows :Faster and reliable ingestion of high throughput data in genetics, genomics and multi-omics, to extract value of investments in new technology (instrument to analysis-ready data in Enabling governance by design of external and internal data :with engineered practical solutions for controlled use and monitoring Innovative disease-specific and domain-expert specific data products : to enable computational scientists and their research unit collaborators to get faster to key insights leading to faster biopharmaceutical development cycles. Supporting e2ecode traceability and data provenance :Increasing assurance of data integrity through automation, integration. Improving engineering efficiency :Extensible, reusable, scalable,updateable,maintainable, virtualized traceable data and code would be driven by data engineering innovation and better resource utilization. We are looking for experienced Senior DevOps Engineers to join our growing Data Ops team. As a Senior Dev Ops Engineer is a highly technical individual contributor, building modern, cloud-native, DevOps-first systems for standardizing and templatizingbiomedical and scientificdata engineering, with demonstrable experience across the following areas: Deliver declarative components for common data ingestion, transformation and publishing techniques Define and implement data governance aligned to modern standards Establish scalable, automated processes for data engineering teams across GSK Thought leader and partner with wider DSDE data engineering teams to advise on implementation and best practices Cloud Infrastructure-as-Code Define Service and Flow orchestration Data as a configurable resource(including configuration-driven access to scientific data modelling tools) Observability (monitoring, alerting, logging, tracing, etc.) Enable quality engineering through KPIs and code coverage and quality checks Standardise GitOps/declarative software development lifecycle Audit as a service Senior DevOpsEngineers take full ownership of delivering high-performing, high-impactbiomedical and scientificdataopsproducts and services, froma description of apattern thatcustomer Data Engineers are trying touseall the way through tofinal delivery (and ongoing monitoring and operations)of a templated project and all associated automation. They arestandard-bearers for software engineering and quality coding practices within theteam andareexpected to mentor more junior engineers; they may even coordinate the work of more junior engineers on a large project.Theydevise useful metrics for ensuring their services are meeting customer demand and having animpact anditerate to deliver and improve on those metrics in an agile fashion. Successful Senior DevOpsEngineers are developing expertise with the types of data and types of tools that are leveraged in the biomedical and scientific data engineering space, andhas the following skills and experience(withsignificant depth in one or more of these areas): Demonstrable experience deploying robust modularised/container-based solutions to production (ideally GCP) and leveraging the Cloud NativeComputing Foundation (CNCF) ecosystem Significant depth in DevOps principles and tools (e.g.GitOps, Jenkins,CircleCI, Azure DevOps, etc.), and how to integrate these tools with other productivity tools (e.g. Jira, Slack, Microsoft Teams) to build a comprehensive workflow Programming in Python. Scala orGo Embedding agile software engineering (task/issue management, testing, documentation, software development lifecycle, source control, etc.) Leveraging major cloud providers, both via Kubernetesorvia vendor-specific services Authentication and Authorization flows and associated technologies (e.g.OAuth2 + JWT) Common distributed data tools (e.g.Spark, Hive) The DSDE team is built on the principles of ownership, accountability, continuous development, and collaboration. We hire for the long term, and we're motivated to make this a great place to work. Our leaders will be committed to your career and development from day one. Why you? Basic Qualifications: We are looking for professionals with these required skills to achieve our goals: Masters in Computer Science with a focus in Data Engineering, DataOps, DevOps, MLOps, Software Engineering, etc, plus 5 years job experience (or PhD plus 3 years job experience) Experience with DevOps tools and concepts (e.g. Jira, GitLabs / Jenkins / CircleCI / Azure DevOps /etc.)Excellent with common distributed data tools in a production setting (Spark, Kafka, etc) Experience with specialized data architecture (e.g. optimizing physical layout for access patterns, including bloom filters, optimizing against self-describing formats such as ORC or Parquet, etc.) Experience with search / indexing systems (e.g. Elasticsearch) Expertise with agile development in Python, Scala, Go, and/or C++ Experience building reusable components on top of the CNCF ecosystem including Kubernetes Metrics-first mindset Experience mentoring junior engineers into deep technical expertise Preferred Qualifications: If you have the following characteristics, it would be a plus: Experience with agile software development Experience with building and designing a DevOps-first way of working Experience with building reusable components on top of the CNCF ecosystem including Kubernetes (or similar ecosystem ) LI-GSK Why GSK? Our values and expectationsare at the heart of everything we do and form an important part of our culture. These include Patient focus, Transparency, Respect, Integrity along with Courage, Accountability, Development, and Teamwork. As GSK focuses on our values and expectations and a culture of innovation, performance, and trust, the successful candidate will demonstrate the following capabilities: Operating at pace and agile decision making - using evidence and applying judgement to balance pace, rigour and risk. Committed to delivering high-quality results, overcoming challenges, focusing on what matters, execution. Continuously looking for opportunities to learn, build skills and share learning. Sustaining energy and wellbeing Building strong relationships and collaboration, honest and open conversations. Budgeting and cost consciousness As a company driven by our values of Patient focus, Transparency, Respect and Integrity, we know inclusion and diversity are essential for us to be able to succeed. We want all our colleagues to thrive at GSK bringing their unique experiences, ensuring they feel good and to keep growing their careers. As a candidate for a role, we want you to feel the same way. As an Equal Opportunity Employer, we are open to all talent. In the US, we also adhere to Affirmative Action principles. This ensures that all qualified applicants will receive equal consideration for employment without regard to race/ethnicity, colour, national origin, religion, gender, pregnancy, marital status, sexual orientation, gender identity/expression, age, disability, genetic information, military service, covered/protected veteran status or any other federal, state or local protected class ( US only). We believe in an agile working culture for all our roles. If flexibility is important to you, we encourage you to explore with our hiring team what the opportunities are. Please don't hesitate to contact us if you'd like to discuss any adjustments to our process which might help you demonstrate your strengths and capabilities. You can either call us on , or send an email As you apply, we will ask you to share some personal information which is entirely voluntary..... click apply for full job details
Senior Data Operations Engineer
GlaxoSmithKline Stevenage, Hertfordshire
Site Name: UK - Hertfordshire - Stevenage, USA - Connecticut - Hartford, USA - Delaware - Dover, USA - Maryland - Rockville, USA - Massachusetts - Waltham, USA - Pennsylvania - Upper Providence, Warren NJ Posted Date: Aug The mission of the Data Science and Data Engineering (DSDE) organization within GSK Pharmaceuticals R&D is to get the right data, to the right people, at the right time. TheData Framework and Opsorganization ensures we can do this efficiently, reliably, transparently, and at scale through the creation of a leading-edge, cloud-native data services framework. We focus heavily on developer experience, on strong, semantic abstractions for the data ecosystem, on professional operations and aggressive automation, and on transparency of operations and cost. Achieving delivery of the right data to the right people at the right time needs design and implementation of data flows and data products which leverage internal and external data assets and tools to drive discovery and development is a key objective for the Data Science and Data Engineering (DS D E) team within GSK's Pharmaceutical R&D organisation . There are five key drivers for this approach, which are closely aligned with GSK's corporate priorities of Innovation, Performance and Trust: Automation of end-to-end data flows: Faster and reliable ingestion of high throughput data in genetics, genomics and multi-omics, to extract value of investments in new technology (instrument to analysis-ready data in Enabling governance by design of external and internal data: with engineered practical solutions for controlled use and monitoring Innovative disease-specific and domain-expert specific data products : to enable computational scientists and their research unit collaborators to get faster to key insights leading to faster biopharmaceutical development cycles. Supporting e2 e code traceability and data provenance: Increasing assurance of data integrity through automation, integration Improving engineering efficiency: Extensible, reusable, scalable,updateable,maintainable, virtualized traceable data and code would b e driven by data engineering innovation and better resource utilization. We are looking for an experienced Sr. Data Ops Engineer to join our growing Data Ops team. As a Sr. Data Ops Engineer is a highly technical individual contributor, building modern, cloud-native, DevOps-first systems for standardizing and templatizingbiomedical and scientificdata engineering, with demonstrable experience across the following areas : Deliver declarative components for common data ingestion, transformation and publishing techniques Define and implement data governance aligned to modern standards Establish scalable, automated processes for data engineering team s across GSK Thought leader and partner with wider DSDE data engineering teams to advise on implementation and best practices Cloud Infrastructure-as-Code D efine Service and Flow orchestration Data as a configurable resource(including configuration-driven access to scientific data modelling tools) Ob servabilty (monitoring, alerting, logging, tracing, ...) Enable quality engineering through KPIs and c ode coverage and quality checks Standardise GitOps /declarative software development lifecycle Audit as a service Sr. DataOpsEngineerstake full ownership of delivering high-performing, high-impactbiomedical and scientificdataopsproducts and services, froma description of apattern thatcustomer Data Engineers are trying touseall the way through tofinal delivery (and ongoing monitoring and operations)of a templated project and all associated automation. They arestandard-bearers for software engineering and quality coding practices within theteam andareexpected to mentor more junior engineers; they may even coordinate the work of more junior engineers on a large project.Theydevise useful metrics for ensuring their services are meeting customer demand and having animpact anditerate to deliver and improve on those metrics in an agile fashion. A successfulSr.DataOpsEngineeris developing expertise with the types of data and types of tools that are leveraged in the biomedical and scientific data engineering space, andhas the following skills and experience(withsignificant depth in one or more of these areas): Demonstrable experience deploying robust modularised/ container based solutions to production (ideally GCP) and leveraging the Cloud NativeComputing Foundation (CNCF) ecosystem Significant depth in DevOps principles and tools ( e.g. GitOps , Jenkins, CircleCI , Azure DevOps, ...), and how to integrate these tools with other productivity tools (e.g. Jira, Slack, Microsoft Teams) to build a comprehensive workflow P rogramming in Python. Scala or Go Embedding agile s oftware engineering ( task/issue management, testing, documentation, software development lifecycle, source control, ) Leveraging major cloud providers, both via Kubernetes or via vendor-specific services Authentication and Authorization flows and associated technologies ( e.g. OAuth2 + JWT) Common distributed data tools ( e.g. Spark, Hive) The DSDE team is built on the principles of ownership, accountability, continuous development, and collaboration. We hire for the long term, and we're motivated to make this a great place to work. Our leaders will be committed to your career and development from day one. Why you? Basic Qualifications: Bachelors degree in Computer Science with a focus in Data Engineering, DataOps, DevOps, MLOps, Software Engineering, etc, plus 7 years job experience or Masters degree with 5 Years of experience (or PhD plus 3 years job experience) Deep experience with DevOps tools and concepts ( e.g. Jira, GitLabs / Jenkins / CircleCI / Azure DevOps / ...) Excellent with common distributed data tools in a production setting (Spark, Kafka, etc) Experience with specialized data architecture ( e.g. optimizing physical layout for access patterns, including bloom filters, optimizing against self-describing formats such as ORC or Parquet, etc) Experience with search / indexing systems ( e.g. Elasticsearch) Deep expertise with agile development in Python, Scala, Go, and/or C++ Experience building reusable components on top of the CNCF ecosystem including Kubernetes Metrics-first mindset Experience mentoring junior engineers into deep technical expertise Preferred Qualifications: If you have the following characteristics, it would be a plus: Experience with agile software development Experience building and designing a DevOps-first way of working Demonstrated experience building reusable components on top of the CNCF ecosystem including Kubernetes (or similar ecosystem ) LI-GSK Why GSK? Our values and expectations are at the heart of everything we do and form an important part of our culture. These include Patient focus, Transparency, Respect, Integrity along with Courage, Accountability, Development, and Teamwork. As GSK focuses on our values and expectations and a culture of innovation, performance, and trust, the successful candidate will demonstrate the following capabilities: Operating at pace and agile decision making - using evidence and applying judgement to balance pace, rigour and risk. Committed to delivering high-quality results, overcoming challenges, focusing on what matters, execution. Continuously looking for opportunities to learn, build skills and share learning. Sustaining energy and wellbeing Building strong relationships and collaboration, honest and open conversations. Budgeting and cost consciousness As a company driven by our values of Patient focus, Transparency, Respect and Integrity, we know inclusion and diversity are essential for us to be able to succeed. We want all our colleagues to thrive at GSK bringing their unique experiences, ensuring they feel good and to keep growing their careers. As a candidate for a role, we want you to feel the same way. As an Equal Opportunity Employer, we are open to all talent. In the US, we also adhere to Affirmative Action principles. This ensures that all qualified applicants will receive equal consideration for employment without regard to neurodiversity, race/ethnicity, colour, national origin, religion, gender, pregnancy, marital status, sexual orientation, gender identity/expression, age, disability, genetic information, military service, covered/protected veteran status or any other federal, state or local protected class ( US only). We believe in an agile working culture for all our roles. If flexibility is important to you, we encourage you to explore with our hiring team what the opportunities are. Please don't hesitate to contact us if you'd like to discuss any adjustments to our process which might help you demonstrate your strengths and capabilities. You can either call us on , or send an email As you apply, we will ask you to share some personal information which is entirely voluntary..... click apply for full job details
23/09/2022
Full time
Site Name: UK - Hertfordshire - Stevenage, USA - Connecticut - Hartford, USA - Delaware - Dover, USA - Maryland - Rockville, USA - Massachusetts - Waltham, USA - Pennsylvania - Upper Providence, Warren NJ Posted Date: Aug The mission of the Data Science and Data Engineering (DSDE) organization within GSK Pharmaceuticals R&D is to get the right data, to the right people, at the right time. TheData Framework and Opsorganization ensures we can do this efficiently, reliably, transparently, and at scale through the creation of a leading-edge, cloud-native data services framework. We focus heavily on developer experience, on strong, semantic abstractions for the data ecosystem, on professional operations and aggressive automation, and on transparency of operations and cost. Achieving delivery of the right data to the right people at the right time needs design and implementation of data flows and data products which leverage internal and external data assets and tools to drive discovery and development is a key objective for the Data Science and Data Engineering (DS D E) team within GSK's Pharmaceutical R&D organisation . There are five key drivers for this approach, which are closely aligned with GSK's corporate priorities of Innovation, Performance and Trust: Automation of end-to-end data flows: Faster and reliable ingestion of high throughput data in genetics, genomics and multi-omics, to extract value of investments in new technology (instrument to analysis-ready data in Enabling governance by design of external and internal data: with engineered practical solutions for controlled use and monitoring Innovative disease-specific and domain-expert specific data products : to enable computational scientists and their research unit collaborators to get faster to key insights leading to faster biopharmaceutical development cycles. Supporting e2 e code traceability and data provenance: Increasing assurance of data integrity through automation, integration Improving engineering efficiency: Extensible, reusable, scalable,updateable,maintainable, virtualized traceable data and code would b e driven by data engineering innovation and better resource utilization. We are looking for an experienced Sr. Data Ops Engineer to join our growing Data Ops team. As a Sr. Data Ops Engineer is a highly technical individual contributor, building modern, cloud-native, DevOps-first systems for standardizing and templatizingbiomedical and scientificdata engineering, with demonstrable experience across the following areas : Deliver declarative components for common data ingestion, transformation and publishing techniques Define and implement data governance aligned to modern standards Establish scalable, automated processes for data engineering team s across GSK Thought leader and partner with wider DSDE data engineering teams to advise on implementation and best practices Cloud Infrastructure-as-Code D efine Service and Flow orchestration Data as a configurable resource(including configuration-driven access to scientific data modelling tools) Ob servabilty (monitoring, alerting, logging, tracing, ...) Enable quality engineering through KPIs and c ode coverage and quality checks Standardise GitOps /declarative software development lifecycle Audit as a service Sr. DataOpsEngineerstake full ownership of delivering high-performing, high-impactbiomedical and scientificdataopsproducts and services, froma description of apattern thatcustomer Data Engineers are trying touseall the way through tofinal delivery (and ongoing monitoring and operations)of a templated project and all associated automation. They arestandard-bearers for software engineering and quality coding practices within theteam andareexpected to mentor more junior engineers; they may even coordinate the work of more junior engineers on a large project.Theydevise useful metrics for ensuring their services are meeting customer demand and having animpact anditerate to deliver and improve on those metrics in an agile fashion. A successfulSr.DataOpsEngineeris developing expertise with the types of data and types of tools that are leveraged in the biomedical and scientific data engineering space, andhas the following skills and experience(withsignificant depth in one or more of these areas): Demonstrable experience deploying robust modularised/ container based solutions to production (ideally GCP) and leveraging the Cloud NativeComputing Foundation (CNCF) ecosystem Significant depth in DevOps principles and tools ( e.g. GitOps , Jenkins, CircleCI , Azure DevOps, ...), and how to integrate these tools with other productivity tools (e.g. Jira, Slack, Microsoft Teams) to build a comprehensive workflow P rogramming in Python. Scala or Go Embedding agile s oftware engineering ( task/issue management, testing, documentation, software development lifecycle, source control, ) Leveraging major cloud providers, both via Kubernetes or via vendor-specific services Authentication and Authorization flows and associated technologies ( e.g. OAuth2 + JWT) Common distributed data tools ( e.g. Spark, Hive) The DSDE team is built on the principles of ownership, accountability, continuous development, and collaboration. We hire for the long term, and we're motivated to make this a great place to work. Our leaders will be committed to your career and development from day one. Why you? Basic Qualifications: Bachelors degree in Computer Science with a focus in Data Engineering, DataOps, DevOps, MLOps, Software Engineering, etc, plus 7 years job experience or Masters degree with 5 Years of experience (or PhD plus 3 years job experience) Deep experience with DevOps tools and concepts ( e.g. Jira, GitLabs / Jenkins / CircleCI / Azure DevOps / ...) Excellent with common distributed data tools in a production setting (Spark, Kafka, etc) Experience with specialized data architecture ( e.g. optimizing physical layout for access patterns, including bloom filters, optimizing against self-describing formats such as ORC or Parquet, etc) Experience with search / indexing systems ( e.g. Elasticsearch) Deep expertise with agile development in Python, Scala, Go, and/or C++ Experience building reusable components on top of the CNCF ecosystem including Kubernetes Metrics-first mindset Experience mentoring junior engineers into deep technical expertise Preferred Qualifications: If you have the following characteristics, it would be a plus: Experience with agile software development Experience building and designing a DevOps-first way of working Demonstrated experience building reusable components on top of the CNCF ecosystem including Kubernetes (or similar ecosystem ) LI-GSK Why GSK? Our values and expectations are at the heart of everything we do and form an important part of our culture. These include Patient focus, Transparency, Respect, Integrity along with Courage, Accountability, Development, and Teamwork. As GSK focuses on our values and expectations and a culture of innovation, performance, and trust, the successful candidate will demonstrate the following capabilities: Operating at pace and agile decision making - using evidence and applying judgement to balance pace, rigour and risk. Committed to delivering high-quality results, overcoming challenges, focusing on what matters, execution. Continuously looking for opportunities to learn, build skills and share learning. Sustaining energy and wellbeing Building strong relationships and collaboration, honest and open conversations. Budgeting and cost consciousness As a company driven by our values of Patient focus, Transparency, Respect and Integrity, we know inclusion and diversity are essential for us to be able to succeed. We want all our colleagues to thrive at GSK bringing their unique experiences, ensuring they feel good and to keep growing their careers. As a candidate for a role, we want you to feel the same way. As an Equal Opportunity Employer, we are open to all talent. In the US, we also adhere to Affirmative Action principles. This ensures that all qualified applicants will receive equal consideration for employment without regard to neurodiversity, race/ethnicity, colour, national origin, religion, gender, pregnancy, marital status, sexual orientation, gender identity/expression, age, disability, genetic information, military service, covered/protected veteran status or any other federal, state or local protected class ( US only). We believe in an agile working culture for all our roles. If flexibility is important to you, we encourage you to explore with our hiring team what the opportunities are. Please don't hesitate to contact us if you'd like to discuss any adjustments to our process which might help you demonstrate your strengths and capabilities. You can either call us on , or send an email As you apply, we will ask you to share some personal information which is entirely voluntary..... click apply for full job details
Omni RMS
Senior Data Architect
Omni RMS Manchester, Lancashire
Team Overview Data is central to Ofcom's work. We use data from a wide range of sources to understand the dynamics of the sectors we regulate and to provide key insights to inform policy decisions. In addition to existing regulatory responsibilities, there will be exciting opportunities to contribute to Ofcom's new duties in relation to the regulation of Online Harms. It is expected that this new area of work will expand the scope of data-driven activities at Ofcom in terms of the variety and volume of data, as well as the range of analysis we do. To ensure that Ofcom has the appropriate data capabilities to undertake the analysis effectively, ICT has created a new role for an Data Solutions Architect Purpose of the Role The Data Solutions Architect works closely with ICT and the Data Innovation Hub. You will architect, design, implement and oversee the operations of data solutions that empower data professionals to efficiently and effectively deliver their work. Candidates will exhibit critical thinking skills, the ability to synthesize complex problems, and have relevant skills and experience for enabling the transformation of data to create solutions that add value to a myriad of business requirements. You must have a deep understanding of the full data lifecycle and the role that high-quality data plays across applications, machine learning, business analytics, and reporting. You will lead the design and development of solution architectures in response to business requirements. This includes identifying and evaluating alternative architectures, their trade-offs in cost, performance and scalability. And ensuring that the relevant technical strategies, policies, standards and practices (including security) are applied correctly. The end to end solution will be fit for purpose - i.e. meet the needs of business, the agreed requirements, and support the strategic direction of Ofcom. Work with IT teams, business analysts and data analytics teams to understand data consumers' needs and develop solutions By maintaining your knowledge of emerging trends in data usage, tools and analysis techniques you will support our on-going development activities and continually promote data innovation as a means to achieve the business outcomes for specific Groups, and Ofcom. You will need to be self-motivated, an effective communicator and have a collaborative delivery approach.You will work in a collaborative cross-functional environment and interact with the full spectrum of colleagues (data engineers, data analysts, data scientists, operational support and policy makers), and you will need to inform and influence senior managers. Requirements of the Role Build strong relationships with colleagues across the business, understanding their motivations behind projects and own technical activities to translate business requirements (both functional and non-functional) into a solution. Ensuring the required business value is delivered. Fostering a customer centric approach to ensure delivery of business value and an iterative approach that responds to feedback and changing needs. Perform deep dives into technical areas to solve a specific solution or design challenges. Using trials or POCs to prove or discount an approach, to critic your own design. You will be responsible for ensuring that the solutions you help deliver form an integral part of the ICT estate and align with the wider reference architecture and domain roadmaps. Manage stakeholder expectations and be flexible, working on many different projects and topics at the same time period. Manage proactive and reactive communication. Facilitate difficult discussions within the team or with diverse senior stakeholders and external / 3rd parties as necessary. Provide documentation of solutions detailing the business, data, application and technology layers. Work with Data Engineers to define data pipelines and data lakes, covering the ingression, ETL or ELT, and the cataloguing of data. Takes overall responsibility for planning effective data storage, cost, security, quality, sharing, availability, retention and publishing within the organisation. Develops analytics policy, standards and guidelines Ensuring successful transitions for solutions into production ensuring production support have the necessary knowledge and documentation to support the service. Skills, knowledge and experience Robust Data and Technical/Solutions Architecture skills - sets direction for and possesses a deep understanding of architecture and strategies which integrate with industry trends e.g. TOGAF Hands-on experience with analytical tools and languages such as: Python, R, SQL, Azure Data Factory (and SSIS), Databricks, Power BI, Git etc. Experience of infrastructure: cloud-based technologies used for storage, data lakes, data warehouses, data streaming, Databases, ETL / ELT / Transformation. Experience of DevOps/ DataOps methods in development of data solutions to ensure pipelines and processes can be automated and controlled. Experience with Cloud based data and analytical initiatives utilising Azure, AWS, Google Cloud or similar cloud services Experience of working closely with Data Professionals (E.g. Data Scientists and Data Analysts) to understanding their needs. Experience of implementating of statistical, Artificial Intelligence, Machine Learning and Deep learning applications. Experience with integrations (e.g. via APIs) with external vendors to share data between organizations Experience of working with external technology supplier and service providers to deliver business solutions SFIA Skill Enterprise and business architecture STPL - Level 5 Solution architecture ARCH - Level 5 Requirements definition and management REQM- Level 5 Database design DBDS- Level 5 Analytics INAN- Level 4 Emerging Technology Monitoring (EMRG)- Level 4 Relationship Management RLMT- Level 5
04/11/2021
Full time
Team Overview Data is central to Ofcom's work. We use data from a wide range of sources to understand the dynamics of the sectors we regulate and to provide key insights to inform policy decisions. In addition to existing regulatory responsibilities, there will be exciting opportunities to contribute to Ofcom's new duties in relation to the regulation of Online Harms. It is expected that this new area of work will expand the scope of data-driven activities at Ofcom in terms of the variety and volume of data, as well as the range of analysis we do. To ensure that Ofcom has the appropriate data capabilities to undertake the analysis effectively, ICT has created a new role for an Data Solutions Architect Purpose of the Role The Data Solutions Architect works closely with ICT and the Data Innovation Hub. You will architect, design, implement and oversee the operations of data solutions that empower data professionals to efficiently and effectively deliver their work. Candidates will exhibit critical thinking skills, the ability to synthesize complex problems, and have relevant skills and experience for enabling the transformation of data to create solutions that add value to a myriad of business requirements. You must have a deep understanding of the full data lifecycle and the role that high-quality data plays across applications, machine learning, business analytics, and reporting. You will lead the design and development of solution architectures in response to business requirements. This includes identifying and evaluating alternative architectures, their trade-offs in cost, performance and scalability. And ensuring that the relevant technical strategies, policies, standards and practices (including security) are applied correctly. The end to end solution will be fit for purpose - i.e. meet the needs of business, the agreed requirements, and support the strategic direction of Ofcom. Work with IT teams, business analysts and data analytics teams to understand data consumers' needs and develop solutions By maintaining your knowledge of emerging trends in data usage, tools and analysis techniques you will support our on-going development activities and continually promote data innovation as a means to achieve the business outcomes for specific Groups, and Ofcom. You will need to be self-motivated, an effective communicator and have a collaborative delivery approach.You will work in a collaborative cross-functional environment and interact with the full spectrum of colleagues (data engineers, data analysts, data scientists, operational support and policy makers), and you will need to inform and influence senior managers. Requirements of the Role Build strong relationships with colleagues across the business, understanding their motivations behind projects and own technical activities to translate business requirements (both functional and non-functional) into a solution. Ensuring the required business value is delivered. Fostering a customer centric approach to ensure delivery of business value and an iterative approach that responds to feedback and changing needs. Perform deep dives into technical areas to solve a specific solution or design challenges. Using trials or POCs to prove or discount an approach, to critic your own design. You will be responsible for ensuring that the solutions you help deliver form an integral part of the ICT estate and align with the wider reference architecture and domain roadmaps. Manage stakeholder expectations and be flexible, working on many different projects and topics at the same time period. Manage proactive and reactive communication. Facilitate difficult discussions within the team or with diverse senior stakeholders and external / 3rd parties as necessary. Provide documentation of solutions detailing the business, data, application and technology layers. Work with Data Engineers to define data pipelines and data lakes, covering the ingression, ETL or ELT, and the cataloguing of data. Takes overall responsibility for planning effective data storage, cost, security, quality, sharing, availability, retention and publishing within the organisation. Develops analytics policy, standards and guidelines Ensuring successful transitions for solutions into production ensuring production support have the necessary knowledge and documentation to support the service. Skills, knowledge and experience Robust Data and Technical/Solutions Architecture skills - sets direction for and possesses a deep understanding of architecture and strategies which integrate with industry trends e.g. TOGAF Hands-on experience with analytical tools and languages such as: Python, R, SQL, Azure Data Factory (and SSIS), Databricks, Power BI, Git etc. Experience of infrastructure: cloud-based technologies used for storage, data lakes, data warehouses, data streaming, Databases, ETL / ELT / Transformation. Experience of DevOps/ DataOps methods in development of data solutions to ensure pipelines and processes can be automated and controlled. Experience with Cloud based data and analytical initiatives utilising Azure, AWS, Google Cloud or similar cloud services Experience of working closely with Data Professionals (E.g. Data Scientists and Data Analysts) to understanding their needs. Experience of implementating of statistical, Artificial Intelligence, Machine Learning and Deep learning applications. Experience with integrations (e.g. via APIs) with external vendors to share data between organizations Experience of working with external technology supplier and service providers to deliver business solutions SFIA Skill Enterprise and business architecture STPL - Level 5 Solution architecture ARCH - Level 5 Requirements definition and management REQM- Level 5 Database design DBDS- Level 5 Analytics INAN- Level 4 Emerging Technology Monitoring (EMRG)- Level 4 Relationship Management RLMT- Level 5
Omni RMS
Senior Data Architect
Omni RMS
Team Overview Data is central to Ofcom's work. We use data from a wide range of sources to understand the dynamics of the sectors we regulate and to provide key insights to inform policy decisions. In addition to existing regulatory responsibilities, there will be exciting opportunities to contribute to Ofcom's new duties in relation to the regulation of Online Harms. It is expected that this new area of work will expand the scope of data-driven activities at Ofcom in terms of the variety and volume of data, as well as the range of analysis we do. To ensure that Ofcom has the appropriate data capabilities to undertake the analysis effectively, ICT has created a new role for an Data Solutions Architect Purpose of the Role The Data Solutions Architect works closely with ICT and the Data Innovation Hub. You will architect, design, implement and oversee the operations of data solutions that empower data professionals to efficiently and effectively deliver their work. Candidates will exhibit critical thinking skills, the ability to synthesize complex problems, and have relevant skills and experience for enabling the transformation of data to create solutions that add value to a myriad of business requirements. You must have a deep understanding of the full data lifecycle and the role that high-quality data plays across applications, machine learning, business analytics, and reporting. You will lead the design and development of solution architectures in response to business requirements. This includes identifying and evaluating alternative architectures, their trade-offs in cost, performance and scalability. And ensuring that the relevant technical strategies, policies, standards and practices (including security) are applied correctly. The end to end solution will be fit for purpose - i.e. meet the needs of business, the agreed requirements, and support the strategic direction of Ofcom. Work with IT teams, business analysts and data analytics teams to understand data consumers' needs and develop solutions By maintaining your knowledge of emerging trends in data usage, tools and analysis techniques you will support our on-going development activities and continually promote data innovation as a means to achieve the business outcomes for specific Groups, and Ofcom. You will need to be self-motivated, an effective communicator and have a collaborative delivery approach.You will work in a collaborative cross-functional environment and interact with the full spectrum of colleagues (data engineers, data analysts, data scientists, operational support and policy makers), and you will need to inform and influence senior managers. Requirements of the Role Build strong relationships with colleagues across the business, understanding their motivations behind projects and own technical activities to translate business requirements (both functional and non-functional) into a solution. Ensuring the required business value is delivered. Fostering a customer centric approach to ensure delivery of business value and an iterative approach that responds to feedback and changing needs. Perform deep dives into technical areas to solve a specific solution or design challenges. Using trials or POCs to prove or discount an approach, to critic your own design. You will be responsible for ensuring that the solutions you help deliver form an integral part of the ICT estate and align with the wider reference architecture and domain roadmaps. Manage stakeholder expectations and be flexible, working on many different projects and topics at the same time period. Manage proactive and reactive communication. Facilitate difficult discussions within the team or with diverse senior stakeholders and external / 3rd parties as necessary. Provide documentation of solutions detailing the business, data, application and technology layers. Work with Data Engineers to define data pipelines and data lakes, covering the ingression, ETL or ELT, and the cataloguing of data. Takes overall responsibility for planning effective data storage, cost, security, quality, sharing, availability, retention and publishing within the organisation. Develops analytics policy, standards and guidelines Ensuring successful transitions for solutions into production ensuring production support have the necessary knowledge and documentation to support the service. Skills, knowledge and experience Robust Data and Technical/Solutions Architecture skills - sets direction for and possesses a deep understanding of architecture and strategies which integrate with industry trends e.g. TOGAF Hands-on experience with analytical tools and languages such as: Python, R, SQL, Azure Data Factory (and SSIS), Databricks, Power BI, Git etc. Experience of infrastructure: cloud-based technologies used for storage, data lakes, data warehouses, data streaming, Databases, ETL / ELT / Transformation. Experience of DevOps/ DataOps methods in development of data solutions to ensure pipelines and processes can be automated and controlled. Experience with Cloud based data and analytical initiatives utilising Azure, AWS, Google Cloud or similar cloud services Experience of working closely with Data Professionals (E.g. Data Scientists and Data Analysts) to understanding their needs. Experience of implementating of statistical, Artificial Intelligence, Machine Learning and Deep learning applications. Experience with integrations (e.g. via APIs) with external vendors to share data between organizations Experience of working with external technology supplier and service providers to deliver business solutions SFIA Skill Enterprise and business architecture STPL - Level 5 Solution architecture ARCH - Level 5 Requirements definition and management REQM- Level 5 Database design DBDS- Level 5 Analytics INAN- Level 4 Emerging Technology Monitoring (EMRG)- Level 4 Relationship Management RLMT- Level 5
04/11/2021
Full time
Team Overview Data is central to Ofcom's work. We use data from a wide range of sources to understand the dynamics of the sectors we regulate and to provide key insights to inform policy decisions. In addition to existing regulatory responsibilities, there will be exciting opportunities to contribute to Ofcom's new duties in relation to the regulation of Online Harms. It is expected that this new area of work will expand the scope of data-driven activities at Ofcom in terms of the variety and volume of data, as well as the range of analysis we do. To ensure that Ofcom has the appropriate data capabilities to undertake the analysis effectively, ICT has created a new role for an Data Solutions Architect Purpose of the Role The Data Solutions Architect works closely with ICT and the Data Innovation Hub. You will architect, design, implement and oversee the operations of data solutions that empower data professionals to efficiently and effectively deliver their work. Candidates will exhibit critical thinking skills, the ability to synthesize complex problems, and have relevant skills and experience for enabling the transformation of data to create solutions that add value to a myriad of business requirements. You must have a deep understanding of the full data lifecycle and the role that high-quality data plays across applications, machine learning, business analytics, and reporting. You will lead the design and development of solution architectures in response to business requirements. This includes identifying and evaluating alternative architectures, their trade-offs in cost, performance and scalability. And ensuring that the relevant technical strategies, policies, standards and practices (including security) are applied correctly. The end to end solution will be fit for purpose - i.e. meet the needs of business, the agreed requirements, and support the strategic direction of Ofcom. Work with IT teams, business analysts and data analytics teams to understand data consumers' needs and develop solutions By maintaining your knowledge of emerging trends in data usage, tools and analysis techniques you will support our on-going development activities and continually promote data innovation as a means to achieve the business outcomes for specific Groups, and Ofcom. You will need to be self-motivated, an effective communicator and have a collaborative delivery approach.You will work in a collaborative cross-functional environment and interact with the full spectrum of colleagues (data engineers, data analysts, data scientists, operational support and policy makers), and you will need to inform and influence senior managers. Requirements of the Role Build strong relationships with colleagues across the business, understanding their motivations behind projects and own technical activities to translate business requirements (both functional and non-functional) into a solution. Ensuring the required business value is delivered. Fostering a customer centric approach to ensure delivery of business value and an iterative approach that responds to feedback and changing needs. Perform deep dives into technical areas to solve a specific solution or design challenges. Using trials or POCs to prove or discount an approach, to critic your own design. You will be responsible for ensuring that the solutions you help deliver form an integral part of the ICT estate and align with the wider reference architecture and domain roadmaps. Manage stakeholder expectations and be flexible, working on many different projects and topics at the same time period. Manage proactive and reactive communication. Facilitate difficult discussions within the team or with diverse senior stakeholders and external / 3rd parties as necessary. Provide documentation of solutions detailing the business, data, application and technology layers. Work with Data Engineers to define data pipelines and data lakes, covering the ingression, ETL or ELT, and the cataloguing of data. Takes overall responsibility for planning effective data storage, cost, security, quality, sharing, availability, retention and publishing within the organisation. Develops analytics policy, standards and guidelines Ensuring successful transitions for solutions into production ensuring production support have the necessary knowledge and documentation to support the service. Skills, knowledge and experience Robust Data and Technical/Solutions Architecture skills - sets direction for and possesses a deep understanding of architecture and strategies which integrate with industry trends e.g. TOGAF Hands-on experience with analytical tools and languages such as: Python, R, SQL, Azure Data Factory (and SSIS), Databricks, Power BI, Git etc. Experience of infrastructure: cloud-based technologies used for storage, data lakes, data warehouses, data streaming, Databases, ETL / ELT / Transformation. Experience of DevOps/ DataOps methods in development of data solutions to ensure pipelines and processes can be automated and controlled. Experience with Cloud based data and analytical initiatives utilising Azure, AWS, Google Cloud or similar cloud services Experience of working closely with Data Professionals (E.g. Data Scientists and Data Analysts) to understanding their needs. Experience of implementating of statistical, Artificial Intelligence, Machine Learning and Deep learning applications. Experience with integrations (e.g. via APIs) with external vendors to share data between organizations Experience of working with external technology supplier and service providers to deliver business solutions SFIA Skill Enterprise and business architecture STPL - Level 5 Solution architecture ARCH - Level 5 Requirements definition and management REQM- Level 5 Database design DBDS- Level 5 Analytics INAN- Level 4 Emerging Technology Monitoring (EMRG)- Level 4 Relationship Management RLMT- Level 5
Omni RMS
Senior Data Architect
Omni RMS Warrington, Cheshire
Team Overview Data is central to Ofcom's work. We use data from a wide range of sources to understand the dynamics of the sectors we regulate and to provide key insights to inform policy decisions. In addition to existing regulatory responsibilities, there will be exciting opportunities to contribute to Ofcom's new duties in relation to the regulation of Online Harms. It is expected that this new area of work will expand the scope of data-driven activities at Ofcom in terms of the variety and volume of data, as well as the range of analysis we do. To ensure that Ofcom has the appropriate data capabilities to undertake the analysis effectively, ICT has created a new role for an Data Solutions Architect Purpose of the Role The Data Solutions Architect works closely with ICT and the Data Innovation Hub. You will architect, design, implement and oversee the operations of data solutions that empower data professionals to efficiently and effectively deliver their work. Candidates will exhibit critical thinking skills, the ability to synthesize complex problems, and have relevant skills and experience for enabling the transformation of data to create solutions that add value to a myriad of business requirements. You must have a deep understanding of the full data lifecycle and the role that high-quality data plays across applications, machine learning, business analytics, and reporting. You will lead the design and development of solution architectures in response to business requirements. This includes identifying and evaluating alternative architectures, their trade-offs in cost, performance and scalability. And ensuring that the relevant technical strategies, policies, standards and practices (including security) are applied correctly. The end to end solution will be fit for purpose - i.e. meet the needs of business, the agreed requirements, and support the strategic direction of Ofcom. Work with IT teams, business analysts and data analytics teams to understand data consumers' needs and develop solutions By maintaining your knowledge of emerging trends in data usage, tools and analysis techniques you will support our on-going development activities and continually promote data innovation as a means to achieve the business outcomes for specific Groups, and Ofcom. You will need to be self-motivated, an effective communicator and have a collaborative delivery approach.You will work in a collaborative cross-functional environment and interact with the full spectrum of colleagues (data engineers, data analysts, data scientists, operational support and policy makers), and you will need to inform and influence senior managers. Requirements of the Role Build strong relationships with colleagues across the business, understanding their motivations behind projects and own technical activities to translate business requirements (both functional and non-functional) into a solution. Ensuring the required business value is delivered. Fostering a customer centric approach to ensure delivery of business value and an iterative approach that responds to feedback and changing needs. Perform deep dives into technical areas to solve a specific solution or design challenges. Using trials or POCs to prove or discount an approach, to critic your own design. You will be responsible for ensuring that the solutions you help deliver form an integral part of the ICT estate and align with the wider reference architecture and domain roadmaps. Manage stakeholder expectations and be flexible, working on many different projects and topics at the same time period. Manage proactive and reactive communication. Facilitate difficult discussions within the team or with diverse senior stakeholders and external / 3rd parties as necessary. Provide documentation of solutions detailing the business, data, application and technology layers. Work with Data Engineers to define data pipelines and data lakes, covering the ingression, ETL or ELT, and the cataloguing of data. Takes overall responsibility for planning effective data storage, cost, security, quality, sharing, availability, retention and publishing within the organisation. Develops analytics policy, standards and guidelines Ensuring successful transitions for solutions into production ensuring production support have the necessary knowledge and documentation to support the service. Skills, knowledge and experience Robust Data and Technical/Solutions Architecture skills - sets direction for and possesses a deep understanding of architecture and strategies which integrate with industry trends e.g. TOGAF Hands-on experience with analytical tools and languages such as: Python, R, SQL, Azure Data Factory (and SSIS), Databricks, Power BI, Git etc. Experience of infrastructure: cloud-based technologies used for storage, data lakes, data warehouses, data streaming, Databases, ETL / ELT / Transformation. Experience of DevOps/ DataOps methods in development of data solutions to ensure pipelines and processes can be automated and controlled. Experience with Cloud based data and analytical initiatives utilising Azure, AWS, Google Cloud or similar cloud services Experience of working closely with Data Professionals (E.g. Data Scientists and Data Analysts) to understanding their needs. Experience of implementating of statistical, Artificial Intelligence, Machine Learning and Deep learning applications. Experience with integrations (e.g. via APIs) with external vendors to share data between organizations Experience of working with external technology supplier and service providers to deliver business solutions SFIA Skill Enterprise and business architecture STPL - Level 5 Solution architecture ARCH - Level 5 Requirements definition and management REQM- Level 5 Database design DBDS- Level 5 Analytics INAN- Level 4 Emerging Technology Monitoring (EMRG)- Level 4 Relationship Management RLMT- Level 5
04/11/2021
Full time
Team Overview Data is central to Ofcom's work. We use data from a wide range of sources to understand the dynamics of the sectors we regulate and to provide key insights to inform policy decisions. In addition to existing regulatory responsibilities, there will be exciting opportunities to contribute to Ofcom's new duties in relation to the regulation of Online Harms. It is expected that this new area of work will expand the scope of data-driven activities at Ofcom in terms of the variety and volume of data, as well as the range of analysis we do. To ensure that Ofcom has the appropriate data capabilities to undertake the analysis effectively, ICT has created a new role for an Data Solutions Architect Purpose of the Role The Data Solutions Architect works closely with ICT and the Data Innovation Hub. You will architect, design, implement and oversee the operations of data solutions that empower data professionals to efficiently and effectively deliver their work. Candidates will exhibit critical thinking skills, the ability to synthesize complex problems, and have relevant skills and experience for enabling the transformation of data to create solutions that add value to a myriad of business requirements. You must have a deep understanding of the full data lifecycle and the role that high-quality data plays across applications, machine learning, business analytics, and reporting. You will lead the design and development of solution architectures in response to business requirements. This includes identifying and evaluating alternative architectures, their trade-offs in cost, performance and scalability. And ensuring that the relevant technical strategies, policies, standards and practices (including security) are applied correctly. The end to end solution will be fit for purpose - i.e. meet the needs of business, the agreed requirements, and support the strategic direction of Ofcom. Work with IT teams, business analysts and data analytics teams to understand data consumers' needs and develop solutions By maintaining your knowledge of emerging trends in data usage, tools and analysis techniques you will support our on-going development activities and continually promote data innovation as a means to achieve the business outcomes for specific Groups, and Ofcom. You will need to be self-motivated, an effective communicator and have a collaborative delivery approach.You will work in a collaborative cross-functional environment and interact with the full spectrum of colleagues (data engineers, data analysts, data scientists, operational support and policy makers), and you will need to inform and influence senior managers. Requirements of the Role Build strong relationships with colleagues across the business, understanding their motivations behind projects and own technical activities to translate business requirements (both functional and non-functional) into a solution. Ensuring the required business value is delivered. Fostering a customer centric approach to ensure delivery of business value and an iterative approach that responds to feedback and changing needs. Perform deep dives into technical areas to solve a specific solution or design challenges. Using trials or POCs to prove or discount an approach, to critic your own design. You will be responsible for ensuring that the solutions you help deliver form an integral part of the ICT estate and align with the wider reference architecture and domain roadmaps. Manage stakeholder expectations and be flexible, working on many different projects and topics at the same time period. Manage proactive and reactive communication. Facilitate difficult discussions within the team or with diverse senior stakeholders and external / 3rd parties as necessary. Provide documentation of solutions detailing the business, data, application and technology layers. Work with Data Engineers to define data pipelines and data lakes, covering the ingression, ETL or ELT, and the cataloguing of data. Takes overall responsibility for planning effective data storage, cost, security, quality, sharing, availability, retention and publishing within the organisation. Develops analytics policy, standards and guidelines Ensuring successful transitions for solutions into production ensuring production support have the necessary knowledge and documentation to support the service. Skills, knowledge and experience Robust Data and Technical/Solutions Architecture skills - sets direction for and possesses a deep understanding of architecture and strategies which integrate with industry trends e.g. TOGAF Hands-on experience with analytical tools and languages such as: Python, R, SQL, Azure Data Factory (and SSIS), Databricks, Power BI, Git etc. Experience of infrastructure: cloud-based technologies used for storage, data lakes, data warehouses, data streaming, Databases, ETL / ELT / Transformation. Experience of DevOps/ DataOps methods in development of data solutions to ensure pipelines and processes can be automated and controlled. Experience with Cloud based data and analytical initiatives utilising Azure, AWS, Google Cloud or similar cloud services Experience of working closely with Data Professionals (E.g. Data Scientists and Data Analysts) to understanding their needs. Experience of implementating of statistical, Artificial Intelligence, Machine Learning and Deep learning applications. Experience with integrations (e.g. via APIs) with external vendors to share data between organizations Experience of working with external technology supplier and service providers to deliver business solutions SFIA Skill Enterprise and business architecture STPL - Level 5 Solution architecture ARCH - Level 5 Requirements definition and management REQM- Level 5 Database design DBDS- Level 5 Analytics INAN- Level 4 Emerging Technology Monitoring (EMRG)- Level 4 Relationship Management RLMT- Level 5
Nutmeg
Senior Data Engineer
Nutmeg
Who we are: Nutmeg is Europe's leading Digital Wealth Manager, but we don't want to stop there. We're continuing to build our platform to help us achieve our mission of being the most trusted Digital Wealth Manager in the world. Since being founded in 2011 we've: Grown to 160+ employees Raised over £100M in funding Launched 4 amazing products including JISA and Lifetime ISA Won multiple awards including Best Online Stocks & Shares ISA Provider for the fifth year in a row! We hit the 130,000 investor milestone in early 2021 and now manage over £3 billion AUM. *We offer flexible working* Job in a nutshell: We run a pure AWS-based cloud environment and deliver features using a continuous delivery approach. Our Data platform comprises a mix of services and open-source products fully running in Kubernetes and utilising AWS native Data solutions. Nutmeg Data solution is a mix of batching and streaming processes leveraging Airflow, Apache Kafka and AWS Data tools. Our key characteristic is enabling a self-service experience for all Data stakeholders. Nutmeg products are served by a polyglot mix of microservices designed following Domain-Driven Design principles and composing an Event-Driven Architecture powered by Apache Kafka. As a Senior Data Engineer, you will closely collaborate with technical and non-technical teams to deliver Data solutions supporting Nutmeg's Data strategy. We are looking for someone with previous job experience as a senior engineer and a strong passion for Data challenges. Requirements Your skills: Following Data engineering industry best practice Full ownership of end-to-end Data pipelines Designing, implementing, and maintaining Data models Writing automated test around Data models Understanding of CI/CD principles Experience with cloud platforms for Data (ideally AWS) Experience in converting business requirements into technical deliverables Previous experience with two or more of the following: Airflow, dbt, Kafka Connect, Looker, Python, and Redshift You might also have: DataOps best practice Experience in collaborating with BI and Data Science teams Use of agile/lean methodologies for continuous delivery and improvement Knowledge of monitoring, metrics or Site Reliability Engineering Understanding of Data governance and security standards Benefits 25 days' holiday Birthday day off 2 days' paid community leave Competitive salary Private healthcare with Vitality from day 1 Access to a digital GP and other healthcare resources Season ticket and bike loans Access to a wellbeing platform & regular knowledge sharing Regular homeworking perks and rewards Cycle storage and showers onsite Discounted Nutmeg account for you and your family and friends Part of an inclusive Nutmeg team
15/09/2021
Full time
Who we are: Nutmeg is Europe's leading Digital Wealth Manager, but we don't want to stop there. We're continuing to build our platform to help us achieve our mission of being the most trusted Digital Wealth Manager in the world. Since being founded in 2011 we've: Grown to 160+ employees Raised over £100M in funding Launched 4 amazing products including JISA and Lifetime ISA Won multiple awards including Best Online Stocks & Shares ISA Provider for the fifth year in a row! We hit the 130,000 investor milestone in early 2021 and now manage over £3 billion AUM. *We offer flexible working* Job in a nutshell: We run a pure AWS-based cloud environment and deliver features using a continuous delivery approach. Our Data platform comprises a mix of services and open-source products fully running in Kubernetes and utilising AWS native Data solutions. Nutmeg Data solution is a mix of batching and streaming processes leveraging Airflow, Apache Kafka and AWS Data tools. Our key characteristic is enabling a self-service experience for all Data stakeholders. Nutmeg products are served by a polyglot mix of microservices designed following Domain-Driven Design principles and composing an Event-Driven Architecture powered by Apache Kafka. As a Senior Data Engineer, you will closely collaborate with technical and non-technical teams to deliver Data solutions supporting Nutmeg's Data strategy. We are looking for someone with previous job experience as a senior engineer and a strong passion for Data challenges. Requirements Your skills: Following Data engineering industry best practice Full ownership of end-to-end Data pipelines Designing, implementing, and maintaining Data models Writing automated test around Data models Understanding of CI/CD principles Experience with cloud platforms for Data (ideally AWS) Experience in converting business requirements into technical deliverables Previous experience with two or more of the following: Airflow, dbt, Kafka Connect, Looker, Python, and Redshift You might also have: DataOps best practice Experience in collaborating with BI and Data Science teams Use of agile/lean methodologies for continuous delivery and improvement Knowledge of monitoring, metrics or Site Reliability Engineering Understanding of Data governance and security standards Benefits 25 days' holiday Birthday day off 2 days' paid community leave Competitive salary Private healthcare with Vitality from day 1 Access to a digital GP and other healthcare resources Season ticket and bike loans Access to a wellbeing platform & regular knowledge sharing Regular homeworking perks and rewards Cycle storage and showers onsite Discounted Nutmeg account for you and your family and friends Part of an inclusive Nutmeg team
Jonothan Bosworth
Lead Data Engineer
Jonothan Bosworth Gloucester, Gloucestershire
Lead Data Engineer | Remote working | Gloucester | £65,000 - £80,000 Jonothan Bosworth Recruitment Specialists are currently seeking a Lead Data Engineer where you will join a well-established company who are at the forefront of a new growth plan and underway with an ambitious program of work. You will join a new data team as part of the emerging data strategy. This job opportunity is for an experienced Data Engineer who is looking to progress their career into a lead role innovating with the latest technologies to design and lead technical teams in building internal as well as client-facing solutions using Databricks, Azure Stack, and Power BI. As Lead Data Engineer you will help build high performance data platforms from the ground up and establish and manage the Data Engineering team along the way, ensuring they develop, maintain, and optimise data pipelines using best practice within a DataOps methodology. THE BASICS: You will design and implement numerous complex data flows to connect operational systems, data for analytics and business intelligence (BI) systems. Specifically: Design / Implement data storage and processing solutions. Data security and compliance. Monitor and optimise data solutions. Build Data Engineering capacity through technical support and personal development of Data Engineers. Inspire best practice for data products and services, and work with senior team members to identify, plan, develop and deliver data services. KEY SKILLS: You will have experience leading a team along with Cloud architecture and distributed systems Having worked on Big Data projects you will have experience using Big Data Frameworks to create Data Pipelines with the latest stream processing systems (e.g., Kafka, Storm, Spark-Streaming, etc.) Advanced Programming / Scripting (Java, Python, R etc.) Data Strategy, Architectures and Governance, Data Management and Security Data Integrations using Azure Data Factory, Data Bricks and APIs Data Repositories in SQL Server and Analysis Services Data Modelling, SQL and Azure Data Warehouse and Reporting solutions Able to work well under pressure, flexible, positive & focused during times of change. Travel to Gloucester twice a week. For more information, please contact Claire at Jonothan Bosworth Recruitment Specialists. NC_20_LDE_CE We are an equal opportunities employer, committed to diversity and inclusion. We are active anti-slavery advocates and prohibit discrimination and harassment of any kind based on race, colour, sex, religion, sexual orientation, national origin, disability, genetic information, pregnancy, or any other protected characteristic.
14/09/2021
Full time
Lead Data Engineer | Remote working | Gloucester | £65,000 - £80,000 Jonothan Bosworth Recruitment Specialists are currently seeking a Lead Data Engineer where you will join a well-established company who are at the forefront of a new growth plan and underway with an ambitious program of work. You will join a new data team as part of the emerging data strategy. This job opportunity is for an experienced Data Engineer who is looking to progress their career into a lead role innovating with the latest technologies to design and lead technical teams in building internal as well as client-facing solutions using Databricks, Azure Stack, and Power BI. As Lead Data Engineer you will help build high performance data platforms from the ground up and establish and manage the Data Engineering team along the way, ensuring they develop, maintain, and optimise data pipelines using best practice within a DataOps methodology. THE BASICS: You will design and implement numerous complex data flows to connect operational systems, data for analytics and business intelligence (BI) systems. Specifically: Design / Implement data storage and processing solutions. Data security and compliance. Monitor and optimise data solutions. Build Data Engineering capacity through technical support and personal development of Data Engineers. Inspire best practice for data products and services, and work with senior team members to identify, plan, develop and deliver data services. KEY SKILLS: You will have experience leading a team along with Cloud architecture and distributed systems Having worked on Big Data projects you will have experience using Big Data Frameworks to create Data Pipelines with the latest stream processing systems (e.g., Kafka, Storm, Spark-Streaming, etc.) Advanced Programming / Scripting (Java, Python, R etc.) Data Strategy, Architectures and Governance, Data Management and Security Data Integrations using Azure Data Factory, Data Bricks and APIs Data Repositories in SQL Server and Analysis Services Data Modelling, SQL and Azure Data Warehouse and Reporting solutions Able to work well under pressure, flexible, positive & focused during times of change. Travel to Gloucester twice a week. For more information, please contact Claire at Jonothan Bosworth Recruitment Specialists. NC_20_LDE_CE We are an equal opportunities employer, committed to diversity and inclusion. We are active anti-slavery advocates and prohibit discrimination and harassment of any kind based on race, colour, sex, religion, sexual orientation, national origin, disability, genetic information, pregnancy, or any other protected characteristic.
Nutmeg
Senior Data Engineer
Nutmeg
Who we are: Nutmeg is Europe's leading Digital Wealth Manager, but we don't want to stop there. We're continuing to build our platform to help us achieve our mission of being the most trusted Digital Wealth Manager in the world. Since being founded in 2011 we've: Grown to 160+ employees Raised over £100M in funding Launched 4 amazing products including JISA and Lifetime ISA Won multiple awards including Best Online Stocks & Shares ISA Provider for the fifth year in a row! We hit the 130,000 investor milestone in early 2021 and now manage over £3 billion AUM. *We offer flexible working* Job in a nutshell: We run a pure AWS-based cloud environment and deliver features using a continuous delivery approach. Our Data platform comprises a mix of services and open-source products fully running in Kubernetes and utilising AWS native Data solutions. Nutmeg Data solution is a mix of batching and streaming processes leveraging Airflow, Apache Kafka and AWS Data tools. Our key characteristic is enabling a self-service experience for all Data stakeholders. Nutmeg products are served by a polyglot mix of microservices designed following Domain-Driven Design principles and composing an Event-Driven Architecture powered by Apache Kafka. As a Senior Data Engineer, you will closely collaborate with technical and non-technical teams to deliver Data solutions supporting Nutmeg's Data strategy. We are looking for someone with previous job experience as a senior engineer and a strong passion for Data challenges. Requirements Your skills: Following Data engineering industry best practice Full ownership of end-to-end Data pipelines Designing, implementing, and maintaining Data models Writing automated test around Data models Understanding of CI/CD principles Experience with cloud platforms for Data (ideally AWS) Experience in converting business requirements into technical deliverables Previous experience with two or more of the following: Airflow, dbt, Kafka Connect, Looker, Python, and Redshift You might also have: DataOps best practice Experience in collaborating with BI and Data Science teams Use of agile/lean methodologies for continuous delivery and improvement Knowledge of monitoring, metrics or Site Reliability Engineering Understanding of Data governance and security standards Benefits 25 days' holiday Birthday day off 2 days' paid community leave Competitive salary Private healthcare with Vitality from day 1 Access to a digital GP and other healthcare resources Season ticket and bike loans Access to a wellbeing platform & regular knowledge sharing Regular homeworking perks and rewards Cycle storage and showers onsite Discounted Nutmeg account for you and your family and friends Part of an inclusive Nutmeg team
14/09/2021
Full time
Who we are: Nutmeg is Europe's leading Digital Wealth Manager, but we don't want to stop there. We're continuing to build our platform to help us achieve our mission of being the most trusted Digital Wealth Manager in the world. Since being founded in 2011 we've: Grown to 160+ employees Raised over £100M in funding Launched 4 amazing products including JISA and Lifetime ISA Won multiple awards including Best Online Stocks & Shares ISA Provider for the fifth year in a row! We hit the 130,000 investor milestone in early 2021 and now manage over £3 billion AUM. *We offer flexible working* Job in a nutshell: We run a pure AWS-based cloud environment and deliver features using a continuous delivery approach. Our Data platform comprises a mix of services and open-source products fully running in Kubernetes and utilising AWS native Data solutions. Nutmeg Data solution is a mix of batching and streaming processes leveraging Airflow, Apache Kafka and AWS Data tools. Our key characteristic is enabling a self-service experience for all Data stakeholders. Nutmeg products are served by a polyglot mix of microservices designed following Domain-Driven Design principles and composing an Event-Driven Architecture powered by Apache Kafka. As a Senior Data Engineer, you will closely collaborate with technical and non-technical teams to deliver Data solutions supporting Nutmeg's Data strategy. We are looking for someone with previous job experience as a senior engineer and a strong passion for Data challenges. Requirements Your skills: Following Data engineering industry best practice Full ownership of end-to-end Data pipelines Designing, implementing, and maintaining Data models Writing automated test around Data models Understanding of CI/CD principles Experience with cloud platforms for Data (ideally AWS) Experience in converting business requirements into technical deliverables Previous experience with two or more of the following: Airflow, dbt, Kafka Connect, Looker, Python, and Redshift You might also have: DataOps best practice Experience in collaborating with BI and Data Science teams Use of agile/lean methodologies for continuous delivery and improvement Knowledge of monitoring, metrics or Site Reliability Engineering Understanding of Data governance and security standards Benefits 25 days' holiday Birthday day off 2 days' paid community leave Competitive salary Private healthcare with Vitality from day 1 Access to a digital GP and other healthcare resources Season ticket and bike loans Access to a wellbeing platform & regular knowledge sharing Regular homeworking perks and rewards Cycle storage and showers onsite Discounted Nutmeg account for you and your family and friends Part of an inclusive Nutmeg team

Modal Window

  • Home
  • Contact
  • About Us
  • FAQs
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • IT blog
  • Facebook
  • Twitter
  • LinkedIn
  • Youtube
© 2008-2026 IT Job Board