it job board logo
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
  • Recruiting? Post a job
  • Sign in
  • Sign up
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

20 jobs found

Email me jobs like this
Refine Search
Current Search
technical data analyst python sql databricks
CGI
Data Engineer (Databricks and AWS)
CGI
Data Engineer (Databricks and AWS) Position Description At CGI, we're helping to transform the future of healthcare through the power of data. As a Senior Data Engineer, you'll play a pivotal role in designing, building, and optimising data platforms that underpin critical national services. Working at the heart of our Healthcare team, you'll use your expertise in AWS, Databricks, and Python to deliver high-impact solutions that improve outcomes, enhance decision-making, and drive innovation across the sector. You'll collaborate with experts who share your passion for problem-solving, ownership, and technical excellence-empowered to shape the data foundations of tomorrow. CGI was recognised in the Sunday Times Best Places to Work List 2025 and has been named a UK 'Best Employer' by the Financial Times. We offer a competitive salary, excellent pension, private healthcare, plus a share scheme (3.5% + 3.5% matching) which makes you a CGI Partner not just an employee. We are committed to inclusivity, building a genuinely diverse community of tech talent and inspiring everyone to pursue careers in our sector, including our Armed Forces, and are proud to hold a Gold Award in recognition of our support of the Armed Forces Corporate Covenant. Join us and you'll be part of an open, friendly community of experts. We'll train and support you in taking your career wherever you want it to go. Due to the secure nature of the programme, you will need to hold UK Security Clearance or be eligible to go through this clearance. This is a hybrid position based in Leeds. Your future duties and responsibilities In this role, you will design, build, and maintain data solutions that power some of the UK's most critical healthcare systems. You'll be part of a collaborative engineering team, transforming how data is captured, processed, and used to drive better patient and operational outcomes. Your work will combine technical innovation with practical delivery-enabling data accessibility, quality, and security at scale. You'll take ownership of complex data challenges, partner with architects and analysts to shape technical direction, and continuously refine processes to deliver efficient, sustainable data pipelines. Working within CGI's supportive environment, you'll be encouraged to explore new technologies, share knowledge, and contribute to a culture of excellence and innovation. Key responsibilities include: • Design & Build: Develop and optimise data pipelines using Databricks, Apache Spark, and Python. • Develop & Deliver: Create scalable data solutions on AWS leveraging S3, Glue, Lambda, and related services. • Integrate & Automate: Implement ETL processes and data lake/lakehouse architectures that ensure accuracy and reliability. • Collaborate & Advise: Partner with technical and business stakeholders to translate requirements into effective data solutions. • Secure & Govern: Ensure compliance with data governance, NHS standards, and security frameworks. • Innovate & Improve: Drive continuous improvement across data engineering practices and technologies. Required qualifications to be successful in this role To excel in this role, you'll bring strong data engineering expertise and hands-on experience in cloud-based data solutions, ideally within regulated or complex environments such as healthcare. You'll be confident in both the technical and consultative aspects of data delivery. You must have: • Hands-on commercial expertise with Databricks. You should have: • Proven experience as a Data Engineer working with large, complex datasets. • Hands-on expertise with Apache Spark, and SQL. • Strong proficiency in Python (PySpark preferred). • Experience with AWS cloud services including S3, Glue, Lambda, IAM. • Familiarity with ETL design, data modelling, and data lake/lakehouse concepts. • Understanding of data governance and compliance frameworks. • Experience in the healthcare sector or knowledge of NHS data standards (advantageous). Together, as owners, let's turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you'll reach your full potential because You are invited to be an owner from day 1 as we work together to bring our Dream to life. That's why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company's strategy and direction. Your work creates value. You'll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You'll shape your career by joining a company built to grow and last. You'll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team-one of the largest IT and business consulting services firms in the world.
01/04/2026
Full time
Data Engineer (Databricks and AWS) Position Description At CGI, we're helping to transform the future of healthcare through the power of data. As a Senior Data Engineer, you'll play a pivotal role in designing, building, and optimising data platforms that underpin critical national services. Working at the heart of our Healthcare team, you'll use your expertise in AWS, Databricks, and Python to deliver high-impact solutions that improve outcomes, enhance decision-making, and drive innovation across the sector. You'll collaborate with experts who share your passion for problem-solving, ownership, and technical excellence-empowered to shape the data foundations of tomorrow. CGI was recognised in the Sunday Times Best Places to Work List 2025 and has been named a UK 'Best Employer' by the Financial Times. We offer a competitive salary, excellent pension, private healthcare, plus a share scheme (3.5% + 3.5% matching) which makes you a CGI Partner not just an employee. We are committed to inclusivity, building a genuinely diverse community of tech talent and inspiring everyone to pursue careers in our sector, including our Armed Forces, and are proud to hold a Gold Award in recognition of our support of the Armed Forces Corporate Covenant. Join us and you'll be part of an open, friendly community of experts. We'll train and support you in taking your career wherever you want it to go. Due to the secure nature of the programme, you will need to hold UK Security Clearance or be eligible to go through this clearance. This is a hybrid position based in Leeds. Your future duties and responsibilities In this role, you will design, build, and maintain data solutions that power some of the UK's most critical healthcare systems. You'll be part of a collaborative engineering team, transforming how data is captured, processed, and used to drive better patient and operational outcomes. Your work will combine technical innovation with practical delivery-enabling data accessibility, quality, and security at scale. You'll take ownership of complex data challenges, partner with architects and analysts to shape technical direction, and continuously refine processes to deliver efficient, sustainable data pipelines. Working within CGI's supportive environment, you'll be encouraged to explore new technologies, share knowledge, and contribute to a culture of excellence and innovation. Key responsibilities include: • Design & Build: Develop and optimise data pipelines using Databricks, Apache Spark, and Python. • Develop & Deliver: Create scalable data solutions on AWS leveraging S3, Glue, Lambda, and related services. • Integrate & Automate: Implement ETL processes and data lake/lakehouse architectures that ensure accuracy and reliability. • Collaborate & Advise: Partner with technical and business stakeholders to translate requirements into effective data solutions. • Secure & Govern: Ensure compliance with data governance, NHS standards, and security frameworks. • Innovate & Improve: Drive continuous improvement across data engineering practices and technologies. Required qualifications to be successful in this role To excel in this role, you'll bring strong data engineering expertise and hands-on experience in cloud-based data solutions, ideally within regulated or complex environments such as healthcare. You'll be confident in both the technical and consultative aspects of data delivery. You must have: • Hands-on commercial expertise with Databricks. You should have: • Proven experience as a Data Engineer working with large, complex datasets. • Hands-on expertise with Apache Spark, and SQL. • Strong proficiency in Python (PySpark preferred). • Experience with AWS cloud services including S3, Glue, Lambda, IAM. • Familiarity with ETL design, data modelling, and data lake/lakehouse concepts. • Understanding of data governance and compliance frameworks. • Experience in the healthcare sector or knowledge of NHS data standards (advantageous). Together, as owners, let's turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you'll reach your full potential because You are invited to be an owner from day 1 as we work together to bring our Dream to life. That's why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company's strategy and direction. Your work creates value. You'll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You'll shape your career by joining a company built to grow and last. You'll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team-one of the largest IT and business consulting services firms in the world.
Involved Solutions
Lead Data Analyst - up to £70,000 + Bonus + Benefits - Hybrid
Involved Solutions Esher, Surrey
Lead Data Analyst Salary: Up to £70,000 + Benefits Location: Esher - Hybrid Working Hours: Full time - PermanentA large well-established firm has recently implemented Microsoft Fabric and is now seeking a Lead Data Analyst to take ownership of the organisation's data and analytics capability. This role will lead the development of the company's data platform, ensuring data is transformed into meaningful insights that support decision-making across the business. The Lead Data Analyst will work across the full data lifecycle, from ingestion and modelling through to reporting and visualisation, while also managing and mentoring another Data Analyst/Engineer. The Lead Data Analyst position is ideal for someone who enjoys combining hands-on technical delivery with leadership responsibility, advising on data strategy while building scalable BI solutions. Responsibilities for the Lead Data Analyst: Own the organisation's data and BI capability following the implementation of Microsoft Fabric Design and develop high-quality Power BI dashboards and reporting solutions Develop and maintain data pipelines, integrations and data flows within Microsoft Fabric and Azure Integrate data from third-party systems and internal platforms into the data lake environment Build scalable data models and semantic layers for business reporting Build and optimise SQL queries, data models and dimensional schemas for reporting Support the continued growth of the organisation's data lake and analytics platform Analyse and interpret data to identify trends and insights that support business decision-making Work with business stakeholders to understand data needs and deliver actionable insights Manage and prioritise the analytics backlog to ensure work aligns with business value Lead and mentor a Data Analyst/Engineer while driving best practices across the data function Essential Skills for the Lead Data Analyst: Strong Power BI expertise including DAX Experience working with Microsoft Fabric Knowledge of Azure Synapse, Databricks, Spark Strong SQL capability for querying, shaping and modelling data Experience building ETL/ELT pipelines and integrating data from APIs, files and databases Experience with cloud data services within Azure environments Strong stakeholder engagement skills and ability to translate data insights for business audiences Desirable Skills for the Lead Data Analyst: Experience with Python or another analytics-focused programming language Experience working with Azure Data Lake, Azure Functions or Service Bus Experience managing or mentoring analysts or engineers Knowledge of data governance, security and BI deployment best practices If you are a data professional looking to take ownership of a modern analytics platform and shape how data drives decision-making across a business, please apply for the Lead Data Analyst position in the immediate instance. Senior Data Analyst, Senior Data & BI Analyst, Lead Data Analyst, Lead Data & BI Analyst
01/04/2026
Full time
Lead Data Analyst Salary: Up to £70,000 + Benefits Location: Esher - Hybrid Working Hours: Full time - PermanentA large well-established firm has recently implemented Microsoft Fabric and is now seeking a Lead Data Analyst to take ownership of the organisation's data and analytics capability. This role will lead the development of the company's data platform, ensuring data is transformed into meaningful insights that support decision-making across the business. The Lead Data Analyst will work across the full data lifecycle, from ingestion and modelling through to reporting and visualisation, while also managing and mentoring another Data Analyst/Engineer. The Lead Data Analyst position is ideal for someone who enjoys combining hands-on technical delivery with leadership responsibility, advising on data strategy while building scalable BI solutions. Responsibilities for the Lead Data Analyst: Own the organisation's data and BI capability following the implementation of Microsoft Fabric Design and develop high-quality Power BI dashboards and reporting solutions Develop and maintain data pipelines, integrations and data flows within Microsoft Fabric and Azure Integrate data from third-party systems and internal platforms into the data lake environment Build scalable data models and semantic layers for business reporting Build and optimise SQL queries, data models and dimensional schemas for reporting Support the continued growth of the organisation's data lake and analytics platform Analyse and interpret data to identify trends and insights that support business decision-making Work with business stakeholders to understand data needs and deliver actionable insights Manage and prioritise the analytics backlog to ensure work aligns with business value Lead and mentor a Data Analyst/Engineer while driving best practices across the data function Essential Skills for the Lead Data Analyst: Strong Power BI expertise including DAX Experience working with Microsoft Fabric Knowledge of Azure Synapse, Databricks, Spark Strong SQL capability for querying, shaping and modelling data Experience building ETL/ELT pipelines and integrating data from APIs, files and databases Experience with cloud data services within Azure environments Strong stakeholder engagement skills and ability to translate data insights for business audiences Desirable Skills for the Lead Data Analyst: Experience with Python or another analytics-focused programming language Experience working with Azure Data Lake, Azure Functions or Service Bus Experience managing or mentoring analysts or engineers Knowledge of data governance, security and BI deployment best practices If you are a data professional looking to take ownership of a modern analytics platform and shape how data drives decision-making across a business, please apply for the Lead Data Analyst position in the immediate instance. Senior Data Analyst, Senior Data & BI Analyst, Lead Data Analyst, Lead Data & BI Analyst
IntaPeople
Data Architect
IntaPeople Nantgarw, Cardiff
Data Architect Hybrid RCT (South Wales) IntaPeople are proud and excited to be appointed to recruit an experienced Data Architect for a Welsh-based not-for-profit sector client on an exclusive growth project. This is a very exciting opportunity to join their fast-growing Data function in this newly created position. You will be joining the data team as one of the first handful of team members in this area of the business which will work with external partners to build out the organisations data capability offering. As a Data Architect, you will be responsible for designing, building, and maintaining robust, scalable, and secure data pipelines and platform that enable them to make data -driven decisions at a enterprise level. Working closely with the Head of Data Engineering you will help grow out this data function with the recruitment of further data engineering resources whilst working closely with solutions architects and Software Engineers. You will also get the opportunity to progress into a leadership role if this suited the individuals desires and capabilities. You will shape, govern and assure the organisation s data architecture, defining, designing and maintaining strategic data models, standards, flows and governance structures that support organisational goals, ensure compliance, foster collaboration across business areas, and enable the organisation to make data-driven decisions Essential Skills Proven experience as a Senior Data Engineer or Data Architect (or similar/related role). Experience with Enterprise level Data sets. Expertise and practical experience in designing and aligning data models across multiple subject areas, applying recognised patterns and industry standards. Familiarity with structured architectural approaches found in TOGAF (data architecture) or equivalent. Proven experience defining and evolving data governance, including data quality, metadata, lineage, and policy assurance across services. Strong capability in data profiling, source system analysis and identifying links across problem domains to define common, reusable solutions. Experience of communicating technical information and data to a non technical audience and working collaboratively with analysts, architects, and product owners to deliver data solutions that meet user and organisational needs. Ability to lead and mentor other team members. Demonstrable knowledge of data modelling and data warehousing within platforms such as Azure or AWS. Practical experience with Microsoft Azure services, including Azure Data Lake (Gen2), Synapse, Event Hubs, and Cosmos DB, within scalable cloud -based architectures. Robust understanding of data governance, data quality, and metadata management. Desirable skills Experience with Azure Data Factory, Databricks, or Apache Spark, following modern ETL/ELT principles. Experience in using Git, Azure DevOps, or GitHub Actions for version control, CI/CD, and collaborative data delivery. Experience with Big Data. Certification in data architecture or governance frameworks (e.g., TOGAF, DAMA, DCAM, EDMC). Experience of using programming languages such as Python, Scala and SQL Welsh language skills. Key Responsibilities (at a glance): Establish Data strategies and data modelling internally within the data estate Lead the design and oversight of enterprise aligned data models and supporting data architecture, ensuring that all modelling approaches follow organisational standards, recognised patterns, and enable scalable, high quality data flows across services. Provide expert architectural guidance to technical teams delivering cloud based data platforms, ensuring that data integration, modelling, metadata and design decisions align with organisational and enterprise-wide standards Work closely with other business leaders to maintain governance and compliance within their data estate. Work closely with data analysts,data engineering, Enterprise and solution architects, DevOps, and business stakeholders through regular communication and collaborative planning to ensure data solutions are closely aligned with business objectives and effectively meet user needs. Contribute to the development and execution of the Data Strategy by maintaining thorough documentation of data processes, architectures, and workflows to ensure all technical and process information is systematically recorded, updated and data initiatives deliver business value and are aligned with broader technology and organisational goals Research into emerging technologies and upcoming trends Provide oversight to teams building data processing pipelines and integration patterns, ensuring their artefacts are consistent with data architecture principles and metadata strategies. Lead on the introduction of foundational data management capabilities to improve trust, accessibility, and efficiency in an organisation that has limited data management capability, lacks data management practices, including governance, metadata standards, and quality controls. Design, implement, and optimise physical data models that align with pipeline architecture, by using the approach that ensures efficient query performance, scalable storage, and robust integration and delivers adaptable and resource -efficient data processing, meeting the organisation s evolving analytical and operational demands. Managing the aspirations of a variety of stakeholders to enable successful project delivery can be challenging, especially when their priorities may differ or even conflict and require reconciliation to meet business and project needs. What you ll get in return (at a glance) A salary of circa £62,500 - £67,500 (depending on experience) 28 days annual leave + public bank holidays Hybrid working - To be based in their brand new, modern offices 1-2 days per week A flexible working environment Competitive Legal and General pension Scheme (8% employer contribution) 4 x Death in service The opportunity to work on modern and industry changing projects Progression and development opportunities Free Rail travel throughout Wales and discounted throughout the UK Salary sacrifice scheme such as cycle to work, electric vehicle A chance to truly contribute to large scale digitalisation projects within Wales For more information click APPLY now or for a confidential chat call Nathan Handley on (phone number removed). This role is commutable from Swansea, Bridgend, Pontypridd, Cardiff and Newport or surrounding areas.
31/03/2026
Full time
Data Architect Hybrid RCT (South Wales) IntaPeople are proud and excited to be appointed to recruit an experienced Data Architect for a Welsh-based not-for-profit sector client on an exclusive growth project. This is a very exciting opportunity to join their fast-growing Data function in this newly created position. You will be joining the data team as one of the first handful of team members in this area of the business which will work with external partners to build out the organisations data capability offering. As a Data Architect, you will be responsible for designing, building, and maintaining robust, scalable, and secure data pipelines and platform that enable them to make data -driven decisions at a enterprise level. Working closely with the Head of Data Engineering you will help grow out this data function with the recruitment of further data engineering resources whilst working closely with solutions architects and Software Engineers. You will also get the opportunity to progress into a leadership role if this suited the individuals desires and capabilities. You will shape, govern and assure the organisation s data architecture, defining, designing and maintaining strategic data models, standards, flows and governance structures that support organisational goals, ensure compliance, foster collaboration across business areas, and enable the organisation to make data-driven decisions Essential Skills Proven experience as a Senior Data Engineer or Data Architect (or similar/related role). Experience with Enterprise level Data sets. Expertise and practical experience in designing and aligning data models across multiple subject areas, applying recognised patterns and industry standards. Familiarity with structured architectural approaches found in TOGAF (data architecture) or equivalent. Proven experience defining and evolving data governance, including data quality, metadata, lineage, and policy assurance across services. Strong capability in data profiling, source system analysis and identifying links across problem domains to define common, reusable solutions. Experience of communicating technical information and data to a non technical audience and working collaboratively with analysts, architects, and product owners to deliver data solutions that meet user and organisational needs. Ability to lead and mentor other team members. Demonstrable knowledge of data modelling and data warehousing within platforms such as Azure or AWS. Practical experience with Microsoft Azure services, including Azure Data Lake (Gen2), Synapse, Event Hubs, and Cosmos DB, within scalable cloud -based architectures. Robust understanding of data governance, data quality, and metadata management. Desirable skills Experience with Azure Data Factory, Databricks, or Apache Spark, following modern ETL/ELT principles. Experience in using Git, Azure DevOps, or GitHub Actions for version control, CI/CD, and collaborative data delivery. Experience with Big Data. Certification in data architecture or governance frameworks (e.g., TOGAF, DAMA, DCAM, EDMC). Experience of using programming languages such as Python, Scala and SQL Welsh language skills. Key Responsibilities (at a glance): Establish Data strategies and data modelling internally within the data estate Lead the design and oversight of enterprise aligned data models and supporting data architecture, ensuring that all modelling approaches follow organisational standards, recognised patterns, and enable scalable, high quality data flows across services. Provide expert architectural guidance to technical teams delivering cloud based data platforms, ensuring that data integration, modelling, metadata and design decisions align with organisational and enterprise-wide standards Work closely with other business leaders to maintain governance and compliance within their data estate. Work closely with data analysts,data engineering, Enterprise and solution architects, DevOps, and business stakeholders through regular communication and collaborative planning to ensure data solutions are closely aligned with business objectives and effectively meet user needs. Contribute to the development and execution of the Data Strategy by maintaining thorough documentation of data processes, architectures, and workflows to ensure all technical and process information is systematically recorded, updated and data initiatives deliver business value and are aligned with broader technology and organisational goals Research into emerging technologies and upcoming trends Provide oversight to teams building data processing pipelines and integration patterns, ensuring their artefacts are consistent with data architecture principles and metadata strategies. Lead on the introduction of foundational data management capabilities to improve trust, accessibility, and efficiency in an organisation that has limited data management capability, lacks data management practices, including governance, metadata standards, and quality controls. Design, implement, and optimise physical data models that align with pipeline architecture, by using the approach that ensures efficient query performance, scalable storage, and robust integration and delivers adaptable and resource -efficient data processing, meeting the organisation s evolving analytical and operational demands. Managing the aspirations of a variety of stakeholders to enable successful project delivery can be challenging, especially when their priorities may differ or even conflict and require reconciliation to meet business and project needs. What you ll get in return (at a glance) A salary of circa £62,500 - £67,500 (depending on experience) 28 days annual leave + public bank holidays Hybrid working - To be based in their brand new, modern offices 1-2 days per week A flexible working environment Competitive Legal and General pension Scheme (8% employer contribution) 4 x Death in service The opportunity to work on modern and industry changing projects Progression and development opportunities Free Rail travel throughout Wales and discounted throughout the UK Salary sacrifice scheme such as cycle to work, electric vehicle A chance to truly contribute to large scale digitalisation projects within Wales For more information click APPLY now or for a confidential chat call Nathan Handley on (phone number removed). This role is commutable from Swansea, Bridgend, Pontypridd, Cardiff and Newport or surrounding areas.
Rullion Managed Services
Senior Data Analyst - Marketing
Rullion Managed Services City, London
Data Analyst 3 Month Rolling Farringdon - Onsite 1 day per month Inside IR35 Are you a data-driven professional with a passion for helping customers? Do you have a deep understanding of CRM and the data that enables execution and decision making in this space? If you're ready to make a difference through leveraging your experience in a fast-paced, impactful environment, we want you to join our team as a Senior Data Analyst. Here's a taste of what you'll be doing: Consultative Leadership: Spearhead initiatives with cross-functional stakeholders, employing a consultative approach to distill complex requirements into robust data / analytics approaches. Data Mastery: Use your expertise in data to manage large, complex datasets while applying the best analytics techniques, from advanced segmentation to root cause analysis. Impact-Driven Decision Making: Passionate about impact, whether unpacking the why, delivering optimal customer intelligence data products or delivering powerful insights empowering the organisation to be data driven. Insightful Storytelling: Comfortable in "storytelling" and visualisation, delivering insights and recommendations in a clear, relevant and action-oriented manner to senior members of the organisation. Technical Project Leadership: Oversee complex projects from inception to completion, ensuring they are delivered on time and to the highest standards. Apply best practices to ensure accuracy and efficiency in your results. Talent Development: Mentor and coach junior data analysts, fostering a culture of innovation, continuous improvement, and collaboration. Are we the perfect match? Experience working with Marketing data Extensive experience as a Senior Data Analyst, with advanced SQL and Python skills, along with expertise in advanced analytics techniques such as modelling, segmentation, and predictive analysis. Strong analytical skills with a passion for problem-solving Excellent communication skills and the ability to present to non-technical audiences, turning complex data into actionable insights. Comfortable in fast-paced ambiguous environments and collaborative team settings. Passionate about data impact. It would be great if you had: Experience in the energy retail industry Advanced tools knowledge; proficiency in Tableau, cloud platforms (ideally DataBricks), Git, and other analytics tools that support collaborative development and efficient data pipelines. Relevant degree or equivalent (e.g. statistics, mathematics etc). Rullion celebrates and supports diversity and is committed to ensuring equal opportunities for both employees and applicants.
31/03/2026
Contractor
Data Analyst 3 Month Rolling Farringdon - Onsite 1 day per month Inside IR35 Are you a data-driven professional with a passion for helping customers? Do you have a deep understanding of CRM and the data that enables execution and decision making in this space? If you're ready to make a difference through leveraging your experience in a fast-paced, impactful environment, we want you to join our team as a Senior Data Analyst. Here's a taste of what you'll be doing: Consultative Leadership: Spearhead initiatives with cross-functional stakeholders, employing a consultative approach to distill complex requirements into robust data / analytics approaches. Data Mastery: Use your expertise in data to manage large, complex datasets while applying the best analytics techniques, from advanced segmentation to root cause analysis. Impact-Driven Decision Making: Passionate about impact, whether unpacking the why, delivering optimal customer intelligence data products or delivering powerful insights empowering the organisation to be data driven. Insightful Storytelling: Comfortable in "storytelling" and visualisation, delivering insights and recommendations in a clear, relevant and action-oriented manner to senior members of the organisation. Technical Project Leadership: Oversee complex projects from inception to completion, ensuring they are delivered on time and to the highest standards. Apply best practices to ensure accuracy and efficiency in your results. Talent Development: Mentor and coach junior data analysts, fostering a culture of innovation, continuous improvement, and collaboration. Are we the perfect match? Experience working with Marketing data Extensive experience as a Senior Data Analyst, with advanced SQL and Python skills, along with expertise in advanced analytics techniques such as modelling, segmentation, and predictive analysis. Strong analytical skills with a passion for problem-solving Excellent communication skills and the ability to present to non-technical audiences, turning complex data into actionable insights. Comfortable in fast-paced ambiguous environments and collaborative team settings. Passionate about data impact. It would be great if you had: Experience in the energy retail industry Advanced tools knowledge; proficiency in Tableau, cloud platforms (ideally DataBricks), Git, and other analytics tools that support collaborative development and efficient data pipelines. Relevant degree or equivalent (e.g. statistics, mathematics etc). Rullion celebrates and supports diversity and is committed to ensuring equal opportunities for both employees and applicants.
Agilis Recruitment Ltd
Data Engineer
Agilis Recruitment Ltd
Agilis are currently working exclusively with a key client who are a leading technology consultancy in their search for a Data Engineer. This is a fantastic opportunity to join a fast growing, forward thinking company and helping them take their Data engineering to the next level! Job Description: We are seeking a highly skilled and motivated Data Engineer to join a dynamic team. The ideal candidate will have a strong background in SQL, Python, ETL processes, and data integration, ideally in Databricks. You will play a crucial role in continuing an exciting project designing, developing, and maintaining data infrastructure to ensure the seamless inflow, data sanitation/consolidation and automated report production for clients. Key Responsibilities: Design and Development: Design, develop, and maintain scalable ETL pipelines to process and integrate data from various sources. Implement data validation routines to ensure data quality and integrity. Develop and optimize SQL queries for data extraction, transformation, and loading. Strategic Solution Design: Data Integration: Integrate data from multiple sources, including APIs & relational databases. Collaborate with cross-functional teams to gather and understand data requirements. Database Management: Design and maintain relational database schemas to support business needs. Ensure efficient storage, retrieval, and management of large datasets. API Management: Develop and maintain APIs for data access and integration. Utilize tools like Postman for API testing and documentation. A good understanding of working with APIs: Ensure robust and efficient API integration and management. Data bricks Management: Manage permissions and access controls within Databricks to ensure data security and compliance Data Analytics and Reporting: Work with data analysts to provide clean and well-structured data for analysis. Develop and maintain documentation for data processes and workflows. Develop and maintain automatic report production to ensure seamless delivery of critical data Collaboration and Communication: Collaborate with colleagues to gather requirements and translate them into technical specifications. Communicate effectively with team members to ensure alignment on data initiatives Qualifications: Bachelor's degree or equivalent experience in Computer Science, Information Technology, or a related field. Proven experience as a Data Engineer or in a similar role. Strong proficiency in SQL or Python or ideally both. Experience with ETL processes and tools. Knowledge of data validation routines and data integration techniques. Familiarity with relational database design and management. Experience with API development and testing using tools like Postman. Experience of Databricks or similar data platforms desirable Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. For more information please apply using the link or get in touch with Recruitment
31/03/2026
Full time
Agilis are currently working exclusively with a key client who are a leading technology consultancy in their search for a Data Engineer. This is a fantastic opportunity to join a fast growing, forward thinking company and helping them take their Data engineering to the next level! Job Description: We are seeking a highly skilled and motivated Data Engineer to join a dynamic team. The ideal candidate will have a strong background in SQL, Python, ETL processes, and data integration, ideally in Databricks. You will play a crucial role in continuing an exciting project designing, developing, and maintaining data infrastructure to ensure the seamless inflow, data sanitation/consolidation and automated report production for clients. Key Responsibilities: Design and Development: Design, develop, and maintain scalable ETL pipelines to process and integrate data from various sources. Implement data validation routines to ensure data quality and integrity. Develop and optimize SQL queries for data extraction, transformation, and loading. Strategic Solution Design: Data Integration: Integrate data from multiple sources, including APIs & relational databases. Collaborate with cross-functional teams to gather and understand data requirements. Database Management: Design and maintain relational database schemas to support business needs. Ensure efficient storage, retrieval, and management of large datasets. API Management: Develop and maintain APIs for data access and integration. Utilize tools like Postman for API testing and documentation. A good understanding of working with APIs: Ensure robust and efficient API integration and management. Data bricks Management: Manage permissions and access controls within Databricks to ensure data security and compliance Data Analytics and Reporting: Work with data analysts to provide clean and well-structured data for analysis. Develop and maintain documentation for data processes and workflows. Develop and maintain automatic report production to ensure seamless delivery of critical data Collaboration and Communication: Collaborate with colleagues to gather requirements and translate them into technical specifications. Communicate effectively with team members to ensure alignment on data initiatives Qualifications: Bachelor's degree or equivalent experience in Computer Science, Information Technology, or a related field. Proven experience as a Data Engineer or in a similar role. Strong proficiency in SQL or Python or ideally both. Experience with ETL processes and tools. Knowledge of data validation routines and data integration techniques. Familiarity with relational database design and management. Experience with API development and testing using tools like Postman. Experience of Databricks or similar data platforms desirable Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. For more information please apply using the link or get in touch with Recruitment
83Zero Ltd
Senior Data Engineer
83Zero Ltd
Company Overview We are working with an innovative organisation that recognises the increasing complexity of project delivery. Since 2013, our client has been helping companies of all sizes improve the way projects are delivered. Their mission is to become the number one provider of innovative project solutions, driven by a community of experienced, caring, and passionate professionals who are committed to improving project delivery. Why Join Our Client? Our client is currently in an exciting phase of growth, making this an excellent time to join their journey. They are building something special-scaling the business while maintaining a strong people-first approach. Investment in their teams is a key priority, creating an environment where development is encouraged and individuals are supported to grow with the organisation. Their culture sets them apart from other consulting practices, and they are looking to build a team that is equally ambitious. Position Overview Our client is seeking a Senior Data Engineer who thrives on building scalable, cloud-first data systems. In this role, you will design and manage data pipelines that support analytics, AI, and automation across complex infrastructure programmes. Your work will play a key part in enabling data-driven transformation across critical UK industries. Core Responsibilities Design, build, and optimise data pipelines using Azure Data Factory, Synapse, and Databricks Develop and maintain ETL/ELT workflows to ensure high data quality and reliability Collaborate with analysts and AI engineers to deliver robust and reusable data products Manage data lakes and warehouses using formats such as Delta Lake and Parquet Implement best practices for data governance, performance, and security Continuously evaluate and adopt new technologies to evolve the organisation's data platform Provide technical guidance to junior engineers and contribute to team capability building Technical Stack Core: Azure Data Factory Azure Synapse Analytics Azure Data Lake Storage Gen2 SQL Server Databricks Enhancements: Python (PySpark, Pandas) CI/CD (Azure DevOps) Infrastructure as Code (Terraform, Bicep) REST APIs GitHub Actions Desirable: Microsoft Fabric Delta Live Tables Power BI dataset automation DataOps practices What You'll Bring Professional experience in data engineering or cloud data development Strong understanding of data architecture, APIs, and modern data pipeline design Hands-on experience within Microsoft's Azure ecosystem, with an interest in emerging technologies such as Fabric, AI-enhanced ETL, and real-time data streaming Proven ability to lead technical workstreams and mentor junior team members A strong alignment with the organisation's IDEAL values: Integrity, Drive, Empathy, Adaptability, and Loyalty Ready to Apply? This is a fantastic opportunity to join a forward-thinking organisation at a key stage of growth, working on impactful projects across critical industries. If you're looking to take the next step in your career within a collaborative and innovative environment, we'd love to hear from you.
31/03/2026
Full time
Company Overview We are working with an innovative organisation that recognises the increasing complexity of project delivery. Since 2013, our client has been helping companies of all sizes improve the way projects are delivered. Their mission is to become the number one provider of innovative project solutions, driven by a community of experienced, caring, and passionate professionals who are committed to improving project delivery. Why Join Our Client? Our client is currently in an exciting phase of growth, making this an excellent time to join their journey. They are building something special-scaling the business while maintaining a strong people-first approach. Investment in their teams is a key priority, creating an environment where development is encouraged and individuals are supported to grow with the organisation. Their culture sets them apart from other consulting practices, and they are looking to build a team that is equally ambitious. Position Overview Our client is seeking a Senior Data Engineer who thrives on building scalable, cloud-first data systems. In this role, you will design and manage data pipelines that support analytics, AI, and automation across complex infrastructure programmes. Your work will play a key part in enabling data-driven transformation across critical UK industries. Core Responsibilities Design, build, and optimise data pipelines using Azure Data Factory, Synapse, and Databricks Develop and maintain ETL/ELT workflows to ensure high data quality and reliability Collaborate with analysts and AI engineers to deliver robust and reusable data products Manage data lakes and warehouses using formats such as Delta Lake and Parquet Implement best practices for data governance, performance, and security Continuously evaluate and adopt new technologies to evolve the organisation's data platform Provide technical guidance to junior engineers and contribute to team capability building Technical Stack Core: Azure Data Factory Azure Synapse Analytics Azure Data Lake Storage Gen2 SQL Server Databricks Enhancements: Python (PySpark, Pandas) CI/CD (Azure DevOps) Infrastructure as Code (Terraform, Bicep) REST APIs GitHub Actions Desirable: Microsoft Fabric Delta Live Tables Power BI dataset automation DataOps practices What You'll Bring Professional experience in data engineering or cloud data development Strong understanding of data architecture, APIs, and modern data pipeline design Hands-on experience within Microsoft's Azure ecosystem, with an interest in emerging technologies such as Fabric, AI-enhanced ETL, and real-time data streaming Proven ability to lead technical workstreams and mentor junior team members A strong alignment with the organisation's IDEAL values: Integrity, Drive, Empathy, Adaptability, and Loyalty Ready to Apply? This is a fantastic opportunity to join a forward-thinking organisation at a key stage of growth, working on impactful projects across critical industries. If you're looking to take the next step in your career within a collaborative and innovative environment, we'd love to hear from you.
Akkodis
Data Engineer
Akkodis
Data Engineer Full Time / Permanent 55,000 - 60,000 plus up to 20% bonus, private medical and other extensive benefits Hybrid - 1-2 days a week in the North Oxfordshire head office The Company: My client is an industry leading and award-winning financial services organisation who operate on a global scale. They are headquartered in North Oxfordshire, UK. This would be a hybrid role requiring 1-2 days a week in the North Oxfordshire head office. The Role: I am looking for a driven and experienced Data Engineer to help to design, build and maintain a data lakehouse in databricks pulling data from core platforms and external sources and refining this into well curated analysis ready datasets. As a Data Engineer you will operate within an Agile delivery environment, working closely with other Data Engineers, Data Analysts and a Data Architect to deliver against the backlog; providing vital insight from a wide-ranging dataset to support executive and operational decision making that will underpin sustained growth of business units domestically and internationally. The Person: The ideal candidate will possess a strong background in Data Engineering with a proven ability to design, build, and maintain scalable data pipelines and solutions. From a technical standpoint you will ideally possess: Proven experience with databricks Proficiency in programming languages such as Python, Spark, SQL. Strong experience with SQL databases. Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF). Experience with cloud platforms (Azure preferred) and related data services. Knowledge of big data technologies (e.g., Hadoop, Spark, Kafka). Experience of Waterfall and Agile delivery methodologies Contact: Please apply via the link or contact (url removed) for more information. Modis International Ltd acts as an employment agency for permanent recruitment and an employment business for the supply of temporary workers in the UK. Modis Europe Ltd provide a variety of international solutions that connect clients to the best talent in the world. For all positions based in Switzerland, Modis Europe Ltd works with its licensed Swiss partner Accurity GmbH to ensure that candidate applications are handled in accordance with Swiss law. Both Modis International Ltd and Modis Europe Ltd are Equal Opportunities Employers. By applying for this role your details will be submitted to Modis International Ltd and/ or Modis Europe Ltd. Our Candidate Privacy Information Statement which explains how we will use your information is available on the Modis website.
30/03/2026
Full time
Data Engineer Full Time / Permanent 55,000 - 60,000 plus up to 20% bonus, private medical and other extensive benefits Hybrid - 1-2 days a week in the North Oxfordshire head office The Company: My client is an industry leading and award-winning financial services organisation who operate on a global scale. They are headquartered in North Oxfordshire, UK. This would be a hybrid role requiring 1-2 days a week in the North Oxfordshire head office. The Role: I am looking for a driven and experienced Data Engineer to help to design, build and maintain a data lakehouse in databricks pulling data from core platforms and external sources and refining this into well curated analysis ready datasets. As a Data Engineer you will operate within an Agile delivery environment, working closely with other Data Engineers, Data Analysts and a Data Architect to deliver against the backlog; providing vital insight from a wide-ranging dataset to support executive and operational decision making that will underpin sustained growth of business units domestically and internationally. The Person: The ideal candidate will possess a strong background in Data Engineering with a proven ability to design, build, and maintain scalable data pipelines and solutions. From a technical standpoint you will ideally possess: Proven experience with databricks Proficiency in programming languages such as Python, Spark, SQL. Strong experience with SQL databases. Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF). Experience with cloud platforms (Azure preferred) and related data services. Knowledge of big data technologies (e.g., Hadoop, Spark, Kafka). Experience of Waterfall and Agile delivery methodologies Contact: Please apply via the link or contact (url removed) for more information. Modis International Ltd acts as an employment agency for permanent recruitment and an employment business for the supply of temporary workers in the UK. Modis Europe Ltd provide a variety of international solutions that connect clients to the best talent in the world. For all positions based in Switzerland, Modis Europe Ltd works with its licensed Swiss partner Accurity GmbH to ensure that candidate applications are handled in accordance with Swiss law. Both Modis International Ltd and Modis Europe Ltd are Equal Opportunities Employers. By applying for this role your details will be submitted to Modis International Ltd and/ or Modis Europe Ltd. Our Candidate Privacy Information Statement which explains how we will use your information is available on the Modis website.
Stott and May
Principle Data Engineer
Stott and May
Principal Data Engineer - Hybrid (London/Winchester) We're seeking a hands-on Principal Data Engineer to design and deliver enterprise-scale, cloud-native data platforms that power analytics, reporting, and Real Time decision-making. This is a strategic technical leadership role where you'll shape architecture, mentor engineers, and deliver end-to-end solutions across a modern AWS/Databricks stack. What you'll do Lead the design of scalable, secure data architectures on AWS. Build and optimise ETL/ELT pipelines for batch and streaming data. Deploy and manage Apache Spark jobs on Databricks and Delta Lake. Write production-grade Python and SQL for large-scale data transformations. Drive data quality, governance, and automation through CI/CD and IaC. Collaborate with data scientists, analysts, and business stakeholders. Mentor and guide data engineering teams. What we're looking for Proven experience in senior/principal data engineering roles. Expertise in AWS, Databricks, Apache Spark, Python, and SQL. Strong background in cloud-native data platforms, Real Time processing, and data lakes. Hands-on experience with tools such as Airflow, Kafka, Docker, GitLab CI/CD. Excellent stakeholder engagement and leadership skills. What's on offer £84000 salary + 10% bonus 6% pension contribution Private medical & flexible benefits package 25 days annual leave (plus buy/sell options) Hybrid working - travel to London or Winchester once/twice per week Join a company at the forefront of media, connectivity, and smart technology, where your work directly powers millions of daily connections across the UK.
06/10/2025
Full time
Principal Data Engineer - Hybrid (London/Winchester) We're seeking a hands-on Principal Data Engineer to design and deliver enterprise-scale, cloud-native data platforms that power analytics, reporting, and Real Time decision-making. This is a strategic technical leadership role where you'll shape architecture, mentor engineers, and deliver end-to-end solutions across a modern AWS/Databricks stack. What you'll do Lead the design of scalable, secure data architectures on AWS. Build and optimise ETL/ELT pipelines for batch and streaming data. Deploy and manage Apache Spark jobs on Databricks and Delta Lake. Write production-grade Python and SQL for large-scale data transformations. Drive data quality, governance, and automation through CI/CD and IaC. Collaborate with data scientists, analysts, and business stakeholders. Mentor and guide data engineering teams. What we're looking for Proven experience in senior/principal data engineering roles. Expertise in AWS, Databricks, Apache Spark, Python, and SQL. Strong background in cloud-native data platforms, Real Time processing, and data lakes. Hands-on experience with tools such as Airflow, Kafka, Docker, GitLab CI/CD. Excellent stakeholder engagement and leadership skills. What's on offer £84000 salary + 10% bonus 6% pension contribution Private medical & flexible benefits package 25 days annual leave (plus buy/sell options) Hybrid working - travel to London or Winchester once/twice per week Join a company at the forefront of media, connectivity, and smart technology, where your work directly powers millions of daily connections across the UK.
Greencore
Senior Data Engineer
Greencore Worksop, Nottinghamshire
Why Greencore? We're a leading manufacturer of convenience food in the UK and our purpose is to make everyday taste better! We're a vibrant, fast-paced leading food manufacturer. Employing 13,300 colleagues across 16 manufacturing units and 17 distribution depots across the UK. We supply all the UK's food retailers with everything from Sandwiches, soups and sushi to cooking sauces, pickles and ready meals, and in FY24, we generated revenues of £1.8bn. Our vast direct-to-store (DTS) distribution network, comprising of 17 depots nationwide, enables us to make over 10,500 daily deliveries of our own chilled and frozen produce and that of third parties. Why is this exciting for your career as a Senior Data Engineer? The MBE Programme presents a huge opportunity for colleagues across the technology function to play a central role in the design, shape, delivery and execution of an enterprise wide digital transformation programme. The complexity of the initiative, within a FTSE 250 business, will allow for large-scale problem solving, group wide impact assessment and supporting the delivery of an enablement project to future proof the business. Why we embarked on Making Business Easier? Over time processes have become increasingly complex, increasing both the risk and cost they pose, whilst restricting our agility. At the same time, our customers and the market expect more from us than ever before. Making Business Easier forms a fundamental foundation for our commercial and operational excellence agendas, whilst supporting managing our cost base effectively in the future. The MBE Programme will streamline and simplify core processes, provide easier access to quality business data and will invest in the right technology to enable these processes. What you'll be doing: As a Senior Data Engineer, you will play a key role in shaping and delivering enterprise-wide data solutions that translate complex business requirements into scalable, high-performance data platforms. In this role, you will help define and guide the structure of data systems, focusing on seamless integration, accessibility, and governance, while optimising data flows to support both analytics and operational needs. Collaborating closely with business stakeholders, data engineers, and analysts, you will ensure that data platforms are robust, efficient, and adaptable to evolving business priorities. You will also support the usage, alignment, and consistency of data models; therefore, will have a wide-ranging role across many business projects and deliverables Shape and implement data solutions that align with business objectives and leverage both cloud and on-premise technologies Translate complex business needs into scalable, high-performing data solutions Support the development and application of best practices in data governance, security, and system design Collaborate closely with business stakeholders, product teams, and engineers to design and deliver effective, integrated data solutions Optimise data flows and pipelines to enable a wide range of analytical and operational use cases Promote data consistency across transactional and analytical systems through well-designed integration approaches Contribute to the design and ongoing improvement of data platforms - including data lakes, data warehouses, and other distributed storage environments - focused on efficiency, scalability, and ease of maintenance Mentor and support junior engineers and analysts in applying best practices in data engineering and solution design What you'll need: 5+ years of Data Engineering experience, with expertise in Azure data services and/or Microsoft Fabric Strong expertise in designing scalable data platforms and managing cloud-based data ecosystems Proven track record in data integration, ETL processes, and optimising large-scale data systems Expertise in cloud-based data platforms (AWS, Azure, Google Cloud) and distributed storage solutions Proficiency in Python, PySpark, SQL, NoSQL, and data processing frameworks (Spark, Databricks) Expertise in ETL/ELT design and orchestration in Azure, as well as pipeline performance tuning & optimisation Competent in integrating relational, NoSQL, and streaming data sources Management of CI/CD pipelines & Git-based workflows Good knowledge of data governance, privacy regulations, and security best practices Experience with modern data architectures, including data lakes, data mesh, and event-driven data processing Strong problem-solving and analytical skills to translate complex business needs into scalable data solutions Excellent communication and stakeholder management to align business and technical goals High attention to detail and commitment to data quality, security, and governance Ability to mentor and guide teams, fostering a culture of best practices in data architecture Power BI and DAX for data visualisation (desirable) Knowledge of Azure Machine Learning and AI services (desirable) Experience with streaming platforms like Event Hub or Kafka Familiarity with cloud cost optimisation techniques (desirable) What you'll get: Competitive salary and job-related benefits 25 days holiday allowance plus bank holidays Car Allowance Annual Target Bonus Pension up to 8% matched PMI Cover: Individual Life insurance up to 4x salary Company share save scheme Greencore Qualifications Exclusive Greencore employee discount platform Access to a full Wellbeing Centre platform
03/10/2025
Full time
Why Greencore? We're a leading manufacturer of convenience food in the UK and our purpose is to make everyday taste better! We're a vibrant, fast-paced leading food manufacturer. Employing 13,300 colleagues across 16 manufacturing units and 17 distribution depots across the UK. We supply all the UK's food retailers with everything from Sandwiches, soups and sushi to cooking sauces, pickles and ready meals, and in FY24, we generated revenues of £1.8bn. Our vast direct-to-store (DTS) distribution network, comprising of 17 depots nationwide, enables us to make over 10,500 daily deliveries of our own chilled and frozen produce and that of third parties. Why is this exciting for your career as a Senior Data Engineer? The MBE Programme presents a huge opportunity for colleagues across the technology function to play a central role in the design, shape, delivery and execution of an enterprise wide digital transformation programme. The complexity of the initiative, within a FTSE 250 business, will allow for large-scale problem solving, group wide impact assessment and supporting the delivery of an enablement project to future proof the business. Why we embarked on Making Business Easier? Over time processes have become increasingly complex, increasing both the risk and cost they pose, whilst restricting our agility. At the same time, our customers and the market expect more from us than ever before. Making Business Easier forms a fundamental foundation for our commercial and operational excellence agendas, whilst supporting managing our cost base effectively in the future. The MBE Programme will streamline and simplify core processes, provide easier access to quality business data and will invest in the right technology to enable these processes. What you'll be doing: As a Senior Data Engineer, you will play a key role in shaping and delivering enterprise-wide data solutions that translate complex business requirements into scalable, high-performance data platforms. In this role, you will help define and guide the structure of data systems, focusing on seamless integration, accessibility, and governance, while optimising data flows to support both analytics and operational needs. Collaborating closely with business stakeholders, data engineers, and analysts, you will ensure that data platforms are robust, efficient, and adaptable to evolving business priorities. You will also support the usage, alignment, and consistency of data models; therefore, will have a wide-ranging role across many business projects and deliverables Shape and implement data solutions that align with business objectives and leverage both cloud and on-premise technologies Translate complex business needs into scalable, high-performing data solutions Support the development and application of best practices in data governance, security, and system design Collaborate closely with business stakeholders, product teams, and engineers to design and deliver effective, integrated data solutions Optimise data flows and pipelines to enable a wide range of analytical and operational use cases Promote data consistency across transactional and analytical systems through well-designed integration approaches Contribute to the design and ongoing improvement of data platforms - including data lakes, data warehouses, and other distributed storage environments - focused on efficiency, scalability, and ease of maintenance Mentor and support junior engineers and analysts in applying best practices in data engineering and solution design What you'll need: 5+ years of Data Engineering experience, with expertise in Azure data services and/or Microsoft Fabric Strong expertise in designing scalable data platforms and managing cloud-based data ecosystems Proven track record in data integration, ETL processes, and optimising large-scale data systems Expertise in cloud-based data platforms (AWS, Azure, Google Cloud) and distributed storage solutions Proficiency in Python, PySpark, SQL, NoSQL, and data processing frameworks (Spark, Databricks) Expertise in ETL/ELT design and orchestration in Azure, as well as pipeline performance tuning & optimisation Competent in integrating relational, NoSQL, and streaming data sources Management of CI/CD pipelines & Git-based workflows Good knowledge of data governance, privacy regulations, and security best practices Experience with modern data architectures, including data lakes, data mesh, and event-driven data processing Strong problem-solving and analytical skills to translate complex business needs into scalable data solutions Excellent communication and stakeholder management to align business and technical goals High attention to detail and commitment to data quality, security, and governance Ability to mentor and guide teams, fostering a culture of best practices in data architecture Power BI and DAX for data visualisation (desirable) Knowledge of Azure Machine Learning and AI services (desirable) Experience with streaming platforms like Event Hub or Kafka Familiarity with cloud cost optimisation techniques (desirable) What you'll get: Competitive salary and job-related benefits 25 days holiday allowance plus bank holidays Car Allowance Annual Target Bonus Pension up to 8% matched PMI Cover: Individual Life insurance up to 4x salary Company share save scheme Greencore Qualifications Exclusive Greencore employee discount platform Access to a full Wellbeing Centre platform
Greencore
Senior Data Engineer
Greencore Scofton, Nottinghamshire
Why Greencore? We're a leading manufacturer of convenience food in the UK and our purpose is to make everyday taste better! We're a vibrant, fast-paced leading food manufacturer. Employing 13,300 colleagues across 16 manufacturing units and 17 distribution depots across the UK. We supply all the UK's food retailers with everything from Sandwiches, soups and sushi to cooking sauces, pickles and ready meals, and in FY24, we generated revenues of 1.8bn. Our vast direct-to-store (DTS) distribution network, comprising of 17 depots nationwide, enables us to make over 10,500 daily deliveries of our own chilled and frozen produce and that of third parties. Why is this exciting for your career as a Senior Data Engineer? The MBE Programme presents a huge opportunity for colleagues across the technology function to play a central role in the design, shape, delivery and execution of an enterprise wide digital transformation programme. The complexity of the initiative, within a FTSE 250 business, will allow for large-scale problem solving, group wide impact assessment and supporting the delivery of an enablement project to future proof the business. Why we embarked on Making Business Easier? Over time processes have become increasingly complex, increasing both the risk and cost they pose, whilst restricting our agility. At the same time, our customers and the market expect more from us than ever before. Making Business Easier forms a fundamental foundation for our commercial and operational excellence agendas, whilst supporting managing our cost base effectively in the future. The MBE Programme will streamline and simplify core processes, provide easier access to quality business data and will invest in the right technology to enable these processes. What you'll be doing: As a Senior Data Engineer, you will play a key role in shaping and delivering enterprise-wide data solutions that translate complex business requirements into scalable, high-performance data platforms. In this role, you will help define and guide the structure of data systems, focusing on seamless integration, accessibility, and governance, while optimising data flows to support both analytics and operational needs. Collaborating closely with business stakeholders, data engineers, and analysts, you will ensure that data platforms are robust, efficient, and adaptable to evolving business priorities. You will also support the usage, alignment, and consistency of data models; therefore, will have a wide-ranging role across many business projects and deliverables Shape and implement data solutions that align with business objectives and leverage both cloud and on-premise technologies Translate complex business needs into scalable, high-performing data solutions Support the development and application of best practices in data governance, security, and system design Collaborate closely with business stakeholders, product teams, and engineers to design and deliver effective, integrated data solutions Optimise data flows and pipelines to enable a wide range of analytical and operational use cases Promote data consistency across transactional and analytical systems through well-designed integration approaches Contribute to the design and ongoing improvement of data platforms - including data lakes, data warehouses, and other distributed storage environments - focused on efficiency, scalability, and ease of maintenance Mentor and support junior engineers and analysts in applying best practices in data engineering and solution design What you'll need: 5+ years of Data Engineering experience, with expertise in Azure data services and/or Microsoft Fabric Strong expertise in designing scalable data platforms and managing cloud-based data ecosystems Proven track record in data integration, ETL processes, and optimising large-scale data systems Expertise in cloud-based data platforms (AWS, Azure, Google Cloud) and distributed storage solutions Proficiency in Python, PySpark, SQL, NoSQL, and data processing frameworks (Spark, Databricks) Expertise in ETL/ELT design and orchestration in Azure, as well as pipeline performance tuning & optimisation Competent in integrating relational, NoSQL, and streaming data sources Management of CI/CD pipelines & Git-based workflows Good knowledge of data governance, privacy regulations, and security best practices Experience with modern data architectures, including data lakes, data mesh, and event-driven data processing Strong problem-solving and analytical skills to translate complex business needs into scalable data solutions Excellent communication and stakeholder management to align business and technical goals High attention to detail and commitment to data quality, security, and governance Ability to mentor and guide teams, fostering a culture of best practices in data architecture Power BI and DAX for data visualisation (desirable) Knowledge of Azure Machine Learning and AI services (desirable) Experience with streaming platforms like Event Hub or Kafka Familiarity with cloud cost optimisation techniques (desirable) What you'll get: Competitive salary and job-related benefits 25 days holiday allowance plus bank holidays Car Allowance Annual Target Bonus Pension up to 8% matched PMI Cover: Individual Life insurance up to 4x salary Company share save scheme Greencore Qualifications Exclusive Greencore employee discount platform Access to a full Wellbeing Centre platform
02/10/2025
Full time
Why Greencore? We're a leading manufacturer of convenience food in the UK and our purpose is to make everyday taste better! We're a vibrant, fast-paced leading food manufacturer. Employing 13,300 colleagues across 16 manufacturing units and 17 distribution depots across the UK. We supply all the UK's food retailers with everything from Sandwiches, soups and sushi to cooking sauces, pickles and ready meals, and in FY24, we generated revenues of 1.8bn. Our vast direct-to-store (DTS) distribution network, comprising of 17 depots nationwide, enables us to make over 10,500 daily deliveries of our own chilled and frozen produce and that of third parties. Why is this exciting for your career as a Senior Data Engineer? The MBE Programme presents a huge opportunity for colleagues across the technology function to play a central role in the design, shape, delivery and execution of an enterprise wide digital transformation programme. The complexity of the initiative, within a FTSE 250 business, will allow for large-scale problem solving, group wide impact assessment and supporting the delivery of an enablement project to future proof the business. Why we embarked on Making Business Easier? Over time processes have become increasingly complex, increasing both the risk and cost they pose, whilst restricting our agility. At the same time, our customers and the market expect more from us than ever before. Making Business Easier forms a fundamental foundation for our commercial and operational excellence agendas, whilst supporting managing our cost base effectively in the future. The MBE Programme will streamline and simplify core processes, provide easier access to quality business data and will invest in the right technology to enable these processes. What you'll be doing: As a Senior Data Engineer, you will play a key role in shaping and delivering enterprise-wide data solutions that translate complex business requirements into scalable, high-performance data platforms. In this role, you will help define and guide the structure of data systems, focusing on seamless integration, accessibility, and governance, while optimising data flows to support both analytics and operational needs. Collaborating closely with business stakeholders, data engineers, and analysts, you will ensure that data platforms are robust, efficient, and adaptable to evolving business priorities. You will also support the usage, alignment, and consistency of data models; therefore, will have a wide-ranging role across many business projects and deliverables Shape and implement data solutions that align with business objectives and leverage both cloud and on-premise technologies Translate complex business needs into scalable, high-performing data solutions Support the development and application of best practices in data governance, security, and system design Collaborate closely with business stakeholders, product teams, and engineers to design and deliver effective, integrated data solutions Optimise data flows and pipelines to enable a wide range of analytical and operational use cases Promote data consistency across transactional and analytical systems through well-designed integration approaches Contribute to the design and ongoing improvement of data platforms - including data lakes, data warehouses, and other distributed storage environments - focused on efficiency, scalability, and ease of maintenance Mentor and support junior engineers and analysts in applying best practices in data engineering and solution design What you'll need: 5+ years of Data Engineering experience, with expertise in Azure data services and/or Microsoft Fabric Strong expertise in designing scalable data platforms and managing cloud-based data ecosystems Proven track record in data integration, ETL processes, and optimising large-scale data systems Expertise in cloud-based data platforms (AWS, Azure, Google Cloud) and distributed storage solutions Proficiency in Python, PySpark, SQL, NoSQL, and data processing frameworks (Spark, Databricks) Expertise in ETL/ELT design and orchestration in Azure, as well as pipeline performance tuning & optimisation Competent in integrating relational, NoSQL, and streaming data sources Management of CI/CD pipelines & Git-based workflows Good knowledge of data governance, privacy regulations, and security best practices Experience with modern data architectures, including data lakes, data mesh, and event-driven data processing Strong problem-solving and analytical skills to translate complex business needs into scalable data solutions Excellent communication and stakeholder management to align business and technical goals High attention to detail and commitment to data quality, security, and governance Ability to mentor and guide teams, fostering a culture of best practices in data architecture Power BI and DAX for data visualisation (desirable) Knowledge of Azure Machine Learning and AI services (desirable) Experience with streaming platforms like Event Hub or Kafka Familiarity with cloud cost optimisation techniques (desirable) What you'll get: Competitive salary and job-related benefits 25 days holiday allowance plus bank holidays Car Allowance Annual Target Bonus Pension up to 8% matched PMI Cover: Individual Life insurance up to 4x salary Company share save scheme Greencore Qualifications Exclusive Greencore employee discount platform Access to a full Wellbeing Centre platform
Vermelo RPO
Data Scientist
Vermelo RPO City, Manchester
Job Title: Data Scientist (Modelling & Insight) Location: Manchester (hybrid working) Role Overview Markerstudy Group are looking for an experienced Data Scientist to join a fast growing company in developing ambitious solutions across a range of insurance lines, by leveraging vast data assets and state-of-the-art processing capabilities. As a Data Scientist, you will use your advanced analytical skills to directly influence insurer panel performance, ensuring our broking arm maintains a competitive edge through data-driven strategies and advanced analytics. Deliver outstanding and actionable customer insights Have responsibility for providing insights and support the building data products that helps shape Markerstudy s strategic roadmaps and customer propositions Support the delivery, maintanence and ongoing support of the Data Insight and Enrichment integration strategy across the group Work collaboratively with other areas to increase overall company performance Your ideas and solutions will enable improvements to products, prices and processes giving Markerstudy a critical advantage in the increasingly competitive insurance market. As part of your Data Science career you will be expected to further advance a wide range of modern statistical, machine learning and data science methods. This knowledge will be applied to a wide range of business problems and adding demonstrable commercial value. Key Responsibilities: Lead the delivery of high-impact analytics and modelling projects to support strategic decision-making. Proactively identify and deliver innovative, data-led opportunities that drive measurable business impact Act as a subject matter expert in analytics and data science, providing technical guidance. Coach and mentor junior analysts, reviewing code and outputs to ensure quality and consistency. Maintain robust technical documentation and ensure compliance with data governance and regulatory standards. Support cross-functional initiatives such as the Trading Transformation Programme as a technical expert. Collaborate with stakeholders across pricing, marketing, and insurer relations to embed insights into business processes. Comply with all regulatory obligations with regards to customer data, competition law and other relevant guidance/ legislation. Key Skills and Experience: Previous demonstratable Data Science / Analytics Experience ideally within insurance or financial services. Strong academic background in a numerical discipline (eg BSc Mathematics, Computer Science, Data Science). Proficiency in statistical and machine learning techniques (eg logistic regression, clustering, GBMs) and the application of these in a business context. Advanced SQL and experience with Python and/or R. Strong communication and storytelling skills, with the ability to translate complex data into actionable insights. Experience reviewing the work of junior analysts. Ability to work independently, manage multiple priorities, and proactively share insights. Selfless when it comes to sharing findings, experience and advice. We work as a team not separate individuals! Resilience, can work independently to deliver projects Proactively share insights, results and identify risks, without prompting Proficient at communicating results in a concise manner both verbally and written Desirable Postgraduate qualification in relevant field (eg Computer Science, Data Science, Operational Research) Experience with modern data platforms (eg Databricks, Snowflake, MS Fabric). Familiarity with MLOps practices and version control tools (e.g. Git). Experience with deployment and maintenance of ML models in production environments. Experience mentoring junior analysts, sharing expertise and fostering a culture of continuous learning and innovation.
01/09/2025
Full time
Job Title: Data Scientist (Modelling & Insight) Location: Manchester (hybrid working) Role Overview Markerstudy Group are looking for an experienced Data Scientist to join a fast growing company in developing ambitious solutions across a range of insurance lines, by leveraging vast data assets and state-of-the-art processing capabilities. As a Data Scientist, you will use your advanced analytical skills to directly influence insurer panel performance, ensuring our broking arm maintains a competitive edge through data-driven strategies and advanced analytics. Deliver outstanding and actionable customer insights Have responsibility for providing insights and support the building data products that helps shape Markerstudy s strategic roadmaps and customer propositions Support the delivery, maintanence and ongoing support of the Data Insight and Enrichment integration strategy across the group Work collaboratively with other areas to increase overall company performance Your ideas and solutions will enable improvements to products, prices and processes giving Markerstudy a critical advantage in the increasingly competitive insurance market. As part of your Data Science career you will be expected to further advance a wide range of modern statistical, machine learning and data science methods. This knowledge will be applied to a wide range of business problems and adding demonstrable commercial value. Key Responsibilities: Lead the delivery of high-impact analytics and modelling projects to support strategic decision-making. Proactively identify and deliver innovative, data-led opportunities that drive measurable business impact Act as a subject matter expert in analytics and data science, providing technical guidance. Coach and mentor junior analysts, reviewing code and outputs to ensure quality and consistency. Maintain robust technical documentation and ensure compliance with data governance and regulatory standards. Support cross-functional initiatives such as the Trading Transformation Programme as a technical expert. Collaborate with stakeholders across pricing, marketing, and insurer relations to embed insights into business processes. Comply with all regulatory obligations with regards to customer data, competition law and other relevant guidance/ legislation. Key Skills and Experience: Previous demonstratable Data Science / Analytics Experience ideally within insurance or financial services. Strong academic background in a numerical discipline (eg BSc Mathematics, Computer Science, Data Science). Proficiency in statistical and machine learning techniques (eg logistic regression, clustering, GBMs) and the application of these in a business context. Advanced SQL and experience with Python and/or R. Strong communication and storytelling skills, with the ability to translate complex data into actionable insights. Experience reviewing the work of junior analysts. Ability to work independently, manage multiple priorities, and proactively share insights. Selfless when it comes to sharing findings, experience and advice. We work as a team not separate individuals! Resilience, can work independently to deliver projects Proactively share insights, results and identify risks, without prompting Proficient at communicating results in a concise manner both verbally and written Desirable Postgraduate qualification in relevant field (eg Computer Science, Data Science, Operational Research) Experience with modern data platforms (eg Databricks, Snowflake, MS Fabric). Familiarity with MLOps practices and version control tools (e.g. Git). Experience with deployment and maintenance of ML models in production environments. Experience mentoring junior analysts, sharing expertise and fostering a culture of continuous learning and innovation.
Lemongrass
Cloud Data Engineer
Lemongrass
Title- Cloud Data Engineer Location- Remote Lemongrass Consulting is the leading professional and managed service provider of SAP enterprise applications running on AWS hyperscale cloud infrastructure. Our objective is to delight our customers every day by reducing the cost and increasing the agility of their SAP systems. We do this with our continuous innovation, automation, migration and operation, delivered on the world's most comprehensive cloud platforms. Our team is what makes Lemongrass exceptional and why we have the excellent reputation in the market that we enjoy today. At Lemongrass, you will work with the smartest and most motivated people in the business. We take pride in our culture of innovation and collaboration that drives us to deliver exceptional benefits to our clients every day. About the Role: We are seeking an experienced Cloud Data Engineer with a strong background in AWS, Azure, and GCP. The ideal candidate will have extensive experience with cloud-native ETL tools such as AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, and other ETL tools like Informatica, SAP Data Intelligence, etc. You will be responsible for designing, implementing, and maintaining robust data pipelines and building scalable data lakes. Experience with various data platforms like Redshift, Snowflake, Databricks, Synapse, Snowflake and others is essential. Familiarity with data extraction from SAP or ERP systems is a plus. Key Responsibilities: Design and Development: Design, develop, and maintain scalable ETL pipelines using cloud-native tools (AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, etc.). Architect and implement data lakes and data warehouses on cloud platforms (AWS, Azure, GCP). Develop and optimize data ingestion, transformation, and loading processes using Databricks, Snowflake, Redshift, BigQuery and Azure Synapse. Implement ETL processes using tools like Informatica, SAP Data Intelligence, and others. Develop and optimize data processing jobs using Spark Scala. Data Integration and Management: Integrate various data sources, including relational databases, APIs, unstructured data, and ERP systems into the data lake. Ensure data quality and integrity through rigorous testing and validation. Perform data extraction from SAP or ERP systems when necessary. Performance Optimization: Monitor and optimize the performance of data pipelines and ETL processes. Implement best practices for data management, including data governance, security, and compliance. Collaboration and Communication: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Collaborate with cross-functional teams to design and implement data solutions that meet business needs. Documentation and Maintenance: Document technical solutions, processes, and workflows. Maintain and troubleshoot existing ETL pipelines and data integrations. Qualifications: Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Advanced degrees are a plus. Experience: 7+ years of experience as a Data Engineer or in a similar role. Proven experience with cloud platforms: AWS, Azure, and GCP. Hands-on experience with cloud-native ETL tools such as AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, etc. Experience with other ETL tools like Informatica, SAP Data Intelligence, etc. Experience in building and managing data lakes and data warehouses. Proficiency with data platforms like Redshift, Snowflake, BigQuery, Databricks, and Azure Synapse. Experience with data extraction from SAP or ERP systems is a plus. Strong experience with Spark and Scala for data processing. Skills: Strong programming skills in Python, Java, or Scala. Proficient in SQL and query optimization techniques. Familiarity with data modelling, ETL/ELT processes, and data warehousing concepts. Knowledge of data governance, security, and compliance best practices. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Preferred Qualifications: Experience with other data tools and technologies such as Apache Spark, or Hadoop. Certifications in cloud platforms (AWS Certified Data Analytics - Specialty, Google Professional Data Engineer, Microsoft Certified: Azure Data Engineer Associate). Experience with CI/CD pipelines and DevOps practices for data engineering What we offer in return: Remote Working: Lemongrass always has been and always will offer 100% remote work Flexibility: Work where and when you like most of the time Training: A subscription to A Cloud Guru and generous budget for taking certifications and other resources you'll find helpful State of the art tech : An opportunity to learn and run the latest industry standard tools Team: Colleagues who will challenge you giving the chance to learn from them and them from you Lemongrass Consulting is an Equal Opportunity/Affirmative Action employer. All qualified candidates will receive consideration for employment without regard to disability, protected veteran status, race, color, religious creed, national origin, citizenship, marital status, sex, sexual orientation/gender identity, age, or genetic information. Selected applicant will be subject to a background investigation.
23/12/2024
Full time
Title- Cloud Data Engineer Location- Remote Lemongrass Consulting is the leading professional and managed service provider of SAP enterprise applications running on AWS hyperscale cloud infrastructure. Our objective is to delight our customers every day by reducing the cost and increasing the agility of their SAP systems. We do this with our continuous innovation, automation, migration and operation, delivered on the world's most comprehensive cloud platforms. Our team is what makes Lemongrass exceptional and why we have the excellent reputation in the market that we enjoy today. At Lemongrass, you will work with the smartest and most motivated people in the business. We take pride in our culture of innovation and collaboration that drives us to deliver exceptional benefits to our clients every day. About the Role: We are seeking an experienced Cloud Data Engineer with a strong background in AWS, Azure, and GCP. The ideal candidate will have extensive experience with cloud-native ETL tools such as AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, and other ETL tools like Informatica, SAP Data Intelligence, etc. You will be responsible for designing, implementing, and maintaining robust data pipelines and building scalable data lakes. Experience with various data platforms like Redshift, Snowflake, Databricks, Synapse, Snowflake and others is essential. Familiarity with data extraction from SAP or ERP systems is a plus. Key Responsibilities: Design and Development: Design, develop, and maintain scalable ETL pipelines using cloud-native tools (AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, etc.). Architect and implement data lakes and data warehouses on cloud platforms (AWS, Azure, GCP). Develop and optimize data ingestion, transformation, and loading processes using Databricks, Snowflake, Redshift, BigQuery and Azure Synapse. Implement ETL processes using tools like Informatica, SAP Data Intelligence, and others. Develop and optimize data processing jobs using Spark Scala. Data Integration and Management: Integrate various data sources, including relational databases, APIs, unstructured data, and ERP systems into the data lake. Ensure data quality and integrity through rigorous testing and validation. Perform data extraction from SAP or ERP systems when necessary. Performance Optimization: Monitor and optimize the performance of data pipelines and ETL processes. Implement best practices for data management, including data governance, security, and compliance. Collaboration and Communication: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Collaborate with cross-functional teams to design and implement data solutions that meet business needs. Documentation and Maintenance: Document technical solutions, processes, and workflows. Maintain and troubleshoot existing ETL pipelines and data integrations. Qualifications: Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Advanced degrees are a plus. Experience: 7+ years of experience as a Data Engineer or in a similar role. Proven experience with cloud platforms: AWS, Azure, and GCP. Hands-on experience with cloud-native ETL tools such as AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, etc. Experience with other ETL tools like Informatica, SAP Data Intelligence, etc. Experience in building and managing data lakes and data warehouses. Proficiency with data platforms like Redshift, Snowflake, BigQuery, Databricks, and Azure Synapse. Experience with data extraction from SAP or ERP systems is a plus. Strong experience with Spark and Scala for data processing. Skills: Strong programming skills in Python, Java, or Scala. Proficient in SQL and query optimization techniques. Familiarity with data modelling, ETL/ELT processes, and data warehousing concepts. Knowledge of data governance, security, and compliance best practices. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Preferred Qualifications: Experience with other data tools and technologies such as Apache Spark, or Hadoop. Certifications in cloud platforms (AWS Certified Data Analytics - Specialty, Google Professional Data Engineer, Microsoft Certified: Azure Data Engineer Associate). Experience with CI/CD pipelines and DevOps practices for data engineering What we offer in return: Remote Working: Lemongrass always has been and always will offer 100% remote work Flexibility: Work where and when you like most of the time Training: A subscription to A Cloud Guru and generous budget for taking certifications and other resources you'll find helpful State of the art tech : An opportunity to learn and run the latest industry standard tools Team: Colleagues who will challenge you giving the chance to learn from them and them from you Lemongrass Consulting is an Equal Opportunity/Affirmative Action employer. All qualified candidates will receive consideration for employment without regard to disability, protected veteran status, race, color, religious creed, national origin, citizenship, marital status, sex, sexual orientation/gender identity, age, or genetic information. Selected applicant will be subject to a background investigation.
CapGemini
GCP Data Engineer
CapGemini
Location Whilst you may have any of our UK offices as a base location, you must be fully flexible in terms of assignment location, as these roles may involve periods of time away from home during the week at short notice. Capgemini requires our employees to be geographically mobile and to be able to travel to customer site to perform our jobs. Who you'll be working with The Cloud Data Platforms team is part of the Insights and Data Global Practice and has seen strong growth and continued success across a variety of projects and sectors. Cloud Data Platforms is the home of the Data Engineers, Platform Engineers, Solutions Architects and Business Analysts who are focused on driving our customers digital and data transformation journey using the modern cloud platforms. We specialise on using the latest frameworks, reference architectures and technologies using AWS, Azure and GCP. We continue to grow and are looking for talented individuals who want to join our high performing team. If you would like to develop your career as part of a team of highly skilled professionals who are passionate about increasing the value of the data and analytics in organisations you have come to the right place. The focus of your role We are looking for strong GCP Data Engineers who are passionate about Cloud technology and who ideally have skills in many of the following areas: • Build and deliver GCP data engineering solutions as part of a larger project • Use Google Data Products tools (e.g. BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep, etc.) to build solutions for our customers • Experience in Spark (Scala/Python/Java) and Kafka. • Experience in MDM, Metadata Management, Data Quality and Data Lineage tools. • E2E Data Engineering and Lifecycle (including non-functional requirements and operations) management. • E2E Solution Design skills - Prototyping, Usability testing and data visualization literacy. • Experience with SQL and NoSQL modern data stores. • Build relationships with client stakeholders to establish a high-level of rapport and confidence • Work with clients, local teams and offshore resources to deliver modern data products • Work effectively on client sites, Capgemini offices and from home • Use GCP Data focused Reference Architecture • Design and build data service APIs • Analyze current business practices, processes and procedures and identify future opportunities for leveraging GCP services • Design solutions and support the planning and implementation of data platform services including sizing, configuration, and needs assessment • Implement effective metrics and monitoring processes Skills Needed • Minimum 3-4 years of experience with Google Data Products tools (e.g. BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep, etc.) • Google Cloud Platform • Java, Scala, Python, Spark, SQL • Experience of developing enterprise grade ETL/ELT data pipelines. • Deep understanding of data manipulation/wrangling techniques • Demonstrable knowledge of applying Data Engineering best practices (coding practices to DS, unit testing, version control, code review). • Big Data Eco-Systems, Cloudera/Hortonworks, AWS EMR, GCP DataProc or GCP Cloud Data Fusion. • NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. • Snowflake Data Warehouse/Platform • Streaming technologies and processing engines, Kinesis, Kafka, Pub/Sub and Spark Streaming. • Experience of working CI/CD technologies, Git, Jenkins, Spinnaker, GCP Cloud Build, Ansible etc • Experience and knowledge of application Containerisation, Docker, Kubernetes etc • Experience building and deploying solutions to Cloud (AWS, Google Cloud) including Cloud provisioning tools • Strong interpersonal skills with the ability to work with clients to establish requirements in non-technical language. • Ability to translate business requirements into plausible technical solutions for articulation to other development staff. • Good understanding of Lambda architecture patterns • Good understanding of Data Governance, including Master Data Management (MDM) and Data Quality tools and processes • Influencing and supporting project delivery through involvement in project/sprint planning and QA • Experience with Agile methodology • Experience on collaboration tools such as JIRA, Kanban Board, Confluence etc Nice to Haves: • Knowledge of other cloud platforms • AWS (e.g Athena, Redshift, Glue, EMR) • Relevant certifications • Python • Snowflake • Databricks What we'll offer you Professional development. Accelerated career progression. An environment that encourages entrepreneurial spirit. It's all on offer at Capgemini and although collaboration is at the core of the way we work, we also recognise individual needs with a flexible benefits package you can tailor to suit you. Why we're different At Capgemini, we help organisations across the world become more agile, more competitive and more successful. Smart, tailored, often-groundbreaking technical solutions to complex problems are the norm. But so, too, is a culture that's as collaborative as it is forward thinking. Working closely with each other, and with our clients, we get under the skin of businesses and to the heart of their goals. You will too. Capgemini is proud to represent nearly 130 nationalities and its cultural diversity. Our holistic definition of diversity extends beyond gender, gender identity, sexual orientation, disability, ethnicity, race, age and religion. Capgemini views diversity as everything that makes us who we are as an organization, including our social background, our experiences in life and work, our communication styles and even our personality. These dimensions contribute to the type of diversity we value the most: diversity of thought. About Capgemini Capgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The Group is guided everyday by its purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse organization of 270,000 team members in nearly 50 countries. With its strong 50 year heritage and deep industry expertise, Capgemini is trusted by its clients to address the entire breadth of their business needs, from strategy and design to operations, fuelled by the fast evolving and innovative world of cloud, data, AI, connectivity, software, digital engineering and platforms. The Group reported in 2020 global revenues of €16 billion. Discover more about what Capgemini can offer you. Visit: and
23/09/2022
Full time
Location Whilst you may have any of our UK offices as a base location, you must be fully flexible in terms of assignment location, as these roles may involve periods of time away from home during the week at short notice. Capgemini requires our employees to be geographically mobile and to be able to travel to customer site to perform our jobs. Who you'll be working with The Cloud Data Platforms team is part of the Insights and Data Global Practice and has seen strong growth and continued success across a variety of projects and sectors. Cloud Data Platforms is the home of the Data Engineers, Platform Engineers, Solutions Architects and Business Analysts who are focused on driving our customers digital and data transformation journey using the modern cloud platforms. We specialise on using the latest frameworks, reference architectures and technologies using AWS, Azure and GCP. We continue to grow and are looking for talented individuals who want to join our high performing team. If you would like to develop your career as part of a team of highly skilled professionals who are passionate about increasing the value of the data and analytics in organisations you have come to the right place. The focus of your role We are looking for strong GCP Data Engineers who are passionate about Cloud technology and who ideally have skills in many of the following areas: • Build and deliver GCP data engineering solutions as part of a larger project • Use Google Data Products tools (e.g. BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep, etc.) to build solutions for our customers • Experience in Spark (Scala/Python/Java) and Kafka. • Experience in MDM, Metadata Management, Data Quality and Data Lineage tools. • E2E Data Engineering and Lifecycle (including non-functional requirements and operations) management. • E2E Solution Design skills - Prototyping, Usability testing and data visualization literacy. • Experience with SQL and NoSQL modern data stores. • Build relationships with client stakeholders to establish a high-level of rapport and confidence • Work with clients, local teams and offshore resources to deliver modern data products • Work effectively on client sites, Capgemini offices and from home • Use GCP Data focused Reference Architecture • Design and build data service APIs • Analyze current business practices, processes and procedures and identify future opportunities for leveraging GCP services • Design solutions and support the planning and implementation of data platform services including sizing, configuration, and needs assessment • Implement effective metrics and monitoring processes Skills Needed • Minimum 3-4 years of experience with Google Data Products tools (e.g. BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep, etc.) • Google Cloud Platform • Java, Scala, Python, Spark, SQL • Experience of developing enterprise grade ETL/ELT data pipelines. • Deep understanding of data manipulation/wrangling techniques • Demonstrable knowledge of applying Data Engineering best practices (coding practices to DS, unit testing, version control, code review). • Big Data Eco-Systems, Cloudera/Hortonworks, AWS EMR, GCP DataProc or GCP Cloud Data Fusion. • NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. • Snowflake Data Warehouse/Platform • Streaming technologies and processing engines, Kinesis, Kafka, Pub/Sub and Spark Streaming. • Experience of working CI/CD technologies, Git, Jenkins, Spinnaker, GCP Cloud Build, Ansible etc • Experience and knowledge of application Containerisation, Docker, Kubernetes etc • Experience building and deploying solutions to Cloud (AWS, Google Cloud) including Cloud provisioning tools • Strong interpersonal skills with the ability to work with clients to establish requirements in non-technical language. • Ability to translate business requirements into plausible technical solutions for articulation to other development staff. • Good understanding of Lambda architecture patterns • Good understanding of Data Governance, including Master Data Management (MDM) and Data Quality tools and processes • Influencing and supporting project delivery through involvement in project/sprint planning and QA • Experience with Agile methodology • Experience on collaboration tools such as JIRA, Kanban Board, Confluence etc Nice to Haves: • Knowledge of other cloud platforms • AWS (e.g Athena, Redshift, Glue, EMR) • Relevant certifications • Python • Snowflake • Databricks What we'll offer you Professional development. Accelerated career progression. An environment that encourages entrepreneurial spirit. It's all on offer at Capgemini and although collaboration is at the core of the way we work, we also recognise individual needs with a flexible benefits package you can tailor to suit you. Why we're different At Capgemini, we help organisations across the world become more agile, more competitive and more successful. Smart, tailored, often-groundbreaking technical solutions to complex problems are the norm. But so, too, is a culture that's as collaborative as it is forward thinking. Working closely with each other, and with our clients, we get under the skin of businesses and to the heart of their goals. You will too. Capgemini is proud to represent nearly 130 nationalities and its cultural diversity. Our holistic definition of diversity extends beyond gender, gender identity, sexual orientation, disability, ethnicity, race, age and religion. Capgemini views diversity as everything that makes us who we are as an organization, including our social background, our experiences in life and work, our communication styles and even our personality. These dimensions contribute to the type of diversity we value the most: diversity of thought. About Capgemini Capgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The Group is guided everyday by its purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse organization of 270,000 team members in nearly 50 countries. With its strong 50 year heritage and deep industry expertise, Capgemini is trusted by its clients to address the entire breadth of their business needs, from strategy and design to operations, fuelled by the fast evolving and innovative world of cloud, data, AI, connectivity, software, digital engineering and platforms. The Group reported in 2020 global revenues of €16 billion. Discover more about what Capgemini can offer you. Visit: and
CapGemini
Microsoft Azure Data engineer
CapGemini
Location Whilst you may have any of our UK offices as a base location, you must be fully flexible in terms of assignment location, as these roles may involve periods of time away from home during the week at short notice. Capgemini requires our employees to be geographically mobile and to be able to travel to customer site to perform our jobs. Who you'll be working with The Cloud Data Platforms team is part of the Insights and Data Global Practice and has seen strong growth and continued success across a variety of projects and sectors. Cloud Data Platforms is the home of the Data Engineers, Platform Engineers, Solutions Architects and Business Analysts who are focused on driving our customers digital and data transformation journey using the modern cloud platforms. We specialise on using the latest frameworks, reference architectures and technologies using AWS, Azure and GCP. We continue to grow and are looking for talented individuals who want to join our high performing team. If you would like to develop your career as part of a team of highly skilled professionals who are passionate about increasing the value of the data and analytics in organisations you have come to the right place. The focus of your role We are looking for strong Azure Data Engineers who are passionate about Microsoft technology and who ideally have skills in many of the following areas: • Build and deliver Azure data engineering solutions as part of a larger project • Build relationships with client stakeholders to establish a high-level of rapport and confidence • Work with clients, local teams and offshore resources to deliver modern data products • Work effectively on client sites, Capgemini offices and from home • Design and build modern data pipelines and data streams using Microsoft tools • Use Microsoft Data focused Reference Architecture • Design and build data service APIs • Expose data to end users using Power BI, Azure API Apps or other modern visualization platforms or experience • Analyze current business practices, processes and procedures and identify future opportunities for leveraging Microsoft Azure data & analytics PaaS services • Design solutions and support the planning and implementation of data platform services including sizing, configuration, and needs assessment • Implement effective metrics and monitoring processes Skills needed • Minimum 3-4 years of experience with ETL tools, Databricks, SQL, SSAS & T-SQL • Working experience with Azure DevOps (Repos, CI/CD etc.) • Good understanding of Lambda architecture patterns • Good understanding of Data Governance, including Master Data Management (MDM) and Data Quality tools and processes • Influencing and supporting project delivery through involvement in project/sprint planning and QA • Experience with Agile methodology • Experience on collaboration tools such as JIRA, Kanban Board, Confluence etc Nice to have • Azure DevOps • Python • .NET • Snowflake • Databricks What we'll offer you Professional development. Accelerated career progression. An environment that encourages entrepreneurial spirit. It's all on offer at Capgemini. And although collaboration is at the core of the way we work, we also recognise individual needs with a flexible benefits package you can tailor to suit you. Why were different At Capgemini, we help organisations across the world become more agile, more competitive and more successful. Smart, tailored, often-groundbreaking technical solutions to complex problems are the norm. But so, too, is a culture that's as collaborative as it is forward thinking. Working closely with each other, and with our clients, we get under the skin of businesses and to the heart of their goals. You will too. Capgemini is proud to represent nearly 130 nationalities and its cultural diversity. Our holistic definition of diversity extends beyond gender, gender identity, sexual orientation, disability, ethnicity, race, age and religion. Capgemini views diversity as everything that makes us who we are as an organization, including our social background, our experiences in life and work, our communication styles and even our personality. These dimensions contribute to the type of diversity we value the most: diversity of thought. About Capgemini Capgemini is a global leader in consulting, digital transformation, technology and engineering services. The Group is at the forefront of innovation to address the entire breadth of clients' opportunities in the evolving world of cloud, digital and platforms. Building on its strong 50-year+ heritage and deep industry-specific expertise, Capgemini enables organizations to realize their business ambitions through an array of services from strategy to operations. Capgemini is driven by the conviction that the business value of technology comes from and through people. Today, it is a multicultural company of 270,000 team members in almost 50 countries. With Altran, the Group reported 2019 combined revenues of €17billion. Discover more about what Capgemini can offer you. Visit: and
23/09/2022
Full time
Location Whilst you may have any of our UK offices as a base location, you must be fully flexible in terms of assignment location, as these roles may involve periods of time away from home during the week at short notice. Capgemini requires our employees to be geographically mobile and to be able to travel to customer site to perform our jobs. Who you'll be working with The Cloud Data Platforms team is part of the Insights and Data Global Practice and has seen strong growth and continued success across a variety of projects and sectors. Cloud Data Platforms is the home of the Data Engineers, Platform Engineers, Solutions Architects and Business Analysts who are focused on driving our customers digital and data transformation journey using the modern cloud platforms. We specialise on using the latest frameworks, reference architectures and technologies using AWS, Azure and GCP. We continue to grow and are looking for talented individuals who want to join our high performing team. If you would like to develop your career as part of a team of highly skilled professionals who are passionate about increasing the value of the data and analytics in organisations you have come to the right place. The focus of your role We are looking for strong Azure Data Engineers who are passionate about Microsoft technology and who ideally have skills in many of the following areas: • Build and deliver Azure data engineering solutions as part of a larger project • Build relationships with client stakeholders to establish a high-level of rapport and confidence • Work with clients, local teams and offshore resources to deliver modern data products • Work effectively on client sites, Capgemini offices and from home • Design and build modern data pipelines and data streams using Microsoft tools • Use Microsoft Data focused Reference Architecture • Design and build data service APIs • Expose data to end users using Power BI, Azure API Apps or other modern visualization platforms or experience • Analyze current business practices, processes and procedures and identify future opportunities for leveraging Microsoft Azure data & analytics PaaS services • Design solutions and support the planning and implementation of data platform services including sizing, configuration, and needs assessment • Implement effective metrics and monitoring processes Skills needed • Minimum 3-4 years of experience with ETL tools, Databricks, SQL, SSAS & T-SQL • Working experience with Azure DevOps (Repos, CI/CD etc.) • Good understanding of Lambda architecture patterns • Good understanding of Data Governance, including Master Data Management (MDM) and Data Quality tools and processes • Influencing and supporting project delivery through involvement in project/sprint planning and QA • Experience with Agile methodology • Experience on collaboration tools such as JIRA, Kanban Board, Confluence etc Nice to have • Azure DevOps • Python • .NET • Snowflake • Databricks What we'll offer you Professional development. Accelerated career progression. An environment that encourages entrepreneurial spirit. It's all on offer at Capgemini. And although collaboration is at the core of the way we work, we also recognise individual needs with a flexible benefits package you can tailor to suit you. Why were different At Capgemini, we help organisations across the world become more agile, more competitive and more successful. Smart, tailored, often-groundbreaking technical solutions to complex problems are the norm. But so, too, is a culture that's as collaborative as it is forward thinking. Working closely with each other, and with our clients, we get under the skin of businesses and to the heart of their goals. You will too. Capgemini is proud to represent nearly 130 nationalities and its cultural diversity. Our holistic definition of diversity extends beyond gender, gender identity, sexual orientation, disability, ethnicity, race, age and religion. Capgemini views diversity as everything that makes us who we are as an organization, including our social background, our experiences in life and work, our communication styles and even our personality. These dimensions contribute to the type of diversity we value the most: diversity of thought. About Capgemini Capgemini is a global leader in consulting, digital transformation, technology and engineering services. The Group is at the forefront of innovation to address the entire breadth of clients' opportunities in the evolving world of cloud, digital and platforms. Building on its strong 50-year+ heritage and deep industry-specific expertise, Capgemini enables organizations to realize their business ambitions through an array of services from strategy to operations. Capgemini is driven by the conviction that the business value of technology comes from and through people. Today, it is a multicultural company of 270,000 team members in almost 50 countries. With Altran, the Group reported 2019 combined revenues of €17billion. Discover more about what Capgemini can offer you. Visit: and
Omni RMS
Data Solutions Architect
Omni RMS Edinburgh, Midlothian
Ofcom is the regulator for the communications services that we depend upon in the modern world. We make sure people get the best from broadband, telephone and mobile services. We oversee the universal postal service and manage the radio-spectrum used by wireless devices and have regulatory powers over TV and radio. We use data from a wide range of sources to understand the dynamics of the sectors we regulate and to provide insights to inform policy decisions. In addition to existing responsibilities, there are opportunities to contribute to our new duties relating to the regulation of Online Harms. This new area of work will expand the scope of data-driven activities in terms of the variety and volume of data, as well as the range of analysis we do. Purpose of the Role You will architect, design, implement and oversee the operations of data solutions that empower data professionals. What you are expected to deliver in this role You will have critical thinking and relevant skills and experience for enabling the transformation of data to create solutions that add value to business requirements. You will have a deep understanding of the full data lifecycle and the role that data plays across applications, machine learning, business analytics, and reporting. You will identify and evaluate alternative architectures to meet the needs of business, the agreed requirements, and support the strategic direction of Ofcom. Work with IT teams, business analysts and data analytics teams to understand data consumers' needs and develop solutions. You will support our on-going development activities and continually promote data innovation to achieve the business outcomes. You will be a self-motivated and effective communicator and inform and influence senior managers. Understand motivations behind projects and own technical activities to translate business requirements into a business value solution. Take an iterative approach that responds to feedback and changing needs. Perform deep dives to solve a specific solution or design challenges. Using trials or POCs to prove or discount an approach, to critic your own design. You will ensure that the solutions you help deliver forms an integral part of the ICT estate; aligned with the wider architecture. Provide documentation of solutions detailing the business, data, application and technology layers. With colleagues, define data pipelines and data lakes, covering the ingression, ETL or ELT, and the cataloguing of data. Takes overall responsibility for planning effective data storage, cost, security, quality, sharing, availability, retention and publishing within the organisation. Develops analytics policy, standards and guidelines Ensuring successful transitions for solutions into production ensuring production support have the necessary knowledge and documentation to support the service. Skills and Knowledge Robust Data and Technical/Solutions Architecture skills - sets direction for and possesses a deep understanding of architecture and strategies which integrate with industry trends e.g. TOGAF. Hands-on experience with analytical tools and languages such as: Python, R, SQL, Azure Data Factory (and SSIS), Databricks, Power BI, Git etc. Experience of infrastructure: cloud-based technologies used for storage, data lakes, data warehouses, data streaming, Databases, ETL / ELT / Transformation. Experience of DevOps/DataOps methods in development of data solutions to ensure pipelines and processes can be automated and controlled. Experience with Cloud based data and analytical initiatives utilising Azure, AWS, Google Cloud or similar cloud services. Experience of working closely with Data Scientists/Analysts) to understanding their needs. Experience of implementation of statistical, AI, Machine Learning and Deep learning applications. Experience with integrations (e.g., via APIs) with external vendors to share data between organizations. Experience of building and maintaining good working relationships with colleagues at all levels of an organization. Experience of working with external technology supplier and service providers to deliver business solutions.
04/11/2021
Full time
Ofcom is the regulator for the communications services that we depend upon in the modern world. We make sure people get the best from broadband, telephone and mobile services. We oversee the universal postal service and manage the radio-spectrum used by wireless devices and have regulatory powers over TV and radio. We use data from a wide range of sources to understand the dynamics of the sectors we regulate and to provide insights to inform policy decisions. In addition to existing responsibilities, there are opportunities to contribute to our new duties relating to the regulation of Online Harms. This new area of work will expand the scope of data-driven activities in terms of the variety and volume of data, as well as the range of analysis we do. Purpose of the Role You will architect, design, implement and oversee the operations of data solutions that empower data professionals. What you are expected to deliver in this role You will have critical thinking and relevant skills and experience for enabling the transformation of data to create solutions that add value to business requirements. You will have a deep understanding of the full data lifecycle and the role that data plays across applications, machine learning, business analytics, and reporting. You will identify and evaluate alternative architectures to meet the needs of business, the agreed requirements, and support the strategic direction of Ofcom. Work with IT teams, business analysts and data analytics teams to understand data consumers' needs and develop solutions. You will support our on-going development activities and continually promote data innovation to achieve the business outcomes. You will be a self-motivated and effective communicator and inform and influence senior managers. Understand motivations behind projects and own technical activities to translate business requirements into a business value solution. Take an iterative approach that responds to feedback and changing needs. Perform deep dives to solve a specific solution or design challenges. Using trials or POCs to prove or discount an approach, to critic your own design. You will ensure that the solutions you help deliver forms an integral part of the ICT estate; aligned with the wider architecture. Provide documentation of solutions detailing the business, data, application and technology layers. With colleagues, define data pipelines and data lakes, covering the ingression, ETL or ELT, and the cataloguing of data. Takes overall responsibility for planning effective data storage, cost, security, quality, sharing, availability, retention and publishing within the organisation. Develops analytics policy, standards and guidelines Ensuring successful transitions for solutions into production ensuring production support have the necessary knowledge and documentation to support the service. Skills and Knowledge Robust Data and Technical/Solutions Architecture skills - sets direction for and possesses a deep understanding of architecture and strategies which integrate with industry trends e.g. TOGAF. Hands-on experience with analytical tools and languages such as: Python, R, SQL, Azure Data Factory (and SSIS), Databricks, Power BI, Git etc. Experience of infrastructure: cloud-based technologies used for storage, data lakes, data warehouses, data streaming, Databases, ETL / ELT / Transformation. Experience of DevOps/DataOps methods in development of data solutions to ensure pipelines and processes can be automated and controlled. Experience with Cloud based data and analytical initiatives utilising Azure, AWS, Google Cloud or similar cloud services. Experience of working closely with Data Scientists/Analysts) to understanding their needs. Experience of implementation of statistical, AI, Machine Learning and Deep learning applications. Experience with integrations (e.g., via APIs) with external vendors to share data between organizations. Experience of building and maintaining good working relationships with colleagues at all levels of an organization. Experience of working with external technology supplier and service providers to deliver business solutions.
Omni RMS
Data Solutions Architect
Omni RMS Manchester, Lancashire
Ofcom is the regulator for the communications services that we depend upon in the modern world. We make sure people get the best from broadband, telephone and mobile services. We oversee the universal postal service and manage the radio-spectrum used by wireless devices and have regulatory powers over TV and radio. We use data from a wide range of sources to understand the dynamics of the sectors we regulate and to provide insights to inform policy decisions. In addition to existing responsibilities, there are opportunities to contribute to our new duties relating to the regulation of Online Harms. This new area of work will expand the scope of data-driven activities in terms of the variety and volume of data, as well as the range of analysis we do. Purpose of the Role You will architect, design, implement and oversee the operations of data solutions that empower data professionals. What you are expected to deliver in this role You will have critical thinking and relevant skills and experience for enabling the transformation of data to create solutions that add value to business requirements. You will have a deep understanding of the full data lifecycle and the role that data plays across applications, machine learning, business analytics, and reporting. You will identify and evaluate alternative architectures to meet the needs of business, the agreed requirements, and support the strategic direction of Ofcom. Work with IT teams, business analysts and data analytics teams to understand data consumers' needs and develop solutions. You will support our on-going development activities and continually promote data innovation to achieve the business outcomes. You will be a self-motivated and effective communicator and inform and influence senior managers. Understand motivations behind projects and own technical activities to translate business requirements into a business value solution. Take an iterative approach that responds to feedback and changing needs. Perform deep dives to solve a specific solution or design challenges. Using trials or POCs to prove or discount an approach, to critic your own design. You will ensure that the solutions you help deliver forms an integral part of the ICT estate; aligned with the wider architecture. Provide documentation of solutions detailing the business, data, application and technology layers. With colleagues, define data pipelines and data lakes, covering the ingression, ETL or ELT, and the cataloguing of data. Takes overall responsibility for planning effective data storage, cost, security, quality, sharing, availability, retention and publishing within the organisation. Develops analytics policy, standards and guidelines Ensuring successful transitions for solutions into production ensuring production support have the necessary knowledge and documentation to support the service. Skills and Knowledge Robust Data and Technical/Solutions Architecture skills - sets direction for and possesses a deep understanding of architecture and strategies which integrate with industry trends e.g. TOGAF. Hands-on experience with analytical tools and languages such as: Python, R, SQL, Azure Data Factory (and SSIS), Databricks, Power BI, Git etc. Experience of infrastructure: cloud-based technologies used for storage, data lakes, data warehouses, data streaming, Databases, ETL / ELT / Transformation. Experience of DevOps/DataOps methods in development of data solutions to ensure pipelines and processes can be automated and controlled. Experience with Cloud based data and analytical initiatives utilising Azure, AWS, Google Cloud or similar cloud services. Experience of working closely with Data Scientists/Analysts) to understanding their needs. Experience of implementation of statistical, AI, Machine Learning and Deep learning applications. Experience with integrations (e.g., via APIs) with external vendors to share data between organizations. Experience of building and maintaining good working relationships with colleagues at all levels of an organization. Experience of working with external technology supplier and service providers to deliver business solutions.
04/11/2021
Full time
Ofcom is the regulator for the communications services that we depend upon in the modern world. We make sure people get the best from broadband, telephone and mobile services. We oversee the universal postal service and manage the radio-spectrum used by wireless devices and have regulatory powers over TV and radio. We use data from a wide range of sources to understand the dynamics of the sectors we regulate and to provide insights to inform policy decisions. In addition to existing responsibilities, there are opportunities to contribute to our new duties relating to the regulation of Online Harms. This new area of work will expand the scope of data-driven activities in terms of the variety and volume of data, as well as the range of analysis we do. Purpose of the Role You will architect, design, implement and oversee the operations of data solutions that empower data professionals. What you are expected to deliver in this role You will have critical thinking and relevant skills and experience for enabling the transformation of data to create solutions that add value to business requirements. You will have a deep understanding of the full data lifecycle and the role that data plays across applications, machine learning, business analytics, and reporting. You will identify and evaluate alternative architectures to meet the needs of business, the agreed requirements, and support the strategic direction of Ofcom. Work with IT teams, business analysts and data analytics teams to understand data consumers' needs and develop solutions. You will support our on-going development activities and continually promote data innovation to achieve the business outcomes. You will be a self-motivated and effective communicator and inform and influence senior managers. Understand motivations behind projects and own technical activities to translate business requirements into a business value solution. Take an iterative approach that responds to feedback and changing needs. Perform deep dives to solve a specific solution or design challenges. Using trials or POCs to prove or discount an approach, to critic your own design. You will ensure that the solutions you help deliver forms an integral part of the ICT estate; aligned with the wider architecture. Provide documentation of solutions detailing the business, data, application and technology layers. With colleagues, define data pipelines and data lakes, covering the ingression, ETL or ELT, and the cataloguing of data. Takes overall responsibility for planning effective data storage, cost, security, quality, sharing, availability, retention and publishing within the organisation. Develops analytics policy, standards and guidelines Ensuring successful transitions for solutions into production ensuring production support have the necessary knowledge and documentation to support the service. Skills and Knowledge Robust Data and Technical/Solutions Architecture skills - sets direction for and possesses a deep understanding of architecture and strategies which integrate with industry trends e.g. TOGAF. Hands-on experience with analytical tools and languages such as: Python, R, SQL, Azure Data Factory (and SSIS), Databricks, Power BI, Git etc. Experience of infrastructure: cloud-based technologies used for storage, data lakes, data warehouses, data streaming, Databases, ETL / ELT / Transformation. Experience of DevOps/DataOps methods in development of data solutions to ensure pipelines and processes can be automated and controlled. Experience with Cloud based data and analytical initiatives utilising Azure, AWS, Google Cloud or similar cloud services. Experience of working closely with Data Scientists/Analysts) to understanding their needs. Experience of implementation of statistical, AI, Machine Learning and Deep learning applications. Experience with integrations (e.g., via APIs) with external vendors to share data between organizations. Experience of building and maintaining good working relationships with colleagues at all levels of an organization. Experience of working with external technology supplier and service providers to deliver business solutions.
Omni RMS
Data Solutions Architect
Omni RMS
Ofcom is the regulator for the communications services that we depend upon in the modern world. We make sure people get the best from broadband, telephone and mobile services. We oversee the universal postal service and manage the radio-spectrum used by wireless devices and have regulatory powers over TV and radio. We use data from a wide range of sources to understand the dynamics of the sectors we regulate and to provide insights to inform policy decisions. In addition to existing responsibilities, there are opportunities to contribute to our new duties relating to the regulation of Online Harms. This new area of work will expand the scope of data-driven activities in terms of the variety and volume of data, as well as the range of analysis we do. Purpose of the Role You will architect, design, implement and oversee the operations of data solutions that empower data professionals. What you are expected to deliver in this role You will have critical thinking and relevant skills and experience for enabling the transformation of data to create solutions that add value to business requirements. You will have a deep understanding of the full data lifecycle and the role that data plays across applications, machine learning, business analytics, and reporting. You will identify and evaluate alternative architectures to meet the needs of business, the agreed requirements, and support the strategic direction of Ofcom. Work with IT teams, business analysts and data analytics teams to understand data consumers' needs and develop solutions. You will support our on-going development activities and continually promote data innovation to achieve the business outcomes. You will be a self-motivated and effective communicator and inform and influence senior managers. Understand motivations behind projects and own technical activities to translate business requirements into a business value solution. Take an iterative approach that responds to feedback and changing needs. Perform deep dives to solve a specific solution or design challenges. Using trials or POCs to prove or discount an approach, to critic your own design. You will ensure that the solutions you help deliver forms an integral part of the ICT estate; aligned with the wider architecture. Provide documentation of solutions detailing the business, data, application and technology layers. With colleagues, define data pipelines and data lakes, covering the ingression, ETL or ELT, and the cataloguing of data. Takes overall responsibility for planning effective data storage, cost, security, quality, sharing, availability, retention and publishing within the organisation. Develops analytics policy, standards and guidelines Ensuring successful transitions for solutions into production ensuring production support have the necessary knowledge and documentation to support the service. Skills and Knowledge Robust Data and Technical/Solutions Architecture skills - sets direction for and possesses a deep understanding of architecture and strategies which integrate with industry trends e.g. TOGAF. Hands-on experience with analytical tools and languages such as: Python, R, SQL, Azure Data Factory (and SSIS), Databricks, Power BI, Git etc. Experience of infrastructure: cloud-based technologies used for storage, data lakes, data warehouses, data streaming, Databases, ETL / ELT / Transformation. Experience of DevOps/DataOps methods in development of data solutions to ensure pipelines and processes can be automated and controlled. Experience with Cloud based data and analytical initiatives utilising Azure, AWS, Google Cloud or similar cloud services. Experience of working closely with Data Scientists/Analysts) to understanding their needs. Experience of implementation of statistical, AI, Machine Learning and Deep learning applications. Experience with integrations (e.g., via APIs) with external vendors to share data between organizations. Experience of building and maintaining good working relationships with colleagues at all levels of an organization. Experience of working with external technology supplier and service providers to deliver business solutions.
04/11/2021
Full time
Ofcom is the regulator for the communications services that we depend upon in the modern world. We make sure people get the best from broadband, telephone and mobile services. We oversee the universal postal service and manage the radio-spectrum used by wireless devices and have regulatory powers over TV and radio. We use data from a wide range of sources to understand the dynamics of the sectors we regulate and to provide insights to inform policy decisions. In addition to existing responsibilities, there are opportunities to contribute to our new duties relating to the regulation of Online Harms. This new area of work will expand the scope of data-driven activities in terms of the variety and volume of data, as well as the range of analysis we do. Purpose of the Role You will architect, design, implement and oversee the operations of data solutions that empower data professionals. What you are expected to deliver in this role You will have critical thinking and relevant skills and experience for enabling the transformation of data to create solutions that add value to business requirements. You will have a deep understanding of the full data lifecycle and the role that data plays across applications, machine learning, business analytics, and reporting. You will identify and evaluate alternative architectures to meet the needs of business, the agreed requirements, and support the strategic direction of Ofcom. Work with IT teams, business analysts and data analytics teams to understand data consumers' needs and develop solutions. You will support our on-going development activities and continually promote data innovation to achieve the business outcomes. You will be a self-motivated and effective communicator and inform and influence senior managers. Understand motivations behind projects and own technical activities to translate business requirements into a business value solution. Take an iterative approach that responds to feedback and changing needs. Perform deep dives to solve a specific solution or design challenges. Using trials or POCs to prove or discount an approach, to critic your own design. You will ensure that the solutions you help deliver forms an integral part of the ICT estate; aligned with the wider architecture. Provide documentation of solutions detailing the business, data, application and technology layers. With colleagues, define data pipelines and data lakes, covering the ingression, ETL or ELT, and the cataloguing of data. Takes overall responsibility for planning effective data storage, cost, security, quality, sharing, availability, retention and publishing within the organisation. Develops analytics policy, standards and guidelines Ensuring successful transitions for solutions into production ensuring production support have the necessary knowledge and documentation to support the service. Skills and Knowledge Robust Data and Technical/Solutions Architecture skills - sets direction for and possesses a deep understanding of architecture and strategies which integrate with industry trends e.g. TOGAF. Hands-on experience with analytical tools and languages such as: Python, R, SQL, Azure Data Factory (and SSIS), Databricks, Power BI, Git etc. Experience of infrastructure: cloud-based technologies used for storage, data lakes, data warehouses, data streaming, Databases, ETL / ELT / Transformation. Experience of DevOps/DataOps methods in development of data solutions to ensure pipelines and processes can be automated and controlled. Experience with Cloud based data and analytical initiatives utilising Azure, AWS, Google Cloud or similar cloud services. Experience of working closely with Data Scientists/Analysts) to understanding their needs. Experience of implementation of statistical, AI, Machine Learning and Deep learning applications. Experience with integrations (e.g., via APIs) with external vendors to share data between organizations. Experience of building and maintaining good working relationships with colleagues at all levels of an organization. Experience of working with external technology supplier and service providers to deliver business solutions.
Omni RMS
Senior Data Architect
Omni RMS Manchester, Lancashire
Team Overview Data is central to Ofcom's work. We use data from a wide range of sources to understand the dynamics of the sectors we regulate and to provide key insights to inform policy decisions. In addition to existing regulatory responsibilities, there will be exciting opportunities to contribute to Ofcom's new duties in relation to the regulation of Online Harms. It is expected that this new area of work will expand the scope of data-driven activities at Ofcom in terms of the variety and volume of data, as well as the range of analysis we do. To ensure that Ofcom has the appropriate data capabilities to undertake the analysis effectively, ICT has created a new role for an Data Solutions Architect Purpose of the Role The Data Solutions Architect works closely with ICT and the Data Innovation Hub. You will architect, design, implement and oversee the operations of data solutions that empower data professionals to efficiently and effectively deliver their work. Candidates will exhibit critical thinking skills, the ability to synthesize complex problems, and have relevant skills and experience for enabling the transformation of data to create solutions that add value to a myriad of business requirements. You must have a deep understanding of the full data lifecycle and the role that high-quality data plays across applications, machine learning, business analytics, and reporting. You will lead the design and development of solution architectures in response to business requirements. This includes identifying and evaluating alternative architectures, their trade-offs in cost, performance and scalability. And ensuring that the relevant technical strategies, policies, standards and practices (including security) are applied correctly. The end to end solution will be fit for purpose - i.e. meet the needs of business, the agreed requirements, and support the strategic direction of Ofcom. Work with IT teams, business analysts and data analytics teams to understand data consumers' needs and develop solutions By maintaining your knowledge of emerging trends in data usage, tools and analysis techniques you will support our on-going development activities and continually promote data innovation as a means to achieve the business outcomes for specific Groups, and Ofcom. You will need to be self-motivated, an effective communicator and have a collaborative delivery approach.You will work in a collaborative cross-functional environment and interact with the full spectrum of colleagues (data engineers, data analysts, data scientists, operational support and policy makers), and you will need to inform and influence senior managers. Requirements of the Role Build strong relationships with colleagues across the business, understanding their motivations behind projects and own technical activities to translate business requirements (both functional and non-functional) into a solution. Ensuring the required business value is delivered. Fostering a customer centric approach to ensure delivery of business value and an iterative approach that responds to feedback and changing needs. Perform deep dives into technical areas to solve a specific solution or design challenges. Using trials or POCs to prove or discount an approach, to critic your own design. You will be responsible for ensuring that the solutions you help deliver form an integral part of the ICT estate and align with the wider reference architecture and domain roadmaps. Manage stakeholder expectations and be flexible, working on many different projects and topics at the same time period. Manage proactive and reactive communication. Facilitate difficult discussions within the team or with diverse senior stakeholders and external / 3rd parties as necessary. Provide documentation of solutions detailing the business, data, application and technology layers. Work with Data Engineers to define data pipelines and data lakes, covering the ingression, ETL or ELT, and the cataloguing of data. Takes overall responsibility for planning effective data storage, cost, security, quality, sharing, availability, retention and publishing within the organisation. Develops analytics policy, standards and guidelines Ensuring successful transitions for solutions into production ensuring production support have the necessary knowledge and documentation to support the service. Skills, knowledge and experience Robust Data and Technical/Solutions Architecture skills - sets direction for and possesses a deep understanding of architecture and strategies which integrate with industry trends e.g. TOGAF Hands-on experience with analytical tools and languages such as: Python, R, SQL, Azure Data Factory (and SSIS), Databricks, Power BI, Git etc. Experience of infrastructure: cloud-based technologies used for storage, data lakes, data warehouses, data streaming, Databases, ETL / ELT / Transformation. Experience of DevOps/ DataOps methods in development of data solutions to ensure pipelines and processes can be automated and controlled. Experience with Cloud based data and analytical initiatives utilising Azure, AWS, Google Cloud or similar cloud services Experience of working closely with Data Professionals (E.g. Data Scientists and Data Analysts) to understanding their needs. Experience of implementating of statistical, Artificial Intelligence, Machine Learning and Deep learning applications. Experience with integrations (e.g. via APIs) with external vendors to share data between organizations Experience of working with external technology supplier and service providers to deliver business solutions SFIA Skill Enterprise and business architecture STPL - Level 5 Solution architecture ARCH - Level 5 Requirements definition and management REQM- Level 5 Database design DBDS- Level 5 Analytics INAN- Level 4 Emerging Technology Monitoring (EMRG)- Level 4 Relationship Management RLMT- Level 5
04/11/2021
Full time
Team Overview Data is central to Ofcom's work. We use data from a wide range of sources to understand the dynamics of the sectors we regulate and to provide key insights to inform policy decisions. In addition to existing regulatory responsibilities, there will be exciting opportunities to contribute to Ofcom's new duties in relation to the regulation of Online Harms. It is expected that this new area of work will expand the scope of data-driven activities at Ofcom in terms of the variety and volume of data, as well as the range of analysis we do. To ensure that Ofcom has the appropriate data capabilities to undertake the analysis effectively, ICT has created a new role for an Data Solutions Architect Purpose of the Role The Data Solutions Architect works closely with ICT and the Data Innovation Hub. You will architect, design, implement and oversee the operations of data solutions that empower data professionals to efficiently and effectively deliver their work. Candidates will exhibit critical thinking skills, the ability to synthesize complex problems, and have relevant skills and experience for enabling the transformation of data to create solutions that add value to a myriad of business requirements. You must have a deep understanding of the full data lifecycle and the role that high-quality data plays across applications, machine learning, business analytics, and reporting. You will lead the design and development of solution architectures in response to business requirements. This includes identifying and evaluating alternative architectures, their trade-offs in cost, performance and scalability. And ensuring that the relevant technical strategies, policies, standards and practices (including security) are applied correctly. The end to end solution will be fit for purpose - i.e. meet the needs of business, the agreed requirements, and support the strategic direction of Ofcom. Work with IT teams, business analysts and data analytics teams to understand data consumers' needs and develop solutions By maintaining your knowledge of emerging trends in data usage, tools and analysis techniques you will support our on-going development activities and continually promote data innovation as a means to achieve the business outcomes for specific Groups, and Ofcom. You will need to be self-motivated, an effective communicator and have a collaborative delivery approach.You will work in a collaborative cross-functional environment and interact with the full spectrum of colleagues (data engineers, data analysts, data scientists, operational support and policy makers), and you will need to inform and influence senior managers. Requirements of the Role Build strong relationships with colleagues across the business, understanding their motivations behind projects and own technical activities to translate business requirements (both functional and non-functional) into a solution. Ensuring the required business value is delivered. Fostering a customer centric approach to ensure delivery of business value and an iterative approach that responds to feedback and changing needs. Perform deep dives into technical areas to solve a specific solution or design challenges. Using trials or POCs to prove or discount an approach, to critic your own design. You will be responsible for ensuring that the solutions you help deliver form an integral part of the ICT estate and align with the wider reference architecture and domain roadmaps. Manage stakeholder expectations and be flexible, working on many different projects and topics at the same time period. Manage proactive and reactive communication. Facilitate difficult discussions within the team or with diverse senior stakeholders and external / 3rd parties as necessary. Provide documentation of solutions detailing the business, data, application and technology layers. Work with Data Engineers to define data pipelines and data lakes, covering the ingression, ETL or ELT, and the cataloguing of data. Takes overall responsibility for planning effective data storage, cost, security, quality, sharing, availability, retention and publishing within the organisation. Develops analytics policy, standards and guidelines Ensuring successful transitions for solutions into production ensuring production support have the necessary knowledge and documentation to support the service. Skills, knowledge and experience Robust Data and Technical/Solutions Architecture skills - sets direction for and possesses a deep understanding of architecture and strategies which integrate with industry trends e.g. TOGAF Hands-on experience with analytical tools and languages such as: Python, R, SQL, Azure Data Factory (and SSIS), Databricks, Power BI, Git etc. Experience of infrastructure: cloud-based technologies used for storage, data lakes, data warehouses, data streaming, Databases, ETL / ELT / Transformation. Experience of DevOps/ DataOps methods in development of data solutions to ensure pipelines and processes can be automated and controlled. Experience with Cloud based data and analytical initiatives utilising Azure, AWS, Google Cloud or similar cloud services Experience of working closely with Data Professionals (E.g. Data Scientists and Data Analysts) to understanding their needs. Experience of implementating of statistical, Artificial Intelligence, Machine Learning and Deep learning applications. Experience with integrations (e.g. via APIs) with external vendors to share data between organizations Experience of working with external technology supplier and service providers to deliver business solutions SFIA Skill Enterprise and business architecture STPL - Level 5 Solution architecture ARCH - Level 5 Requirements definition and management REQM- Level 5 Database design DBDS- Level 5 Analytics INAN- Level 4 Emerging Technology Monitoring (EMRG)- Level 4 Relationship Management RLMT- Level 5
Omni RMS
Senior Data Architect
Omni RMS
Team Overview Data is central to Ofcom's work. We use data from a wide range of sources to understand the dynamics of the sectors we regulate and to provide key insights to inform policy decisions. In addition to existing regulatory responsibilities, there will be exciting opportunities to contribute to Ofcom's new duties in relation to the regulation of Online Harms. It is expected that this new area of work will expand the scope of data-driven activities at Ofcom in terms of the variety and volume of data, as well as the range of analysis we do. To ensure that Ofcom has the appropriate data capabilities to undertake the analysis effectively, ICT has created a new role for an Data Solutions Architect Purpose of the Role The Data Solutions Architect works closely with ICT and the Data Innovation Hub. You will architect, design, implement and oversee the operations of data solutions that empower data professionals to efficiently and effectively deliver their work. Candidates will exhibit critical thinking skills, the ability to synthesize complex problems, and have relevant skills and experience for enabling the transformation of data to create solutions that add value to a myriad of business requirements. You must have a deep understanding of the full data lifecycle and the role that high-quality data plays across applications, machine learning, business analytics, and reporting. You will lead the design and development of solution architectures in response to business requirements. This includes identifying and evaluating alternative architectures, their trade-offs in cost, performance and scalability. And ensuring that the relevant technical strategies, policies, standards and practices (including security) are applied correctly. The end to end solution will be fit for purpose - i.e. meet the needs of business, the agreed requirements, and support the strategic direction of Ofcom. Work with IT teams, business analysts and data analytics teams to understand data consumers' needs and develop solutions By maintaining your knowledge of emerging trends in data usage, tools and analysis techniques you will support our on-going development activities and continually promote data innovation as a means to achieve the business outcomes for specific Groups, and Ofcom. You will need to be self-motivated, an effective communicator and have a collaborative delivery approach.You will work in a collaborative cross-functional environment and interact with the full spectrum of colleagues (data engineers, data analysts, data scientists, operational support and policy makers), and you will need to inform and influence senior managers. Requirements of the Role Build strong relationships with colleagues across the business, understanding their motivations behind projects and own technical activities to translate business requirements (both functional and non-functional) into a solution. Ensuring the required business value is delivered. Fostering a customer centric approach to ensure delivery of business value and an iterative approach that responds to feedback and changing needs. Perform deep dives into technical areas to solve a specific solution or design challenges. Using trials or POCs to prove or discount an approach, to critic your own design. You will be responsible for ensuring that the solutions you help deliver form an integral part of the ICT estate and align with the wider reference architecture and domain roadmaps. Manage stakeholder expectations and be flexible, working on many different projects and topics at the same time period. Manage proactive and reactive communication. Facilitate difficult discussions within the team or with diverse senior stakeholders and external / 3rd parties as necessary. Provide documentation of solutions detailing the business, data, application and technology layers. Work with Data Engineers to define data pipelines and data lakes, covering the ingression, ETL or ELT, and the cataloguing of data. Takes overall responsibility for planning effective data storage, cost, security, quality, sharing, availability, retention and publishing within the organisation. Develops analytics policy, standards and guidelines Ensuring successful transitions for solutions into production ensuring production support have the necessary knowledge and documentation to support the service. Skills, knowledge and experience Robust Data and Technical/Solutions Architecture skills - sets direction for and possesses a deep understanding of architecture and strategies which integrate with industry trends e.g. TOGAF Hands-on experience with analytical tools and languages such as: Python, R, SQL, Azure Data Factory (and SSIS), Databricks, Power BI, Git etc. Experience of infrastructure: cloud-based technologies used for storage, data lakes, data warehouses, data streaming, Databases, ETL / ELT / Transformation. Experience of DevOps/ DataOps methods in development of data solutions to ensure pipelines and processes can be automated and controlled. Experience with Cloud based data and analytical initiatives utilising Azure, AWS, Google Cloud or similar cloud services Experience of working closely with Data Professionals (E.g. Data Scientists and Data Analysts) to understanding their needs. Experience of implementating of statistical, Artificial Intelligence, Machine Learning and Deep learning applications. Experience with integrations (e.g. via APIs) with external vendors to share data between organizations Experience of working with external technology supplier and service providers to deliver business solutions SFIA Skill Enterprise and business architecture STPL - Level 5 Solution architecture ARCH - Level 5 Requirements definition and management REQM- Level 5 Database design DBDS- Level 5 Analytics INAN- Level 4 Emerging Technology Monitoring (EMRG)- Level 4 Relationship Management RLMT- Level 5
04/11/2021
Full time
Team Overview Data is central to Ofcom's work. We use data from a wide range of sources to understand the dynamics of the sectors we regulate and to provide key insights to inform policy decisions. In addition to existing regulatory responsibilities, there will be exciting opportunities to contribute to Ofcom's new duties in relation to the regulation of Online Harms. It is expected that this new area of work will expand the scope of data-driven activities at Ofcom in terms of the variety and volume of data, as well as the range of analysis we do. To ensure that Ofcom has the appropriate data capabilities to undertake the analysis effectively, ICT has created a new role for an Data Solutions Architect Purpose of the Role The Data Solutions Architect works closely with ICT and the Data Innovation Hub. You will architect, design, implement and oversee the operations of data solutions that empower data professionals to efficiently and effectively deliver their work. Candidates will exhibit critical thinking skills, the ability to synthesize complex problems, and have relevant skills and experience for enabling the transformation of data to create solutions that add value to a myriad of business requirements. You must have a deep understanding of the full data lifecycle and the role that high-quality data plays across applications, machine learning, business analytics, and reporting. You will lead the design and development of solution architectures in response to business requirements. This includes identifying and evaluating alternative architectures, their trade-offs in cost, performance and scalability. And ensuring that the relevant technical strategies, policies, standards and practices (including security) are applied correctly. The end to end solution will be fit for purpose - i.e. meet the needs of business, the agreed requirements, and support the strategic direction of Ofcom. Work with IT teams, business analysts and data analytics teams to understand data consumers' needs and develop solutions By maintaining your knowledge of emerging trends in data usage, tools and analysis techniques you will support our on-going development activities and continually promote data innovation as a means to achieve the business outcomes for specific Groups, and Ofcom. You will need to be self-motivated, an effective communicator and have a collaborative delivery approach.You will work in a collaborative cross-functional environment and interact with the full spectrum of colleagues (data engineers, data analysts, data scientists, operational support and policy makers), and you will need to inform and influence senior managers. Requirements of the Role Build strong relationships with colleagues across the business, understanding their motivations behind projects and own technical activities to translate business requirements (both functional and non-functional) into a solution. Ensuring the required business value is delivered. Fostering a customer centric approach to ensure delivery of business value and an iterative approach that responds to feedback and changing needs. Perform deep dives into technical areas to solve a specific solution or design challenges. Using trials or POCs to prove or discount an approach, to critic your own design. You will be responsible for ensuring that the solutions you help deliver form an integral part of the ICT estate and align with the wider reference architecture and domain roadmaps. Manage stakeholder expectations and be flexible, working on many different projects and topics at the same time period. Manage proactive and reactive communication. Facilitate difficult discussions within the team or with diverse senior stakeholders and external / 3rd parties as necessary. Provide documentation of solutions detailing the business, data, application and technology layers. Work with Data Engineers to define data pipelines and data lakes, covering the ingression, ETL or ELT, and the cataloguing of data. Takes overall responsibility for planning effective data storage, cost, security, quality, sharing, availability, retention and publishing within the organisation. Develops analytics policy, standards and guidelines Ensuring successful transitions for solutions into production ensuring production support have the necessary knowledge and documentation to support the service. Skills, knowledge and experience Robust Data and Technical/Solutions Architecture skills - sets direction for and possesses a deep understanding of architecture and strategies which integrate with industry trends e.g. TOGAF Hands-on experience with analytical tools and languages such as: Python, R, SQL, Azure Data Factory (and SSIS), Databricks, Power BI, Git etc. Experience of infrastructure: cloud-based technologies used for storage, data lakes, data warehouses, data streaming, Databases, ETL / ELT / Transformation. Experience of DevOps/ DataOps methods in development of data solutions to ensure pipelines and processes can be automated and controlled. Experience with Cloud based data and analytical initiatives utilising Azure, AWS, Google Cloud or similar cloud services Experience of working closely with Data Professionals (E.g. Data Scientists and Data Analysts) to understanding their needs. Experience of implementating of statistical, Artificial Intelligence, Machine Learning and Deep learning applications. Experience with integrations (e.g. via APIs) with external vendors to share data between organizations Experience of working with external technology supplier and service providers to deliver business solutions SFIA Skill Enterprise and business architecture STPL - Level 5 Solution architecture ARCH - Level 5 Requirements definition and management REQM- Level 5 Database design DBDS- Level 5 Analytics INAN- Level 4 Emerging Technology Monitoring (EMRG)- Level 4 Relationship Management RLMT- Level 5
Omni RMS
Senior Data Architect
Omni RMS Warrington, Cheshire
Team Overview Data is central to Ofcom's work. We use data from a wide range of sources to understand the dynamics of the sectors we regulate and to provide key insights to inform policy decisions. In addition to existing regulatory responsibilities, there will be exciting opportunities to contribute to Ofcom's new duties in relation to the regulation of Online Harms. It is expected that this new area of work will expand the scope of data-driven activities at Ofcom in terms of the variety and volume of data, as well as the range of analysis we do. To ensure that Ofcom has the appropriate data capabilities to undertake the analysis effectively, ICT has created a new role for an Data Solutions Architect Purpose of the Role The Data Solutions Architect works closely with ICT and the Data Innovation Hub. You will architect, design, implement and oversee the operations of data solutions that empower data professionals to efficiently and effectively deliver their work. Candidates will exhibit critical thinking skills, the ability to synthesize complex problems, and have relevant skills and experience for enabling the transformation of data to create solutions that add value to a myriad of business requirements. You must have a deep understanding of the full data lifecycle and the role that high-quality data plays across applications, machine learning, business analytics, and reporting. You will lead the design and development of solution architectures in response to business requirements. This includes identifying and evaluating alternative architectures, their trade-offs in cost, performance and scalability. And ensuring that the relevant technical strategies, policies, standards and practices (including security) are applied correctly. The end to end solution will be fit for purpose - i.e. meet the needs of business, the agreed requirements, and support the strategic direction of Ofcom. Work with IT teams, business analysts and data analytics teams to understand data consumers' needs and develop solutions By maintaining your knowledge of emerging trends in data usage, tools and analysis techniques you will support our on-going development activities and continually promote data innovation as a means to achieve the business outcomes for specific Groups, and Ofcom. You will need to be self-motivated, an effective communicator and have a collaborative delivery approach.You will work in a collaborative cross-functional environment and interact with the full spectrum of colleagues (data engineers, data analysts, data scientists, operational support and policy makers), and you will need to inform and influence senior managers. Requirements of the Role Build strong relationships with colleagues across the business, understanding their motivations behind projects and own technical activities to translate business requirements (both functional and non-functional) into a solution. Ensuring the required business value is delivered. Fostering a customer centric approach to ensure delivery of business value and an iterative approach that responds to feedback and changing needs. Perform deep dives into technical areas to solve a specific solution or design challenges. Using trials or POCs to prove or discount an approach, to critic your own design. You will be responsible for ensuring that the solutions you help deliver form an integral part of the ICT estate and align with the wider reference architecture and domain roadmaps. Manage stakeholder expectations and be flexible, working on many different projects and topics at the same time period. Manage proactive and reactive communication. Facilitate difficult discussions within the team or with diverse senior stakeholders and external / 3rd parties as necessary. Provide documentation of solutions detailing the business, data, application and technology layers. Work with Data Engineers to define data pipelines and data lakes, covering the ingression, ETL or ELT, and the cataloguing of data. Takes overall responsibility for planning effective data storage, cost, security, quality, sharing, availability, retention and publishing within the organisation. Develops analytics policy, standards and guidelines Ensuring successful transitions for solutions into production ensuring production support have the necessary knowledge and documentation to support the service. Skills, knowledge and experience Robust Data and Technical/Solutions Architecture skills - sets direction for and possesses a deep understanding of architecture and strategies which integrate with industry trends e.g. TOGAF Hands-on experience with analytical tools and languages such as: Python, R, SQL, Azure Data Factory (and SSIS), Databricks, Power BI, Git etc. Experience of infrastructure: cloud-based technologies used for storage, data lakes, data warehouses, data streaming, Databases, ETL / ELT / Transformation. Experience of DevOps/ DataOps methods in development of data solutions to ensure pipelines and processes can be automated and controlled. Experience with Cloud based data and analytical initiatives utilising Azure, AWS, Google Cloud or similar cloud services Experience of working closely with Data Professionals (E.g. Data Scientists and Data Analysts) to understanding their needs. Experience of implementating of statistical, Artificial Intelligence, Machine Learning and Deep learning applications. Experience with integrations (e.g. via APIs) with external vendors to share data between organizations Experience of working with external technology supplier and service providers to deliver business solutions SFIA Skill Enterprise and business architecture STPL - Level 5 Solution architecture ARCH - Level 5 Requirements definition and management REQM- Level 5 Database design DBDS- Level 5 Analytics INAN- Level 4 Emerging Technology Monitoring (EMRG)- Level 4 Relationship Management RLMT- Level 5
04/11/2021
Full time
Team Overview Data is central to Ofcom's work. We use data from a wide range of sources to understand the dynamics of the sectors we regulate and to provide key insights to inform policy decisions. In addition to existing regulatory responsibilities, there will be exciting opportunities to contribute to Ofcom's new duties in relation to the regulation of Online Harms. It is expected that this new area of work will expand the scope of data-driven activities at Ofcom in terms of the variety and volume of data, as well as the range of analysis we do. To ensure that Ofcom has the appropriate data capabilities to undertake the analysis effectively, ICT has created a new role for an Data Solutions Architect Purpose of the Role The Data Solutions Architect works closely with ICT and the Data Innovation Hub. You will architect, design, implement and oversee the operations of data solutions that empower data professionals to efficiently and effectively deliver their work. Candidates will exhibit critical thinking skills, the ability to synthesize complex problems, and have relevant skills and experience for enabling the transformation of data to create solutions that add value to a myriad of business requirements. You must have a deep understanding of the full data lifecycle and the role that high-quality data plays across applications, machine learning, business analytics, and reporting. You will lead the design and development of solution architectures in response to business requirements. This includes identifying and evaluating alternative architectures, their trade-offs in cost, performance and scalability. And ensuring that the relevant technical strategies, policies, standards and practices (including security) are applied correctly. The end to end solution will be fit for purpose - i.e. meet the needs of business, the agreed requirements, and support the strategic direction of Ofcom. Work with IT teams, business analysts and data analytics teams to understand data consumers' needs and develop solutions By maintaining your knowledge of emerging trends in data usage, tools and analysis techniques you will support our on-going development activities and continually promote data innovation as a means to achieve the business outcomes for specific Groups, and Ofcom. You will need to be self-motivated, an effective communicator and have a collaborative delivery approach.You will work in a collaborative cross-functional environment and interact with the full spectrum of colleagues (data engineers, data analysts, data scientists, operational support and policy makers), and you will need to inform and influence senior managers. Requirements of the Role Build strong relationships with colleagues across the business, understanding their motivations behind projects and own technical activities to translate business requirements (both functional and non-functional) into a solution. Ensuring the required business value is delivered. Fostering a customer centric approach to ensure delivery of business value and an iterative approach that responds to feedback and changing needs. Perform deep dives into technical areas to solve a specific solution or design challenges. Using trials or POCs to prove or discount an approach, to critic your own design. You will be responsible for ensuring that the solutions you help deliver form an integral part of the ICT estate and align with the wider reference architecture and domain roadmaps. Manage stakeholder expectations and be flexible, working on many different projects and topics at the same time period. Manage proactive and reactive communication. Facilitate difficult discussions within the team or with diverse senior stakeholders and external / 3rd parties as necessary. Provide documentation of solutions detailing the business, data, application and technology layers. Work with Data Engineers to define data pipelines and data lakes, covering the ingression, ETL or ELT, and the cataloguing of data. Takes overall responsibility for planning effective data storage, cost, security, quality, sharing, availability, retention and publishing within the organisation. Develops analytics policy, standards and guidelines Ensuring successful transitions for solutions into production ensuring production support have the necessary knowledge and documentation to support the service. Skills, knowledge and experience Robust Data and Technical/Solutions Architecture skills - sets direction for and possesses a deep understanding of architecture and strategies which integrate with industry trends e.g. TOGAF Hands-on experience with analytical tools and languages such as: Python, R, SQL, Azure Data Factory (and SSIS), Databricks, Power BI, Git etc. Experience of infrastructure: cloud-based technologies used for storage, data lakes, data warehouses, data streaming, Databases, ETL / ELT / Transformation. Experience of DevOps/ DataOps methods in development of data solutions to ensure pipelines and processes can be automated and controlled. Experience with Cloud based data and analytical initiatives utilising Azure, AWS, Google Cloud or similar cloud services Experience of working closely with Data Professionals (E.g. Data Scientists and Data Analysts) to understanding their needs. Experience of implementating of statistical, Artificial Intelligence, Machine Learning and Deep learning applications. Experience with integrations (e.g. via APIs) with external vendors to share data between organizations Experience of working with external technology supplier and service providers to deliver business solutions SFIA Skill Enterprise and business architecture STPL - Level 5 Solution architecture ARCH - Level 5 Requirements definition and management REQM- Level 5 Database design DBDS- Level 5 Analytics INAN- Level 4 Emerging Technology Monitoring (EMRG)- Level 4 Relationship Management RLMT- Level 5

Modal Window

  • Home
  • Contact
  • About Us
  • FAQs
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • IT blog
  • Facebook
  • Twitter
  • LinkedIn
  • Youtube
© 2008-2026 IT Job Board