Data Engineers Position Description As a Senior Data Engineer, you will design and lead the implementation of data flows that link operational systems, analytics & BI platforms. You will be part of the Data Services team, which handles ingesting, storing, maintaining and exposing a variety of datasets. These datasets are used by analysts and data scientists to generate insights and support decision-making. The team is growing, and your role will involve bringing in new datasets, maintaining existing ones, and ensuring data is clean, accessible and high-quality Your future duties and responsibilities -Ingest new datasets as needed by the business. - Ensure all analytics-ready datasets are formatted clearly and meet high-quality standards. - Investigate and resolve any defects or discrepancies in the datasets. - Maintain the dataset catalogue and data dictionary so analysts/data scientists can easily find and use data. - Perform any other tasks that help ensure the datasets are coherent, well-maintained and available for end-users. Required qualifications to be successful in this role Communication - Engage effectively with both technical and non-technical stakeholders. - Lead discussions in multidisciplinary teams and handle differing viewpoints. - Represent and advocate for the Data Services team externally. - Data Analysis & Synthesis - Profile data and analyse source systems. - Present clear insights to support how data is used downstream. Data Development & Integration - Design, build and test large or complex data products. - Look for ways to improve data by providing "conformed" (standardised) datasets. - Choose and implement technologies that deliver resilient, scalable, future-proof data solutions. Data Modelling - Produce data models across multiple subject areas. - Explain the rationale behind choosing specific models. - Understand industry-recognised modelling standards and apply them appropriately. Metadata & Data Management - Ensure datasets are accompanied by appropriate metadata. - Know tools and practices for metadata storage and usage. - Oversee integrity, accessibility and searchability of data and metadata, and recommend improvements. Problem Resolution (Data) - Respond to problems in databases, data processes or data products as they arise. - Monitor services to identify trends and take preventative action. Programming / Build (Data Engineering) - Use agreed standards and tools to design, code, test, document and refactor moderate-to-complex programs and scripts. - Collaborate with others on specifications and reviews. Testing - Review requirements, define test conditions, identify risks and test issues. - Apply manual and automated testing as needed, analyse and report results. Technical Understanding & Innovation - Understand core technical concepts relevant to the role and apply them with guidance. - Stay aware of emerging trends, tools, techniques in data, and their impact on the organisation Due to the secure nature of the programme, you will need to hold UK Security Clearance or be eligible to go through this clearance. This position is available in Gloucester. Together, as owners, let's turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you'll reach your full potential because You are invited to be an owner from day 1 as we work together to bring our Dream to life. That's why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company's strategy and direction. Your work creates value. You'll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You'll shape your career by joining a company built to grow and last. You'll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team-one of the largest IT and business consulting services firms in the world.
01/04/2026
Full time
Data Engineers Position Description As a Senior Data Engineer, you will design and lead the implementation of data flows that link operational systems, analytics & BI platforms. You will be part of the Data Services team, which handles ingesting, storing, maintaining and exposing a variety of datasets. These datasets are used by analysts and data scientists to generate insights and support decision-making. The team is growing, and your role will involve bringing in new datasets, maintaining existing ones, and ensuring data is clean, accessible and high-quality Your future duties and responsibilities -Ingest new datasets as needed by the business. - Ensure all analytics-ready datasets are formatted clearly and meet high-quality standards. - Investigate and resolve any defects or discrepancies in the datasets. - Maintain the dataset catalogue and data dictionary so analysts/data scientists can easily find and use data. - Perform any other tasks that help ensure the datasets are coherent, well-maintained and available for end-users. Required qualifications to be successful in this role Communication - Engage effectively with both technical and non-technical stakeholders. - Lead discussions in multidisciplinary teams and handle differing viewpoints. - Represent and advocate for the Data Services team externally. - Data Analysis & Synthesis - Profile data and analyse source systems. - Present clear insights to support how data is used downstream. Data Development & Integration - Design, build and test large or complex data products. - Look for ways to improve data by providing "conformed" (standardised) datasets. - Choose and implement technologies that deliver resilient, scalable, future-proof data solutions. Data Modelling - Produce data models across multiple subject areas. - Explain the rationale behind choosing specific models. - Understand industry-recognised modelling standards and apply them appropriately. Metadata & Data Management - Ensure datasets are accompanied by appropriate metadata. - Know tools and practices for metadata storage and usage. - Oversee integrity, accessibility and searchability of data and metadata, and recommend improvements. Problem Resolution (Data) - Respond to problems in databases, data processes or data products as they arise. - Monitor services to identify trends and take preventative action. Programming / Build (Data Engineering) - Use agreed standards and tools to design, code, test, document and refactor moderate-to-complex programs and scripts. - Collaborate with others on specifications and reviews. Testing - Review requirements, define test conditions, identify risks and test issues. - Apply manual and automated testing as needed, analyse and report results. Technical Understanding & Innovation - Understand core technical concepts relevant to the role and apply them with guidance. - Stay aware of emerging trends, tools, techniques in data, and their impact on the organisation Due to the secure nature of the programme, you will need to hold UK Security Clearance or be eligible to go through this clearance. This position is available in Gloucester. Together, as owners, let's turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you'll reach your full potential because You are invited to be an owner from day 1 as we work together to bring our Dream to life. That's why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company's strategy and direction. Your work creates value. You'll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You'll shape your career by joining a company built to grow and last. You'll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team-one of the largest IT and business consulting services firms in the world.
ECB Data Analyst (Contract) Duration: 6 Months (Possibility for extension) Location: London/Hybrid (3 days per week on site) Rate: A highly competitive Umbrella Day Rate is available for suitable candidates Role Profile The External Data Analyst will support the ECB Onboarding Programme by translating business and regulatory requirements into clear external data needs and mapping these to existing datasets, systems, and vendor sources. The role will work closely with business stakeholders, Technology, Data Management, Legal, and Procurement to identify where current external data provision does not meet requirements, assess vendor and dataset gaps, and drive the delivery of remediation actions or new data sourcing. The analyst will ensure that all required data is accurately identified, traceable, contractually compliant, and available to regulatory submissions, controls, and programme milestones. Key Accountabilities: Translate business and regulatory requirements into external data specifications , mapping them to existing and new datasets, systems, and vendor sources to identify gaps. Work collaboratively with Data and Technology teams to define ingestion, integration, storage, and metadata needs for external data supporting ECB onboarding. Assess the suitability and coverage of current vendor datasets , identifying data availability, completeness, and quality gaps that may impact ECB reporting or controls. Work with Data Quality and Data Governance teams to validate data standards, lineage, definitions, and controls, ensuring alignment with ECB expectations. Ensure all ingestion flows and architectural designs comply with vendor licensing , including usage rights, redistribution restrictions, and entitlement rules. Coordinate with Legal, Procurement to validate licensing requirements, address contractual gaps, and support sourcing of additional datasets when needed. Document source-to-target mappings, lineage, licensing rules, and ingestion patterns to support ECB traceability, governance artefacts, and internal audit readiness. Track and deliver remediation actions for data, architectural, quality, or licensing gaps, providing clear reporting of risks, issues, and dependencies to ECB programme governance. Skills & Experience: Experience translating regulatory or business requirements into clear data specifications and mappings. Strong understanding of external data from key financial vendors (Bloomberg, Refinitiv, S&P, Moody's, Fitch). Proven ability to work with Data and Technology teams to understand requirements for ingestion, integration, and system flows. Knowledge of data licensing, usage rights, entitlement models, and redistribution constraints. Experience running data sourcing exercises - identifying and evaluating vendors. Experience collaborating with Data Governance teams on lineage, metadata, controls, and standards. Strong documentation skills, including mapping, lineage, and technical requirement artefacts. Effective stakeholder management, working with Technology, Business SMEs, Legal, Procurement Experience with regulatory onboarding programmes or data remediation. Familiarity with data governance frameworks (e.g., BCBS 239, EDM Council standards). Exposure to vendor contract review, sourcing processes, and commercial/licensing negotiations. Awareness of cloud data architecture concepts and licensing implications (Azure, AWS). Experience with data quality tooling or profiling methods. Technical literacy such as basic ability to interpret vendor data schemas. Knowledge of ESG specific external data sources Candidates will need to show evidence of the above in their CV in order to be considered. If you feel you have the skills and experience and want to hear more about this role 'apply now' to declare your interest in this opportunity with our client. Your application will be observed by our dedicated team. We will respond to all successful applicants ASAP however, please be advised that we will always look to contact you further from this time should we need further applicants or if other opportunities arise relevant to your skillset. Pontoon is an employment consultancy. We put expertise, energy, and enthusiasm into improving everyone's chance of being part of the workplace. We respect and appreciate people of all ethnicities, generations, religious beliefs, sexual orientations, gender identities, and more. We do this by showcasing their talents, skills, and unique experience in an inclusive environment that helps them thrive. As part of our standard hiring process to manage risk, please note background screening checks will be conducted on all hires before commencing employment. We use generative AI tools to support our candidate screening process. This helps us ensure a fair, consistent, and efficient experience for all applicants. Rest assured, all final decisions are made by our hiring team, and your application will be reviewed with care and attention.
01/04/2026
Contractor
ECB Data Analyst (Contract) Duration: 6 Months (Possibility for extension) Location: London/Hybrid (3 days per week on site) Rate: A highly competitive Umbrella Day Rate is available for suitable candidates Role Profile The External Data Analyst will support the ECB Onboarding Programme by translating business and regulatory requirements into clear external data needs and mapping these to existing datasets, systems, and vendor sources. The role will work closely with business stakeholders, Technology, Data Management, Legal, and Procurement to identify where current external data provision does not meet requirements, assess vendor and dataset gaps, and drive the delivery of remediation actions or new data sourcing. The analyst will ensure that all required data is accurately identified, traceable, contractually compliant, and available to regulatory submissions, controls, and programme milestones. Key Accountabilities: Translate business and regulatory requirements into external data specifications , mapping them to existing and new datasets, systems, and vendor sources to identify gaps. Work collaboratively with Data and Technology teams to define ingestion, integration, storage, and metadata needs for external data supporting ECB onboarding. Assess the suitability and coverage of current vendor datasets , identifying data availability, completeness, and quality gaps that may impact ECB reporting or controls. Work with Data Quality and Data Governance teams to validate data standards, lineage, definitions, and controls, ensuring alignment with ECB expectations. Ensure all ingestion flows and architectural designs comply with vendor licensing , including usage rights, redistribution restrictions, and entitlement rules. Coordinate with Legal, Procurement to validate licensing requirements, address contractual gaps, and support sourcing of additional datasets when needed. Document source-to-target mappings, lineage, licensing rules, and ingestion patterns to support ECB traceability, governance artefacts, and internal audit readiness. Track and deliver remediation actions for data, architectural, quality, or licensing gaps, providing clear reporting of risks, issues, and dependencies to ECB programme governance. Skills & Experience: Experience translating regulatory or business requirements into clear data specifications and mappings. Strong understanding of external data from key financial vendors (Bloomberg, Refinitiv, S&P, Moody's, Fitch). Proven ability to work with Data and Technology teams to understand requirements for ingestion, integration, and system flows. Knowledge of data licensing, usage rights, entitlement models, and redistribution constraints. Experience running data sourcing exercises - identifying and evaluating vendors. Experience collaborating with Data Governance teams on lineage, metadata, controls, and standards. Strong documentation skills, including mapping, lineage, and technical requirement artefacts. Effective stakeholder management, working with Technology, Business SMEs, Legal, Procurement Experience with regulatory onboarding programmes or data remediation. Familiarity with data governance frameworks (e.g., BCBS 239, EDM Council standards). Exposure to vendor contract review, sourcing processes, and commercial/licensing negotiations. Awareness of cloud data architecture concepts and licensing implications (Azure, AWS). Experience with data quality tooling or profiling methods. Technical literacy such as basic ability to interpret vendor data schemas. Knowledge of ESG specific external data sources Candidates will need to show evidence of the above in their CV in order to be considered. If you feel you have the skills and experience and want to hear more about this role 'apply now' to declare your interest in this opportunity with our client. Your application will be observed by our dedicated team. We will respond to all successful applicants ASAP however, please be advised that we will always look to contact you further from this time should we need further applicants or if other opportunities arise relevant to your skillset. Pontoon is an employment consultancy. We put expertise, energy, and enthusiasm into improving everyone's chance of being part of the workplace. We respect and appreciate people of all ethnicities, generations, religious beliefs, sexual orientations, gender identities, and more. We do this by showcasing their talents, skills, and unique experience in an inclusive environment that helps them thrive. As part of our standard hiring process to manage risk, please note background screening checks will be conducted on all hires before commencing employment. We use generative AI tools to support our candidate screening process. This helps us ensure a fair, consistent, and efficient experience for all applicants. Rest assured, all final decisions are made by our hiring team, and your application will be reviewed with care and attention.
Job Title - Senior Data Manager Location - Manchester, UK Type - Contract Job Description: About the role The Trips Data Governance team manages data for all aspects of trips, excluding stays, enabling teams to understand and improve business performance and customer experience. They are currently seeking an experienced Data Manager to join their impactful Data Governance team to support the critical SAP Rise migration program. About the SAP Rise program: The programme is a key business modernisation initiative to migrate the current Booking Transport (BTL) SAP instance to a standardised cloud-based ERP system. The programme will introduce an interim architecture between the current platform and the SAP S/4 RISE ecosystem to test processes, data quality and governance, data contracts, and new capabilities, while supporting the transition to the North Star architecture where data flows directly from a modernised enterprise order platform. Role Overview: In this role, a Senior Data Manager combines technical knowledge, business insight, and expert communication to provide critical information about data systems. This position focuses on supporting business needs with high-quality data through monitoring, issue detection, impact quantification, end-to-end data corrections, standardization, and architectural optimization. A key aspect of the role is to advocate for a Data Quality mindset across the organization. You will report to a Senior Manager and collaborate with other Data Managers on strategic objectives for data quality, governance, metadata management and regulatory compliance. This is a hands-on role where you will work closely with Business Analysts, Data Engineers, Data Scientists, and Insights Analysts to build deliverables required for the SAP Rise program. Roles & Responsibilities As a Senior Data Manager I (Level G), your required competencies include: Independence in: AI & Ethics: Independent in ethical data handling, responsible AI, compliance, knowledge & prompt engineering, and AI application in Data Management. Change & Project Mgt.: Independent in change management, planning, monitoring & delivery, stakeholder management, and DQ implementation. Critical Thinking: Independent in decision making, and DQ - investigate & resolve. Data & Info. Management: Independent in MDM integrations, MDM policies, document & content lifecycle, document & content classification, data risk identification, data risk decision, data risk mitigation, metadata management, and data lifecycle management. Effective Communication: Independent in communication basics, cross-cultural relationships, and tailored messaging & motivational communication. Privacy & Security: Independent in high-pressure communication, regulatory knowledge, vulnerability & mitigation, and compliance-by-design. Software & Analytics: Independent in coding, visualisation, and Data Mgt. Adoption. Stewardship: Independent in identification & training. Strategy & Policy: Independent in data management strategy, policies, standards & playbooks, and maturity model & assessment. Solution Design: Independent in solution requirements. Expertise Critical Thinking: Expertise in root cause analysis. Data & Info. Management: Expertise in MDM requirements. Software & Analytics: Expertise in analysis (both listed entries), Data Mgt. Integration, and DQ Dimensions & Rules. Solution Design: Expertise in solution monitoring & iteration. Stewardship: Expertise in advocacy & support. Randstad Technologies is acting as an Employment Business in relation to this vacancy.
01/04/2026
Contractor
Job Title - Senior Data Manager Location - Manchester, UK Type - Contract Job Description: About the role The Trips Data Governance team manages data for all aspects of trips, excluding stays, enabling teams to understand and improve business performance and customer experience. They are currently seeking an experienced Data Manager to join their impactful Data Governance team to support the critical SAP Rise migration program. About the SAP Rise program: The programme is a key business modernisation initiative to migrate the current Booking Transport (BTL) SAP instance to a standardised cloud-based ERP system. The programme will introduce an interim architecture between the current platform and the SAP S/4 RISE ecosystem to test processes, data quality and governance, data contracts, and new capabilities, while supporting the transition to the North Star architecture where data flows directly from a modernised enterprise order platform. Role Overview: In this role, a Senior Data Manager combines technical knowledge, business insight, and expert communication to provide critical information about data systems. This position focuses on supporting business needs with high-quality data through monitoring, issue detection, impact quantification, end-to-end data corrections, standardization, and architectural optimization. A key aspect of the role is to advocate for a Data Quality mindset across the organization. You will report to a Senior Manager and collaborate with other Data Managers on strategic objectives for data quality, governance, metadata management and regulatory compliance. This is a hands-on role where you will work closely with Business Analysts, Data Engineers, Data Scientists, and Insights Analysts to build deliverables required for the SAP Rise program. Roles & Responsibilities As a Senior Data Manager I (Level G), your required competencies include: Independence in: AI & Ethics: Independent in ethical data handling, responsible AI, compliance, knowledge & prompt engineering, and AI application in Data Management. Change & Project Mgt.: Independent in change management, planning, monitoring & delivery, stakeholder management, and DQ implementation. Critical Thinking: Independent in decision making, and DQ - investigate & resolve. Data & Info. Management: Independent in MDM integrations, MDM policies, document & content lifecycle, document & content classification, data risk identification, data risk decision, data risk mitigation, metadata management, and data lifecycle management. Effective Communication: Independent in communication basics, cross-cultural relationships, and tailored messaging & motivational communication. Privacy & Security: Independent in high-pressure communication, regulatory knowledge, vulnerability & mitigation, and compliance-by-design. Software & Analytics: Independent in coding, visualisation, and Data Mgt. Adoption. Stewardship: Independent in identification & training. Strategy & Policy: Independent in data management strategy, policies, standards & playbooks, and maturity model & assessment. Solution Design: Independent in solution requirements. Expertise Critical Thinking: Expertise in root cause analysis. Data & Info. Management: Expertise in MDM requirements. Software & Analytics: Expertise in analysis (both listed entries), Data Mgt. Integration, and DQ Dimensions & Rules. Solution Design: Expertise in solution monitoring & iteration. Stewardship: Expertise in advocacy & support. Randstad Technologies is acting as an Employment Business in relation to this vacancy.
Data Architect Hybrid RCT (South Wales) IntaPeople are proud and excited to be appointed to recruit an experienced Data Architect for a Welsh-based not-for-profit sector client on an exclusive growth project. This is a very exciting opportunity to join their fast-growing Data function in this newly created position. You will be joining the data team as one of the first handful of team members in this area of the business which will work with external partners to build out the organisations data capability offering. As a Data Architect, you will be responsible for designing, building, and maintaining robust, scalable, and secure data pipelines and platform that enable them to make data -driven decisions at a enterprise level. Working closely with the Head of Data Engineering you will help grow out this data function with the recruitment of further data engineering resources whilst working closely with solutions architects and Software Engineers. You will also get the opportunity to progress into a leadership role if this suited the individuals desires and capabilities. You will shape, govern and assure the organisation s data architecture, defining, designing and maintaining strategic data models, standards, flows and governance structures that support organisational goals, ensure compliance, foster collaboration across business areas, and enable the organisation to make data-driven decisions Essential Skills Proven experience as a Senior Data Engineer or Data Architect (or similar/related role). Experience with Enterprise level Data sets. Expertise and practical experience in designing and aligning data models across multiple subject areas, applying recognised patterns and industry standards. Familiarity with structured architectural approaches found in TOGAF (data architecture) or equivalent. Proven experience defining and evolving data governance, including data quality, metadata, lineage, and policy assurance across services. Strong capability in data profiling, source system analysis and identifying links across problem domains to define common, reusable solutions. Experience of communicating technical information and data to a non technical audience and working collaboratively with analysts, architects, and product owners to deliver data solutions that meet user and organisational needs. Ability to lead and mentor other team members. Demonstrable knowledge of data modelling and data warehousing within platforms such as Azure or AWS. Practical experience with Microsoft Azure services, including Azure Data Lake (Gen2), Synapse, Event Hubs, and Cosmos DB, within scalable cloud -based architectures. Robust understanding of data governance, data quality, and metadata management. Desirable skills Experience with Azure Data Factory, Databricks, or Apache Spark, following modern ETL/ELT principles. Experience in using Git, Azure DevOps, or GitHub Actions for version control, CI/CD, and collaborative data delivery. Experience with Big Data. Certification in data architecture or governance frameworks (e.g., TOGAF, DAMA, DCAM, EDMC). Experience of using programming languages such as Python, Scala and SQL Welsh language skills. Key Responsibilities (at a glance): Establish Data strategies and data modelling internally within the data estate Lead the design and oversight of enterprise aligned data models and supporting data architecture, ensuring that all modelling approaches follow organisational standards, recognised patterns, and enable scalable, high quality data flows across services. Provide expert architectural guidance to technical teams delivering cloud based data platforms, ensuring that data integration, modelling, metadata and design decisions align with organisational and enterprise-wide standards Work closely with other business leaders to maintain governance and compliance within their data estate. Work closely with data analysts,data engineering, Enterprise and solution architects, DevOps, and business stakeholders through regular communication and collaborative planning to ensure data solutions are closely aligned with business objectives and effectively meet user needs. Contribute to the development and execution of the Data Strategy by maintaining thorough documentation of data processes, architectures, and workflows to ensure all technical and process information is systematically recorded, updated and data initiatives deliver business value and are aligned with broader technology and organisational goals Research into emerging technologies and upcoming trends Provide oversight to teams building data processing pipelines and integration patterns, ensuring their artefacts are consistent with data architecture principles and metadata strategies. Lead on the introduction of foundational data management capabilities to improve trust, accessibility, and efficiency in an organisation that has limited data management capability, lacks data management practices, including governance, metadata standards, and quality controls. Design, implement, and optimise physical data models that align with pipeline architecture, by using the approach that ensures efficient query performance, scalable storage, and robust integration and delivers adaptable and resource -efficient data processing, meeting the organisation s evolving analytical and operational demands. Managing the aspirations of a variety of stakeholders to enable successful project delivery can be challenging, especially when their priorities may differ or even conflict and require reconciliation to meet business and project needs. What you ll get in return (at a glance) A salary of circa £62,500 - £67,500 (depending on experience) 28 days annual leave + public bank holidays Hybrid working - To be based in their brand new, modern offices 1-2 days per week A flexible working environment Competitive Legal and General pension Scheme (8% employer contribution) 4 x Death in service The opportunity to work on modern and industry changing projects Progression and development opportunities Free Rail travel throughout Wales and discounted throughout the UK Salary sacrifice scheme such as cycle to work, electric vehicle A chance to truly contribute to large scale digitalisation projects within Wales For more information click APPLY now or for a confidential chat call Nathan Handley on (phone number removed). This role is commutable from Swansea, Bridgend, Pontypridd, Cardiff and Newport or surrounding areas.
31/03/2026
Full time
Data Architect Hybrid RCT (South Wales) IntaPeople are proud and excited to be appointed to recruit an experienced Data Architect for a Welsh-based not-for-profit sector client on an exclusive growth project. This is a very exciting opportunity to join their fast-growing Data function in this newly created position. You will be joining the data team as one of the first handful of team members in this area of the business which will work with external partners to build out the organisations data capability offering. As a Data Architect, you will be responsible for designing, building, and maintaining robust, scalable, and secure data pipelines and platform that enable them to make data -driven decisions at a enterprise level. Working closely with the Head of Data Engineering you will help grow out this data function with the recruitment of further data engineering resources whilst working closely with solutions architects and Software Engineers. You will also get the opportunity to progress into a leadership role if this suited the individuals desires and capabilities. You will shape, govern and assure the organisation s data architecture, defining, designing and maintaining strategic data models, standards, flows and governance structures that support organisational goals, ensure compliance, foster collaboration across business areas, and enable the organisation to make data-driven decisions Essential Skills Proven experience as a Senior Data Engineer or Data Architect (or similar/related role). Experience with Enterprise level Data sets. Expertise and practical experience in designing and aligning data models across multiple subject areas, applying recognised patterns and industry standards. Familiarity with structured architectural approaches found in TOGAF (data architecture) or equivalent. Proven experience defining and evolving data governance, including data quality, metadata, lineage, and policy assurance across services. Strong capability in data profiling, source system analysis and identifying links across problem domains to define common, reusable solutions. Experience of communicating technical information and data to a non technical audience and working collaboratively with analysts, architects, and product owners to deliver data solutions that meet user and organisational needs. Ability to lead and mentor other team members. Demonstrable knowledge of data modelling and data warehousing within platforms such as Azure or AWS. Practical experience with Microsoft Azure services, including Azure Data Lake (Gen2), Synapse, Event Hubs, and Cosmos DB, within scalable cloud -based architectures. Robust understanding of data governance, data quality, and metadata management. Desirable skills Experience with Azure Data Factory, Databricks, or Apache Spark, following modern ETL/ELT principles. Experience in using Git, Azure DevOps, or GitHub Actions for version control, CI/CD, and collaborative data delivery. Experience with Big Data. Certification in data architecture or governance frameworks (e.g., TOGAF, DAMA, DCAM, EDMC). Experience of using programming languages such as Python, Scala and SQL Welsh language skills. Key Responsibilities (at a glance): Establish Data strategies and data modelling internally within the data estate Lead the design and oversight of enterprise aligned data models and supporting data architecture, ensuring that all modelling approaches follow organisational standards, recognised patterns, and enable scalable, high quality data flows across services. Provide expert architectural guidance to technical teams delivering cloud based data platforms, ensuring that data integration, modelling, metadata and design decisions align with organisational and enterprise-wide standards Work closely with other business leaders to maintain governance and compliance within their data estate. Work closely with data analysts,data engineering, Enterprise and solution architects, DevOps, and business stakeholders through regular communication and collaborative planning to ensure data solutions are closely aligned with business objectives and effectively meet user needs. Contribute to the development and execution of the Data Strategy by maintaining thorough documentation of data processes, architectures, and workflows to ensure all technical and process information is systematically recorded, updated and data initiatives deliver business value and are aligned with broader technology and organisational goals Research into emerging technologies and upcoming trends Provide oversight to teams building data processing pipelines and integration patterns, ensuring their artefacts are consistent with data architecture principles and metadata strategies. Lead on the introduction of foundational data management capabilities to improve trust, accessibility, and efficiency in an organisation that has limited data management capability, lacks data management practices, including governance, metadata standards, and quality controls. Design, implement, and optimise physical data models that align with pipeline architecture, by using the approach that ensures efficient query performance, scalable storage, and robust integration and delivers adaptable and resource -efficient data processing, meeting the organisation s evolving analytical and operational demands. Managing the aspirations of a variety of stakeholders to enable successful project delivery can be challenging, especially when their priorities may differ or even conflict and require reconciliation to meet business and project needs. What you ll get in return (at a glance) A salary of circa £62,500 - £67,500 (depending on experience) 28 days annual leave + public bank holidays Hybrid working - To be based in their brand new, modern offices 1-2 days per week A flexible working environment Competitive Legal and General pension Scheme (8% employer contribution) 4 x Death in service The opportunity to work on modern and industry changing projects Progression and development opportunities Free Rail travel throughout Wales and discounted throughout the UK Salary sacrifice scheme such as cycle to work, electric vehicle A chance to truly contribute to large scale digitalisation projects within Wales For more information click APPLY now or for a confidential chat call Nathan Handley on (phone number removed). This role is commutable from Swansea, Bridgend, Pontypridd, Cardiff and Newport or surrounding areas.
SC Cleared Data Analyst Based at client locations, working remotely, or based in our Godalming or Milton Keynes offices. Salary up to 55k depending on experience, plus company benefits. Given the nature of the work and timescales, candidates must hold an active SC clearance. About Us Triad Group Plc is an award-winning digital, data, and solutions consultancy with over 35 years' experience primarily serving the UK public sector and central government. We deliver high-quality solutions that make a real difference to users, citizens and consumers. At Triad, collaboration thrives, knowledge is shared, and every voice matters. Our close-knit, supportive culture ensures you're valued from day one. Whether working with cutting-edge tech or shaping strategy for national-scale projects, you'll be trusted, challenged, and empowered to grow. We nurture learning through communities of practice and encourage creativity, autonomy, and innovation. If you're passionate about solving meaningful problems with smart and passionate people, Triad could be the place for you. Glassdoor score of 4.7 96% of our staff would recommend Triad to a friend 100% CEO approval See for yourself some of the work that makes us all so proud: Helping law enforcement with secure intelligence systems that keep the UK safe Supporting the UK's national meteorological service in leveraging supercomputers for next-level weather forecasting Assisting the British government department that is responsible for the safety of consumer products, with systems to track unsafe products Powering systems that help the government monitor and reduce greenhouse gas emissions from commercial transport Role Summary Triad is seeking a Data Analyst to support client engagements involving complex data environments and high volumes of data requests. In this role, you will analyse, document, and manage data assets, mappings, and requests while creating artefacts that enable effective understanding and dissemination of information across teams and stakeholders. Working collaboratively with technical specialists, delivery teams, and client stakeholders, you will help transform complex data structures into clear, structured outputs such as mapping documentation, data dictionaries, and reporting artefacts. Your work will support improved data transparency, accessibility, and governance across client systems and services. Key Responsibilities Manage and track large volumes of incoming data requests, ensuring they are logged, prioritised, and resolved efficiently. Analyse and maintain mappings between systems and datasets, ensuring accuracy, traceability, and alignment with business requirements. Produce clear and structured artefacts including data dictionaries, mapping documents, metadata documentation, and data flow diagrams. Translate complex technical data structures into accessible documentation for both technical and non-technical stakeholders. Work closely with delivery teams, engineers, and client stakeholders to understand data requirements and support informed decision-making. Create reports, visualisations, and supporting materials that enable the effective sharing and interpretation of data across teams. Support data governance initiatives by ensuring documentation and data artefacts remain accurate, current, and aligned with system changes. Skills and Experience Experience analysing and working with complex datasets within enterprise or government environments. Strong analytical and problem-solving skills with the ability to interpret and structure large volumes of data. Experience producing data documentation such as data dictionaries, mapping documents, or metadata artefacts. Strong stakeholder engagement skills with the ability to communicate complex data concepts clearly. Experience using data analysis and visualisation tools such as SQL, Excel, Power BI, Tableau, or similar technologies. Understanding of data management principles including data lineage, metadata, and data governance. Experience supporting delivery teams within Agile or digital service environments. Qualifications & Certifications A degree or equivalent qualification related to the area you work in - Desirable Due to the nature of this position, you must be willing and eligible to achieve a minimum of SC clearance. To be eligible, you must have been a resident in the UK for a minimum of 5 years and have the right to work in the UK. Triads Commitment To You As a growing and ambitious company, Triad prioritises your development and well-being: Continuous Training & Development: Access to top-rated Udemy Business courses. Work Environment: Collaborative, creative, and free from discrimination. Benefits: 25 days of annual leave, plus bank holidays. Matched pension contributions (5%). Private healthcare with Bupa Gym membership support or Lakeshore Fitness access. Perkbox membership. Cycle-to-work scheme. What Our Colleagues Have to Say Please see for yourself on Glass Door and our "Day in the Life" videos at the bottom of our Careers Page. Our Selection Process After applying for the role, our in-house talent team will contact you to discuss Triad and the position. If shortlisted, you will be invited for: 1. An interview with our Data team, including a career review and cultural fit assessment. 2. An interview with our management team We aim to complete interviews and progress candidates to offer stage within 2-3 weeks of the initial conversation. Other Information If this role is of interest to you or you would like further information, please submit your application now! Triad is an equal opportunities employer and welcomes applications from all suitably qualified people regardless of sex, race, disability, age, sexual orientation, gender reassignment, religion, or belief. We are proud that our recruitment process has been recognised as inclusive and accessible to disabled people who meet the minimum criteria for any role. We are a signatory on the Tech Talent Charter that aims to bring industries and organisations together to drive greater inclusion and diversity in technology roles, in addition, as a Disability Confident Leader.
31/03/2026
Full time
SC Cleared Data Analyst Based at client locations, working remotely, or based in our Godalming or Milton Keynes offices. Salary up to 55k depending on experience, plus company benefits. Given the nature of the work and timescales, candidates must hold an active SC clearance. About Us Triad Group Plc is an award-winning digital, data, and solutions consultancy with over 35 years' experience primarily serving the UK public sector and central government. We deliver high-quality solutions that make a real difference to users, citizens and consumers. At Triad, collaboration thrives, knowledge is shared, and every voice matters. Our close-knit, supportive culture ensures you're valued from day one. Whether working with cutting-edge tech or shaping strategy for national-scale projects, you'll be trusted, challenged, and empowered to grow. We nurture learning through communities of practice and encourage creativity, autonomy, and innovation. If you're passionate about solving meaningful problems with smart and passionate people, Triad could be the place for you. Glassdoor score of 4.7 96% of our staff would recommend Triad to a friend 100% CEO approval See for yourself some of the work that makes us all so proud: Helping law enforcement with secure intelligence systems that keep the UK safe Supporting the UK's national meteorological service in leveraging supercomputers for next-level weather forecasting Assisting the British government department that is responsible for the safety of consumer products, with systems to track unsafe products Powering systems that help the government monitor and reduce greenhouse gas emissions from commercial transport Role Summary Triad is seeking a Data Analyst to support client engagements involving complex data environments and high volumes of data requests. In this role, you will analyse, document, and manage data assets, mappings, and requests while creating artefacts that enable effective understanding and dissemination of information across teams and stakeholders. Working collaboratively with technical specialists, delivery teams, and client stakeholders, you will help transform complex data structures into clear, structured outputs such as mapping documentation, data dictionaries, and reporting artefacts. Your work will support improved data transparency, accessibility, and governance across client systems and services. Key Responsibilities Manage and track large volumes of incoming data requests, ensuring they are logged, prioritised, and resolved efficiently. Analyse and maintain mappings between systems and datasets, ensuring accuracy, traceability, and alignment with business requirements. Produce clear and structured artefacts including data dictionaries, mapping documents, metadata documentation, and data flow diagrams. Translate complex technical data structures into accessible documentation for both technical and non-technical stakeholders. Work closely with delivery teams, engineers, and client stakeholders to understand data requirements and support informed decision-making. Create reports, visualisations, and supporting materials that enable the effective sharing and interpretation of data across teams. Support data governance initiatives by ensuring documentation and data artefacts remain accurate, current, and aligned with system changes. Skills and Experience Experience analysing and working with complex datasets within enterprise or government environments. Strong analytical and problem-solving skills with the ability to interpret and structure large volumes of data. Experience producing data documentation such as data dictionaries, mapping documents, or metadata artefacts. Strong stakeholder engagement skills with the ability to communicate complex data concepts clearly. Experience using data analysis and visualisation tools such as SQL, Excel, Power BI, Tableau, or similar technologies. Understanding of data management principles including data lineage, metadata, and data governance. Experience supporting delivery teams within Agile or digital service environments. Qualifications & Certifications A degree or equivalent qualification related to the area you work in - Desirable Due to the nature of this position, you must be willing and eligible to achieve a minimum of SC clearance. To be eligible, you must have been a resident in the UK for a minimum of 5 years and have the right to work in the UK. Triads Commitment To You As a growing and ambitious company, Triad prioritises your development and well-being: Continuous Training & Development: Access to top-rated Udemy Business courses. Work Environment: Collaborative, creative, and free from discrimination. Benefits: 25 days of annual leave, plus bank holidays. Matched pension contributions (5%). Private healthcare with Bupa Gym membership support or Lakeshore Fitness access. Perkbox membership. Cycle-to-work scheme. What Our Colleagues Have to Say Please see for yourself on Glass Door and our "Day in the Life" videos at the bottom of our Careers Page. Our Selection Process After applying for the role, our in-house talent team will contact you to discuss Triad and the position. If shortlisted, you will be invited for: 1. An interview with our Data team, including a career review and cultural fit assessment. 2. An interview with our management team We aim to complete interviews and progress candidates to offer stage within 2-3 weeks of the initial conversation. Other Information If this role is of interest to you or you would like further information, please submit your application now! Triad is an equal opportunities employer and welcomes applications from all suitably qualified people regardless of sex, race, disability, age, sexual orientation, gender reassignment, religion, or belief. We are proud that our recruitment process has been recognised as inclusive and accessible to disabled people who meet the minimum criteria for any role. We are a signatory on the Tech Talent Charter that aims to bring industries and organisations together to drive greater inclusion and diversity in technology roles, in addition, as a Disability Confident Leader.
Randstad Technologies Recruitment
City, Manchester
Senior Data Manager SAP Rise Program (Contract,Manchester, hybrid) Are you a technical data expert with a passion for high-stakes business transformation? We are seeking an experienced Senior Data Manager to join a critical global initiative focused on modernizing enterprise architecture through the SAP Rise migration program. In this role, you will play a pivotal part in transitioning from legacy systems to a standardized cloud-based ERP ecosystem. You will work at the intersection of technical systems and business insight to ensure data remains high-quality, compliant, and architecturally optimized during this large-scale migration. The Role Data Governance & Quality: Drive a "Data Quality mindset" by monitoring systems, detecting issues, and executing end-to-end data corrections and standardizations. Migration Strategy: Support the introduction of interim architectures to test data contracts, quality governance, and new capabilities as we move toward a modern "North Star" data flow. Collaboration: Partner with Data Engineers, Scientists, and Business Analysts to build essential deliverables for the SAP Rise program. Strategic Oversight: Manage metadata, Master Data Management (MDM) integrations, and regulatory compliance while contributing to data management strategy and policies. Key Competencies Technical Expertise: Advanced skills in root cause analysis, MDM requirements, DQ Dimensions & Rules, and solution monitoring. Data Management: Proven ability in data lifecycle management, risk identification, and metadata management. Software & Analytics: Independent proficiency in coding, data visualization, and the adoption of data management tools. Project Leadership: Experienced in change management, stakeholder engagement, and delivering complex data quality implementations. AI & Ethics: Knowledgeable in ethical data handling, responsible AI, and compliance. Ready to help steer one of the most significant data migrations in the travel tech industry? Please apply here or share your CV to (url removed) Randstad Technologies is acting as an Employment Business in relation to this vacancy.
29/03/2026
Contractor
Senior Data Manager SAP Rise Program (Contract,Manchester, hybrid) Are you a technical data expert with a passion for high-stakes business transformation? We are seeking an experienced Senior Data Manager to join a critical global initiative focused on modernizing enterprise architecture through the SAP Rise migration program. In this role, you will play a pivotal part in transitioning from legacy systems to a standardized cloud-based ERP ecosystem. You will work at the intersection of technical systems and business insight to ensure data remains high-quality, compliant, and architecturally optimized during this large-scale migration. The Role Data Governance & Quality: Drive a "Data Quality mindset" by monitoring systems, detecting issues, and executing end-to-end data corrections and standardizations. Migration Strategy: Support the introduction of interim architectures to test data contracts, quality governance, and new capabilities as we move toward a modern "North Star" data flow. Collaboration: Partner with Data Engineers, Scientists, and Business Analysts to build essential deliverables for the SAP Rise program. Strategic Oversight: Manage metadata, Master Data Management (MDM) integrations, and regulatory compliance while contributing to data management strategy and policies. Key Competencies Technical Expertise: Advanced skills in root cause analysis, MDM requirements, DQ Dimensions & Rules, and solution monitoring. Data Management: Proven ability in data lifecycle management, risk identification, and metadata management. Software & Analytics: Independent proficiency in coding, data visualization, and the adoption of data management tools. Project Leadership: Experienced in change management, stakeholder engagement, and delivering complex data quality implementations. AI & Ethics: Knowledgeable in ethical data handling, responsible AI, and compliance. Ready to help steer one of the most significant data migrations in the travel tech industry? Please apply here or share your CV to (url removed) Randstad Technologies is acting as an Employment Business in relation to this vacancy.
Randstad Technologies Recruitment
City, Manchester
Job Title - Senior Data Manager Location - Manchester, UK Type - Contract Job Description: About the role The Trips Data Governance team manages data for all aspects of trips, excluding stays, enabling teams to understand and improve business performance and customer experience. They are currently seeking an experienced Data Manager to join their impactful Data Governance team to support the critical SAP Rise migration program. About the SAP Rise program: The programme is a key business modernisation initiative to migrate the current Booking Transport (BTL) SAP instance to a standardised cloud-based ERP system. The programme will introduce an interim architecture between the current platform and the SAP S/4 RISE ecosystem to test processes, data quality and governance, data contracts, and new capabilities, while supporting the transition to the North Star architecture where data flows directly from a modernised enterprise order platform. Role Overview: In this role, a Senior Data Manager combines technical knowledge, business insight, and expert communication to provide critical information about data systems. This position focuses on supporting business needs with high-quality data through monitoring, issue detection, impact quantification, end-to-end data corrections, standardization, and architectural optimization. A key aspect of the role is to advocate for a Data Quality mindset across the organization. You will report to a Senior Manager and collaborate with other Data Managers on strategic objectives for data quality, governance, metadata management and regulatory compliance. This is a hands-on role where you will work closely with Business Analysts, Data Engineers, Data Scientists, and Insights Analysts to build deliverables required for the SAP Rise program. Roles & Responsibilities As a Senior Data Manager I (Level G), your required competencies include: Independence in: AI & Ethics: Independent in ethical data handling, responsible AI, compliance, knowledge & prompt engineering, and AI application in Data Management. Change & Project Mgt.: Independent in change management, planning, monitoring & delivery, stakeholder management, and DQ implementation. Critical Thinking: Independent in decision making, and DQ - investigate & resolve. Data & Info. Management: Independent in MDM integrations, MDM policies, document & content lifecycle, document & content classification, data risk identification, data risk decision, data risk mitigation, metadata management, and data lifecycle management. Effective Communication: Independent in communication basics, cross-cultural relationships, and tailored messaging & motivational communication. Privacy & Security: Independent in high-pressure communication, regulatory knowledge, vulnerability & mitigation, and compliance-by-design. Software & Analytics: Independent in coding, visualisation, and Data Mgt. Adoption. Stewardship: Independent in identification & training. Strategy & Policy: Independent in data management strategy, policies, standards & playbooks, and maturity model & assessment. Solution Design: Independent in solution requirements. Expertise Critical Thinking: Expertise in root cause analysis. Data & Info. Management: Expertise in MDM requirements. Software & Analytics: Expertise in analysis (both listed entries), Data Mgt. Integration, and DQ Dimensions & Rules. Solution Design: Expertise in solution monitoring & iteration. Stewardship: Expertise in advocacy & support. Randstad Technologies is acting as an Employment Business in relation to this vacancy.
27/03/2026
Contractor
Job Title - Senior Data Manager Location - Manchester, UK Type - Contract Job Description: About the role The Trips Data Governance team manages data for all aspects of trips, excluding stays, enabling teams to understand and improve business performance and customer experience. They are currently seeking an experienced Data Manager to join their impactful Data Governance team to support the critical SAP Rise migration program. About the SAP Rise program: The programme is a key business modernisation initiative to migrate the current Booking Transport (BTL) SAP instance to a standardised cloud-based ERP system. The programme will introduce an interim architecture between the current platform and the SAP S/4 RISE ecosystem to test processes, data quality and governance, data contracts, and new capabilities, while supporting the transition to the North Star architecture where data flows directly from a modernised enterprise order platform. Role Overview: In this role, a Senior Data Manager combines technical knowledge, business insight, and expert communication to provide critical information about data systems. This position focuses on supporting business needs with high-quality data through monitoring, issue detection, impact quantification, end-to-end data corrections, standardization, and architectural optimization. A key aspect of the role is to advocate for a Data Quality mindset across the organization. You will report to a Senior Manager and collaborate with other Data Managers on strategic objectives for data quality, governance, metadata management and regulatory compliance. This is a hands-on role where you will work closely with Business Analysts, Data Engineers, Data Scientists, and Insights Analysts to build deliverables required for the SAP Rise program. Roles & Responsibilities As a Senior Data Manager I (Level G), your required competencies include: Independence in: AI & Ethics: Independent in ethical data handling, responsible AI, compliance, knowledge & prompt engineering, and AI application in Data Management. Change & Project Mgt.: Independent in change management, planning, monitoring & delivery, stakeholder management, and DQ implementation. Critical Thinking: Independent in decision making, and DQ - investigate & resolve. Data & Info. Management: Independent in MDM integrations, MDM policies, document & content lifecycle, document & content classification, data risk identification, data risk decision, data risk mitigation, metadata management, and data lifecycle management. Effective Communication: Independent in communication basics, cross-cultural relationships, and tailored messaging & motivational communication. Privacy & Security: Independent in high-pressure communication, regulatory knowledge, vulnerability & mitigation, and compliance-by-design. Software & Analytics: Independent in coding, visualisation, and Data Mgt. Adoption. Stewardship: Independent in identification & training. Strategy & Policy: Independent in data management strategy, policies, standards & playbooks, and maturity model & assessment. Solution Design: Independent in solution requirements. Expertise Critical Thinking: Expertise in root cause analysis. Data & Info. Management: Expertise in MDM requirements. Software & Analytics: Expertise in analysis (both listed entries), Data Mgt. Integration, and DQ Dimensions & Rules. Solution Design: Expertise in solution monitoring & iteration. Stewardship: Expertise in advocacy & support. Randstad Technologies is acting as an Employment Business in relation to this vacancy.
Data Architect Location: Hybrid working (minimum 2 days a week in Staffordshire Head Office) Type: Full-time Salary: Up to £70,000 Are you a visionary Data Architect ready to shape the future of enterprise data and AI ? I have an amazing opportunity to join my national market leading client to lead the design and delivery of scalable, secure, and high-performing data solutions that power transformation across a complex, data-rich organisation. What you'll be doing As a Data Architect, you'll be at the heart of enterprise-wide change programmes, working closely with the Lead Data Architect to: Design and deliver enterprise Data & AI architecture aligned with strategic goals. Develop conceptual, logical, and physical data models for both operational and analytical use cases. Define data architecture for solutions involving ETL , data integration, and migration. Establish and maintain data architecture assets, including standards, policies, and integration patterns. Collaborate with Data Stewards, Analysts, and SMEs to define and ratify common reference data and hierarchies. Ensure alignment with enterprise data models and governance frameworks. Support physical schema implementation in data platforms, ensuring consistency and performance. Document solutions including data models, configurations, and architecture decisions. Provide quality assurance across development activity and contribute to architectural standards. Stay ahead of the curve by researching emerging trends in Data & AI and identifying opportunities to drive innovation. What you'll bring Proven experience in data architecture, data modelling, and enterprise data strategy. Strong understanding of data governance, data quality, and metadata management. Experience designing data solutions across operational and analytical systems. Excellent stakeholder engagement and communication skills. A passion for innovation and continuous improvement in the data space. Tech you'll work with Cloud & Data Platforms: Azure Synapse, Azure Data Lake, Azure Data Factory Data Modelling & Integration: SQL, ETL tools, data pipelines Architecture & Governance: Enterprise data models, data catalogues, metadata management Collaboration & Documentation: Agile delivery, architecture documentation, stakeholder workshops Why This Role? You'll be joining a forward-thinking team where data is central to decision-making and innovation. This is your chance to influence enterprise architecture, work on high-impact projects, and help shape a data-driven future. If you feel you are a great match, then please apply with an up-to-date CV Studies suggest that women tend not to apply for a job if their CV isn't a perfect fit. Here, talent takes precedence over experience. So, if you like the role and think you could be awesome at it in time, go ahead and apply. My client is unable to sponsor so please only apply if you can work full time without any restrictions
01/10/2025
Full time
Data Architect Location: Hybrid working (minimum 2 days a week in Staffordshire Head Office) Type: Full-time Salary: Up to £70,000 Are you a visionary Data Architect ready to shape the future of enterprise data and AI ? I have an amazing opportunity to join my national market leading client to lead the design and delivery of scalable, secure, and high-performing data solutions that power transformation across a complex, data-rich organisation. What you'll be doing As a Data Architect, you'll be at the heart of enterprise-wide change programmes, working closely with the Lead Data Architect to: Design and deliver enterprise Data & AI architecture aligned with strategic goals. Develop conceptual, logical, and physical data models for both operational and analytical use cases. Define data architecture for solutions involving ETL , data integration, and migration. Establish and maintain data architecture assets, including standards, policies, and integration patterns. Collaborate with Data Stewards, Analysts, and SMEs to define and ratify common reference data and hierarchies. Ensure alignment with enterprise data models and governance frameworks. Support physical schema implementation in data platforms, ensuring consistency and performance. Document solutions including data models, configurations, and architecture decisions. Provide quality assurance across development activity and contribute to architectural standards. Stay ahead of the curve by researching emerging trends in Data & AI and identifying opportunities to drive innovation. What you'll bring Proven experience in data architecture, data modelling, and enterprise data strategy. Strong understanding of data governance, data quality, and metadata management. Experience designing data solutions across operational and analytical systems. Excellent stakeholder engagement and communication skills. A passion for innovation and continuous improvement in the data space. Tech you'll work with Cloud & Data Platforms: Azure Synapse, Azure Data Lake, Azure Data Factory Data Modelling & Integration: SQL, ETL tools, data pipelines Architecture & Governance: Enterprise data models, data catalogues, metadata management Collaboration & Documentation: Agile delivery, architecture documentation, stakeholder workshops Why This Role? You'll be joining a forward-thinking team where data is central to decision-making and innovation. This is your chance to influence enterprise architecture, work on high-impact projects, and help shape a data-driven future. If you feel you are a great match, then please apply with an up-to-date CV Studies suggest that women tend not to apply for a job if their CV isn't a perfect fit. Here, talent takes precedence over experience. So, if you like the role and think you could be awesome at it in time, go ahead and apply. My client is unable to sponsor so please only apply if you can work full time without any restrictions
Locations: Boston London Atlanta Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures-and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do Join the Data Layer Team, a global portfolio transforming our organization into a data-driven enterprise! The Data Layer Team is a portfolio of 30 people who build essential data platforms, products, and capabilities to empower our clients and colleagues with high-quality, actionable insights. Our focus is on creating scalable data solutions and advancing our data infrastructure to drive informed decision-making across the company. As Use Case Enablement Product Analyst within BCG's Data Layer Team, you will collaborate with Use Case Enablement Product Owner and cross-functional teams to gather and analyze business and data requirements. Your role is critical to bridging the gap between business stakeholders and technical teams, ensuring that new GenAI use cases are well-scoped, feasible, and aligned with user needs. You will work with various Gen AI use cases and applications, including: Consultant Journey - internal GenAI assistants that changes the way consultants work to provide value to our clients. Practice Area GenAI applications - developed by functional practice areas to support various capabilities (e.g., outside-in rapid cost diagnostics or Accelerated Cost Analysis). Data catalog - a centralized library that provides consulting teams with access to critical tools and data assets across BCG These tools require ingestion of multiple data sources, and your role will be to support the selection of eligible datasets and identify the best sources for each GenAI use case. You will ensure that these use cases and applications are equipped with the necessary data pipelines to maximize their impact on business and users. You will play a key role in use case discovery and requirements refinement, while also managing the continuous maintenance and enhancement of data asset quality, accuracy, and stability to support evolving use cases. Detailed responsibilities include: Deliver business results and customer value Support the development of GenAI-enabled data products by helping translate business needs into actionable data requirements Help to define requirements for user stories and structure the backlog with a focus on measurable outcomes Help shape GenAI-enabled use cases that contribute to real business impact, through thoughtful prioritization and attention to detail Participate in evaluating use case success metrics and learn from what works (and what doesn't) Serve as the voice of customer or end-user Translate business needs into user stories, engaging end users for continuous feedback Engaging in continuous data discovery exercises to understand most valuable data assets that satisfy customer needs Balance customer value, technical feasibility, and business impact when making prioritization decisions Work with product teams to integrate GenAI-enhanced offerings into BCG systems and workflows Deliver high-quality outcomes Collaborate with engineers, architects, and product teammates to test and validate data pipelines - ensuring solutions are robust, accurate, and useful Contribute to documentation that helps others understand the "why" and "how" behind what's been built, supporting long-term scalability and reuse Work with stakeholders across BCG (e.g., Practice Areas, Knowledge Teams) to ensure data products are grounded in real needs and enable meaningful use Share observations, risks, or open questions early-your input helps the team avoid missteps and refine solutions before they reach users YOU'RE GOOD AT Being user-focused - Deeply understanding and translating business needs into GenAI-enabled solutions, ensuring offerings address real user challenges Communicating with transparency - Clearly and openly engaging with stakeholders at all levels, ensuring alignment, visibility, and trust across teams Bringing a data-driven approach to decision-making - Leveraging qualitative and quantitative insights to prioritize initiatives, measure impact, and refine solutions Facilitating data discovery sessions - Engaging business stakeholders to capture business context, user intent, and data solution objectives Breaking down complex challenges - Applying critical reasoning and creative problem-solving to analyze problem statements and design effective, scalable solutions Collaborating with product and technical teams - Working closely with POs, engineers, and data stewards to ensure solutions meet expectations and constraints Collaborating with development teams - Ensuring prioritized data sources align with GenAI solution requirements, business objectives, and technical feasibility Defining and tracking KPIs - Establishing measurable success metrics to drive squad performance and ensure data products align with OKRs Documenting thoughtfully - Creating simple, clear artifacts (e.g., data definitions, flow diagrams, test plans) that others can build from Contributing to continuous improvement - Bringing curiosity and a mindset of learning, always looking for ways to improve how the team works or deliver What You'll Bring 4-6+ years of experience in a product analyst, business analyst, or data analyst role -ideally supporting data or AI-related projects Project management skills, with ability to build project plans, track progress, drive alignment and manage risks Proven experience in AI, GenAI, or data product development, preferably with a focus on GenAI powered user-facing applications Experience in enterprise software development, data engineering, or AI-driven transformation initiatives Experience working with structured and unstructured data, and familiarity with modern data platforms (e.g., Snowflake, AWS, SharePoint) A working knowledge of agile ways of working, and openness to learning through iteration and feedback Understanding of enterprise data governance, AI model integration, and scalable data architecture Familiarity with AI/ML technologies, including GenAI models (e.g., OpenAI GPT, RAG, fine-tuning models, or machine learning frameworks) Good communication skills, especially when collaborating across different functions or surfacing potential risks or questions Familiarity with tools like JIRA, Confluence, Excel, or lightweight data catalog platforms is a plus Experience in a consulting or client-service environment is helpful, but not required Who You'll Work With BCG Global Consulting Practice Areas (Functional & Industry) and Data Teams - Partnering with business leaders to transform prioritized offerings into GenAI-enabled solutions, collaborating with teams such as the Data Product Portfolio, Data Governance CoE, Master Data Management, Enterprise Architecture, and Data Product Development Data Layer Offer Enablement Product Owner Lead (PO) - Aligning on strategic priorities, roadmap development, and execution Data Layer Offer Enablement Team - Collaborating amongst Data Product Analyst, and working alongside data engineers, lead architects, data stewards, and QA engineers Data Layer Data Governance Team - Partnering to ensure that data assets meet quality, metadata, and compliance standards, and are appropriately catalogued for reuse Product Teams - Collaborating with BCG's product team members to integrate required data sources into GenAI-enhanced offerings Agile Coaches - Embedding agile principles into daily work, leveraging coaching support to drive an iterative and user-focused approach to GenAI use case development Data Product Consumers (Internal Customers) - Translating their voice and needs into user stories, ensuring their requirements are reflected in the backlog, and actively engaging them for feedback and validation Additional info In the US, we have a compensation transparency approach. Total compensation for this role includes base salary, annual discretionary performance bonus, retirement contribution, and a market leading benefits package described below. The base salary range for this role in Boston is $102,000.00 - $137,333.33 This is an estimated range, however, specific base salaries within the range depend on various factors such as experience and skill set. It is not common for new BCG employees to be hired at the high-end of the salary range. BCG regularly reviews its ranges to ensure market competitiveness. In addition to your base salary, your total compensation will include a bonus of up to 12% and a generous retirement contribution that starts at 5% and moves to 10% after 2 years. . click apply for full job details
01/10/2025
Full time
Locations: Boston London Atlanta Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures-and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do Join the Data Layer Team, a global portfolio transforming our organization into a data-driven enterprise! The Data Layer Team is a portfolio of 30 people who build essential data platforms, products, and capabilities to empower our clients and colleagues with high-quality, actionable insights. Our focus is on creating scalable data solutions and advancing our data infrastructure to drive informed decision-making across the company. As Use Case Enablement Product Analyst within BCG's Data Layer Team, you will collaborate with Use Case Enablement Product Owner and cross-functional teams to gather and analyze business and data requirements. Your role is critical to bridging the gap between business stakeholders and technical teams, ensuring that new GenAI use cases are well-scoped, feasible, and aligned with user needs. You will work with various Gen AI use cases and applications, including: Consultant Journey - internal GenAI assistants that changes the way consultants work to provide value to our clients. Practice Area GenAI applications - developed by functional practice areas to support various capabilities (e.g., outside-in rapid cost diagnostics or Accelerated Cost Analysis). Data catalog - a centralized library that provides consulting teams with access to critical tools and data assets across BCG These tools require ingestion of multiple data sources, and your role will be to support the selection of eligible datasets and identify the best sources for each GenAI use case. You will ensure that these use cases and applications are equipped with the necessary data pipelines to maximize their impact on business and users. You will play a key role in use case discovery and requirements refinement, while also managing the continuous maintenance and enhancement of data asset quality, accuracy, and stability to support evolving use cases. Detailed responsibilities include: Deliver business results and customer value Support the development of GenAI-enabled data products by helping translate business needs into actionable data requirements Help to define requirements for user stories and structure the backlog with a focus on measurable outcomes Help shape GenAI-enabled use cases that contribute to real business impact, through thoughtful prioritization and attention to detail Participate in evaluating use case success metrics and learn from what works (and what doesn't) Serve as the voice of customer or end-user Translate business needs into user stories, engaging end users for continuous feedback Engaging in continuous data discovery exercises to understand most valuable data assets that satisfy customer needs Balance customer value, technical feasibility, and business impact when making prioritization decisions Work with product teams to integrate GenAI-enhanced offerings into BCG systems and workflows Deliver high-quality outcomes Collaborate with engineers, architects, and product teammates to test and validate data pipelines - ensuring solutions are robust, accurate, and useful Contribute to documentation that helps others understand the "why" and "how" behind what's been built, supporting long-term scalability and reuse Work with stakeholders across BCG (e.g., Practice Areas, Knowledge Teams) to ensure data products are grounded in real needs and enable meaningful use Share observations, risks, or open questions early-your input helps the team avoid missteps and refine solutions before they reach users YOU'RE GOOD AT Being user-focused - Deeply understanding and translating business needs into GenAI-enabled solutions, ensuring offerings address real user challenges Communicating with transparency - Clearly and openly engaging with stakeholders at all levels, ensuring alignment, visibility, and trust across teams Bringing a data-driven approach to decision-making - Leveraging qualitative and quantitative insights to prioritize initiatives, measure impact, and refine solutions Facilitating data discovery sessions - Engaging business stakeholders to capture business context, user intent, and data solution objectives Breaking down complex challenges - Applying critical reasoning and creative problem-solving to analyze problem statements and design effective, scalable solutions Collaborating with product and technical teams - Working closely with POs, engineers, and data stewards to ensure solutions meet expectations and constraints Collaborating with development teams - Ensuring prioritized data sources align with GenAI solution requirements, business objectives, and technical feasibility Defining and tracking KPIs - Establishing measurable success metrics to drive squad performance and ensure data products align with OKRs Documenting thoughtfully - Creating simple, clear artifacts (e.g., data definitions, flow diagrams, test plans) that others can build from Contributing to continuous improvement - Bringing curiosity and a mindset of learning, always looking for ways to improve how the team works or deliver What You'll Bring 4-6+ years of experience in a product analyst, business analyst, or data analyst role -ideally supporting data or AI-related projects Project management skills, with ability to build project plans, track progress, drive alignment and manage risks Proven experience in AI, GenAI, or data product development, preferably with a focus on GenAI powered user-facing applications Experience in enterprise software development, data engineering, or AI-driven transformation initiatives Experience working with structured and unstructured data, and familiarity with modern data platforms (e.g., Snowflake, AWS, SharePoint) A working knowledge of agile ways of working, and openness to learning through iteration and feedback Understanding of enterprise data governance, AI model integration, and scalable data architecture Familiarity with AI/ML technologies, including GenAI models (e.g., OpenAI GPT, RAG, fine-tuning models, or machine learning frameworks) Good communication skills, especially when collaborating across different functions or surfacing potential risks or questions Familiarity with tools like JIRA, Confluence, Excel, or lightweight data catalog platforms is a plus Experience in a consulting or client-service environment is helpful, but not required Who You'll Work With BCG Global Consulting Practice Areas (Functional & Industry) and Data Teams - Partnering with business leaders to transform prioritized offerings into GenAI-enabled solutions, collaborating with teams such as the Data Product Portfolio, Data Governance CoE, Master Data Management, Enterprise Architecture, and Data Product Development Data Layer Offer Enablement Product Owner Lead (PO) - Aligning on strategic priorities, roadmap development, and execution Data Layer Offer Enablement Team - Collaborating amongst Data Product Analyst, and working alongside data engineers, lead architects, data stewards, and QA engineers Data Layer Data Governance Team - Partnering to ensure that data assets meet quality, metadata, and compliance standards, and are appropriately catalogued for reuse Product Teams - Collaborating with BCG's product team members to integrate required data sources into GenAI-enhanced offerings Agile Coaches - Embedding agile principles into daily work, leveraging coaching support to drive an iterative and user-focused approach to GenAI use case development Data Product Consumers (Internal Customers) - Translating their voice and needs into user stories, ensuring their requirements are reflected in the backlog, and actively engaging them for feedback and validation Additional info In the US, we have a compensation transparency approach. Total compensation for this role includes base salary, annual discretionary performance bonus, retirement contribution, and a market leading benefits package described below. The base salary range for this role in Boston is $102,000.00 - $137,333.33 This is an estimated range, however, specific base salaries within the range depend on various factors such as experience and skill set. It is not common for new BCG employees to be hired at the high-end of the salary range. BCG regularly reviews its ranges to ensure market competitiveness. In addition to your base salary, your total compensation will include a bonus of up to 12% and a generous retirement contribution that starts at 5% and moves to 10% after 2 years. . click apply for full job details
Data Migration Developers x 2
Salary: £45,000 - £50,000 + company benefits
Location - Hybrid (2-3 days per week in one of the following offices Bristol, Leicester or Plymouth)
Full time/permanent vacancy
You must be a sole British Nationals and be able to obtain BPSS/SC and NPPV clearance.
JOB PURPOSE
Working with the Business Object Data Services (BODS) Extract, Transform, Load (ETL) toolset to design, develop and support end-to-end data migration processes associated with both SAP and non SAP based programs.
Design and implement data quality and data cleansing requirements and methodologies.
Manage support, tools, and maintenance of integration processes.
Interact and collaborate with other data team members and business analysts.
Maintain and Support Corporate ETL & Data Migration Solutions in both SAP & Microsoft platforms.
Understanding of SAP data structures & tables and loading data to SAP using LSMW and BODS directly.
KEY TASKS
Design and develop SAP BODS jobs for data conversion and data integration to and from SAP and other sources.
Maintain and enhance the existing toolset to ensure delivery of maximum value as the project evolves.
Prepare Excel spreadsheets (Data Collection Workbooks) using SAP load formats (iDoc/BAPI/ABAP Programs) as load templates for data analysts to populate.
Identify and suggest existing or new emerging standards and best practices.
Manage and monitor job schedules and provide fix for any failed schedules/jobs.
Data cleansing, modeling (physical and logical), profiling, enterprise data architecting, data quality and data governance
Collaborate with technical team to maintain BODS server architecture, data governance and end-end processes.
Performance tuning of BODS ETL and data models.
Use the local repository metadata to generate reports for input into program and management reporting cycles.
Move the projects from DEV to SIT, SIT to UAT and UAT to PRD.
Schedule transformation jobs in Management Console to produce data load files in text format for loading via LSMW or directly into SAP tables and structures.
Use Business Objects Data Integrator (BODI) to create projects, batch jobs, workflows and dataflows.
Essential
Data modeling and data architecture skills using BODS ETL toolset
Expertise in SAP Data Extraction process BODS Jobs development.
Evidence of working in a challenging and complex organisation and demonstrable experience of contributing to a technical change.
Good experience of understanding and writing Data Stage ETL specifications and delivery of technical solutions to an agreed standard.
Experience delivering solutions to an agreed standard using industry standard methodologies.
Proven experience working in a small team delivering technical solutions to project requirements.
Bachelor's Degree in an Information Systems Field or preferably at least 3 years plus experience designing and/or delivering BODS ETL solutions/programs.Desirable
Business Objects Data Services certifications (highly desirable).
Experience in one or more SAP full lifecycle implementations using SAP BODS ETL toolset.
Knowledge of Master Data Management and SAP MDM/MDG products.
Experience in SAP functional areas such as Finance, Cost Controlling, Supply Chain, Contract Management and Plant Maintenance.--- Fusion People are committed to promoting equal opportunities to people regardless of age, gender, religion, belief, race, sexuality or disability. We operate as an employment agency and employment business. You'll find a wide selection of vacancies on our website
01/06/2025
Data Migration Developers x 2
Salary: £45,000 - £50,000 + company benefits
Location - Hybrid (2-3 days per week in one of the following offices Bristol, Leicester or Plymouth)
Full time/permanent vacancy
You must be a sole British Nationals and be able to obtain BPSS/SC and NPPV clearance.
JOB PURPOSE
Working with the Business Object Data Services (BODS) Extract, Transform, Load (ETL) toolset to design, develop and support end-to-end data migration processes associated with both SAP and non SAP based programs.
Design and implement data quality and data cleansing requirements and methodologies.
Manage support, tools, and maintenance of integration processes.
Interact and collaborate with other data team members and business analysts.
Maintain and Support Corporate ETL & Data Migration Solutions in both SAP & Microsoft platforms.
Understanding of SAP data structures & tables and loading data to SAP using LSMW and BODS directly.
KEY TASKS
Design and develop SAP BODS jobs for data conversion and data integration to and from SAP and other sources.
Maintain and enhance the existing toolset to ensure delivery of maximum value as the project evolves.
Prepare Excel spreadsheets (Data Collection Workbooks) using SAP load formats (iDoc/BAPI/ABAP Programs) as load templates for data analysts to populate.
Identify and suggest existing or new emerging standards and best practices.
Manage and monitor job schedules and provide fix for any failed schedules/jobs.
Data cleansing, modeling (physical and logical), profiling, enterprise data architecting, data quality and data governance
Collaborate with technical team to maintain BODS server architecture, data governance and end-end processes.
Performance tuning of BODS ETL and data models.
Use the local repository metadata to generate reports for input into program and management reporting cycles.
Move the projects from DEV to SIT, SIT to UAT and UAT to PRD.
Schedule transformation jobs in Management Console to produce data load files in text format for loading via LSMW or directly into SAP tables and structures.
Use Business Objects Data Integrator (BODI) to create projects, batch jobs, workflows and dataflows.
Essential
Data modeling and data architecture skills using BODS ETL toolset
Expertise in SAP Data Extraction process BODS Jobs development.
Evidence of working in a challenging and complex organisation and demonstrable experience of contributing to a technical change.
Good experience of understanding and writing Data Stage ETL specifications and delivery of technical solutions to an agreed standard.
Experience delivering solutions to an agreed standard using industry standard methodologies.
Proven experience working in a small team delivering technical solutions to project requirements.
Bachelor's Degree in an Information Systems Field or preferably at least 3 years plus experience designing and/or delivering BODS ETL solutions/programs.Desirable
Business Objects Data Services certifications (highly desirable).
Experience in one or more SAP full lifecycle implementations using SAP BODS ETL toolset.
Knowledge of Master Data Management and SAP MDM/MDG products.
Experience in SAP functional areas such as Finance, Cost Controlling, Supply Chain, Contract Management and Plant Maintenance.--- Fusion People are committed to promoting equal opportunities to people regardless of age, gender, religion, belief, race, sexuality or disability. We operate as an employment agency and employment business. You'll find a wide selection of vacancies on our website
Data Migration Developers x 2
Salary: £45,000 - £50,000 + company benefits
Location - Hybrid (2-3 days per week in one of the following offices Bristol, Leicester or Plymouth)
Full time/permanent vacancy
You must be a sole British Nationals and be able to obtain BPSS/SC and NPPV clearance.
JOB PURPOSE
Working with the Business Object Data Services (BODS) Extract, Transform, Load (ETL) toolset to design, develop and support end-to-end data migration processes associated with both SAP and non SAP based programs.
Design and implement data quality and data cleansing requirements and methodologies.
Manage support, tools, and maintenance of integration processes.
Interact and collaborate with other data team members and business analysts.
Maintain and Support Corporate ETL & Data Migration Solutions in both SAP & Microsoft platforms.
Understanding of SAP data structures & tables and loading data to SAP using LSMW and BODS directly.
KEY TASKS
Design and develop SAP BODS jobs for data conversion and data integration to and from SAP and other sources.
Maintain and enhance the existing toolset to ensure delivery of maximum value as the project evolves.
Prepare Excel spreadsheets (Data Collection Workbooks) using SAP load formats (iDoc/BAPI/ABAP Programs) as load templates for data analysts to populate.
Identify and suggest existing or new emerging standards and best practices.
Manage and monitor job schedules and provide fix for any failed schedules/jobs.
Data cleansing, modeling (physical and logical), profiling, enterprise data architecting, data quality and data governance
Collaborate with technical team to maintain BODS server architecture, data governance and end-end processes.
Performance tuning of BODS ETL and data models.
Use the local repository metadata to generate reports for input into program and management reporting cycles.
Move the projects from DEV to SIT, SIT to UAT and UAT to PRD.
Schedule transformation jobs in Management Console to produce data load files in text format for loading via LSMW or directly into SAP tables and structures.
Use Business Objects Data Integrator (BODI) to create projects, batch jobs, workflows and dataflows.
Essential
Data modeling and data architecture skills using BODS ETL toolset
Expertise in SAP Data Extraction process BODS Jobs development.
Evidence of working in a challenging and complex organisation and demonstrable experience of contributing to a technical change.
Good experience of understanding and writing Data Stage ETL specifications and delivery of technical solutions to an agreed standard.
Experience delivering solutions to an agreed standard using industry standard methodologies.
Proven experience working in a small team delivering technical solutions to project requirements.
Bachelor's Degree in an Information Systems Field or preferably at least 3 years plus experience designing and/or delivering BODS ETL solutions/programs.Desirable
Business Objects Data Services certifications (highly desirable).
Experience in one or more SAP full lifecycle implementations using SAP BODS ETL toolset.
Knowledge of Master Data Management and SAP MDM/MDG products.
Experience in SAP functional areas such as Finance, Cost Controlling, Supply Chain, Contract Management and Plant Maintenance.--- Fusion People are committed to promoting equal opportunities to people regardless of age, gender, religion, belief, race, sexuality or disability. We operate as an employment agency and employment business. You'll find a wide selection of vacancies on our website
01/06/2025
Data Migration Developers x 2
Salary: £45,000 - £50,000 + company benefits
Location - Hybrid (2-3 days per week in one of the following offices Bristol, Leicester or Plymouth)
Full time/permanent vacancy
You must be a sole British Nationals and be able to obtain BPSS/SC and NPPV clearance.
JOB PURPOSE
Working with the Business Object Data Services (BODS) Extract, Transform, Load (ETL) toolset to design, develop and support end-to-end data migration processes associated with both SAP and non SAP based programs.
Design and implement data quality and data cleansing requirements and methodologies.
Manage support, tools, and maintenance of integration processes.
Interact and collaborate with other data team members and business analysts.
Maintain and Support Corporate ETL & Data Migration Solutions in both SAP & Microsoft platforms.
Understanding of SAP data structures & tables and loading data to SAP using LSMW and BODS directly.
KEY TASKS
Design and develop SAP BODS jobs for data conversion and data integration to and from SAP and other sources.
Maintain and enhance the existing toolset to ensure delivery of maximum value as the project evolves.
Prepare Excel spreadsheets (Data Collection Workbooks) using SAP load formats (iDoc/BAPI/ABAP Programs) as load templates for data analysts to populate.
Identify and suggest existing or new emerging standards and best practices.
Manage and monitor job schedules and provide fix for any failed schedules/jobs.
Data cleansing, modeling (physical and logical), profiling, enterprise data architecting, data quality and data governance
Collaborate with technical team to maintain BODS server architecture, data governance and end-end processes.
Performance tuning of BODS ETL and data models.
Use the local repository metadata to generate reports for input into program and management reporting cycles.
Move the projects from DEV to SIT, SIT to UAT and UAT to PRD.
Schedule transformation jobs in Management Console to produce data load files in text format for loading via LSMW or directly into SAP tables and structures.
Use Business Objects Data Integrator (BODI) to create projects, batch jobs, workflows and dataflows.
Essential
Data modeling and data architecture skills using BODS ETL toolset
Expertise in SAP Data Extraction process BODS Jobs development.
Evidence of working in a challenging and complex organisation and demonstrable experience of contributing to a technical change.
Good experience of understanding and writing Data Stage ETL specifications and delivery of technical solutions to an agreed standard.
Experience delivering solutions to an agreed standard using industry standard methodologies.
Proven experience working in a small team delivering technical solutions to project requirements.
Bachelor's Degree in an Information Systems Field or preferably at least 3 years plus experience designing and/or delivering BODS ETL solutions/programs.Desirable
Business Objects Data Services certifications (highly desirable).
Experience in one or more SAP full lifecycle implementations using SAP BODS ETL toolset.
Knowledge of Master Data Management and SAP MDM/MDG products.
Experience in SAP functional areas such as Finance, Cost Controlling, Supply Chain, Contract Management and Plant Maintenance.--- Fusion People are committed to promoting equal opportunities to people regardless of age, gender, religion, belief, race, sexuality or disability. We operate as an employment agency and employment business. You'll find a wide selection of vacancies on our website
Data Migration Developers x 2
Salary: £45,000 - £50,000 + company benefits
Location - Hybrid (2-3 days per week in one of the following offices Bristol, Leicester or Plymouth)
Full time/permanent vacancy
You must be a sole British Nationals and be able to obtain BPSS/SC and NPPV clearance.
JOB PURPOSE
Working with the Business Object Data Services (BODS) Extract, Transform, Load (ETL) toolset to design, develop and support end-to-end data migration processes associated with both SAP and non SAP based programs.
Design and implement data quality and data cleansing requirements and methodologies.
Manage support, tools, and maintenance of integration processes.
Interact and collaborate with other data team members and business analysts.
Maintain and Support Corporate ETL & Data Migration Solutions in both SAP & Microsoft platforms.
Understanding of SAP data structures & tables and loading data to SAP using LSMW and BODS directly.
KEY TASKS
Design and develop SAP BODS jobs for data conversion and data integration to and from SAP and other sources.
Maintain and enhance the existing toolset to ensure delivery of maximum value as the project evolves.
Prepare Excel spreadsheets (Data Collection Workbooks) using SAP load formats (iDoc/BAPI/ABAP Programs) as load templates for data analysts to populate.
Identify and suggest existing or new emerging standards and best practices.
Manage and monitor job schedules and provide fix for any failed schedules/jobs.
Data cleansing, modeling (physical and logical), profiling, enterprise data architecting, data quality and data governance
Collaborate with technical team to maintain BODS server architecture, data governance and end-end processes.
Performance tuning of BODS ETL and data models.
Use the local repository metadata to generate reports for input into program and management reporting cycles.
Move the projects from DEV to SIT, SIT to UAT and UAT to PRD.
Schedule transformation jobs in Management Console to produce data load files in text format for loading via LSMW or directly into SAP tables and structures.
Use Business Objects Data Integrator (BODI) to create projects, batch jobs, workflows and dataflows.
Essential
Data modeling and data architecture skills using BODS ETL toolset
Expertise in SAP Data Extraction process BODS Jobs development.
Evidence of working in a challenging and complex organisation and demonstrable experience of contributing to a technical change.
Good experience of understanding and writing Data Stage ETL specifications and delivery of technical solutions to an agreed standard.
Experience delivering solutions to an agreed standard using industry standard methodologies.
Proven experience working in a small team delivering technical solutions to project requirements.
Bachelor's Degree in an Information Systems Field or preferably at least 3 years plus experience designing and/or delivering BODS ETL solutions/programs.Desirable
Business Objects Data Services certifications (highly desirable).
Experience in one or more SAP full lifecycle implementations using SAP BODS ETL toolset.
Knowledge of Master Data Management and SAP MDM/MDG products.
Experience in SAP functional areas such as Finance, Cost Controlling, Supply Chain, Contract Management and Plant Maintenance.--- Fusion People are committed to promoting equal opportunities to people regardless of age, gender, religion, belief, race, sexuality or disability. We operate as an employment agency and employment business. You'll find a wide selection of vacancies on our website
01/06/2025
Data Migration Developers x 2
Salary: £45,000 - £50,000 + company benefits
Location - Hybrid (2-3 days per week in one of the following offices Bristol, Leicester or Plymouth)
Full time/permanent vacancy
You must be a sole British Nationals and be able to obtain BPSS/SC and NPPV clearance.
JOB PURPOSE
Working with the Business Object Data Services (BODS) Extract, Transform, Load (ETL) toolset to design, develop and support end-to-end data migration processes associated with both SAP and non SAP based programs.
Design and implement data quality and data cleansing requirements and methodologies.
Manage support, tools, and maintenance of integration processes.
Interact and collaborate with other data team members and business analysts.
Maintain and Support Corporate ETL & Data Migration Solutions in both SAP & Microsoft platforms.
Understanding of SAP data structures & tables and loading data to SAP using LSMW and BODS directly.
KEY TASKS
Design and develop SAP BODS jobs for data conversion and data integration to and from SAP and other sources.
Maintain and enhance the existing toolset to ensure delivery of maximum value as the project evolves.
Prepare Excel spreadsheets (Data Collection Workbooks) using SAP load formats (iDoc/BAPI/ABAP Programs) as load templates for data analysts to populate.
Identify and suggest existing or new emerging standards and best practices.
Manage and monitor job schedules and provide fix for any failed schedules/jobs.
Data cleansing, modeling (physical and logical), profiling, enterprise data architecting, data quality and data governance
Collaborate with technical team to maintain BODS server architecture, data governance and end-end processes.
Performance tuning of BODS ETL and data models.
Use the local repository metadata to generate reports for input into program and management reporting cycles.
Move the projects from DEV to SIT, SIT to UAT and UAT to PRD.
Schedule transformation jobs in Management Console to produce data load files in text format for loading via LSMW or directly into SAP tables and structures.
Use Business Objects Data Integrator (BODI) to create projects, batch jobs, workflows and dataflows.
Essential
Data modeling and data architecture skills using BODS ETL toolset
Expertise in SAP Data Extraction process BODS Jobs development.
Evidence of working in a challenging and complex organisation and demonstrable experience of contributing to a technical change.
Good experience of understanding and writing Data Stage ETL specifications and delivery of technical solutions to an agreed standard.
Experience delivering solutions to an agreed standard using industry standard methodologies.
Proven experience working in a small team delivering technical solutions to project requirements.
Bachelor's Degree in an Information Systems Field or preferably at least 3 years plus experience designing and/or delivering BODS ETL solutions/programs.Desirable
Business Objects Data Services certifications (highly desirable).
Experience in one or more SAP full lifecycle implementations using SAP BODS ETL toolset.
Knowledge of Master Data Management and SAP MDM/MDG products.
Experience in SAP functional areas such as Finance, Cost Controlling, Supply Chain, Contract Management and Plant Maintenance.--- Fusion People are committed to promoting equal opportunities to people regardless of age, gender, religion, belief, race, sexuality or disability. We operate as an employment agency and employment business. You'll find a wide selection of vacancies on our website
Location Whilst you may have any of our UK offices as a base location, you must be fully flexible in terms of assignment location, as these roles may involve periods of time away from home during the week at short notice. Capgemini requires our employees to be geographically mobile and to be able to travel to customer site to perform our jobs. Who you'll be working with The Cloud Data Platforms team is part of the Insights and Data Global Practice and has seen strong growth and continued success across a variety of projects and sectors. Cloud Data Platforms is the home of the Data Engineers, Platform Engineers, Solutions Architects and Business Analysts who are focused on driving our customers digital and data transformation journey using the modern cloud platforms. We specialise on using the latest frameworks, reference architectures and technologies using AWS, Azure and GCP. We continue to grow and are looking for talented individuals who want to join our high performing team. If you would like to develop your career as part of a team of highly skilled professionals who are passionate about increasing the value of the data and analytics in organisations you have come to the right place. The focus of your role We are looking for strong GCP Data Engineers who are passionate about Cloud technology and who ideally have skills in many of the following areas: • Build and deliver GCP data engineering solutions as part of a larger project • Use Google Data Products tools (e.g. BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep, etc.) to build solutions for our customers • Experience in Spark (Scala/Python/Java) and Kafka. • Experience in MDM, Metadata Management, Data Quality and Data Lineage tools. • E2E Data Engineering and Lifecycle (including non-functional requirements and operations) management. • E2E Solution Design skills - Prototyping, Usability testing and data visualization literacy. • Experience with SQL and NoSQL modern data stores. • Build relationships with client stakeholders to establish a high-level of rapport and confidence • Work with clients, local teams and offshore resources to deliver modern data products • Work effectively on client sites, Capgemini offices and from home • Use GCP Data focused Reference Architecture • Design and build data service APIs • Analyze current business practices, processes and procedures and identify future opportunities for leveraging GCP services • Design solutions and support the planning and implementation of data platform services including sizing, configuration, and needs assessment • Implement effective metrics and monitoring processes Skills Needed • Minimum 3-4 years of experience with Google Data Products tools (e.g. BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep, etc.) • Google Cloud Platform • Java, Scala, Python, Spark, SQL • Experience of developing enterprise grade ETL/ELT data pipelines. • Deep understanding of data manipulation/wrangling techniques • Demonstrable knowledge of applying Data Engineering best practices (coding practices to DS, unit testing, version control, code review). • Big Data Eco-Systems, Cloudera/Hortonworks, AWS EMR, GCP DataProc or GCP Cloud Data Fusion. • NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. • Snowflake Data Warehouse/Platform • Streaming technologies and processing engines, Kinesis, Kafka, Pub/Sub and Spark Streaming. • Experience of working CI/CD technologies, Git, Jenkins, Spinnaker, GCP Cloud Build, Ansible etc • Experience and knowledge of application Containerisation, Docker, Kubernetes etc • Experience building and deploying solutions to Cloud (AWS, Google Cloud) including Cloud provisioning tools • Strong interpersonal skills with the ability to work with clients to establish requirements in non-technical language. • Ability to translate business requirements into plausible technical solutions for articulation to other development staff. • Good understanding of Lambda architecture patterns • Good understanding of Data Governance, including Master Data Management (MDM) and Data Quality tools and processes • Influencing and supporting project delivery through involvement in project/sprint planning and QA • Experience with Agile methodology • Experience on collaboration tools such as JIRA, Kanban Board, Confluence etc Nice to Haves: • Knowledge of other cloud platforms • AWS (e.g Athena, Redshift, Glue, EMR) • Relevant certifications • Python • Snowflake • Databricks What we'll offer you Professional development. Accelerated career progression. An environment that encourages entrepreneurial spirit. It's all on offer at Capgemini and although collaboration is at the core of the way we work, we also recognise individual needs with a flexible benefits package you can tailor to suit you. Why we're different At Capgemini, we help organisations across the world become more agile, more competitive and more successful. Smart, tailored, often-groundbreaking technical solutions to complex problems are the norm. But so, too, is a culture that's as collaborative as it is forward thinking. Working closely with each other, and with our clients, we get under the skin of businesses and to the heart of their goals. You will too. Capgemini is proud to represent nearly 130 nationalities and its cultural diversity. Our holistic definition of diversity extends beyond gender, gender identity, sexual orientation, disability, ethnicity, race, age and religion. Capgemini views diversity as everything that makes us who we are as an organization, including our social background, our experiences in life and work, our communication styles and even our personality. These dimensions contribute to the type of diversity we value the most: diversity of thought. About Capgemini Capgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The Group is guided everyday by its purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse organization of 270,000 team members in nearly 50 countries. With its strong 50 year heritage and deep industry expertise, Capgemini is trusted by its clients to address the entire breadth of their business needs, from strategy and design to operations, fuelled by the fast evolving and innovative world of cloud, data, AI, connectivity, software, digital engineering and platforms. The Group reported in 2020 global revenues of €16 billion. Discover more about what Capgemini can offer you. Visit: and
23/09/2022
Full time
Location Whilst you may have any of our UK offices as a base location, you must be fully flexible in terms of assignment location, as these roles may involve periods of time away from home during the week at short notice. Capgemini requires our employees to be geographically mobile and to be able to travel to customer site to perform our jobs. Who you'll be working with The Cloud Data Platforms team is part of the Insights and Data Global Practice and has seen strong growth and continued success across a variety of projects and sectors. Cloud Data Platforms is the home of the Data Engineers, Platform Engineers, Solutions Architects and Business Analysts who are focused on driving our customers digital and data transformation journey using the modern cloud platforms. We specialise on using the latest frameworks, reference architectures and technologies using AWS, Azure and GCP. We continue to grow and are looking for talented individuals who want to join our high performing team. If you would like to develop your career as part of a team of highly skilled professionals who are passionate about increasing the value of the data and analytics in organisations you have come to the right place. The focus of your role We are looking for strong GCP Data Engineers who are passionate about Cloud technology and who ideally have skills in many of the following areas: • Build and deliver GCP data engineering solutions as part of a larger project • Use Google Data Products tools (e.g. BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep, etc.) to build solutions for our customers • Experience in Spark (Scala/Python/Java) and Kafka. • Experience in MDM, Metadata Management, Data Quality and Data Lineage tools. • E2E Data Engineering and Lifecycle (including non-functional requirements and operations) management. • E2E Solution Design skills - Prototyping, Usability testing and data visualization literacy. • Experience with SQL and NoSQL modern data stores. • Build relationships with client stakeholders to establish a high-level of rapport and confidence • Work with clients, local teams and offshore resources to deliver modern data products • Work effectively on client sites, Capgemini offices and from home • Use GCP Data focused Reference Architecture • Design and build data service APIs • Analyze current business practices, processes and procedures and identify future opportunities for leveraging GCP services • Design solutions and support the planning and implementation of data platform services including sizing, configuration, and needs assessment • Implement effective metrics and monitoring processes Skills Needed • Minimum 3-4 years of experience with Google Data Products tools (e.g. BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep, etc.) • Google Cloud Platform • Java, Scala, Python, Spark, SQL • Experience of developing enterprise grade ETL/ELT data pipelines. • Deep understanding of data manipulation/wrangling techniques • Demonstrable knowledge of applying Data Engineering best practices (coding practices to DS, unit testing, version control, code review). • Big Data Eco-Systems, Cloudera/Hortonworks, AWS EMR, GCP DataProc or GCP Cloud Data Fusion. • NoSQL Databases. Dynamo DB/Neo4j/Elastic, Google Cloud Datastore. • Snowflake Data Warehouse/Platform • Streaming technologies and processing engines, Kinesis, Kafka, Pub/Sub and Spark Streaming. • Experience of working CI/CD technologies, Git, Jenkins, Spinnaker, GCP Cloud Build, Ansible etc • Experience and knowledge of application Containerisation, Docker, Kubernetes etc • Experience building and deploying solutions to Cloud (AWS, Google Cloud) including Cloud provisioning tools • Strong interpersonal skills with the ability to work with clients to establish requirements in non-technical language. • Ability to translate business requirements into plausible technical solutions for articulation to other development staff. • Good understanding of Lambda architecture patterns • Good understanding of Data Governance, including Master Data Management (MDM) and Data Quality tools and processes • Influencing and supporting project delivery through involvement in project/sprint planning and QA • Experience with Agile methodology • Experience on collaboration tools such as JIRA, Kanban Board, Confluence etc Nice to Haves: • Knowledge of other cloud platforms • AWS (e.g Athena, Redshift, Glue, EMR) • Relevant certifications • Python • Snowflake • Databricks What we'll offer you Professional development. Accelerated career progression. An environment that encourages entrepreneurial spirit. It's all on offer at Capgemini and although collaboration is at the core of the way we work, we also recognise individual needs with a flexible benefits package you can tailor to suit you. Why we're different At Capgemini, we help organisations across the world become more agile, more competitive and more successful. Smart, tailored, often-groundbreaking technical solutions to complex problems are the norm. But so, too, is a culture that's as collaborative as it is forward thinking. Working closely with each other, and with our clients, we get under the skin of businesses and to the heart of their goals. You will too. Capgemini is proud to represent nearly 130 nationalities and its cultural diversity. Our holistic definition of diversity extends beyond gender, gender identity, sexual orientation, disability, ethnicity, race, age and religion. Capgemini views diversity as everything that makes us who we are as an organization, including our social background, our experiences in life and work, our communication styles and even our personality. These dimensions contribute to the type of diversity we value the most: diversity of thought. About Capgemini Capgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The Group is guided everyday by its purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse organization of 270,000 team members in nearly 50 countries. With its strong 50 year heritage and deep industry expertise, Capgemini is trusted by its clients to address the entire breadth of their business needs, from strategy and design to operations, fuelled by the fast evolving and innovative world of cloud, data, AI, connectivity, software, digital engineering and platforms. The Group reported in 2020 global revenues of €16 billion. Discover more about what Capgemini can offer you. Visit: and
Company Description Netcompany is one of the fastest-growing and most successful IT services companies in Europe. We are true leaders in digitalisation and are proud to build, implement and support innovative IT solutions for some of the most exciting and prestigious organisations in the world. Our vision is to be the leading digital challenger in Europe, pioneering the next generation of IT consulting, with a strong focus on providing customers true value and delivery certainty. We are experts in the management of complex change and build robust and scalable IT solutions utilising agile delivery methods, thereby improving our client's speed to market and allowing them to rapidly adapt to evolving business requirements. Job Description Netcompany is currently looking for an experienced Senior Data Architect to act as client architecture lead for various programmes of work. Data Architect is a multi-disciplinary role, requiring collaboration with a wide range of stakeholders, from developers to C-level executives. You will be responsible for working with customers to influence and shape the end-to-end data management and analytics workstreams, within fast paced and complex programmes, engaging in a wide variety of data management and analytics activities. Although we are looking for specialists, it is expected that a Senior Data Architect would have a broad understanding across the full spectrum of Data Engineering, Data Science and Data Governance. Key Responsibilities Support and influence Data Strategy, and Data Governance Policies and Principles Promote Data Management standards and best practices Support business and data requirements gathering Input and guidance to business for Data Catalog, Master Data and Metadata Management Lead the data solution designs and execution of data models for these solutions such as Data Warehouse, Data Lake and Data Lakehouse Work with Data Engineers and Analysts to architect scalable and secure solutions across Data Integration, Data Orchestration, Data Processing, Data Storage, and Data Visualisation Work with cross-functional teams to support delivery of the data solutions Engage with customer and end-users to understand solution impact and develop technology operation plans Work with customers or partners to promote the Netcompany brand and develop healthy relationships Coach and mentor upcoming Data Architects Qualifications Demonstrable experience in Data Architecture in the last 3 years Experience in architecting data solutions which meet high data security and compliance requirements Experience working with various open-source, on-prem, COTS, and cloud (AWS, Azure, GCP) tools and technologies Advanced Data Modelling skills and experience in relational, dimensional and NoSQL databases Demonstrable experience in advanced SQL/TSQL Knowledge and experience working with a variety of frameworks and platforms for data management and analytics Data Engineering experience, and familiarity with Git, Python and R Data Analysis, Data Profiling and Data Visualisation experience Knowledge and desired experience of Big Data Additional Information Netcompany has existed in the UK since the acquisition of the very successful IT company, Hunter Macdonald in October 2017. Netcompany is one of Northern Europe's most successful IT Companies, with offices in Denmark, Norway and Poland, Holland, UK and Vietnam. We are an entrepreneurial company and we're looking for people who are excited by the challenge of doing things differently. Our culture builds on low bureaucracy with a strong focus on high agility and flexibility. At Netcompany we believe that a diverse and inclusive workplace is central to our success, which is why all qualified candidates are invited to apply regardless of gender, sexual orientation, disability, age, religion and belief, ethnic background, nationality, gender identity or culture. We are committed to live out a culture where we provide equal opportunities for all.
23/09/2022
Full time
Company Description Netcompany is one of the fastest-growing and most successful IT services companies in Europe. We are true leaders in digitalisation and are proud to build, implement and support innovative IT solutions for some of the most exciting and prestigious organisations in the world. Our vision is to be the leading digital challenger in Europe, pioneering the next generation of IT consulting, with a strong focus on providing customers true value and delivery certainty. We are experts in the management of complex change and build robust and scalable IT solutions utilising agile delivery methods, thereby improving our client's speed to market and allowing them to rapidly adapt to evolving business requirements. Job Description Netcompany is currently looking for an experienced Senior Data Architect to act as client architecture lead for various programmes of work. Data Architect is a multi-disciplinary role, requiring collaboration with a wide range of stakeholders, from developers to C-level executives. You will be responsible for working with customers to influence and shape the end-to-end data management and analytics workstreams, within fast paced and complex programmes, engaging in a wide variety of data management and analytics activities. Although we are looking for specialists, it is expected that a Senior Data Architect would have a broad understanding across the full spectrum of Data Engineering, Data Science and Data Governance. Key Responsibilities Support and influence Data Strategy, and Data Governance Policies and Principles Promote Data Management standards and best practices Support business and data requirements gathering Input and guidance to business for Data Catalog, Master Data and Metadata Management Lead the data solution designs and execution of data models for these solutions such as Data Warehouse, Data Lake and Data Lakehouse Work with Data Engineers and Analysts to architect scalable and secure solutions across Data Integration, Data Orchestration, Data Processing, Data Storage, and Data Visualisation Work with cross-functional teams to support delivery of the data solutions Engage with customer and end-users to understand solution impact and develop technology operation plans Work with customers or partners to promote the Netcompany brand and develop healthy relationships Coach and mentor upcoming Data Architects Qualifications Demonstrable experience in Data Architecture in the last 3 years Experience in architecting data solutions which meet high data security and compliance requirements Experience working with various open-source, on-prem, COTS, and cloud (AWS, Azure, GCP) tools and technologies Advanced Data Modelling skills and experience in relational, dimensional and NoSQL databases Demonstrable experience in advanced SQL/TSQL Knowledge and experience working with a variety of frameworks and platforms for data management and analytics Data Engineering experience, and familiarity with Git, Python and R Data Analysis, Data Profiling and Data Visualisation experience Knowledge and desired experience of Big Data Additional Information Netcompany has existed in the UK since the acquisition of the very successful IT company, Hunter Macdonald in October 2017. Netcompany is one of Northern Europe's most successful IT Companies, with offices in Denmark, Norway and Poland, Holland, UK and Vietnam. We are an entrepreneurial company and we're looking for people who are excited by the challenge of doing things differently. Our culture builds on low bureaucracy with a strong focus on high agility and flexibility. At Netcompany we believe that a diverse and inclusive workplace is central to our success, which is why all qualified candidates are invited to apply regardless of gender, sexual orientation, disability, age, religion and belief, ethnic background, nationality, gender identity or culture. We are committed to live out a culture where we provide equal opportunities for all.
Devonshire Hayes have been engaged by a client to source a Data Governance Analyst to be part of their Data Strategy, Governance and Advisory team. Please note this role is fully remote with occassional travel to client site. Alongside excellent communication skills and a proven track record of managing stakeholders you will be responsible for the following: Defining an organisation's data strategy Translating business strategy in terms of data strategy Designing and implementing a target operating model for data management Establishing the office of the CDO Implementation and adoption of data governance processes and tools Advice and execution on monetisation of organisational data Identification and prioritising key data domains and related business cases Business change and user adoption execution for data initiatives Basic Qualifications Experience in implementing Data Governance and Quality Experiencing identifying and mapping Data Flows Experience in Data Management Experience in Data Requirements Capture and Documentation Experience in using Data Governance tools including one or more of the following: Collibra Informatica Axon and/or EDC Talend Alation Knowledge of industry leading data quality and data protection management practices Knowledge of data governance practices, business and technology issues related to management of enterprise information assets and approaches related to data protection Knowledge of risk data architecture and technology solutions Preferred Skills Knowledge of data governance, data quality analysis and/or master data management Direct experience in metadata management projects or implementation efforts Knowledge of data related government regulatory requirements and emerging trends and issues Experience with Agile Scrum or working in a Scrum Development Team
05/11/2021
Full time
Devonshire Hayes have been engaged by a client to source a Data Governance Analyst to be part of their Data Strategy, Governance and Advisory team. Please note this role is fully remote with occassional travel to client site. Alongside excellent communication skills and a proven track record of managing stakeholders you will be responsible for the following: Defining an organisation's data strategy Translating business strategy in terms of data strategy Designing and implementing a target operating model for data management Establishing the office of the CDO Implementation and adoption of data governance processes and tools Advice and execution on monetisation of organisational data Identification and prioritising key data domains and related business cases Business change and user adoption execution for data initiatives Basic Qualifications Experience in implementing Data Governance and Quality Experiencing identifying and mapping Data Flows Experience in Data Management Experience in Data Requirements Capture and Documentation Experience in using Data Governance tools including one or more of the following: Collibra Informatica Axon and/or EDC Talend Alation Knowledge of industry leading data quality and data protection management practices Knowledge of data governance practices, business and technology issues related to management of enterprise information assets and approaches related to data protection Knowledge of risk data architecture and technology solutions Preferred Skills Knowledge of data governance, data quality analysis and/or master data management Direct experience in metadata management projects or implementation efforts Knowledge of data related government regulatory requirements and emerging trends and issues Experience with Agile Scrum or working in a Scrum Development Team
Data Analyst / SQL / Quality / Looker / Google Big Query / Cleansing / Central London £40,000 - £45,000 + benefits Permanent Our client, a leading Media company, are currently looking for a Data Analyst to join their expanding team. Looking for a data analyst who is skilled in areas such as data analysis, data validation, data cleansing and managing a data catalogue. Tech stack includes SQL, Looker and Google BigQuery. Currently fully remote but could be an office based / remote split further down the line. Please apply for further info. Responsibilities / Essential Skills Ensuring standards and policies are reflected through the enterprise data catalogue. Ensuring data is qualitative and fit for purpose in accordance to the business use Managing metadata and data catalogue processes. Working closely with data owners (senior stakeholders) and subject matter experts to ensure the data catalogue is complete. Creating business data definitions. Work with data owners to establish business data quality procedures. Create and maintain the data quality scorecard Take an active role in the data excellence framework to feedback on existing practice and recommend improvements Detailed level of understanding of relevant statutory frameworks applying to data excellence such as the Data Protection Act and GDPR Proven experience of working in a Data excellence Framework and a Data Quality Management service Excellent written and verbal communication abilities Advanced skills working with spreadsheets to macro level, primary programming language such as SQL, the ability to read and understand database schemas and non-relational systems Comfortable presenting complex data flows and relationships to non-data peers and colleagues. Working across finance, web analytics, CRM, marketing automation and content analytics Adept at interacting with stakeholders; highly personable Use of data analytics tools, platforms and coding languages such as Looker, Google BigQuery, DataRobot, SQL, R and Python Advanced Excel and PowerPoint
15/09/2021
Full time
Data Analyst / SQL / Quality / Looker / Google Big Query / Cleansing / Central London £40,000 - £45,000 + benefits Permanent Our client, a leading Media company, are currently looking for a Data Analyst to join their expanding team. Looking for a data analyst who is skilled in areas such as data analysis, data validation, data cleansing and managing a data catalogue. Tech stack includes SQL, Looker and Google BigQuery. Currently fully remote but could be an office based / remote split further down the line. Please apply for further info. Responsibilities / Essential Skills Ensuring standards and policies are reflected through the enterprise data catalogue. Ensuring data is qualitative and fit for purpose in accordance to the business use Managing metadata and data catalogue processes. Working closely with data owners (senior stakeholders) and subject matter experts to ensure the data catalogue is complete. Creating business data definitions. Work with data owners to establish business data quality procedures. Create and maintain the data quality scorecard Take an active role in the data excellence framework to feedback on existing practice and recommend improvements Detailed level of understanding of relevant statutory frameworks applying to data excellence such as the Data Protection Act and GDPR Proven experience of working in a Data excellence Framework and a Data Quality Management service Excellent written and verbal communication abilities Advanced skills working with spreadsheets to macro level, primary programming language such as SQL, the ability to read and understand database schemas and non-relational systems Comfortable presenting complex data flows and relationships to non-data peers and colleagues. Working across finance, web analytics, CRM, marketing automation and content analytics Adept at interacting with stakeholders; highly personable Use of data analytics tools, platforms and coding languages such as Looker, Google BigQuery, DataRobot, SQL, R and Python Advanced Excel and PowerPoint
OUTSIDE IR35! We are recruiting for a Data Architect for an initial 3 month contract. The successful candidate will compliment an existing team of Solution Architects and Data Analysts and will report to a Head of Architecture. This is a high profile Data Architect position in which you'll work on both strategic and enterprise level architecture for key programmes of change. Responsibilities: Work extensively with our Core Business Transformation Programme Team to lead stakeholders in preparing data in our Procurement, HR, Payroll and Finance functions ahead of the migration of data to modern ERP systems Work with our Office of Data Analytics to acts as a bridge between the Data Architecture, the analysis teams & the business. Own the conceptual, logical, and physical data models; creating data models that conform to existing and future standards and conventions also entity/business process mappings. Work with a range of stakeholders to promote principles, standards and patterns. Lead the transition toward the introduction of MDM (Master Data Management) and MDG (Master Data Governance) best practices. Identify relevant opportunities for Innovation and change Essential Experience: Experience leading data architecture and data management activity Excellent understanding of generic/abstract Datamodelling approaches Excellent understanding of metadata and reference data management concepts our key transformation areas (Procurement, HR, Payroll and Finance) Knowledge of Datamodelling, metadata and master data management and Reference data management Experience analysing data from a quality and integrity perspective Good knowledge of MDM and MDG methodologies Demonstrable experience of managing Data Preparation and Migration projects for ERP or quadrant leading products for finance, HR, Source to Pay or Payroll Knowledge and experience of data/information technologies and products including Azure, Azure data factory/integration, SQL, data warehouse, data marts, data processing/ETL If you are available for a new role and interested in getting more information please apply with on up to date CV
15/09/2021
Contractor
OUTSIDE IR35! We are recruiting for a Data Architect for an initial 3 month contract. The successful candidate will compliment an existing team of Solution Architects and Data Analysts and will report to a Head of Architecture. This is a high profile Data Architect position in which you'll work on both strategic and enterprise level architecture for key programmes of change. Responsibilities: Work extensively with our Core Business Transformation Programme Team to lead stakeholders in preparing data in our Procurement, HR, Payroll and Finance functions ahead of the migration of data to modern ERP systems Work with our Office of Data Analytics to acts as a bridge between the Data Architecture, the analysis teams & the business. Own the conceptual, logical, and physical data models; creating data models that conform to existing and future standards and conventions also entity/business process mappings. Work with a range of stakeholders to promote principles, standards and patterns. Lead the transition toward the introduction of MDM (Master Data Management) and MDG (Master Data Governance) best practices. Identify relevant opportunities for Innovation and change Essential Experience: Experience leading data architecture and data management activity Excellent understanding of generic/abstract Datamodelling approaches Excellent understanding of metadata and reference data management concepts our key transformation areas (Procurement, HR, Payroll and Finance) Knowledge of Datamodelling, metadata and master data management and Reference data management Experience analysing data from a quality and integrity perspective Good knowledge of MDM and MDG methodologies Demonstrable experience of managing Data Preparation and Migration projects for ERP or quadrant leading products for finance, HR, Source to Pay or Payroll Knowledge and experience of data/information technologies and products including Azure, Azure data factory/integration, SQL, data warehouse, data marts, data processing/ETL If you are available for a new role and interested in getting more information please apply with on up to date CV
This is a great opportunity for a multi-skilled IT Business Analyst to join the IT Team of a public sector body who are embarking on a three-year transformation of learning and development supporting the construction engineering sector and the career paths of hundreds of thousands of people. Reporting to the CIO, you will play a central role in defining the technical, data and service platforms on which that transformation can be delivered. Remote working with occasional travel to Herts. Up to £50k basic plus £7,965 car allowance and benefits. It's an exciting opportunity to work innovatively at pace on a range of diverse challenges to design the future platform for construction engineering learning. What you will do: Analyse the current data model, and design a data platform model that supports today's operations and tomorrow's blended learning vision. Redesign one of the key online services from a user-centric perspective and work with vendors to develop a user-tested proof of concept. Gather requirements and evaluate multiple platforms to enable the selection of platforms for delivering Blended Learning and IT Operations vision. Conduct other diverse elements of discovery, to mobilise for implementation from 2022. What you will bring: Passion for doing something transformational that contributes to an important sector of the UK economy and the career paths of hundreds of thousands of people. Expert knowledge and experience of data modelling, data normalisation, metadata management, and related tools and data normalization. High level of communication, facilitation, interpersonal and influencing skills. Experience of UX definition, including persona development, journey mapping and task flows. Experience of defining product backlogs and requirements. Good understanding of Agile principles and practices. In return you can expect to receive a competitive salary in the region of £50,000 basic plus the following : Car Allowance of £7,965pa paid pro-rate monthly, taxable, non-pensionable 25 days holiday per year rising to 30 days during first 5 years' service. Contributory Defined Contribution pension scheme. Employee contributions range from 4.5% to 8.5% of salary. Company contributions are matched. Home office set up fund available in first 3 months of service - up to £250. Appraisal and development scheme. Corporate bonus and individual bonus (non-consolidated) normally available. If you are an experienced Business Analyst looking to develop your career in an interesting transformation that will have a positive effect on the careers of over 400,000 UK learners, then apply today or contact Andrew Medhurst at Inspire People for more information. Inspire People is a acting as a employment agency providing services to employers and individuals.
10/09/2021
Full time
This is a great opportunity for a multi-skilled IT Business Analyst to join the IT Team of a public sector body who are embarking on a three-year transformation of learning and development supporting the construction engineering sector and the career paths of hundreds of thousands of people. Reporting to the CIO, you will play a central role in defining the technical, data and service platforms on which that transformation can be delivered. Remote working with occasional travel to Herts. Up to £50k basic plus £7,965 car allowance and benefits. It's an exciting opportunity to work innovatively at pace on a range of diverse challenges to design the future platform for construction engineering learning. What you will do: Analyse the current data model, and design a data platform model that supports today's operations and tomorrow's blended learning vision. Redesign one of the key online services from a user-centric perspective and work with vendors to develop a user-tested proof of concept. Gather requirements and evaluate multiple platforms to enable the selection of platforms for delivering Blended Learning and IT Operations vision. Conduct other diverse elements of discovery, to mobilise for implementation from 2022. What you will bring: Passion for doing something transformational that contributes to an important sector of the UK economy and the career paths of hundreds of thousands of people. Expert knowledge and experience of data modelling, data normalisation, metadata management, and related tools and data normalization. High level of communication, facilitation, interpersonal and influencing skills. Experience of UX definition, including persona development, journey mapping and task flows. Experience of defining product backlogs and requirements. Good understanding of Agile principles and practices. In return you can expect to receive a competitive salary in the region of £50,000 basic plus the following : Car Allowance of £7,965pa paid pro-rate monthly, taxable, non-pensionable 25 days holiday per year rising to 30 days during first 5 years' service. Contributory Defined Contribution pension scheme. Employee contributions range from 4.5% to 8.5% of salary. Company contributions are matched. Home office set up fund available in first 3 months of service - up to £250. Appraisal and development scheme. Corporate bonus and individual bonus (non-consolidated) normally available. If you are an experienced Business Analyst looking to develop your career in an interesting transformation that will have a positive effect on the careers of over 400,000 UK learners, then apply today or contact Andrew Medhurst at Inspire People for more information. Inspire People is a acting as a employment agency providing services to employers and individuals.
Lead Cloud Data Engineer: The role will be responsible for building new data pipelines and optimizing data flows using the Azure cloud stack. Ideal candidate will be an experience data pipeline builder and data wrangler who enjoys building data products from scratch. The data engineer will need to support Business Analyst's and Data Architect's with discovery and best practices. He/She must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. Skills Needed: Advance Azure knowledge and experience working and migrating data products from on-prem to Azure. Experience building and optimizing big data' data pipelines, architectures and data sets using Py-Spark. Experience building Real Time data pipelines using Event-hub, storage queues and Azure stream analysis. Strong analytic skills related to working with unstructured datasets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Cloud Big Data Analytics in Azure Synapse Analytics, Azure Analysis Services Data Ingestion and Storage including Azure Data Factory, Azure Databricks, Azure Data Lake, Kafka and Spark Streaming, Azure EventHub/IoT Hub, and Azure Stream Analytics Experience with Bigdata Tools: Hadoop Spark, Kafka Experience with Stream processing systems: Spark-streaming, Kafka Experience with Object Oriented/object function Scripting languages: Python preferred. Understanding of Hadoop HDFS-Hive, Power BI and Basic Unix Scripting will be a bonus.
10/09/2021
Lead Cloud Data Engineer: The role will be responsible for building new data pipelines and optimizing data flows using the Azure cloud stack. Ideal candidate will be an experience data pipeline builder and data wrangler who enjoys building data products from scratch. The data engineer will need to support Business Analyst's and Data Architect's with discovery and best practices. He/She must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. Skills Needed: Advance Azure knowledge and experience working and migrating data products from on-prem to Azure. Experience building and optimizing big data' data pipelines, architectures and data sets using Py-Spark. Experience building Real Time data pipelines using Event-hub, storage queues and Azure stream analysis. Strong analytic skills related to working with unstructured datasets. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Cloud Big Data Analytics in Azure Synapse Analytics, Azure Analysis Services Data Ingestion and Storage including Azure Data Factory, Azure Databricks, Azure Data Lake, Kafka and Spark Streaming, Azure EventHub/IoT Hub, and Azure Stream Analytics Experience with Bigdata Tools: Hadoop Spark, Kafka Experience with Stream processing systems: Spark-streaming, Kafka Experience with Object Oriented/object function Scripting languages: Python preferred. Understanding of Hadoop HDFS-Hive, Power BI and Basic Unix Scripting will be a bonus.
Park Lane Recruitment are looking to recruit an Abinitio Developer to work for a global leader in their field.
We have both contract and permanent positions available, based out of Edinburgh.
We are looking for applicants mid level upwards. Duties will involve, but not limited to;
* Maintaining a daily checks list and any additional monitoring required to ensure that the BI production systems are running without issue and that any failures are dealt with in a timely manner.
* Provide support to resolve application incidents and manage application enhancement requests within agreed service levels.
* Maintain and support the data warehouse and associated applications
* Liaise with users to specify, design and implement minor enhancements to the data warehouse as required by the business, ensuring that changes are consistent with the existing BI architecture.
* Provide assistance with the resolution of user problems and queries
* Liaise with the IT department on changes to upstream and downstream systems to ensure that any impacts on the BI systems are identified and dealt with accordingly
* Database, ETL, Report and OLAP development work
The primary focus of the position is to technical designs, process flows, code development, unit/system/UAT testing, and post implementation support.
This role works closely with Business Analysts, Database Administrators, QA, BI Developers, and Architects.
You will also utilize Ab Initio Graphical Development Environment (GDE) to provide technical specifications to programmers to implement ETL changes and maintain high throughput and performance to meet regulatory demands, including frequency of filed regulatory reports and high level of retail transaction volumes.
Requirements
Suitable applicants will have the following background;
* Minimum of 3 years of Ab-Intio development experience
* Experience of working on complex data migration programmes
* Strong technical skills in Ab-Initio
* Experience of UNIX Shell Scripting and SQL will be required
* Exposure to Data-warehouse management and metadata management concepts being highly desirable.
Benefits
With both contract and permanent positions available, your remuneration would be;
Contract - between £450 - £600 per day for approximately 6 months
Permanent - up to £72K plus excellent benefits including career growth - working for the fastest growing company of their sector, working with some of the most recognised brands, work life balance and a rewards system
Contact now for more information
29/10/2018
Park Lane Recruitment are looking to recruit an Abinitio Developer to work for a global leader in their field.
We have both contract and permanent positions available, based out of Edinburgh.
We are looking for applicants mid level upwards. Duties will involve, but not limited to;
* Maintaining a daily checks list and any additional monitoring required to ensure that the BI production systems are running without issue and that any failures are dealt with in a timely manner.
* Provide support to resolve application incidents and manage application enhancement requests within agreed service levels.
* Maintain and support the data warehouse and associated applications
* Liaise with users to specify, design and implement minor enhancements to the data warehouse as required by the business, ensuring that changes are consistent with the existing BI architecture.
* Provide assistance with the resolution of user problems and queries
* Liaise with the IT department on changes to upstream and downstream systems to ensure that any impacts on the BI systems are identified and dealt with accordingly
* Database, ETL, Report and OLAP development work
The primary focus of the position is to technical designs, process flows, code development, unit/system/UAT testing, and post implementation support.
This role works closely with Business Analysts, Database Administrators, QA, BI Developers, and Architects.
You will also utilize Ab Initio Graphical Development Environment (GDE) to provide technical specifications to programmers to implement ETL changes and maintain high throughput and performance to meet regulatory demands, including frequency of filed regulatory reports and high level of retail transaction volumes.
Requirements
Suitable applicants will have the following background;
* Minimum of 3 years of Ab-Intio development experience
* Experience of working on complex data migration programmes
* Strong technical skills in Ab-Initio
* Experience of UNIX Shell Scripting and SQL will be required
* Exposure to Data-warehouse management and metadata management concepts being highly desirable.
Benefits
With both contract and permanent positions available, your remuneration would be;
Contract - between £450 - £600 per day for approximately 6 months
Permanent - up to £72K plus excellent benefits including career growth - working for the fastest growing company of their sector, working with some of the most recognised brands, work life balance and a rewards system
Contact now for more information