Data & Analytics Consultant HD-Tech is proud to be partnering with a leading data management and analytics consultancy that specialises in delivering transformative business insights using Microsoft technologies. With over 10 years of industry presence, our client is recognised for helping organisations optimise their use of data to drive smarter decisions and long-term value. As they continue to scale their professional and managed services offerings, we re now seeking a Data & Analytics Consultant to join their expanding consulting team. About the Company This organisation is deeply committed to helping businesses "predict their future" through better use of data, technology, people, and process . Their services include: Managed Data Services Database and Infrastructure Consultancy Data Engineering & Analytics Adoption and Change Management Their reputation is built on strong consulting skills, deep expertise in Microsoft Data Services, and a highly collaborative culture. They place a premium on client engagement, commercial awareness, and internal collaboration to drive results. Role Overview As a Consultant, you will play a hands-on role in delivering client-facing analytics solutions. You ll help define project scopes, design and implement data architectures, and deliver compelling data narratives. You ll also be expected to identify additional client needs and work closely with internal teams to drive business opportunities. This is a hybrid role, with occasional travel to client sites across the UK and Europe. Key Responsibilities Design, develop, and deliver data and analytics solutions using Microsoft technologies Shape and scope projects to ensure timely and on-budget delivery Support sales and project teams by identifying new client opportunities Engage with stakeholders to gather requirements and define technical solutions Recommend and design optimal analytics architectures Develop robust data models using methodologies such as Kimble Deliver impactful visualisations and insights using tools like Power BI Promote user adoption, training, and change management initiatives Ensure high standards of documentation and data security compliance Technical Skills (desirable): Microsoft Azure Data Services (e.g., Azure Data Factory, Synapse, Databricks, Fabric) Data warehousing and lakehouse design ETL/ELT pipelines SQL, Python for data manipulation and machine learning Big Data frameworks (e.g., Hadoop, Spark) Data visualisation (e.g., Power BI) Understanding of statistical analysis and predictive modelling Experience: 5+ years working with Microsoft data platforms 5+ years in a customer-facing consulting or professional services role Strong understanding of data warehousing and BI principles Skilled in data modelling, database design, and transformation processes Confident communicator with strong stakeholder engagement skills Agile delivery experience preferred Benefits: Hybrid working model monthly onsite collaboration at a modern office in Thames Valley Park Comprehensive benefits including healthcare, pension, life assurance, and income protection
10/07/2025
Full time
Data & Analytics Consultant HD-Tech is proud to be partnering with a leading data management and analytics consultancy that specialises in delivering transformative business insights using Microsoft technologies. With over 10 years of industry presence, our client is recognised for helping organisations optimise their use of data to drive smarter decisions and long-term value. As they continue to scale their professional and managed services offerings, we re now seeking a Data & Analytics Consultant to join their expanding consulting team. About the Company This organisation is deeply committed to helping businesses "predict their future" through better use of data, technology, people, and process . Their services include: Managed Data Services Database and Infrastructure Consultancy Data Engineering & Analytics Adoption and Change Management Their reputation is built on strong consulting skills, deep expertise in Microsoft Data Services, and a highly collaborative culture. They place a premium on client engagement, commercial awareness, and internal collaboration to drive results. Role Overview As a Consultant, you will play a hands-on role in delivering client-facing analytics solutions. You ll help define project scopes, design and implement data architectures, and deliver compelling data narratives. You ll also be expected to identify additional client needs and work closely with internal teams to drive business opportunities. This is a hybrid role, with occasional travel to client sites across the UK and Europe. Key Responsibilities Design, develop, and deliver data and analytics solutions using Microsoft technologies Shape and scope projects to ensure timely and on-budget delivery Support sales and project teams by identifying new client opportunities Engage with stakeholders to gather requirements and define technical solutions Recommend and design optimal analytics architectures Develop robust data models using methodologies such as Kimble Deliver impactful visualisations and insights using tools like Power BI Promote user adoption, training, and change management initiatives Ensure high standards of documentation and data security compliance Technical Skills (desirable): Microsoft Azure Data Services (e.g., Azure Data Factory, Synapse, Databricks, Fabric) Data warehousing and lakehouse design ETL/ELT pipelines SQL, Python for data manipulation and machine learning Big Data frameworks (e.g., Hadoop, Spark) Data visualisation (e.g., Power BI) Understanding of statistical analysis and predictive modelling Experience: 5+ years working with Microsoft data platforms 5+ years in a customer-facing consulting or professional services role Strong understanding of data warehousing and BI principles Skilled in data modelling, database design, and transformation processes Confident communicator with strong stakeholder engagement skills Agile delivery experience preferred Benefits: Hybrid working model monthly onsite collaboration at a modern office in Thames Valley Park Comprehensive benefits including healthcare, pension, life assurance, and income protection
Platform Engineer - Apache Airflow - Contract - Hybrid/London - £460PD A leading organisation is seeking an Apache Airflow Subject Matter Expert to fine-tune and optimise their data orchestration platform. This is not a Data Engineer role, my client are after someone who lives and breathes Airflow at platform and infrastructure level. The primary objective of this role is to fine-tune and optimize their existing Airflow environment, ensuring high reliability, performance, and scalability. THIS ROLE IS INSIDE IR35 The Role: Optimise and scale an enterprise-grade Airflow environment Design and build reusable, modular DAGs Integrate Airflow with Azure services (ADF, Databricks, Synapse) Develop CI/CD pipelines (Azure DevOps) Set up monitoring/logging standards for operational excellence Provide architectural guidance and mentor engineers Key Skills: Deep Airflow expertise (scheduler, Celery, Kubernetes executor, KEDA) Strong Python and Azure (ADF, Databricks, Storage) experience CI/CD with Azure DevOps (YAML pipelines) Familiar with Docker, Kubernetes, and observability tools (Grafana, Prometheus, ELK) If you are an Airflow Optimization specialist looking for a new opportunity, please do get in touch as soon as you can for a confidential discussion. Start ASAP - apply now to learn more. Platform Engineer - Apache Airflow - Contract - Hybrid/London - £460PD Randstad Digital is acting as an Employment Business in relation to this vacancy.
10/07/2025
Contractor
Platform Engineer - Apache Airflow - Contract - Hybrid/London - £460PD A leading organisation is seeking an Apache Airflow Subject Matter Expert to fine-tune and optimise their data orchestration platform. This is not a Data Engineer role, my client are after someone who lives and breathes Airflow at platform and infrastructure level. The primary objective of this role is to fine-tune and optimize their existing Airflow environment, ensuring high reliability, performance, and scalability. THIS ROLE IS INSIDE IR35 The Role: Optimise and scale an enterprise-grade Airflow environment Design and build reusable, modular DAGs Integrate Airflow with Azure services (ADF, Databricks, Synapse) Develop CI/CD pipelines (Azure DevOps) Set up monitoring/logging standards for operational excellence Provide architectural guidance and mentor engineers Key Skills: Deep Airflow expertise (scheduler, Celery, Kubernetes executor, KEDA) Strong Python and Azure (ADF, Databricks, Storage) experience CI/CD with Azure DevOps (YAML pipelines) Familiar with Docker, Kubernetes, and observability tools (Grafana, Prometheus, ELK) If you are an Airflow Optimization specialist looking for a new opportunity, please do get in touch as soon as you can for a confidential discussion. Start ASAP - apply now to learn more. Platform Engineer - Apache Airflow - Contract - Hybrid/London - £460PD Randstad Digital is acting as an Employment Business in relation to this vacancy.
Senior Data Engineer Permanent Up to 75k Middlesbrough (Circa 3 times per week) Overview A leading organisation in the industrial services sector is seeking a talented and experienced Data Engineer to support and develop cutting-edge data solutions. This role focuses on data integration, modelling, transformation, and master data management, with a strong emphasis on Microsoft and Azure technologies. The company operates in a fast-paced environment using Agile methodologies and CI/CD processes to deliver data-driven insights across the business. Key Responsibilities Design, develop, and support scalable data solutions that integrate multiple systems and sources. Implement and manage master data solutions using modern integration techniques. Transform raw data into trusted, curated datasets for enterprise reporting. Lead data projects, mentor team members, and contribute to data architecture evolution. Evaluate and improve business processes by designing effective data solutions. Collaborate with stakeholders and vendors to deliver interfaces and resolve integration issues. Address data quality, integrity, and consistency challenges, particularly with legacy systems. Ensure compliance with data governance standards and maintain clear documentation. Technical Skills & Qualifications Proven experience in a Data Engineer role. Expertise in Microsoft and Azure technologies: SQL Server, ADF, Logic Apps, Data Lake, Power Query, Fabric, Function Apps, Power Automate, Spark, Databricks, SSIS. Strong understanding of data modelling techniques including Normalisation and Kimball methodologies. Proficient in SQL, and experience with programming languages such as Python or C#. Familiarity with low-code platforms (e.g., SharePoint, Power Apps, Power Automate, Nintex). Experience or knowledge of DBA functions, performance tuning, and system administration is desirable. Relevant qualifications in data analytics or related disciplines. (Rullion is a recruitment agency) Rullion celebrates and supports diversity and is committed to ensuring equal opportunities for both employees and applicants.
09/07/2025
Full time
Senior Data Engineer Permanent Up to 75k Middlesbrough (Circa 3 times per week) Overview A leading organisation in the industrial services sector is seeking a talented and experienced Data Engineer to support and develop cutting-edge data solutions. This role focuses on data integration, modelling, transformation, and master data management, with a strong emphasis on Microsoft and Azure technologies. The company operates in a fast-paced environment using Agile methodologies and CI/CD processes to deliver data-driven insights across the business. Key Responsibilities Design, develop, and support scalable data solutions that integrate multiple systems and sources. Implement and manage master data solutions using modern integration techniques. Transform raw data into trusted, curated datasets for enterprise reporting. Lead data projects, mentor team members, and contribute to data architecture evolution. Evaluate and improve business processes by designing effective data solutions. Collaborate with stakeholders and vendors to deliver interfaces and resolve integration issues. Address data quality, integrity, and consistency challenges, particularly with legacy systems. Ensure compliance with data governance standards and maintain clear documentation. Technical Skills & Qualifications Proven experience in a Data Engineer role. Expertise in Microsoft and Azure technologies: SQL Server, ADF, Logic Apps, Data Lake, Power Query, Fabric, Function Apps, Power Automate, Spark, Databricks, SSIS. Strong understanding of data modelling techniques including Normalisation and Kimball methodologies. Proficient in SQL, and experience with programming languages such as Python or C#. Familiarity with low-code platforms (e.g., SharePoint, Power Apps, Power Automate, Nintex). Experience or knowledge of DBA functions, performance tuning, and system administration is desirable. Relevant qualifications in data analytics or related disciplines. (Rullion is a recruitment agency) Rullion celebrates and supports diversity and is committed to ensuring equal opportunities for both employees and applicants.
SQL Database Developer Derby Permanent £40,000 - £50,000 (DOE) + Benefits SQL Database Developer needed for a permanent career opportunity based in Derby . A chance to join an established + growing software business who develop data solutions for large Retail brands. Must be willing to work from the Derby office full-time. Start ideally in Summer 2025. Key experience + responsibilities: Designing, developing + maintaining SQL Server database solutions that power core business applications. Strong hands-on SQL Server database development skills including: complex stored procedures T-SQL, indexing, relational / dimensional modelling, data dashboards. Building / optimising data pipelines and integrations across cloud platforms. Any cloud platforms experience would be helpful including: such as Snowflake, Databricks, BigQuery, Azure SQL. Working closely with key stakeholders including data architects, analysts, testers and managers. Working across the full SQL development life-cycle including: design, development, documentation and testing. Advantageous Skills: Power BI, NoSQL, Azure Synapse, data visualisation tools, Data Lakes, streaming technologies, MDM, Git, DevOps pipelines. Benefits include: £40k-50k Base (DOE) + 25 days holiday (plus BHs) + pension + company bonus + More.
09/07/2025
Full time
SQL Database Developer Derby Permanent £40,000 - £50,000 (DOE) + Benefits SQL Database Developer needed for a permanent career opportunity based in Derby . A chance to join an established + growing software business who develop data solutions for large Retail brands. Must be willing to work from the Derby office full-time. Start ideally in Summer 2025. Key experience + responsibilities: Designing, developing + maintaining SQL Server database solutions that power core business applications. Strong hands-on SQL Server database development skills including: complex stored procedures T-SQL, indexing, relational / dimensional modelling, data dashboards. Building / optimising data pipelines and integrations across cloud platforms. Any cloud platforms experience would be helpful including: such as Snowflake, Databricks, BigQuery, Azure SQL. Working closely with key stakeholders including data architects, analysts, testers and managers. Working across the full SQL development life-cycle including: design, development, documentation and testing. Advantageous Skills: Power BI, NoSQL, Azure Synapse, data visualisation tools, Data Lakes, streaming technologies, MDM, Git, DevOps pipelines. Benefits include: £40k-50k Base (DOE) + 25 days holiday (plus BHs) + pension + company bonus + More.
Data Pipeline Development: Design and implement end-to-end data pipelines in Azure Databricks, handling ingestion from various data sources, performing complex transformations, and publishing data to Azure Data Lake or other storage services. Write efficient and standardized Spark SQL and PySpark code for data transformations, ensuring data integrity and accuracy across the pipeline. Automate pipeline orchestration using Databricks Workflows or integration with external tools (e.g., Apache Airflow, Azure Data Factory). Data Ingestion & Transformation: Build scalable data ingestion processes to handle structured, semi-structured, and unstructured data from various sources (APIs, databases, file systems). Implement data transformation logic using Spark, ensuring data is cleaned, transformed, and enriched according to business requirements. Leverage Databricks features such as Delta Lake to manage and track changes to data, enabling better versioning and performance for incremental data loads. Data Publishing & Integration: Publish clean, transformed data to Azure Data Lake or other cloud storage solutions for consumption by analytics and reporting tools. Define and document best practices for managing and maintaining robust, scalable data pipelines. Data Governance & Security: Implement and maintain data governance policies using Unity Catalog, ensuring proper organization, access control, and metadata management across data assets. Ensure data security best practices, such as encryption at rest and in transit, and role-based access control (RBAC) within Azure Databricks and Azure services. Performance Tuning & Optimization: Optimize Spark jobs for performance by tuning configurations, partitioning data, and caching intermediate results to minimize processing time and resource consumption. Continuously monitor and improve pipeline performance, addressing bottlenecks and optimizing for cost efficiency in Azure. Automation & Monitoring: Automate data pipeline deployment and management using tools like Terraform, ensuring consistency across environments. Set up monitoring and alerting mechanisms for pipelines using Databricks built-in features and Azure Monitor to detect and resolve issues proactively. Requirements Data Pipeline Expertise: Extensive experience in designing and implementing scalable ETL/ELT data pipelines in Azure Databricks, transforming raw data into usable datasets for analysis. Azure Databricks Proficiency: Strong knowledge of Spark (SQL, PySpark) for data transformation and processing within Databricks, along with experience building workflows and automation using Databricks Workflows. Azure Data Services: Hands-on experience with Azure services like Azure Data Lake, Azure Blob Storage, and Azure Synapse for data storage, processing, and publication. Data Governance & Security: Familiarity with managing data governance and security using Databricks Unity Catalog, ensuring data is appropriately organized, secured, and accessible to authorized users. Optimization & Performance Tuning: Proven experience in optimizing data pipelines for performance, cost-efficiency, and scalability, including partitioning, caching, and tuning Spark jobs. Cloud Architecture & Automation: Strong understanding of Azure cloud architecture, including best practices for infrastructure-as-code, automation, and monitoring in data environments.
08/07/2025
Contractor
Data Pipeline Development: Design and implement end-to-end data pipelines in Azure Databricks, handling ingestion from various data sources, performing complex transformations, and publishing data to Azure Data Lake or other storage services. Write efficient and standardized Spark SQL and PySpark code for data transformations, ensuring data integrity and accuracy across the pipeline. Automate pipeline orchestration using Databricks Workflows or integration with external tools (e.g., Apache Airflow, Azure Data Factory). Data Ingestion & Transformation: Build scalable data ingestion processes to handle structured, semi-structured, and unstructured data from various sources (APIs, databases, file systems). Implement data transformation logic using Spark, ensuring data is cleaned, transformed, and enriched according to business requirements. Leverage Databricks features such as Delta Lake to manage and track changes to data, enabling better versioning and performance for incremental data loads. Data Publishing & Integration: Publish clean, transformed data to Azure Data Lake or other cloud storage solutions for consumption by analytics and reporting tools. Define and document best practices for managing and maintaining robust, scalable data pipelines. Data Governance & Security: Implement and maintain data governance policies using Unity Catalog, ensuring proper organization, access control, and metadata management across data assets. Ensure data security best practices, such as encryption at rest and in transit, and role-based access control (RBAC) within Azure Databricks and Azure services. Performance Tuning & Optimization: Optimize Spark jobs for performance by tuning configurations, partitioning data, and caching intermediate results to minimize processing time and resource consumption. Continuously monitor and improve pipeline performance, addressing bottlenecks and optimizing for cost efficiency in Azure. Automation & Monitoring: Automate data pipeline deployment and management using tools like Terraform, ensuring consistency across environments. Set up monitoring and alerting mechanisms for pipelines using Databricks built-in features and Azure Monitor to detect and resolve issues proactively. Requirements Data Pipeline Expertise: Extensive experience in designing and implementing scalable ETL/ELT data pipelines in Azure Databricks, transforming raw data into usable datasets for analysis. Azure Databricks Proficiency: Strong knowledge of Spark (SQL, PySpark) for data transformation and processing within Databricks, along with experience building workflows and automation using Databricks Workflows. Azure Data Services: Hands-on experience with Azure services like Azure Data Lake, Azure Blob Storage, and Azure Synapse for data storage, processing, and publication. Data Governance & Security: Familiarity with managing data governance and security using Databricks Unity Catalog, ensuring data is appropriately organized, secured, and accessible to authorized users. Optimization & Performance Tuning: Proven experience in optimizing data pipelines for performance, cost-efficiency, and scalability, including partitioning, caching, and tuning Spark jobs. Cloud Architecture & Automation: Strong understanding of Azure cloud architecture, including best practices for infrastructure-as-code, automation, and monitoring in data environments.
Senior Python Developer Permanent £80,000- £90,000 (doe) London Hybrid Seeking an experienced Senior Python Developer for a Permanent role based in central London.Hybrid working with the expectation of 3 days/week on site in London . Immediate start, ideally Summer 2025. A chance to work with a leading digital transformation business on large-scale IT modernisation programmes for government clients. Key skills, experience + tasks will include: Extensive hands-on experience with Python is essential, including a strong grasp of Core Python concepts. Proven expertise in deploying Python applications using CI/CD, along with proficiency in designing and implementing CI/CD pipelines in Cloud environments. Excellent practical expertise in Performance tuning and system optimisation. Experience with PySpark and Azure Databricks for distributed data processing and large-scale data analysis. Proven experience with web frameworks , including knowledge of Django and experience with Flask, along with a solid understanding of APIs. Develop and maintain data interfaces to connect various systems within the Bank. Solid architecture and design knowledge to ensure scalable and efficient systems Lead the development of MVPs , ensuring timely delivery, alignment with business goals Collaborate with cross-functional teams to build and enhance banking applications. Work closely with UI/UX Designers to seamlessly integrate visualisations into web applications or other platforms. Banking Domain Experience is Highly Desirable.
08/07/2025
Full time
Senior Python Developer Permanent £80,000- £90,000 (doe) London Hybrid Seeking an experienced Senior Python Developer for a Permanent role based in central London.Hybrid working with the expectation of 3 days/week on site in London . Immediate start, ideally Summer 2025. A chance to work with a leading digital transformation business on large-scale IT modernisation programmes for government clients. Key skills, experience + tasks will include: Extensive hands-on experience with Python is essential, including a strong grasp of Core Python concepts. Proven expertise in deploying Python applications using CI/CD, along with proficiency in designing and implementing CI/CD pipelines in Cloud environments. Excellent practical expertise in Performance tuning and system optimisation. Experience with PySpark and Azure Databricks for distributed data processing and large-scale data analysis. Proven experience with web frameworks , including knowledge of Django and experience with Flask, along with a solid understanding of APIs. Develop and maintain data interfaces to connect various systems within the Bank. Solid architecture and design knowledge to ensure scalable and efficient systems Lead the development of MVPs , ensuring timely delivery, alignment with business goals Collaborate with cross-functional teams to build and enhance banking applications. Work closely with UI/UX Designers to seamlessly integrate visualisations into web applications or other platforms. Banking Domain Experience is Highly Desirable.
Contract Role: Data Architect - £600/day (Inside IR35) Location: Hybrid - 2 days per week onsite (West Midlands HQ) Duration: 6 months initially, with high potential for extension A well-established regulatory organisation is looking for an experienced Data Architect to lead the design, development, and maintenance of enterprise-wide data models. This role will be pivotal in supporting both regulatory and operational reporting needs and will work closely with enterprise architecture, BI, and data engineering teams. This is a hands-on architecture role requiring strong stakeholder engagement, a strategic mindset, and the ability to lead technical data initiatives from concept to implementation. Key Responsibilities: Design and maintain enterprise data models across key platforms including Dynamics 365, internal CRM, finance systems, and Azure-based environments. Collaborate with the Enterprise Architect to define and lead the data strategy and architecture across the estate. Work closely with BI and business improvement teams to ensure models meet analytical and reporting needs. Lead and coach a small team of Data Engineers in delivering structured, secure, and scalable data solutions. Own the end-to-end data lifecycle including ingestion, transformation (ELT), retention, integration, and governance. Support development and maintenance of Common and Semantic Data Models (CDM/SDM) for enterprise reporting layers. Manage large-scale data migrations and integrations with multiple source systems. Ensure compliance with data security, integrity, and audit requirements. Required Experience: Proven experience in data architecture roles within complex organisations. Strong knowledge and hands-on experience with Microsoft Dynamics 365, Power BI, Data Factory, and PowerApps. Direct experience querying, extracting, and analysing Dynamics data via APIs is essential. Strong understanding of Azure-based data solutions and modern data warehousing principles. Experience leading or heavily contributing to enterprise data programmes and transformations. Confident translating business requirements into scalable, secure technical designs. Strong T-SQL and ELT pipeline development experience. Desirable: Familiarity with Microsoft Fabric, Purview, and Databricks. Experience in regulated industries, especially legal or public sector. Understanding of GDPR, FOI, and data retention/destruction best practices. Exposure to Agile delivery environments.
08/07/2025
Seasonal
Contract Role: Data Architect - £600/day (Inside IR35) Location: Hybrid - 2 days per week onsite (West Midlands HQ) Duration: 6 months initially, with high potential for extension A well-established regulatory organisation is looking for an experienced Data Architect to lead the design, development, and maintenance of enterprise-wide data models. This role will be pivotal in supporting both regulatory and operational reporting needs and will work closely with enterprise architecture, BI, and data engineering teams. This is a hands-on architecture role requiring strong stakeholder engagement, a strategic mindset, and the ability to lead technical data initiatives from concept to implementation. Key Responsibilities: Design and maintain enterprise data models across key platforms including Dynamics 365, internal CRM, finance systems, and Azure-based environments. Collaborate with the Enterprise Architect to define and lead the data strategy and architecture across the estate. Work closely with BI and business improvement teams to ensure models meet analytical and reporting needs. Lead and coach a small team of Data Engineers in delivering structured, secure, and scalable data solutions. Own the end-to-end data lifecycle including ingestion, transformation (ELT), retention, integration, and governance. Support development and maintenance of Common and Semantic Data Models (CDM/SDM) for enterprise reporting layers. Manage large-scale data migrations and integrations with multiple source systems. Ensure compliance with data security, integrity, and audit requirements. Required Experience: Proven experience in data architecture roles within complex organisations. Strong knowledge and hands-on experience with Microsoft Dynamics 365, Power BI, Data Factory, and PowerApps. Direct experience querying, extracting, and analysing Dynamics data via APIs is essential. Strong understanding of Azure-based data solutions and modern data warehousing principles. Experience leading or heavily contributing to enterprise data programmes and transformations. Confident translating business requirements into scalable, secure technical designs. Strong T-SQL and ELT pipeline development experience. Desirable: Familiarity with Microsoft Fabric, Purview, and Databricks. Experience in regulated industries, especially legal or public sector. Understanding of GDPR, FOI, and data retention/destruction best practices. Exposure to Agile delivery environments.
Lead Data Engineer - Manchester - Hybrid - 75k - 80k This is an excellent opportunity for a highly experienced data engineer to join a growing tech consultancy and contribute to the leadership of their data engineering practice. If you are an experienced and capable data engineer with a history of using Microsoft products and working in consultancy, this is the role for you! Salary & Benefits Competitive salary of 75k - 80k Performance related bonus of up to 10% 27 days annual holiday, plus public holidays and your birthday off Hybrid working model (1-2 days in office) Pension contribution Great opportunities for career progression And many more Role & Responsibilities Design and deliver solutions using MS Fabric, ADF, ADL, Synapse, Databricks, SQL, and Python. Work closely with a variety of clients to gather requirements and deliver solutions. Be willing to engage and assist in pre-sales activities, bids, proposals etc. Use key techniques such as Governance, Architecture, Data Modelling, ETL / ELT, Data Lakes, Data Warehousing, Master Data, and BI. Consistently utilise key processes in the engineering delivery cycle including Agile and DevOps, Git, APIs, Containers, Microservices and Data Pipelines. Assist in building out, developing, and training the data engineering function. What do I need to apply Strong MS data engineering expertise (Fabric, ADF, ADL, Synapse, SQL) Expert use of Databricks Strong Python experience Consultancy experience Leadership experience My client are looking to book in first stage interviews for next week and slots are already filling up fast. I have limited slots for 1st stage interviews next week so if you're interest, get in touch ASAP with a copy of your most recent and up to date CV and email me at (url removed) or you can call me on (phone number removed). Please Note: This is a permanent role for UK residents only. This role does not offer Sponsorship. You must have the right to work in the UK with no restrictions. Some of our roles may be subject to successful background checks including a DBS and Credit Check. TRG are the go-to recruiter for Power BI and Azure Data Platform roles in the UK, offering more opportunities across the country than any other. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, the London Power BI User Group, Newcastle Power BI User Group and Newcastle Data Platform and Cloud User Group. To find out more and speak confidentially about your job search or hiring needs, please contact me directly at (url removed)
08/07/2025
Full time
Lead Data Engineer - Manchester - Hybrid - 75k - 80k This is an excellent opportunity for a highly experienced data engineer to join a growing tech consultancy and contribute to the leadership of their data engineering practice. If you are an experienced and capable data engineer with a history of using Microsoft products and working in consultancy, this is the role for you! Salary & Benefits Competitive salary of 75k - 80k Performance related bonus of up to 10% 27 days annual holiday, plus public holidays and your birthday off Hybrid working model (1-2 days in office) Pension contribution Great opportunities for career progression And many more Role & Responsibilities Design and deliver solutions using MS Fabric, ADF, ADL, Synapse, Databricks, SQL, and Python. Work closely with a variety of clients to gather requirements and deliver solutions. Be willing to engage and assist in pre-sales activities, bids, proposals etc. Use key techniques such as Governance, Architecture, Data Modelling, ETL / ELT, Data Lakes, Data Warehousing, Master Data, and BI. Consistently utilise key processes in the engineering delivery cycle including Agile and DevOps, Git, APIs, Containers, Microservices and Data Pipelines. Assist in building out, developing, and training the data engineering function. What do I need to apply Strong MS data engineering expertise (Fabric, ADF, ADL, Synapse, SQL) Expert use of Databricks Strong Python experience Consultancy experience Leadership experience My client are looking to book in first stage interviews for next week and slots are already filling up fast. I have limited slots for 1st stage interviews next week so if you're interest, get in touch ASAP with a copy of your most recent and up to date CV and email me at (url removed) or you can call me on (phone number removed). Please Note: This is a permanent role for UK residents only. This role does not offer Sponsorship. You must have the right to work in the UK with no restrictions. Some of our roles may be subject to successful background checks including a DBS and Credit Check. TRG are the go-to recruiter for Power BI and Azure Data Platform roles in the UK, offering more opportunities across the country than any other. We're the proud sponsor and supporter of SQLBits, Power Platform World Tour, the London Power BI User Group, Newcastle Power BI User Group and Newcastle Data Platform and Cloud User Group. To find out more and speak confidentially about your job search or hiring needs, please contact me directly at (url removed)
Enterprise Data Architect - Hybrid - Newcastle - 85k - 100k + Bonus + Excellent Benefits I'm seeking an experience Enterprise Data Architect for a forward-thinking, regulated organisation with a strong national footprint. They are looking for a talented Enterprise Data Architect to shape the future of their cloud-first data strategy. You'll be responsible for designing, implementing, and optimising a scalable and secure data architecture that empowers advanced analytics, supports regulatory reporting, and drives long-term business value. You will lead the design of a high-performance data estate, built primarily on Microsoft Azure and Databricks Key Responsibilities: Design and implement scalable, secure cloud-based data architecture (Azure & Databricks) Develop optimised data models and pipelines using Delta Lake and Azure services Define data standards, policies, and governance practices aligned with compliance Enable real-time analytics and machine learning use cases across business functions Ensure data integration across structured/unstructured sources with high availability Lead innovation and adopt best practices in data engineering and architecture Collaborate with internal stakeholders and third-party vendors Key Skills & Experience: Proven background designing and delivering enterprise-scale data platforms Strong knowledge of Microsoft Azure, Databricks, Delta Lake, and data warehousing Advanced data modelling and ETL/ELT optimisation experience Familiarity with regulatory frameworks such as IFRS 17 , BCBS 239 , and UK Data Protection Excellent stakeholder communication and cross-functional collaboration skills Prior experience in a regulated industry (e.g. insurance, financial services, healthcare) is desirable Microsoft Certified: Azure Solutions Architect Expert Microsoft Certified: Azure Data Engineer Associate Databricks Certified Data Engineer or Machine Learning Associate Benefits for the Enterprise Data Architect Discretionary annual bonus Hybrid working (2 days office / 3 days home) 25 days annual leave + bank holidays (increasing over time) Enhanced pension Flexitime & flexible working options Electric car salary sacrifice scheme Healthcare cash plan Wellbeing app & fitness platform Enhanced parental leave and IVF appointment time Life assurance (4x salary) Discounts on insurance products and retailers Cycle to Work Scheme, community support days & social events If you are interested in this role or looking for something similar, please contact Alex MacDermott directly. If you are interested in this position please click 'apply'. Hunter Selection Limited is a recruitment consultancy with offices UK wide, specialising in permanent & contract roles within Engineering & Manufacturing, IT & Digital, Science & Technology and Service & Sales sectors. Please note as we receive a high level of applications we can only respond to applicants whose skills & qualifications are suitable for this position. No terminology in this advert is intended to discriminate against any of the protected characteristics that fall under the Equality Act 2010. For the purposes of the Conduct Regulations 2003, when advertising permanent vacancies we are acting as an Employment Agency, and when advertising temporary/contract vacancies we are acting as an Employment Business.
08/07/2025
Full time
Enterprise Data Architect - Hybrid - Newcastle - 85k - 100k + Bonus + Excellent Benefits I'm seeking an experience Enterprise Data Architect for a forward-thinking, regulated organisation with a strong national footprint. They are looking for a talented Enterprise Data Architect to shape the future of their cloud-first data strategy. You'll be responsible for designing, implementing, and optimising a scalable and secure data architecture that empowers advanced analytics, supports regulatory reporting, and drives long-term business value. You will lead the design of a high-performance data estate, built primarily on Microsoft Azure and Databricks Key Responsibilities: Design and implement scalable, secure cloud-based data architecture (Azure & Databricks) Develop optimised data models and pipelines using Delta Lake and Azure services Define data standards, policies, and governance practices aligned with compliance Enable real-time analytics and machine learning use cases across business functions Ensure data integration across structured/unstructured sources with high availability Lead innovation and adopt best practices in data engineering and architecture Collaborate with internal stakeholders and third-party vendors Key Skills & Experience: Proven background designing and delivering enterprise-scale data platforms Strong knowledge of Microsoft Azure, Databricks, Delta Lake, and data warehousing Advanced data modelling and ETL/ELT optimisation experience Familiarity with regulatory frameworks such as IFRS 17 , BCBS 239 , and UK Data Protection Excellent stakeholder communication and cross-functional collaboration skills Prior experience in a regulated industry (e.g. insurance, financial services, healthcare) is desirable Microsoft Certified: Azure Solutions Architect Expert Microsoft Certified: Azure Data Engineer Associate Databricks Certified Data Engineer or Machine Learning Associate Benefits for the Enterprise Data Architect Discretionary annual bonus Hybrid working (2 days office / 3 days home) 25 days annual leave + bank holidays (increasing over time) Enhanced pension Flexitime & flexible working options Electric car salary sacrifice scheme Healthcare cash plan Wellbeing app & fitness platform Enhanced parental leave and IVF appointment time Life assurance (4x salary) Discounts on insurance products and retailers Cycle to Work Scheme, community support days & social events If you are interested in this role or looking for something similar, please contact Alex MacDermott directly. If you are interested in this position please click 'apply'. Hunter Selection Limited is a recruitment consultancy with offices UK wide, specialising in permanent & contract roles within Engineering & Manufacturing, IT & Digital, Science & Technology and Service & Sales sectors. Please note as we receive a high level of applications we can only respond to applicants whose skills & qualifications are suitable for this position. No terminology in this advert is intended to discriminate against any of the protected characteristics that fall under the Equality Act 2010. For the purposes of the Conduct Regulations 2003, when advertising permanent vacancies we are acting as an Employment Agency, and when advertising temporary/contract vacancies we are acting as an Employment Business.
Data Engineering Manager Location: Leeds (Mostly Remote 1 Day On-Site) Salary: Up to £75,000 per annum Are you an experienced Senior Data Engineer ready to step into a leadership role? We re looking for a Lead Data Engineer to join our team based in Leeds, working mostly remotely with just one day on-site per week. You ll lead the design and delivery of scalable, cloud-based data solutions using Databricks, Python, and SQL, while mentoring a team and driving engineering best practices. About You You might currently be a Senior Data Engineer ready to grow your leadership skills. You re passionate about building robust, efficient data pipelines and shaping cloud data architecture in an agile environment. Key Responsibilities Lead development of data pipelines and solutions using Databricks, Python, and SQL Design and maintain data models supporting analytics and business intelligence Build and optimise ELT/ETL processes on AWS or Azure Collaborate closely with analysts, architects, and stakeholders to deliver high-quality data products Champion best practices in testing, CI/CD, version control, and infrastructure as code Mentor and support your team, taking ownership of technical delivery and decisions Drive continuous improvements in platform performance, cost, and reliability Key Requirements Hands-on experience with Databricks or similar data engineering platforms Strong Python and SQL skills in data engineering contexts Expertise in data modelling and building analytics-ready datasets Experience with AWS or Azure cloud data services Proven leadership or mentorship experience Excellent communication and stakeholder management Agile delivery and DevOps tooling knowledge Desirable Experience with infrastructure-as-code (Terraform, CloudFormation) Familiarity with CI/CD pipelines and orchestration tools Knowledge of data governance and quality controls Experience in regulated or large-scale environments
07/07/2025
Full time
Data Engineering Manager Location: Leeds (Mostly Remote 1 Day On-Site) Salary: Up to £75,000 per annum Are you an experienced Senior Data Engineer ready to step into a leadership role? We re looking for a Lead Data Engineer to join our team based in Leeds, working mostly remotely with just one day on-site per week. You ll lead the design and delivery of scalable, cloud-based data solutions using Databricks, Python, and SQL, while mentoring a team and driving engineering best practices. About You You might currently be a Senior Data Engineer ready to grow your leadership skills. You re passionate about building robust, efficient data pipelines and shaping cloud data architecture in an agile environment. Key Responsibilities Lead development of data pipelines and solutions using Databricks, Python, and SQL Design and maintain data models supporting analytics and business intelligence Build and optimise ELT/ETL processes on AWS or Azure Collaborate closely with analysts, architects, and stakeholders to deliver high-quality data products Champion best practices in testing, CI/CD, version control, and infrastructure as code Mentor and support your team, taking ownership of technical delivery and decisions Drive continuous improvements in platform performance, cost, and reliability Key Requirements Hands-on experience with Databricks or similar data engineering platforms Strong Python and SQL skills in data engineering contexts Expertise in data modelling and building analytics-ready datasets Experience with AWS or Azure cloud data services Proven leadership or mentorship experience Excellent communication and stakeholder management Agile delivery and DevOps tooling knowledge Desirable Experience with infrastructure-as-code (Terraform, CloudFormation) Familiarity with CI/CD pipelines and orchestration tools Knowledge of data governance and quality controls Experience in regulated or large-scale environments
Senior Azure Data Engineer Remote Working - UK Home-based with occasional travel into the office 39,784 - 49,477 (National) 45,456- 55,149 (London within the M25) Additional allowance for exceptional candidates - Max Salary - 62,000 (National) and 67,000 (London within the M25) Homeworking allowance of 581 per annum Job Ref: J12964 A Senior Azure Data Engineer is required to design, build, test and maintain data on the enterprise data platform, allowing accessibility of the data that meets business and end user needs. The successful individual will be responsible for maximising the automations, scalability, reliability and security of data services, focusing on opportunities for re-use, adaptation and efficient engineering. Additionally you will support the team but knowledge sharing and mentoring more junior members of the team. Accountabilities: Design, build and test data pipelines and services, based on feeds from multiple systems using a range of different storage technologies and/or access methods provided by the Enterprise Data Platform, with a focus on creating repeatable and reusable components and products. Design, write and iterate code from prototype to production ready. Understand security, accessibility and version control. Use a range of coding tools and languages as required. Work closely with colleagues across the Data & Insight Unit to effectively translate requirements into solutions, and accurately communicate across technical and non-technical stakeholders as well as facilitating discussions within a multidisciplinary team. Deliver robust, supportable and sustainable data solutions in accordance with agreed organisational standards that ensure services are resilient, scalable and future proof. Understand the concepts and principles of data modelling and produce, maintain and update relevant physical data models for specific business needs, aligning to the enterprise data architecture standards. Design and implement data solutions for the ingest, storage and use of sensitive data within the organisation, including designing and implementing row and field-level controls as needed to appropriately control, protect and audit such data. Clearly, accurately and informatively document and annotate code, routines and other components to enable support, maintenance and future development. Work with QA Engineers to execute testing where required, automating processes as much as possible. Learn from what has worked as well as what has not, being open to change and improvement and working in smarter', more effective ways. Work collaboratively, sharing information appropriately and building supportive, trusting and professional relationships with colleagues and a wide range of people within and outside of the organisation. Provide oversight and assurance of suppliers and team members, coaching and mentoring colleagues to create a highly-performant and effective team. Design and undertake appropriate quality control and assurance for delivery of output. Provide direction and guidance to peers and junior colleagues, including line management and development of teams, where required. Essential Skills and Experience: Educated to degree level or have equivalent professional experience. Experience of MS Azure Databricks Experience working with Database technologies such as SQL Server, and Data Warehouse Architecture with knowledge of big data, data lakes and NoSQL. Experience following product/solution development lifecycles using frameworks/methodologies such as Agile, SAFe, DevOps and use of associated tooling (e.g., version control, task tracking). Demonstrable experience writing ETL scripts and code to make sure the ETL processes perform optimally. Experience in other programming languages for data manipulation (e.g., Python, Scala). Extensive experience of data engineering and the development of data ingest and transformation routines and services using modern, cloud-based approaches and technologies. Understanding of the principles of data modelling and data flows with ability to apply this to design of data solutions. Experience of supporting and enabling AI technologies. Experience implementing data flows to connect operational systems, data for analytics and BI systems. Experience documenting source-to-target mappings. Experience translating business requirements into solution design and implementation. Experience in assessing and analysing technical issues or problems in order to identify and implement the appropriate solution. Knowledge and experience of data security and data protection (e.g., GDPR) practices and application of these through technology. Strong decision-making, leadership and mentoring skills Proven ability to understand stakeholder needs, manage their expectations and influence at all levels on the use of data and insight. Additional Requirements: Candidates must have an existing and future right to live and work in the UK. Sponsorship at any point is not available. If this sounds like the role for you then please apply today! Alternatively, you can refer a friend or colleague by taking part in our fantastic referral schemes! If you have a friend or colleague who would be interested in this role, please refer them to us. For each relevant candidate that you introduce to us (there is no limit) and we place, you will be entitled to our general gift/voucher scheme. Datatech is one of the UK's leading recruitment agencies in the field of analytics and host of the critically acclaimed event, Women in Data. For more information, visit our website: (url removed)
07/07/2025
Full time
Senior Azure Data Engineer Remote Working - UK Home-based with occasional travel into the office 39,784 - 49,477 (National) 45,456- 55,149 (London within the M25) Additional allowance for exceptional candidates - Max Salary - 62,000 (National) and 67,000 (London within the M25) Homeworking allowance of 581 per annum Job Ref: J12964 A Senior Azure Data Engineer is required to design, build, test and maintain data on the enterprise data platform, allowing accessibility of the data that meets business and end user needs. The successful individual will be responsible for maximising the automations, scalability, reliability and security of data services, focusing on opportunities for re-use, adaptation and efficient engineering. Additionally you will support the team but knowledge sharing and mentoring more junior members of the team. Accountabilities: Design, build and test data pipelines and services, based on feeds from multiple systems using a range of different storage technologies and/or access methods provided by the Enterprise Data Platform, with a focus on creating repeatable and reusable components and products. Design, write and iterate code from prototype to production ready. Understand security, accessibility and version control. Use a range of coding tools and languages as required. Work closely with colleagues across the Data & Insight Unit to effectively translate requirements into solutions, and accurately communicate across technical and non-technical stakeholders as well as facilitating discussions within a multidisciplinary team. Deliver robust, supportable and sustainable data solutions in accordance with agreed organisational standards that ensure services are resilient, scalable and future proof. Understand the concepts and principles of data modelling and produce, maintain and update relevant physical data models for specific business needs, aligning to the enterprise data architecture standards. Design and implement data solutions for the ingest, storage and use of sensitive data within the organisation, including designing and implementing row and field-level controls as needed to appropriately control, protect and audit such data. Clearly, accurately and informatively document and annotate code, routines and other components to enable support, maintenance and future development. Work with QA Engineers to execute testing where required, automating processes as much as possible. Learn from what has worked as well as what has not, being open to change and improvement and working in smarter', more effective ways. Work collaboratively, sharing information appropriately and building supportive, trusting and professional relationships with colleagues and a wide range of people within and outside of the organisation. Provide oversight and assurance of suppliers and team members, coaching and mentoring colleagues to create a highly-performant and effective team. Design and undertake appropriate quality control and assurance for delivery of output. Provide direction and guidance to peers and junior colleagues, including line management and development of teams, where required. Essential Skills and Experience: Educated to degree level or have equivalent professional experience. Experience of MS Azure Databricks Experience working with Database technologies such as SQL Server, and Data Warehouse Architecture with knowledge of big data, data lakes and NoSQL. Experience following product/solution development lifecycles using frameworks/methodologies such as Agile, SAFe, DevOps and use of associated tooling (e.g., version control, task tracking). Demonstrable experience writing ETL scripts and code to make sure the ETL processes perform optimally. Experience in other programming languages for data manipulation (e.g., Python, Scala). Extensive experience of data engineering and the development of data ingest and transformation routines and services using modern, cloud-based approaches and technologies. Understanding of the principles of data modelling and data flows with ability to apply this to design of data solutions. Experience of supporting and enabling AI technologies. Experience implementing data flows to connect operational systems, data for analytics and BI systems. Experience documenting source-to-target mappings. Experience translating business requirements into solution design and implementation. Experience in assessing and analysing technical issues or problems in order to identify and implement the appropriate solution. Knowledge and experience of data security and data protection (e.g., GDPR) practices and application of these through technology. Strong decision-making, leadership and mentoring skills Proven ability to understand stakeholder needs, manage their expectations and influence at all levels on the use of data and insight. Additional Requirements: Candidates must have an existing and future right to live and work in the UK. Sponsorship at any point is not available. If this sounds like the role for you then please apply today! Alternatively, you can refer a friend or colleague by taking part in our fantastic referral schemes! If you have a friend or colleague who would be interested in this role, please refer them to us. For each relevant candidate that you introduce to us (there is no limit) and we place, you will be entitled to our general gift/voucher scheme. Datatech is one of the UK's leading recruitment agencies in the field of analytics and host of the critically acclaimed event, Women in Data. For more information, visit our website: (url removed)
Fabric PowerBI Data Analytical Specialist with MS Fabric inc Data Factory, Synapse OneLake, SQL and familiar with Databricks is required by a leading commercial interior design company in the heart of the City, a short walk from Farringdon Station, paying upto 70k + All commuting costs paid , it is an office based role . You'll leverage Microsoft Fabric to develop insightful dashboards, mentor a growing team, and drive data-driven decisions. All commuting costs are reimbursed as it is an Office based role. As the PowerBI Data Analytic Specialist you will spearhead their data journey using Microsoft Fabric and Power BI. Develop insightful dashboards, mentor a growing team, and shape a data-centric future Key Skills required : Microsoft Fabric Expertise: Including Data Factory, Synapse, and OneLake for efficient data workflows. Advanced Power BI Skills: Dashboard development, data modelling, and DAX for strategic insights. Data Warehousing Knowledge: Understanding of principles and practices for effective data management. Databricks Familiarity: Experience with Databricks for data processing and analytics. Stakeholder Engagement: Ability to pro actively engage with internal teams and translate business needs. Mentoring & Team Development: Skills to guide junior team members and build a cohesive analytics unit. Data Quality & Governance: Ensuring data accuracy and implementing robust processes. Technical Troubleshooting: Ability to swiftly resolve technical issues. Documentation: Comprehensive documentation of data architecture and reporting processes. Analytical & Problem-Solving: Strong abilities to analyse data and solve complex issues.Experience with Microsoft SQL Server, including writing efficient SQL queries and stored procedures. Supabase/Postgres. Exposure to low-code tools like Microsoft Power Apps, Microsoft Power Automate, Workato, and Make.
07/07/2025
Full time
Fabric PowerBI Data Analytical Specialist with MS Fabric inc Data Factory, Synapse OneLake, SQL and familiar with Databricks is required by a leading commercial interior design company in the heart of the City, a short walk from Farringdon Station, paying upto 70k + All commuting costs paid , it is an office based role . You'll leverage Microsoft Fabric to develop insightful dashboards, mentor a growing team, and drive data-driven decisions. All commuting costs are reimbursed as it is an Office based role. As the PowerBI Data Analytic Specialist you will spearhead their data journey using Microsoft Fabric and Power BI. Develop insightful dashboards, mentor a growing team, and shape a data-centric future Key Skills required : Microsoft Fabric Expertise: Including Data Factory, Synapse, and OneLake for efficient data workflows. Advanced Power BI Skills: Dashboard development, data modelling, and DAX for strategic insights. Data Warehousing Knowledge: Understanding of principles and practices for effective data management. Databricks Familiarity: Experience with Databricks for data processing and analytics. Stakeholder Engagement: Ability to pro actively engage with internal teams and translate business needs. Mentoring & Team Development: Skills to guide junior team members and build a cohesive analytics unit. Data Quality & Governance: Ensuring data accuracy and implementing robust processes. Technical Troubleshooting: Ability to swiftly resolve technical issues. Documentation: Comprehensive documentation of data architecture and reporting processes. Analytical & Problem-Solving: Strong abilities to analyse data and solve complex issues.Experience with Microsoft SQL Server, including writing efficient SQL queries and stored procedures. Supabase/Postgres. Exposure to low-code tools like Microsoft Power Apps, Microsoft Power Automate, Workato, and Make.
Job Description We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III at JPMorgan Chase within the Global Banking, Data, Analytics & Technology division you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm's business objectives. Job responsibilities Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture Contributes to software engineering communities of practice and events that explore new and emerging technologies Adds to team culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on Databricks and proficient advanced experience Hands-on practical experience delivering system design, application development, testing, and operational stability Advanced in one or more programming language(s), Python is required Advanced knowledge of software applications and technical processes with considerable in-depth knowledge in one or more technical disciplines (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Ability to tackle design and functionality problems independently with little to no oversight Experience performing data analytics on AWS platforms Experience in writing efficient SQL's, implementing complex ETL transformations on big data platform. Experience in a Big Data technologies (Spark, Impala, Hive, Redshift, Kafka, etc.) Experience in data quality testing; adept at writing test cases and scripts, presenting and resolving data issues Experience with Databricks, Snowflake, Iceberg are required Preferred qualifications, capabilities, and skills Experience in application and data design disciplines with an emphasis on real-time processing and delivery e.g. Kafka is preferable Understanding of the Commercial & Investment Bank business will be useful. Proficiency across the full range of database and business intelligence tools; publishing and presenting information in an engaging way is a plus Financial Services and Commercial and Investment Banking experience is a plus Familiarity with NoSQL database platforms(DynamoDB, Cassandra) is a plus Familiarity with relational database environment (Oracle, SQL Server, etc.) leveraging databases, tables/views, stored procedures, agent jobs, etc. About Us J.P. Morgan is a global leader in financial services, providing strategic advice and products to the world's most prominent corporations, governments, wealthy individuals and institutional investors. Our first-class business in a first-class way approach to serving clients drives everything we do. We strive to build trusted, long-term partnerships to help our clients achieve their business objectives. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants' and employees' religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team J.P. Morgan's Commercial & Investment Bank is a global leader across banking, markets, securities services and payments. Corporations, governments and institutions throughout the world entrust us with their business in more than 100 countries. The Commercial & Investment Bank provides strategic advice, raises capital, manages risk and extends liquidity in markets around the world.
05/07/2025
Full time
Job Description We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III at JPMorgan Chase within the Global Banking, Data, Analytics & Technology division you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm's business objectives. Job responsibilities Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture Contributes to software engineering communities of practice and events that explore new and emerging technologies Adds to team culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on Databricks and proficient advanced experience Hands-on practical experience delivering system design, application development, testing, and operational stability Advanced in one or more programming language(s), Python is required Advanced knowledge of software applications and technical processes with considerable in-depth knowledge in one or more technical disciplines (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Ability to tackle design and functionality problems independently with little to no oversight Experience performing data analytics on AWS platforms Experience in writing efficient SQL's, implementing complex ETL transformations on big data platform. Experience in a Big Data technologies (Spark, Impala, Hive, Redshift, Kafka, etc.) Experience in data quality testing; adept at writing test cases and scripts, presenting and resolving data issues Experience with Databricks, Snowflake, Iceberg are required Preferred qualifications, capabilities, and skills Experience in application and data design disciplines with an emphasis on real-time processing and delivery e.g. Kafka is preferable Understanding of the Commercial & Investment Bank business will be useful. Proficiency across the full range of database and business intelligence tools; publishing and presenting information in an engaging way is a plus Financial Services and Commercial and Investment Banking experience is a plus Familiarity with NoSQL database platforms(DynamoDB, Cassandra) is a plus Familiarity with relational database environment (Oracle, SQL Server, etc.) leveraging databases, tables/views, stored procedures, agent jobs, etc. About Us J.P. Morgan is a global leader in financial services, providing strategic advice and products to the world's most prominent corporations, governments, wealthy individuals and institutional investors. Our first-class business in a first-class way approach to serving clients drives everything we do. We strive to build trusted, long-term partnerships to help our clients achieve their business objectives. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants' and employees' religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team J.P. Morgan's Commercial & Investment Bank is a global leader across banking, markets, securities services and payments. Corporations, governments and institutions throughout the world entrust us with their business in more than 100 countries. The Commercial & Investment Bank provides strategic advice, raises capital, manages risk and extends liquidity in markets around the world.
Join our client in embarking on an ambitious data transformation journey using Databricks, guided by best practice data governance and architectural principles. To support this, we are recruiting for talented data engineers. As a major UK energy provider, our client is committed to 100% renewable energy and sustainability, focusing on delivering exceptional customer experiences. It is initially a 3-month contract with potential to be extended. The role is Hybrid, with one day a week being based in their Nottingham office, this is negotiable. It is a full-time role, 37 hours per week. Accountabilities: Develop and maintain scalable, efficient data pipelines within Databricks, continuously evolving them as requirements and technologies change. Build and manage an enterprise data model within Databricks. Integrate new data sources into the platform using batch and streaming processes, adhering to SLAs. Create and maintain documentation for data pipelines and associated systems, following security and monitoring protocols. Ensure data quality and reliability processes are effective, maintaining trust in the data. Be comfortable with taking ownership of complex data engineering projects and develop appropriate solutions in accordance with business requirements. Able to work closely with stakeholders and managing their requirements. Actively coach and mentor others in the team and foster a culture of innovation and peer review within the team to ensure best practice. Knowledge and Skills: Extensive experience of Python preferred, including advanced concepts like decorators, protocols, functools, context managers, and comprehensions. Strong understanding of SQL, database design, and data architecture. Experience with Databricks and/or Spark. Knowledgeable in data governance, data cataloguing, data quality principles, and related tools. Skilled in data extraction, joining, and aggregation tasks, especially with big data and real-time data using Spark. Capable of performing data cleansing operations to prepare data for analysis, including transforming data into useful formats. Understand data storage concepts and logical data structures, such as data warehousing. Able to write repeatable, production-quality code for data pipelines, utilizing templating and parameterization where needed. Can make data pipeline design recommendations based on business requirements. Experience with data migration is a plus. Open to new ways of working and new technologies. Self-motivated with the ability to set goals and take initiative. Driven to troubleshoot, deconstruct problems, and build effective solutions. Experience of Git / Version control Experience working with larger, legacy codebases Understanding of unit and integration testing Understanding and experience with CI/CD and general software development best practices A strong attention to detail and a curiosity about the data you will be working with. A strong understanding of Linux based tooling and concepts Knowledge and experience of Amazon Web Services is essential Please note: Should your application be successful, and you are offered the role, a number of pre-employment checks need to be carried out before your appointment can be confirmed. Any assignment offer with our client will be subject to a satisfactory checking report from the Disclosure Barring Service. This vacancy is being advertised by Rullion Ltd acting as an employment business. Since 1978, Rullion has been securing exceptional candidates for a range of clients; from large well-known brands, to SMEs and start-ups. As a family-owned business, Rullion's approach is credible and honest, focused on building long-lasting relationships with both clients and candidates. We celebrate and support diversity and are committed to ensuring equal opportunities for both employees and applicants. Rullion celebrates and supports diversity and is committed to ensuring equal opportunities for both employees and applicants.
03/07/2025
Contractor
Join our client in embarking on an ambitious data transformation journey using Databricks, guided by best practice data governance and architectural principles. To support this, we are recruiting for talented data engineers. As a major UK energy provider, our client is committed to 100% renewable energy and sustainability, focusing on delivering exceptional customer experiences. It is initially a 3-month contract with potential to be extended. The role is Hybrid, with one day a week being based in their Nottingham office, this is negotiable. It is a full-time role, 37 hours per week. Accountabilities: Develop and maintain scalable, efficient data pipelines within Databricks, continuously evolving them as requirements and technologies change. Build and manage an enterprise data model within Databricks. Integrate new data sources into the platform using batch and streaming processes, adhering to SLAs. Create and maintain documentation for data pipelines and associated systems, following security and monitoring protocols. Ensure data quality and reliability processes are effective, maintaining trust in the data. Be comfortable with taking ownership of complex data engineering projects and develop appropriate solutions in accordance with business requirements. Able to work closely with stakeholders and managing their requirements. Actively coach and mentor others in the team and foster a culture of innovation and peer review within the team to ensure best practice. Knowledge and Skills: Extensive experience of Python preferred, including advanced concepts like decorators, protocols, functools, context managers, and comprehensions. Strong understanding of SQL, database design, and data architecture. Experience with Databricks and/or Spark. Knowledgeable in data governance, data cataloguing, data quality principles, and related tools. Skilled in data extraction, joining, and aggregation tasks, especially with big data and real-time data using Spark. Capable of performing data cleansing operations to prepare data for analysis, including transforming data into useful formats. Understand data storage concepts and logical data structures, such as data warehousing. Able to write repeatable, production-quality code for data pipelines, utilizing templating and parameterization where needed. Can make data pipeline design recommendations based on business requirements. Experience with data migration is a plus. Open to new ways of working and new technologies. Self-motivated with the ability to set goals and take initiative. Driven to troubleshoot, deconstruct problems, and build effective solutions. Experience of Git / Version control Experience working with larger, legacy codebases Understanding of unit and integration testing Understanding and experience with CI/CD and general software development best practices A strong attention to detail and a curiosity about the data you will be working with. A strong understanding of Linux based tooling and concepts Knowledge and experience of Amazon Web Services is essential Please note: Should your application be successful, and you are offered the role, a number of pre-employment checks need to be carried out before your appointment can be confirmed. Any assignment offer with our client will be subject to a satisfactory checking report from the Disclosure Barring Service. This vacancy is being advertised by Rullion Ltd acting as an employment business. Since 1978, Rullion has been securing exceptional candidates for a range of clients; from large well-known brands, to SMEs and start-ups. As a family-owned business, Rullion's approach is credible and honest, focused on building long-lasting relationships with both clients and candidates. We celebrate and support diversity and are committed to ensuring equal opportunities for both employees and applicants. Rullion celebrates and supports diversity and is committed to ensuring equal opportunities for both employees and applicants.
HUNTER SELECTION
Newcastle Upon Tyne, Tyne And Wear
Enterprise Data Architect - Hybrid - Newcastle - 85k - 100k + Bonus + Excellent Benefits I'm seeking an experience Enterprise Data Architect for a forward-thinking, regulated organisation with a strong national footprint. They are looking for a talented Enterprise Data Architect to shape the future of their cloud-first data strategy. You'll be responsible for designing, implementing, and optimising a scalable and secure data architecture that empowers advanced analytics, supports regulatory reporting, and drives long-term business value. You will lead the design of a high-performance data estate, built primarily on Microsoft Azure and Databricks Key Responsibilities: Design and implement scalable, secure cloud-based data architecture (Azure & Databricks) Develop optimised data models and pipelines using Delta Lake and Azure services Define data standards, policies, and governance practices aligned with compliance Enable real-time analytics and machine learning use cases across business functions Ensure data integration across structured/unstructured sources with high availability Lead innovation and adopt best practices in data engineering and architecture Collaborate with internal stakeholders and third-party vendors Key Skills & Experience: Proven background designing and delivering enterprise-scale data platforms Strong knowledge of Microsoft Azure, Databricks, Delta Lake, and data warehousing Advanced data modelling and ETL/ELT optimisation experience Familiarity with regulatory frameworks such as IFRS 17 , BCBS 239 , and UK Data Protection Excellent stakeholder communication and cross-functional collaboration skills Prior experience in a regulated industry (e.g. insurance, financial services, healthcare) is desirable Microsoft Certified: Azure Solutions Architect Expert Microsoft Certified: Azure Data Engineer Associate Databricks Certified Data Engineer or Machine Learning Associate Benefits for the Enterprise Data Architect Discretionary annual bonus Hybrid working (2 days office / 3 days home) 25 days annual leave + bank holidays (increasing over time) Enhanced pension Flexitime & flexible working options Electric car salary sacrifice scheme Healthcare cash plan Wellbeing app & fitness platform Enhanced parental leave and IVF appointment time Life assurance (4x salary) Discounts on insurance products and retailers Cycle to Work Scheme, community support days & social events If you are interested in this role or looking for something similar, please contact Alex MacDermott directly. If you are interested in this position please click 'apply'. Hunter Selection Limited is a recruitment consultancy with offices UK wide, specialising in permanent & contract roles within Engineering & Manufacturing, IT & Digital, Science & Technology and Service & Sales sectors. Please note as we receive a high level of applications we can only respond to applicants whose skills & qualifications are suitable for this position. No terminology in this advert is intended to discriminate against any of the protected characteristics that fall under the Equality Act 2010. For the purposes of the Conduct Regulations 2003, when advertising permanent vacancies we are acting as an Employment Agency, and when advertising temporary/contract vacancies we are acting as an Employment Business.
03/07/2025
Full time
Enterprise Data Architect - Hybrid - Newcastle - 85k - 100k + Bonus + Excellent Benefits I'm seeking an experience Enterprise Data Architect for a forward-thinking, regulated organisation with a strong national footprint. They are looking for a talented Enterprise Data Architect to shape the future of their cloud-first data strategy. You'll be responsible for designing, implementing, and optimising a scalable and secure data architecture that empowers advanced analytics, supports regulatory reporting, and drives long-term business value. You will lead the design of a high-performance data estate, built primarily on Microsoft Azure and Databricks Key Responsibilities: Design and implement scalable, secure cloud-based data architecture (Azure & Databricks) Develop optimised data models and pipelines using Delta Lake and Azure services Define data standards, policies, and governance practices aligned with compliance Enable real-time analytics and machine learning use cases across business functions Ensure data integration across structured/unstructured sources with high availability Lead innovation and adopt best practices in data engineering and architecture Collaborate with internal stakeholders and third-party vendors Key Skills & Experience: Proven background designing and delivering enterprise-scale data platforms Strong knowledge of Microsoft Azure, Databricks, Delta Lake, and data warehousing Advanced data modelling and ETL/ELT optimisation experience Familiarity with regulatory frameworks such as IFRS 17 , BCBS 239 , and UK Data Protection Excellent stakeholder communication and cross-functional collaboration skills Prior experience in a regulated industry (e.g. insurance, financial services, healthcare) is desirable Microsoft Certified: Azure Solutions Architect Expert Microsoft Certified: Azure Data Engineer Associate Databricks Certified Data Engineer or Machine Learning Associate Benefits for the Enterprise Data Architect Discretionary annual bonus Hybrid working (2 days office / 3 days home) 25 days annual leave + bank holidays (increasing over time) Enhanced pension Flexitime & flexible working options Electric car salary sacrifice scheme Healthcare cash plan Wellbeing app & fitness platform Enhanced parental leave and IVF appointment time Life assurance (4x salary) Discounts on insurance products and retailers Cycle to Work Scheme, community support days & social events If you are interested in this role or looking for something similar, please contact Alex MacDermott directly. If you are interested in this position please click 'apply'. Hunter Selection Limited is a recruitment consultancy with offices UK wide, specialising in permanent & contract roles within Engineering & Manufacturing, IT & Digital, Science & Technology and Service & Sales sectors. Please note as we receive a high level of applications we can only respond to applicants whose skills & qualifications are suitable for this position. No terminology in this advert is intended to discriminate against any of the protected characteristics that fall under the Equality Act 2010. For the purposes of the Conduct Regulations 2003, when advertising permanent vacancies we are acting as an Employment Agency, and when advertising temporary/contract vacancies we are acting as an Employment Business.
My FTSE 100 Client is urgently recruiting for a Data Solutions Architect, the chosen Candidate will be at the forefront of shaping end-to-end data architectures that support analytics, AI, and enterprise reporting across the organisation. You will lead solution design, champion best practices, and work closely with business and technical stakeholders to deliver robust, scalable, and secure data platforms. Key Responsibilities Design and implement modern data architectures using Azure Synapse , Data Lake , Data Factory , Databricks , and Microsoft Fabric . Lead the development of integrated data solutions supporting BI, advanced analytics, and real-time data processing. Define data governance, security, and compliance standards across the data lifecycle. Collaborate with cross-functional teams to translate business requirements into technical blueprints. Stay ahead of emerging trends in cloud, data, and AI, bringing innovative thinking into the organisation. About You Proven experience as a Data Architect or similar role in cloud data solutions. Expert in Microsoft Azure data services and hands-on with Microsoft Fabric (OneLake, Lakehouse, DirectLake, Power BI integration, etc.) would be a distinct advantage. Strong understanding of data modelling, ETL/ELT pipelines, and data warehousing principles. Skilled in designing scalable and secure solutions using best practices and industry frameworks. Excellent communication and stakeholder engagement skills. Bonus Points For Certifications in Azure Data Engineering or Azure Solutions Architecture. Experience with Power BI, AI/ML integration, or real-time streaming data. Knowledge of data governance tools like Purview. Please send an up to date CV for an immediate response and more information on a fantastic opportunity with a truly great Client.
03/07/2025
Full time
My FTSE 100 Client is urgently recruiting for a Data Solutions Architect, the chosen Candidate will be at the forefront of shaping end-to-end data architectures that support analytics, AI, and enterprise reporting across the organisation. You will lead solution design, champion best practices, and work closely with business and technical stakeholders to deliver robust, scalable, and secure data platforms. Key Responsibilities Design and implement modern data architectures using Azure Synapse , Data Lake , Data Factory , Databricks , and Microsoft Fabric . Lead the development of integrated data solutions supporting BI, advanced analytics, and real-time data processing. Define data governance, security, and compliance standards across the data lifecycle. Collaborate with cross-functional teams to translate business requirements into technical blueprints. Stay ahead of emerging trends in cloud, data, and AI, bringing innovative thinking into the organisation. About You Proven experience as a Data Architect or similar role in cloud data solutions. Expert in Microsoft Azure data services and hands-on with Microsoft Fabric (OneLake, Lakehouse, DirectLake, Power BI integration, etc.) would be a distinct advantage. Strong understanding of data modelling, ETL/ELT pipelines, and data warehousing principles. Skilled in designing scalable and secure solutions using best practices and industry frameworks. Excellent communication and stakeholder engagement skills. Bonus Points For Certifications in Azure Data Engineering or Azure Solutions Architecture. Experience with Power BI, AI/ML integration, or real-time streaming data. Knowledge of data governance tools like Purview. Please send an up to date CV for an immediate response and more information on a fantastic opportunity with a truly great Client.
We're looking for an experienced Enterprise Data Architect to lead the design and evolution of our clients data architecture across their key business domains. You'll work on major transformation programmes, shaping the data strategy and enabling unified, scalable, and secure solutions.
We are looking for candidates that have very strong experience across Retail, Hospitality, Hotels or customer focussed service industries, working in an end client environment, not candidates from a Consultancy background.
PLEASE NOTE: The client requires candidates to be on site in London 3 days per week - this is not negotiable. Please do not apply if you cannot be in London 3 days pw.
Key Responsibilities
Design conceptual, logical, and physical data models for customer, product, HR, and finance data.
Collaborate across business and tech teams to deliver cohesive data architecture.
Lead architecture reviews and guide cloud-based data engineering initiatives.
Champion governance, quality frameworks, and GenAI integration.Thought Leadership:
Provide architectural leadership across key business areas, advocating for best practices in data strategy and governance.
Lead the definition and execution of enterprise data strategy in partnership with other architects and data leaders.
Define and drive improvements in data governance, ingestion optimisation, and exploration processes.
Design and implement real-time data quality frameworks, including proactive anomaly detection and alerting mechanisms.Tech Stack & Tools
ETL: Microsoft Fabric, Databricks
Analytics: Power BI, SAP Analytics Cloud, Azure Synapse
Governance: Azure Purview
AI Tools: GenAI platformsWhat We're Looking For
Strong experience in enterprise data architecture and transformation programmes.
Expertise in distributed data domains, migrations (e.g. SAP Marketing Cloud to Emarsys), and GDPR compliance.
Familiarity with SAP, Oracle, Workday, Microsoft Dynamics, and E-Commerce platforms.
Strategic thinker with experience creating data strategies and capability-aligned roadmaps.If you're ready to drive enterprise-level data transformation and make a tangible impact, we'd love to hear from you
01/06/2025
We're looking for an experienced Enterprise Data Architect to lead the design and evolution of our clients data architecture across their key business domains. You'll work on major transformation programmes, shaping the data strategy and enabling unified, scalable, and secure solutions.
We are looking for candidates that have very strong experience across Retail, Hospitality, Hotels or customer focussed service industries, working in an end client environment, not candidates from a Consultancy background.
PLEASE NOTE: The client requires candidates to be on site in London 3 days per week - this is not negotiable. Please do not apply if you cannot be in London 3 days pw.
Key Responsibilities
Design conceptual, logical, and physical data models for customer, product, HR, and finance data.
Collaborate across business and tech teams to deliver cohesive data architecture.
Lead architecture reviews and guide cloud-based data engineering initiatives.
Champion governance, quality frameworks, and GenAI integration.Thought Leadership:
Provide architectural leadership across key business areas, advocating for best practices in data strategy and governance.
Lead the definition and execution of enterprise data strategy in partnership with other architects and data leaders.
Define and drive improvements in data governance, ingestion optimisation, and exploration processes.
Design and implement real-time data quality frameworks, including proactive anomaly detection and alerting mechanisms.Tech Stack & Tools
ETL: Microsoft Fabric, Databricks
Analytics: Power BI, SAP Analytics Cloud, Azure Synapse
Governance: Azure Purview
AI Tools: GenAI platformsWhat We're Looking For
Strong experience in enterprise data architecture and transformation programmes.
Expertise in distributed data domains, migrations (e.g. SAP Marketing Cloud to Emarsys), and GDPR compliance.
Familiarity with SAP, Oracle, Workday, Microsoft Dynamics, and E-Commerce platforms.
Strategic thinker with experience creating data strategies and capability-aligned roadmaps.If you're ready to drive enterprise-level data transformation and make a tangible impact, we'd love to hear from you
Head of Development
Location: Hybrid working - home and at our office near London Bridge
Working pattern: Full Time, 37.5 hours per week
Contract Type: Permanent
Number of roles: 1
Grade: IT6
Salary: Circa £85,000 - £89,000 per annum
We are looking for a Head of Development to join our team.
This is a leadership role within IT with managerial and commercial responsibilities, requiring experience and knowledge in modern software development technologies and practices to ensure that we can develop and release software change rapidly and to a high standard.
You will ensure that we have the development capability and capacity aligned with our technologies, demand and strategy. Currently, this is delivered using predominantly outsourced suppliers. Thus, you will be responsible for managing the commercial relationships and their performance, ensuring they work seamlessly with our internal teams. You will also be responsible for establishing and maintaining effective testing frameworks and practices, ensuring strong collaboration between IT and business teams to align testing objectives and deliver high-quality software.
You will assist the CTO in defining, driving, and delivering technology strategies while also serving as a source of innovation, ensuring that Kaplan maintains and strengthens its leadership position in our chosen markets.
What you’ll bring to the role
Skills & Experience
*
Agile software development frameworks such as Scrum and Kanban
*
Secure software development lifecycle (SSDLC) and DevOps delivery model
*
Developing and maintaining high-performing, highly available applications based on a variety of architectures (e.g. microservice, distributed, monolithic)
*
Cloud technologies, platforms and services (AWS and Azure)
*
AI technologies for enhancing productivity and application capabilities (Copilot)
*
Establishing and refining test strategies and methods across the development lifecycle
Development tooling:
*
Development work management (Azure DevOps Boards)
*
Source control management (Azure DevOps, TFS, Git)
*
Deployment (Azure DevOps Pipelines, Octopus Deploy)
*
Code quality and vulnerability management (SonarQube, Snyk, Qualys)
*
Containerisation (Docker, Kubernetes)
*
Infrastructure as code (Terraform)
Development languages, frameworks and platforms:
*
Web content management systems (Sitefinity, WordPress)
*
C# / .NET Framework / .NET Core
*
JavaScript & JavaScript frameworks
*
Structured Query Language (SQL)
*
PowerShell
*
Web protocols and internet-based technologies – HTTP, XML, JSON, REST, JavaScript, LTI, TLS, API management
*
Testing tools such as Selenium and JMeter
Experience developing or working on the following applications:
*
Ecommerce and portal websites
*
Enterprise resource planning (Dynamics NAV, Business Central)
*
Learning management systems (Brightspace, Moodle)
*
Enterprise data and reporting systems (SQL Server, Power BI, Databricks)
What we do
Kaplan Professional is a leading provider of apprenticeships, accountancy / tax / finance & banking courses, and professional assessments. For almost 80 years, we've helped shape the development and careers of finance professionals.
We are part of the Kaplan group, one of the world’s largest and most diverse education and assessment providers. We operate in over 30 countries and maintain relationships and partnerships with more than 1,000 school districts, colleges and universities, professional bodies and over 10,000 businesses. Our vast breadth and scope in terms of both capabilities and assets sets us apart.
What we believe in
Our goal is for Kaplan to be a great place to work where everybody can succeed, so we truly live and breathe our values: act with integrity, grow knowledge, empower & support, create opportunity, drive results together.
Kaplan is a global organisation whose mission is rooted in providing equal access to education and opportunities for advancement to people of all backgrounds. We believe diversity, equity and inclusion - of culture, experiences, perspectives - are paramount to creating success and opportunity in an ever-changing world. As an educator, partner and employer, Kaplan is committed to promoting an equitable world in which diverse talent can develop, advance and thrive.
To view our candidate privacy notice click here.
Our Values
We live, breathe and celebrate our values - they drive what we believe in, how we behave and guide us in creating a culture of success.
• Act with integrity
• Empower and support
• Create opportunity
• Grow knowledge
• Drive results together
What we offer
As well as a competitive salary, transparent pay structures, hybrid/home working where possible, and paths for career progression, we offer a comprehensive benefits package that includes:
• 28 days annual leave + option to purchase more
• Season ticket loan and cycle to work scheme
• Big discounts on Kaplan courses for you and your family
• Private medical, income protection, and life insurance
• 24/7 confidential helpline providing counselling and other support services
• Company pension contributions
• Maternity, Adoption, Shared Parental and Paternity/Partner pay which is well above statutory levels
01/06/2025
Head of Development
Location: Hybrid working - home and at our office near London Bridge
Working pattern: Full Time, 37.5 hours per week
Contract Type: Permanent
Number of roles: 1
Grade: IT6
Salary: Circa £85,000 - £89,000 per annum
We are looking for a Head of Development to join our team.
This is a leadership role within IT with managerial and commercial responsibilities, requiring experience and knowledge in modern software development technologies and practices to ensure that we can develop and release software change rapidly and to a high standard.
You will ensure that we have the development capability and capacity aligned with our technologies, demand and strategy. Currently, this is delivered using predominantly outsourced suppliers. Thus, you will be responsible for managing the commercial relationships and their performance, ensuring they work seamlessly with our internal teams. You will also be responsible for establishing and maintaining effective testing frameworks and practices, ensuring strong collaboration between IT and business teams to align testing objectives and deliver high-quality software.
You will assist the CTO in defining, driving, and delivering technology strategies while also serving as a source of innovation, ensuring that Kaplan maintains and strengthens its leadership position in our chosen markets.
What you’ll bring to the role
Skills & Experience
*
Agile software development frameworks such as Scrum and Kanban
*
Secure software development lifecycle (SSDLC) and DevOps delivery model
*
Developing and maintaining high-performing, highly available applications based on a variety of architectures (e.g. microservice, distributed, monolithic)
*
Cloud technologies, platforms and services (AWS and Azure)
*
AI technologies for enhancing productivity and application capabilities (Copilot)
*
Establishing and refining test strategies and methods across the development lifecycle
Development tooling:
*
Development work management (Azure DevOps Boards)
*
Source control management (Azure DevOps, TFS, Git)
*
Deployment (Azure DevOps Pipelines, Octopus Deploy)
*
Code quality and vulnerability management (SonarQube, Snyk, Qualys)
*
Containerisation (Docker, Kubernetes)
*
Infrastructure as code (Terraform)
Development languages, frameworks and platforms:
*
Web content management systems (Sitefinity, WordPress)
*
C# / .NET Framework / .NET Core
*
JavaScript & JavaScript frameworks
*
Structured Query Language (SQL)
*
PowerShell
*
Web protocols and internet-based technologies – HTTP, XML, JSON, REST, JavaScript, LTI, TLS, API management
*
Testing tools such as Selenium and JMeter
Experience developing or working on the following applications:
*
Ecommerce and portal websites
*
Enterprise resource planning (Dynamics NAV, Business Central)
*
Learning management systems (Brightspace, Moodle)
*
Enterprise data and reporting systems (SQL Server, Power BI, Databricks)
What we do
Kaplan Professional is a leading provider of apprenticeships, accountancy / tax / finance & banking courses, and professional assessments. For almost 80 years, we've helped shape the development and careers of finance professionals.
We are part of the Kaplan group, one of the world’s largest and most diverse education and assessment providers. We operate in over 30 countries and maintain relationships and partnerships with more than 1,000 school districts, colleges and universities, professional bodies and over 10,000 businesses. Our vast breadth and scope in terms of both capabilities and assets sets us apart.
What we believe in
Our goal is for Kaplan to be a great place to work where everybody can succeed, so we truly live and breathe our values: act with integrity, grow knowledge, empower & support, create opportunity, drive results together.
Kaplan is a global organisation whose mission is rooted in providing equal access to education and opportunities for advancement to people of all backgrounds. We believe diversity, equity and inclusion - of culture, experiences, perspectives - are paramount to creating success and opportunity in an ever-changing world. As an educator, partner and employer, Kaplan is committed to promoting an equitable world in which diverse talent can develop, advance and thrive.
To view our candidate privacy notice click here.
Our Values
We live, breathe and celebrate our values - they drive what we believe in, how we behave and guide us in creating a culture of success.
• Act with integrity
• Empower and support
• Create opportunity
• Grow knowledge
• Drive results together
What we offer
As well as a competitive salary, transparent pay structures, hybrid/home working where possible, and paths for career progression, we offer a comprehensive benefits package that includes:
• 28 days annual leave + option to purchase more
• Season ticket loan and cycle to work scheme
• Big discounts on Kaplan courses for you and your family
• Private medical, income protection, and life insurance
• 24/7 confidential helpline providing counselling and other support services
• Company pension contributions
• Maternity, Adoption, Shared Parental and Paternity/Partner pay which is well above statutory levels
Title- Cloud Data Engineer Location- Remote Lemongrass Consulting is the leading professional and managed service provider of SAP enterprise applications running on AWS hyperscale cloud infrastructure. Our objective is to delight our customers every day by reducing the cost and increasing the agility of their SAP systems. We do this with our continuous innovation, automation, migration and operation, delivered on the world's most comprehensive cloud platforms. Our team is what makes Lemongrass exceptional and why we have the excellent reputation in the market that we enjoy today. At Lemongrass, you will work with the smartest and most motivated people in the business. We take pride in our culture of innovation and collaboration that drives us to deliver exceptional benefits to our clients every day. About the Role: We are seeking an experienced Cloud Data Engineer with a strong background in AWS, Azure, and GCP. The ideal candidate will have extensive experience with cloud-native ETL tools such as AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, and other ETL tools like Informatica, SAP Data Intelligence, etc. You will be responsible for designing, implementing, and maintaining robust data pipelines and building scalable data lakes. Experience with various data platforms like Redshift, Snowflake, Databricks, Synapse, Snowflake and others is essential. Familiarity with data extraction from SAP or ERP systems is a plus. Key Responsibilities: Design and Development: Design, develop, and maintain scalable ETL pipelines using cloud-native tools (AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, etc.). Architect and implement data lakes and data warehouses on cloud platforms (AWS, Azure, GCP). Develop and optimize data ingestion, transformation, and loading processes using Databricks, Snowflake, Redshift, BigQuery and Azure Synapse. Implement ETL processes using tools like Informatica, SAP Data Intelligence, and others. Develop and optimize data processing jobs using Spark Scala. Data Integration and Management: Integrate various data sources, including relational databases, APIs, unstructured data, and ERP systems into the data lake. Ensure data quality and integrity through rigorous testing and validation. Perform data extraction from SAP or ERP systems when necessary. Performance Optimization: Monitor and optimize the performance of data pipelines and ETL processes. Implement best practices for data management, including data governance, security, and compliance. Collaboration and Communication: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Collaborate with cross-functional teams to design and implement data solutions that meet business needs. Documentation and Maintenance: Document technical solutions, processes, and workflows. Maintain and troubleshoot existing ETL pipelines and data integrations. Qualifications: Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Advanced degrees are a plus. Experience: 7+ years of experience as a Data Engineer or in a similar role. Proven experience with cloud platforms: AWS, Azure, and GCP. Hands-on experience with cloud-native ETL tools such as AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, etc. Experience with other ETL tools like Informatica, SAP Data Intelligence, etc. Experience in building and managing data lakes and data warehouses. Proficiency with data platforms like Redshift, Snowflake, BigQuery, Databricks, and Azure Synapse. Experience with data extraction from SAP or ERP systems is a plus. Strong experience with Spark and Scala for data processing. Skills: Strong programming skills in Python, Java, or Scala. Proficient in SQL and query optimization techniques. Familiarity with data modelling, ETL/ELT processes, and data warehousing concepts. Knowledge of data governance, security, and compliance best practices. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Preferred Qualifications: Experience with other data tools and technologies such as Apache Spark, or Hadoop. Certifications in cloud platforms (AWS Certified Data Analytics - Specialty, Google Professional Data Engineer, Microsoft Certified: Azure Data Engineer Associate). Experience with CI/CD pipelines and DevOps practices for data engineering What we offer in return: Remote Working: Lemongrass always has been and always will offer 100% remote work Flexibility: Work where and when you like most of the time Training: A subscription to A Cloud Guru and generous budget for taking certifications and other resources you'll find helpful State of the art tech : An opportunity to learn and run the latest industry standard tools Team: Colleagues who will challenge you giving the chance to learn from them and them from you Lemongrass Consulting is an Equal Opportunity/Affirmative Action employer. All qualified candidates will receive consideration for employment without regard to disability, protected veteran status, race, color, religious creed, national origin, citizenship, marital status, sex, sexual orientation/gender identity, age, or genetic information. Selected applicant will be subject to a background investigation.
23/12/2024
Full time
Title- Cloud Data Engineer Location- Remote Lemongrass Consulting is the leading professional and managed service provider of SAP enterprise applications running on AWS hyperscale cloud infrastructure. Our objective is to delight our customers every day by reducing the cost and increasing the agility of their SAP systems. We do this with our continuous innovation, automation, migration and operation, delivered on the world's most comprehensive cloud platforms. Our team is what makes Lemongrass exceptional and why we have the excellent reputation in the market that we enjoy today. At Lemongrass, you will work with the smartest and most motivated people in the business. We take pride in our culture of innovation and collaboration that drives us to deliver exceptional benefits to our clients every day. About the Role: We are seeking an experienced Cloud Data Engineer with a strong background in AWS, Azure, and GCP. The ideal candidate will have extensive experience with cloud-native ETL tools such as AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, and other ETL tools like Informatica, SAP Data Intelligence, etc. You will be responsible for designing, implementing, and maintaining robust data pipelines and building scalable data lakes. Experience with various data platforms like Redshift, Snowflake, Databricks, Synapse, Snowflake and others is essential. Familiarity with data extraction from SAP or ERP systems is a plus. Key Responsibilities: Design and Development: Design, develop, and maintain scalable ETL pipelines using cloud-native tools (AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, etc.). Architect and implement data lakes and data warehouses on cloud platforms (AWS, Azure, GCP). Develop and optimize data ingestion, transformation, and loading processes using Databricks, Snowflake, Redshift, BigQuery and Azure Synapse. Implement ETL processes using tools like Informatica, SAP Data Intelligence, and others. Develop and optimize data processing jobs using Spark Scala. Data Integration and Management: Integrate various data sources, including relational databases, APIs, unstructured data, and ERP systems into the data lake. Ensure data quality and integrity through rigorous testing and validation. Perform data extraction from SAP or ERP systems when necessary. Performance Optimization: Monitor and optimize the performance of data pipelines and ETL processes. Implement best practices for data management, including data governance, security, and compliance. Collaboration and Communication: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Collaborate with cross-functional teams to design and implement data solutions that meet business needs. Documentation and Maintenance: Document technical solutions, processes, and workflows. Maintain and troubleshoot existing ETL pipelines and data integrations. Qualifications: Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Advanced degrees are a plus. Experience: 7+ years of experience as a Data Engineer or in a similar role. Proven experience with cloud platforms: AWS, Azure, and GCP. Hands-on experience with cloud-native ETL tools such as AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, etc. Experience with other ETL tools like Informatica, SAP Data Intelligence, etc. Experience in building and managing data lakes and data warehouses. Proficiency with data platforms like Redshift, Snowflake, BigQuery, Databricks, and Azure Synapse. Experience with data extraction from SAP or ERP systems is a plus. Strong experience with Spark and Scala for data processing. Skills: Strong programming skills in Python, Java, or Scala. Proficient in SQL and query optimization techniques. Familiarity with data modelling, ETL/ELT processes, and data warehousing concepts. Knowledge of data governance, security, and compliance best practices. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Preferred Qualifications: Experience with other data tools and technologies such as Apache Spark, or Hadoop. Certifications in cloud platforms (AWS Certified Data Analytics - Specialty, Google Professional Data Engineer, Microsoft Certified: Azure Data Engineer Associate). Experience with CI/CD pipelines and DevOps practices for data engineering What we offer in return: Remote Working: Lemongrass always has been and always will offer 100% remote work Flexibility: Work where and when you like most of the time Training: A subscription to A Cloud Guru and generous budget for taking certifications and other resources you'll find helpful State of the art tech : An opportunity to learn and run the latest industry standard tools Team: Colleagues who will challenge you giving the chance to learn from them and them from you Lemongrass Consulting is an Equal Opportunity/Affirmative Action employer. All qualified candidates will receive consideration for employment without regard to disability, protected veteran status, race, color, religious creed, national origin, citizenship, marital status, sex, sexual orientation/gender identity, age, or genetic information. Selected applicant will be subject to a background investigation.
Job Title: Big Data Architect Salary: £100,000 to £110,000 + Bonus + Benefits Location: East London - 3 days onsite/2 days onsite Are you looking to work as a Big Data Architect for a global organisation? I am currently working with a leading organisation in the telecommunications industry who are looking to expand their Data team and we are looking to hire a Big Data Architect. As the Big Data Architect you will be a part of a very established team and you will be working with exciting projects along the UK, Europe and Asia with the latest technologies as the organisation goes through a exciting digital transformation. You will be providing technical leadership in data architecture and designing end to end solutions for the Azure Synapse Data Lake, ensuring that the implementation of the big data solutions are in-line with the industry's best practices and standards. Within this role, you will be collaborating with a wide range of stakeholders to identify customer needs and ensure that the right resource is selected. This is a permanent salaried position paying up to £110,000, hybrid working model with 3 times per week in the office in East London. Experience required - Big Data Architect: Strong previous experience as a Big Data Architect Azure - Azure Data Lake/Azure Data Bricks/Azure Synapse Experience with Hadoop/Python/Pyspark/Databricks Experience with Data migration in Azure cloud Exceptional communication skills If you have experience with Real Time data analytics or Data Mesh, that would be beneficial This is a great opportunity for Big Data Architect professionals to join an organisation that can provide further exposure to new and exciting technologies. Our client are looking to hire ASAP so please do not hesitate to apply with a copy of your most up to date CV! Data Architect/Data/Azure - Azure Data Lake/Azure Data Bricks/Azure Synapse/Hadoop/Python/Pyspark/Databricks/Telco/Telecomms/Retail/E-commerce
17/08/2023
Full time
Job Title: Big Data Architect Salary: £100,000 to £110,000 + Bonus + Benefits Location: East London - 3 days onsite/2 days onsite Are you looking to work as a Big Data Architect for a global organisation? I am currently working with a leading organisation in the telecommunications industry who are looking to expand their Data team and we are looking to hire a Big Data Architect. As the Big Data Architect you will be a part of a very established team and you will be working with exciting projects along the UK, Europe and Asia with the latest technologies as the organisation goes through a exciting digital transformation. You will be providing technical leadership in data architecture and designing end to end solutions for the Azure Synapse Data Lake, ensuring that the implementation of the big data solutions are in-line with the industry's best practices and standards. Within this role, you will be collaborating with a wide range of stakeholders to identify customer needs and ensure that the right resource is selected. This is a permanent salaried position paying up to £110,000, hybrid working model with 3 times per week in the office in East London. Experience required - Big Data Architect: Strong previous experience as a Big Data Architect Azure - Azure Data Lake/Azure Data Bricks/Azure Synapse Experience with Hadoop/Python/Pyspark/Databricks Experience with Data migration in Azure cloud Exceptional communication skills If you have experience with Real Time data analytics or Data Mesh, that would be beneficial This is a great opportunity for Big Data Architect professionals to join an organisation that can provide further exposure to new and exciting technologies. Our client are looking to hire ASAP so please do not hesitate to apply with a copy of your most up to date CV! Data Architect/Data/Azure - Azure Data Lake/Azure Data Bricks/Azure Synapse/Hadoop/Python/Pyspark/Databricks/Telco/Telecomms/Retail/E-commerce
Jobs - Frequently Asked Questions
Use the location filter to find IT jobs in cities like London, Manchester, Birmingham, and across the UK.
Entry-level roles include IT support technician, junior developer, QA tester, and helpdesk analyst.
New jobs are posted daily. Set up alerts to be notified as soon as new roles match your preferences.
Key skills include problem-solving, coding, cloud computing, networking, and familiarity with tools like AWS or SQL.
Yes, many employers offer training or junior roles. Focus on building a strong CV with relevant coursework or personal projects.