it job board logo
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
  • Recruiting? Post a job
  • Sign in
  • Sign up
  • Home
  • Find IT Jobs
  • Register CV
  • Career Advice
  • Contact us
  • Employers
    • Register as Employer
    • Pricing Plans
Sorry, that job is no longer available. Here are some results that may be similar to the job you were looking for.

1986 jobs found

Email me jobs like this
Refine Search
Current Search
sql data engineer
Full Stack Developer - Manchester or London hybrid
National Residential Landlords Association London, UK
Location: Based from either our Manchester or London office – with some home working possible   The NRLA: Who we are The NRLA is the UK’s largest organisation representing private residential landlords, a community of over 110,000 members across England and Wales. We’re on a mission to transform how landlords manage their properties, stay compliant, and operate day to day. And that means building the next generation of digital tools that will define the private rented sector for years to come. We’re not here to be average. We’re carving out a bold digital path and building the foundations of what will become the operating system for the Private Rental Sector (PRS). To do that, we need talented, curious, forward-thinking developers who want to stretch themselves, learn new stacks, and help shape genuinely meaningful technology. Recognised with Investors in People Gold and named by The Sunday Times as one of the UK’s best companies to work for, we offer a supportive, ambitious environment where innovation isn’t just welcomed, it’s expected. With hybrid working from our Manchester and London hubs, you’ll join a team that values professional growth, collaboration, and the desire to push boundaries. If you want to build purposeful tech, upskill across multiple modern stacks, and play a key role in shaping the NRLA’s digital future, we’d love to hear from you.   The NRLA package: 25 days annual leave increasing to 26 days with three years’ service, 27 days with five years’ service and 28 days with seven years’ service Additional annual day off for your birthday Salary exchange Pension scheme Life assurance Cash plan health and wellbeing benefit including Employee Assistance Programme and counselling service Sick pay Cycle purchase loans, season tickets loans and interest free staff loans Complimentary NRLA membership Non contractual annual performance-related bonus scheme Enhanced maternity and paternity pay MAIN PURPOSE AND SCOPE OF JOB: The Full Stack Developer will be responsible for developing, maintaining, and enhancing the NRLA’s property management platform within a modern monorepo architecture. This role requires expertise in both frontend (Angular) and backend (Node.js/Firebase) development, with a focus on delivering robust, scalable solutions that serve landlords and tenants. The successful candidate will work collaboratively within an agile team environment, contributing to all phases of the software development lifecycle while maintaining high standards of code quality and security.   RESPONSIBLE FOR: Developing high quality, scalable solutions code across both the frontend and backend for landlords and property management professionals. Working collaboratively with Product, Design, and Delivery teams to gather requirements, investigate solutions and translating into technical ... Building and maintaining an Angular-based web application and Firebase Cloud Functions. Working within a monorepo architecture using modern development tools and practices. Ensuring security, data protection, and compliance best practices. Supporting code reviews, documentation, and continuous improvement.   Duties And Key Responsibilities   Key Technologies: Frontend:  Angular 16, TypeScript, RxJS, Angular Material Backend:  Node.js 20, Firebase Cloud Functions, Express.js Database:  Firebase Firestore, BigQuery Cloud:  Google Cloud Platform, Firebase Testing:  Jest, Storybook Build Tools:  pnpm, Turborepo, Angular CLI DevOps:  Google Cloud Build, Infisical, Sentry, Shell Scripting Version Control:  Git, GitHub Technical Development and Implementation: Develop and maintain Angular 16+ applications using TypeScript, RxJS, and Angular Material Build and maintain Firebase Cloud Functions using Node.js 20 and TypeScript Implement responsive, accessible UI components following modern design patterns Develop RESTful APIs and integrate third-party services (Stripe, SendGrid, Algolia, Moneyhub, etc.) Work with Google Cloud Platform services including BigQuery, Cloud Storage, Cloud Tasks, and Pub/Sub Software Development and Delivery: Write clean, maintainable, and well-documented code following established coding standards Participate in code reviews and provide constructive feedback to team members Develop and maintain comprehensive unit tests using Jest Work within a monorepo structure using pnpm workspaces Implement CI/CD pipelines using Google Cloud Build and Firebase deployment tools Manage application state and data flow using reactive programming patterns Optimize application performance Professional Development and Collaboration: Participate in agile ceremonies including sprint planning, daily standups, and retrospectives Collaborate with product owners, designers, and stakeholders to refine requirements Stay current with emerging technologies and best practices in web development Contribute to technical documentation and knowledge sharing within the team Participate in technical discussions and architectural decision-making Technical Operations and Quality Assurance: Monitor application performance using Sentry error tracking and analytics Implement security best practices Debug and resolve production issues in a timely manner Maintain test coverage and ensure comprehensive testing strategies Perform database migrations and manage Firestore data structures Stakeholder Collaboration: Communicate technical concepts effectively to non-technical stakeholders Gather and analyse requirements from business stakeholders Provide technical estimates and delivery timelines Present demos and progress updates to stakeholders Collaborate with external partners and third-party service providers Support customer-facing teams with technical expertise when needed Person Specification Qualifications: Bachelor's degree in Computer Science, Software Engineering, or related field (or equivalent practical experience) Relevant certifications in Angular, Google Cloud Platform, or Firebase (desirable) Evidence of ongoing professional development or contributions to technical communities (desirable). Skills And Abilities: Frontend Development:  Knowledge of Angular (v16+), TypeScript, ES6+, RxJS, HTML5, CSS3/SCSS Backend Development:  Strong proficiency in Node.js, Express.js, and serverless architectures Database:  Experience with NoSQL databases, particularly Firebase Firestore Cloud Platforms:  Hands-on experience with Google Cloud Platform and Firebase services Version Control:  Proficient with Git, GitHub workflows, and collaborative development practices Testing:  Experience with Jest and test-driven development approaches API Integration:  Ability to integrate and work with third-party APIs and services Problem Solving:  Strong analytical and debugging skills Communication:  Excellent written and verbal communication skills Experience: Minimum 3 - 5 years of professional software development experience Proven experience building production-grade Angular applications Experience with Firebase Cloud Functions and serverless architectures Track record of working in monorepo environments (desirable) Experience with payment processing systems (Stripe) and financial integrations Familiarity with property management or real estate technology (desirable) Experience with CI/CD pipelines and DevOps practices Background in agile/scrum development methodologies Knowledge: Deep understanding of JavaScript/TypeScript and modern ES6+ features Knowledge of reactive programming patterns and state management Understanding of RESTful API design principles Familiarity with authentication and authorization patterns (JWT, OAuth) Knowledge of web security best practices and OWASP guidelines Understanding of responsive design and mobile-first development Awareness of accessibility standards (WCAG 2.1) Knowledge of performance optimization techniques Understanding of microservices and event-driven architectures Personal Attributes and other requirements: Self-motivated with strong initiative and ability to work independently Detail-oriented with commitment to code quality and best practices A keen eye for detail when working with UI Adaptable and comfortable working in a fast-paced, evolving environment Collaborative team player with strong interpersonal skills Proactive approach to learning new technologies and methodologies Strong time management and organizational skills Passion for creating excellent user experiences Commitment to continuous improvement and professional development Ability to work hybrid from Manchester or London office
02/12/2025
Full time
Location: Based from either our Manchester or London office – with some home working possible   The NRLA: Who we are The NRLA is the UK’s largest organisation representing private residential landlords, a community of over 110,000 members across England and Wales. We’re on a mission to transform how landlords manage their properties, stay compliant, and operate day to day. And that means building the next generation of digital tools that will define the private rented sector for years to come. We’re not here to be average. We’re carving out a bold digital path and building the foundations of what will become the operating system for the Private Rental Sector (PRS). To do that, we need talented, curious, forward-thinking developers who want to stretch themselves, learn new stacks, and help shape genuinely meaningful technology. Recognised with Investors in People Gold and named by The Sunday Times as one of the UK’s best companies to work for, we offer a supportive, ambitious environment where innovation isn’t just welcomed, it’s expected. With hybrid working from our Manchester and London hubs, you’ll join a team that values professional growth, collaboration, and the desire to push boundaries. If you want to build purposeful tech, upskill across multiple modern stacks, and play a key role in shaping the NRLA’s digital future, we’d love to hear from you.   The NRLA package: 25 days annual leave increasing to 26 days with three years’ service, 27 days with five years’ service and 28 days with seven years’ service Additional annual day off for your birthday Salary exchange Pension scheme Life assurance Cash plan health and wellbeing benefit including Employee Assistance Programme and counselling service Sick pay Cycle purchase loans, season tickets loans and interest free staff loans Complimentary NRLA membership Non contractual annual performance-related bonus scheme Enhanced maternity and paternity pay MAIN PURPOSE AND SCOPE OF JOB: The Full Stack Developer will be responsible for developing, maintaining, and enhancing the NRLA’s property management platform within a modern monorepo architecture. This role requires expertise in both frontend (Angular) and backend (Node.js/Firebase) development, with a focus on delivering robust, scalable solutions that serve landlords and tenants. The successful candidate will work collaboratively within an agile team environment, contributing to all phases of the software development lifecycle while maintaining high standards of code quality and security.   RESPONSIBLE FOR: Developing high quality, scalable solutions code across both the frontend and backend for landlords and property management professionals. Working collaboratively with Product, Design, and Delivery teams to gather requirements, investigate solutions and translating into technical ... Building and maintaining an Angular-based web application and Firebase Cloud Functions. Working within a monorepo architecture using modern development tools and practices. Ensuring security, data protection, and compliance best practices. Supporting code reviews, documentation, and continuous improvement.   Duties And Key Responsibilities   Key Technologies: Frontend:  Angular 16, TypeScript, RxJS, Angular Material Backend:  Node.js 20, Firebase Cloud Functions, Express.js Database:  Firebase Firestore, BigQuery Cloud:  Google Cloud Platform, Firebase Testing:  Jest, Storybook Build Tools:  pnpm, Turborepo, Angular CLI DevOps:  Google Cloud Build, Infisical, Sentry, Shell Scripting Version Control:  Git, GitHub Technical Development and Implementation: Develop and maintain Angular 16+ applications using TypeScript, RxJS, and Angular Material Build and maintain Firebase Cloud Functions using Node.js 20 and TypeScript Implement responsive, accessible UI components following modern design patterns Develop RESTful APIs and integrate third-party services (Stripe, SendGrid, Algolia, Moneyhub, etc.) Work with Google Cloud Platform services including BigQuery, Cloud Storage, Cloud Tasks, and Pub/Sub Software Development and Delivery: Write clean, maintainable, and well-documented code following established coding standards Participate in code reviews and provide constructive feedback to team members Develop and maintain comprehensive unit tests using Jest Work within a monorepo structure using pnpm workspaces Implement CI/CD pipelines using Google Cloud Build and Firebase deployment tools Manage application state and data flow using reactive programming patterns Optimize application performance Professional Development and Collaboration: Participate in agile ceremonies including sprint planning, daily standups, and retrospectives Collaborate with product owners, designers, and stakeholders to refine requirements Stay current with emerging technologies and best practices in web development Contribute to technical documentation and knowledge sharing within the team Participate in technical discussions and architectural decision-making Technical Operations and Quality Assurance: Monitor application performance using Sentry error tracking and analytics Implement security best practices Debug and resolve production issues in a timely manner Maintain test coverage and ensure comprehensive testing strategies Perform database migrations and manage Firestore data structures Stakeholder Collaboration: Communicate technical concepts effectively to non-technical stakeholders Gather and analyse requirements from business stakeholders Provide technical estimates and delivery timelines Present demos and progress updates to stakeholders Collaborate with external partners and third-party service providers Support customer-facing teams with technical expertise when needed Person Specification Qualifications: Bachelor's degree in Computer Science, Software Engineering, or related field (or equivalent practical experience) Relevant certifications in Angular, Google Cloud Platform, or Firebase (desirable) Evidence of ongoing professional development or contributions to technical communities (desirable). Skills And Abilities: Frontend Development:  Knowledge of Angular (v16+), TypeScript, ES6+, RxJS, HTML5, CSS3/SCSS Backend Development:  Strong proficiency in Node.js, Express.js, and serverless architectures Database:  Experience with NoSQL databases, particularly Firebase Firestore Cloud Platforms:  Hands-on experience with Google Cloud Platform and Firebase services Version Control:  Proficient with Git, GitHub workflows, and collaborative development practices Testing:  Experience with Jest and test-driven development approaches API Integration:  Ability to integrate and work with third-party APIs and services Problem Solving:  Strong analytical and debugging skills Communication:  Excellent written and verbal communication skills Experience: Minimum 3 - 5 years of professional software development experience Proven experience building production-grade Angular applications Experience with Firebase Cloud Functions and serverless architectures Track record of working in monorepo environments (desirable) Experience with payment processing systems (Stripe) and financial integrations Familiarity with property management or real estate technology (desirable) Experience with CI/CD pipelines and DevOps practices Background in agile/scrum development methodologies Knowledge: Deep understanding of JavaScript/TypeScript and modern ES6+ features Knowledge of reactive programming patterns and state management Understanding of RESTful API design principles Familiarity with authentication and authorization patterns (JWT, OAuth) Knowledge of web security best practices and OWASP guidelines Understanding of responsive design and mobile-first development Awareness of accessibility standards (WCAG 2.1) Knowledge of performance optimization techniques Understanding of microservices and event-driven architectures Personal Attributes and other requirements: Self-motivated with strong initiative and ability to work independently Detail-oriented with commitment to code quality and best practices A keen eye for detail when working with UI Adaptable and comfortable working in a fast-paced, evolving environment Collaborative team player with strong interpersonal skills Proactive approach to learning new technologies and methodologies Strong time management and organizational skills Passion for creating excellent user experiences Commitment to continuous improvement and professional development Ability to work hybrid from Manchester or London office
Pense Ltd
Web Developer (.NET)
Pense Ltd
Role and Responsibilities This is an excellent opportunity for a capable web developer who enjoys owning features end-to-end in a modern, forward-thinking financial services business. We have our own proprietary applications, both back-office solutions as well as client facing. You will work on multiple projects including continuing to enhance our own CRM solution, migrating legacy WebForms functionality to clean services and a modern UI. You’ll work in a fast-paced, dynamic environment; you will not be a “cog in the machine.” You’ll be involved in every aspect of delivery - from requirements and design, to development, testing, deployment, documentation, and support, with a strong focus on scalability and maintainability. We are an extremely fast-growing business and as such, you will be required to adapt quickly to changes and business requirements. You must have a willingness to learn and adapt quickly in order to satisfy changes to business requirements. The job role will consist of the below responsibilities: Development CRM (WebForms/.NET/TSQL/BPM): Configure and extend our proprietary CRM using TSQL, BPM/workflow tools and C# Modernisation: Incremental modernisation of legacy applications using strangler pattern Client facing applications: Develop our customer self service portal and other front-end applications utilising React/Typescript and .NET 9 Web API Azure implementation: Assist in administering our cloud infrastructure (App Service/Functions, Azure SQL, Storage, Key Vault with Managed Identity, App Configuration, Private Endpoints, Application Insights) Quality/pipelines: Contribute to and set up CI/CD pipelines, unit/integration tests where required Testing: Conduct thorough testing and peer review of work items Data Analysis (SQL Server/TSQL) Write performance conscious, SARGable queries and implement appropriate indexing Monitor and apply performance fixes and other DB management tasks in Azure SQL Server hosted DB Technical Documentation Produce documentation for requirement gathering, code architecture and training guides Contribute to due diligence and penetration test readiness packs (architecture diagrams, data flows and controls) Support Triage and resolve support tickets relating to in-house applications from employees Support customer facing applications and address incidents utilising Azure Application Insights and application logs to identify and resolve issues Requirement Gathering: Communicate with stakeholders to establish clear problem statements and requirements Translate requirements into technical solutions with a focus on extendibility and minimising technical debt, considering future changes Accountability and Working Relationships Part of a small, highly motivated and productive development team Partners closely with CTO (hands on) Collaborates and in direct communication with operations, advice, compliance and administration teams to deliver appropriate solutions Accountable for delivering projects on time and to specification Working Environment and Hours: Office based in Doncaster (remote working can be available on occasions where required, however office based is preferred) Monday to Friday, 09:00-17:30. Hours can be flexible within reason and to ensure delivery of key milestones Benefits Competitive salary (dependent on experience) 24 days annual leave (plus bank holidays) Pension scheme Can be very flexible on working hours dependent on requirements and ensuring delivery of key milestones Opportunity to own meaningful projects end-to-end in a growing, entrepreneurial business Big opportunity to learn new emerging technologies (particularly AI), while at work Skills – Essential C# .NET (6+): Dependency Injection, Web API Working knowledge and experience of legacy .NET Framework 4.x and WebForms Microsoft SQL Server: Strong T-SQL, SARGability and indexing, performance tuning mindset TypeScript/React: Component reusability, state management Auth and Security: Securing SPA/APIs using flows such as OAuth2/OIDC Microsoft Azure (hands-on): VMs, App Service, Functions/Logic Apps, Azure SQL, Storage, Key Vault (Managed Identity), App Configuration, Application Insights; basic networking/private endpoints/DNS concepts CI/CD: GitHub Actions/Azure DevOps; environment-aware configuration and secret management Skills – Desirable Firebase: authentication, hosting, functions, Firestore React Native and building mobile apps Exposure to Azure AI Foundry or other AI orchestration (prompting/evaluation/workflows) Experience migrating legacy codebases to modern architecture Using Entra ID for authentication (OAuth2 Flow) Azure infrastructure set up including VNET/Private endpoint and DNS management and setting up secure landing zones Twilio API Industry knowledge of Financial services particularly retirement and custodial platforms Knowledge and experience working with the Seccl API Development Path Candidates slightly lighter in one area (like React or Azure) but strong in WebForms + C# + SQL and motivated to learn will be considered, provided they can show understanding of fundamentals and are willing to learn quickly. Comfortable operating across legacy and modern stacks in the same day. Job Type: Full-time Benefits:   Company events Company pension Free parking On-site parking   Ability to commute/relocate:   Doncaster DN4 5NL: reliably commute or plan to relocate before starting work (preferred)   Work Location: In person
12/11/2025
Full time
Role and Responsibilities This is an excellent opportunity for a capable web developer who enjoys owning features end-to-end in a modern, forward-thinking financial services business. We have our own proprietary applications, both back-office solutions as well as client facing. You will work on multiple projects including continuing to enhance our own CRM solution, migrating legacy WebForms functionality to clean services and a modern UI. You’ll work in a fast-paced, dynamic environment; you will not be a “cog in the machine.” You’ll be involved in every aspect of delivery - from requirements and design, to development, testing, deployment, documentation, and support, with a strong focus on scalability and maintainability. We are an extremely fast-growing business and as such, you will be required to adapt quickly to changes and business requirements. You must have a willingness to learn and adapt quickly in order to satisfy changes to business requirements. The job role will consist of the below responsibilities: Development CRM (WebForms/.NET/TSQL/BPM): Configure and extend our proprietary CRM using TSQL, BPM/workflow tools and C# Modernisation: Incremental modernisation of legacy applications using strangler pattern Client facing applications: Develop our customer self service portal and other front-end applications utilising React/Typescript and .NET 9 Web API Azure implementation: Assist in administering our cloud infrastructure (App Service/Functions, Azure SQL, Storage, Key Vault with Managed Identity, App Configuration, Private Endpoints, Application Insights) Quality/pipelines: Contribute to and set up CI/CD pipelines, unit/integration tests where required Testing: Conduct thorough testing and peer review of work items Data Analysis (SQL Server/TSQL) Write performance conscious, SARGable queries and implement appropriate indexing Monitor and apply performance fixes and other DB management tasks in Azure SQL Server hosted DB Technical Documentation Produce documentation for requirement gathering, code architecture and training guides Contribute to due diligence and penetration test readiness packs (architecture diagrams, data flows and controls) Support Triage and resolve support tickets relating to in-house applications from employees Support customer facing applications and address incidents utilising Azure Application Insights and application logs to identify and resolve issues Requirement Gathering: Communicate with stakeholders to establish clear problem statements and requirements Translate requirements into technical solutions with a focus on extendibility and minimising technical debt, considering future changes Accountability and Working Relationships Part of a small, highly motivated and productive development team Partners closely with CTO (hands on) Collaborates and in direct communication with operations, advice, compliance and administration teams to deliver appropriate solutions Accountable for delivering projects on time and to specification Working Environment and Hours: Office based in Doncaster (remote working can be available on occasions where required, however office based is preferred) Monday to Friday, 09:00-17:30. Hours can be flexible within reason and to ensure delivery of key milestones Benefits Competitive salary (dependent on experience) 24 days annual leave (plus bank holidays) Pension scheme Can be very flexible on working hours dependent on requirements and ensuring delivery of key milestones Opportunity to own meaningful projects end-to-end in a growing, entrepreneurial business Big opportunity to learn new emerging technologies (particularly AI), while at work Skills – Essential C# .NET (6+): Dependency Injection, Web API Working knowledge and experience of legacy .NET Framework 4.x and WebForms Microsoft SQL Server: Strong T-SQL, SARGability and indexing, performance tuning mindset TypeScript/React: Component reusability, state management Auth and Security: Securing SPA/APIs using flows such as OAuth2/OIDC Microsoft Azure (hands-on): VMs, App Service, Functions/Logic Apps, Azure SQL, Storage, Key Vault (Managed Identity), App Configuration, Application Insights; basic networking/private endpoints/DNS concepts CI/CD: GitHub Actions/Azure DevOps; environment-aware configuration and secret management Skills – Desirable Firebase: authentication, hosting, functions, Firestore React Native and building mobile apps Exposure to Azure AI Foundry or other AI orchestration (prompting/evaluation/workflows) Experience migrating legacy codebases to modern architecture Using Entra ID for authentication (OAuth2 Flow) Azure infrastructure set up including VNET/Private endpoint and DNS management and setting up secure landing zones Twilio API Industry knowledge of Financial services particularly retirement and custodial platforms Knowledge and experience working with the Seccl API Development Path Candidates slightly lighter in one area (like React or Azure) but strong in WebForms + C# + SQL and motivated to learn will be considered, provided they can show understanding of fundamentals and are willing to learn quickly. Comfortable operating across legacy and modern stacks in the same day. Job Type: Full-time Benefits:   Company events Company pension Free parking On-site parking   Ability to commute/relocate:   Doncaster DN4 5NL: reliably commute or plan to relocate before starting work (preferred)   Work Location: In person
Involved Productions Ltd
Data Engineer
Involved Productions Ltd London
We’re looking for a Data Engineer to work across the Involved Group, the collective behind globally renowned dance and electronic music labels including Anjunabeats and Anjunadeep, spanning label services and distribution, music publishing, events promotion and artist management. This is a key role within our Technology Department, responsible for developing and managing data pipelines, automating data collection processes, and creating analytics dashboards to provide actionable insights across the company, directly impacting strategy. This role involves working closely with a variety of departments to understand their data needs, developing solutions that streamline data analysis and reporting processes. Reporting to the Head of Technology, our Data Engineer ensures that data analytics initiatives are strategically aligned, efficiently executed, and contribute to the company's overall objectives. Location: Bermondsey, London Working pattern: Part-time (3 days/week) – either in-person at our lively Bermondsey office, hybrid, or home-working.   ____________________________   Who we are:   Based in Bermondsey, the Involved group of companies includes: Involved Productions, home of globally renowned independent dance and electronic music labels Anjunabeats, Anjunadeep and Anjunachill, as well as our label and distribution services. Involved Live, the touring and events company responsible for a portfolio of international events, festivals, and all-night-long showcases, creating unforgettable experiences for fans globally. Involved Publishing, a progressive independent music publisher, representing cutting-edge producers, writers and artists from around the world. Involved Management is a boutique artist management company that is responsible for steering the careers of Above & Beyond, Lane 8, Le Youth and Dusky.  We offer careers, not just jobs, and our team embrace the entrepreneurial spirit, independent mindset and respectful culture we have created, building community and connection through music. ____________________________   Our Data Engineer is responsible for: Analytics Dashboard Creation: Developing and optimising Tableau dashboards that provide clear, actionable insights to various teams, including Streaming & Promotions, Label Directors, and Publishing. Data Pipeline Development: Designing, building, and maintaining efficient and scalable data pipelines to automate the collection, transformation, and delivery of data to and from various sources, including DSPs, FUGA Analytics, Google Analytics, Chartmetric, Curve, etc. Database Management: Developing and maintaining the company’s database structure, ensuring data accuracy, security, and accessibility for analytics purposes. Teaching: Providing support and training to ensure teams are making effective use of analytics tools and dashboards. Tailoring : Collaborating with different departments to understand their data needs, and working creatively to provide tailored analytics solutions. Building: Supporting the Head of Technology in building and maintaining cross-platform automations. Innovation and Research: Staying up to date with the latest trends and technologies in data engineering and analytics, exploring new tools and methodologies that can enhance our data capabilities. This list is not exhaustive – we may ask you to go beyond your job description on occasion, and we hope the role will change and develop with you. ____________________________   About you:   The ideal candidate for this role will likely have: a solid foundation in Python and JavaScript, ideally with proficiency in other programming languages. experience designing and implementing ETL pipelines, specifically using Apache Airflow (Astronomer). hands-on experience with ETL frameworks, particularly dbt (data build tool). SQL and various database management system skills. a good understanding of different database types, designs, and data modelling systems. experience with cloud platforms like AWS and GCP, including services such as BigQuery, RDS, and Athena. familiarity with Tableau and project management tools like monday.com and Notion. knowledge of APIs from music Digital Service Providers (e.g., Spotify, Apple Music). previous experience at a record label, music distributor, or music publisher. an understanding of the music industry excellent analytical, problem-solving, and communication skills. a proactive approach to learning, excitement about problem-solving, approaching new projects with an open mind. strong accuracy and attention to detail. good written and verbal communication skills, the ability to explain complex ideas using non-technical language. the ability to prioritise and manage their time independently.   ____________________________   What we offer:   A competitive salary (£50-60k pro rata) Participation in our Profit Share Scheme 20 days annual leave A benefits package to support your wellbeing, including access to local gyms and fitness classes, and subscription to health apps including Calm, Headspace and Strava A collection of enhanced family policies to support your family life The opportunity to attend a variety of live events Cycle to work scheme Season ticket loans A lively, collaborative office environment, and a flexible hybrid working policy Paid time off to volunteer with our local charitable initiatives   Applications   Closing date for applications is 21 November 2025, although we may close applications earlier. If you need more information before applying, email us at people@anjunabeats.com. We are committed to inclusion, and encourage applications from anyone with relevant experience and skills. If you require any adjustments throughout the application process to meet your needs and help you perform at your best, please let us know.
28/10/2025
Part time
We’re looking for a Data Engineer to work across the Involved Group, the collective behind globally renowned dance and electronic music labels including Anjunabeats and Anjunadeep, spanning label services and distribution, music publishing, events promotion and artist management. This is a key role within our Technology Department, responsible for developing and managing data pipelines, automating data collection processes, and creating analytics dashboards to provide actionable insights across the company, directly impacting strategy. This role involves working closely with a variety of departments to understand their data needs, developing solutions that streamline data analysis and reporting processes. Reporting to the Head of Technology, our Data Engineer ensures that data analytics initiatives are strategically aligned, efficiently executed, and contribute to the company's overall objectives. Location: Bermondsey, London Working pattern: Part-time (3 days/week) – either in-person at our lively Bermondsey office, hybrid, or home-working.   ____________________________   Who we are:   Based in Bermondsey, the Involved group of companies includes: Involved Productions, home of globally renowned independent dance and electronic music labels Anjunabeats, Anjunadeep and Anjunachill, as well as our label and distribution services. Involved Live, the touring and events company responsible for a portfolio of international events, festivals, and all-night-long showcases, creating unforgettable experiences for fans globally. Involved Publishing, a progressive independent music publisher, representing cutting-edge producers, writers and artists from around the world. Involved Management is a boutique artist management company that is responsible for steering the careers of Above & Beyond, Lane 8, Le Youth and Dusky.  We offer careers, not just jobs, and our team embrace the entrepreneurial spirit, independent mindset and respectful culture we have created, building community and connection through music. ____________________________   Our Data Engineer is responsible for: Analytics Dashboard Creation: Developing and optimising Tableau dashboards that provide clear, actionable insights to various teams, including Streaming & Promotions, Label Directors, and Publishing. Data Pipeline Development: Designing, building, and maintaining efficient and scalable data pipelines to automate the collection, transformation, and delivery of data to and from various sources, including DSPs, FUGA Analytics, Google Analytics, Chartmetric, Curve, etc. Database Management: Developing and maintaining the company’s database structure, ensuring data accuracy, security, and accessibility for analytics purposes. Teaching: Providing support and training to ensure teams are making effective use of analytics tools and dashboards. Tailoring : Collaborating with different departments to understand their data needs, and working creatively to provide tailored analytics solutions. Building: Supporting the Head of Technology in building and maintaining cross-platform automations. Innovation and Research: Staying up to date with the latest trends and technologies in data engineering and analytics, exploring new tools and methodologies that can enhance our data capabilities. This list is not exhaustive – we may ask you to go beyond your job description on occasion, and we hope the role will change and develop with you. ____________________________   About you:   The ideal candidate for this role will likely have: a solid foundation in Python and JavaScript, ideally with proficiency in other programming languages. experience designing and implementing ETL pipelines, specifically using Apache Airflow (Astronomer). hands-on experience with ETL frameworks, particularly dbt (data build tool). SQL and various database management system skills. a good understanding of different database types, designs, and data modelling systems. experience with cloud platforms like AWS and GCP, including services such as BigQuery, RDS, and Athena. familiarity with Tableau and project management tools like monday.com and Notion. knowledge of APIs from music Digital Service Providers (e.g., Spotify, Apple Music). previous experience at a record label, music distributor, or music publisher. an understanding of the music industry excellent analytical, problem-solving, and communication skills. a proactive approach to learning, excitement about problem-solving, approaching new projects with an open mind. strong accuracy and attention to detail. good written and verbal communication skills, the ability to explain complex ideas using non-technical language. the ability to prioritise and manage their time independently.   ____________________________   What we offer:   A competitive salary (£50-60k pro rata) Participation in our Profit Share Scheme 20 days annual leave A benefits package to support your wellbeing, including access to local gyms and fitness classes, and subscription to health apps including Calm, Headspace and Strava A collection of enhanced family policies to support your family life The opportunity to attend a variety of live events Cycle to work scheme Season ticket loans A lively, collaborative office environment, and a flexible hybrid working policy Paid time off to volunteer with our local charitable initiatives   Applications   Closing date for applications is 21 November 2025, although we may close applications earlier. If you need more information before applying, email us at people@anjunabeats.com. We are committed to inclusion, and encourage applications from anyone with relevant experience and skills. If you require any adjustments throughout the application process to meet your needs and help you perform at your best, please let us know.
Reigate and Banstead Borough Council
Systems Support Analyst
Reigate and Banstead Borough Council Hybrid, Town Hall Reigate
Systems Support Analyst Location: Town Hall, Reigate Salary: £39,183 to £41,925 Contract: Permanent Working Hours: Full time, 36 hours per week   Can you help us improve the services delivered to our residents and customers?  Do you have software development skills, technical ICT experience and enjoy variety?  Reigate and Banstead Borough Council are seeking an enthusiastic and experienced Systems Support Analyst and it could be the position you are looking for.   Joining the small and friendly Business Improvement Team in ICT, you will help deliver the Council’s ICT and Digital Strategy.  You will be able to demonstrate that you are an innovative, supportive, positive and flexible person whilst working on wide variety of interesting business change and application projects.   You will be using your knowledge and experience of digital service platforms, API’s, SQL and SQL Server, ETL software, automation tools, ftp, PowerShell scripting, webservices and PowerBI. This could be:   with the Granicus govService digital platform, creating self-service online forms and processes for our residents and customers, or generating efficiencies and business value for our internal service unit colleagues integrating diverse back office systems using API’s and webservices creating and managing existing SQL databases writing and maintaining PowerBI reports creating and supporting batch work using scheduled tasks, ftp and PowerShell scripting tools working with third party software suppliers on upgrade and migration projects troubleshoot third line support calls   You will be a self-starter and have excellent analytical and problem-solving skills along with strong organisational and interpersonal skills.   The Council is also embarking on the journey to become a larger unitary authority in Surrey which will present opportunities for further career development.   Staff Benefits In exchange for your expertise, experience and enthusiasm, we will offer support in continuing your personal and career development, in addition to providing a wide range of employment linked benefits. We provide you with generous annual leave, flexible working and contribute 15% towards the LGPS pension scheme. You will also have access to a range of discounts including local and high street stores, salary sacrifice schemes including a cycle lease scheme, and discounted ‘Better’ leisure centre membership.   Additional Information For an informal discussion about the role, please call Kenton Reader, Technology Services Manager on 01737 276764.   We are proud to be an equal opportunities employer, supporting the guaranteed interview scheme for disabled and ex-armed forces candidates, who meet the essential criteria for the role.   Closing date: 16 June 2025   Values and Behaviours Our great working environment and the values and behaviours of every individual and team in the Council, help to evolve the culture of our organisation to become more commercial, innovative and embracing of change. Successful applicants to our career opportunities will be able to demonstrate they share the values and behaviours we seek in our organisation.   See ' Who we are ' as a council, to find out more about us. Click here to view a Job Summary, Person Specification and Employment Pack.
30/05/2025
Full time
Systems Support Analyst Location: Town Hall, Reigate Salary: £39,183 to £41,925 Contract: Permanent Working Hours: Full time, 36 hours per week   Can you help us improve the services delivered to our residents and customers?  Do you have software development skills, technical ICT experience and enjoy variety?  Reigate and Banstead Borough Council are seeking an enthusiastic and experienced Systems Support Analyst and it could be the position you are looking for.   Joining the small and friendly Business Improvement Team in ICT, you will help deliver the Council’s ICT and Digital Strategy.  You will be able to demonstrate that you are an innovative, supportive, positive and flexible person whilst working on wide variety of interesting business change and application projects.   You will be using your knowledge and experience of digital service platforms, API’s, SQL and SQL Server, ETL software, automation tools, ftp, PowerShell scripting, webservices and PowerBI. This could be:   with the Granicus govService digital platform, creating self-service online forms and processes for our residents and customers, or generating efficiencies and business value for our internal service unit colleagues integrating diverse back office systems using API’s and webservices creating and managing existing SQL databases writing and maintaining PowerBI reports creating and supporting batch work using scheduled tasks, ftp and PowerShell scripting tools working with third party software suppliers on upgrade and migration projects troubleshoot third line support calls   You will be a self-starter and have excellent analytical and problem-solving skills along with strong organisational and interpersonal skills.   The Council is also embarking on the journey to become a larger unitary authority in Surrey which will present opportunities for further career development.   Staff Benefits In exchange for your expertise, experience and enthusiasm, we will offer support in continuing your personal and career development, in addition to providing a wide range of employment linked benefits. We provide you with generous annual leave, flexible working and contribute 15% towards the LGPS pension scheme. You will also have access to a range of discounts including local and high street stores, salary sacrifice schemes including a cycle lease scheme, and discounted ‘Better’ leisure centre membership.   Additional Information For an informal discussion about the role, please call Kenton Reader, Technology Services Manager on 01737 276764.   We are proud to be an equal opportunities employer, supporting the guaranteed interview scheme for disabled and ex-armed forces candidates, who meet the essential criteria for the role.   Closing date: 16 June 2025   Values and Behaviours Our great working environment and the values and behaviours of every individual and team in the Council, help to evolve the culture of our organisation to become more commercial, innovative and embracing of change. Successful applicants to our career opportunities will be able to demonstrate they share the values and behaviours we seek in our organisation.   See ' Who we are ' as a council, to find out more about us. Click here to view a Job Summary, Person Specification and Employment Pack.
Inuvi
DevOps and Infrastructure Engineer
Inuvi Wokingham, UK
Why work with Inuvi At Inuvi, our mission is to foster a work environment where innovation, collaboration, and personal growth are at the forefront. We believe in the power of diverse perspectives and strive to create a culture where every team member feels valued and empowered to contribute their unique ideas. Working with us means being part of a dynamic team that is passionate about making a positive impact on our customers and the industry.   Introducing Inuvi’s IT team At Inuvi, our IT team is dedicated to ensuring the seamless operation and continuous improvement of our systems. The team is responsible for the support, deployment, automation, and maintenance of our infrastructure, ensuring its availability, performance, scalability, and security. Our IT professionals bring a wealth of experience from various backgrounds and work collaboratively with other departments to tackle complex challenges. With a strong focus on innovation and efficiency, the IT team plays a crucial role in driving Inuvi's success and delivering exceptional value to our customers.   What we are looking for We are looking for an IT Infrastructure Analyst to be responsible for the support, deployment, automation and maintenance of our systems whilst ensuring their availability, performance scalability and security.  You must have proven “hands on” experience in a similar role with a good understanding of IT Infrastructure both on premise and in the cloud; which includes a good understanding of DevOps engineering practices. As a small team our roles are broad which will enable your involvement in a range of technologies and projects and foster opportunities for your personal growth. You must be able to demonstrate real world experience of IT infrastructure both on-premise and in the AWS cloud which includes: Good working knowledge of on-premises server infrastructure including Hyper-V servers, storage arrays, switches and firewalls. An excellent understanding of AWS, and a wealth of practical hands-on real-world experience using the core services (EC2, CloudWatch, IAM, RDS, S3 etc at a minimum) An excellent understanding of networking principles and technologies (subnets, VLANs and routing) A good understanding of development pipelines and the technologies used to implement them in AWS and/or Azure Experience using Docker containers Good working knowledge of Linux and Windows server administration and support Experience with JIRA, Bitbucket/Git Proficiency at scripting (especially PowerShell) A good understanding of database technologies (MySQL in particular) and experience administering them Strong interpersonal and communication skills with an ability to produce documentation to a high standard Ability to troubleshoot in a logical manner with a pro-active approach, spot potential problems, escalate and react when necessary Ability to deal competently with pressure and be able to prioritise workload Effective collaboration with other members of the IT team and wider business   In return, we will ensure you have: Very competitive salary with annual salary reviews 25 days holiday a year (plus bank holidays) for some well-deserved time off Opportunity to purchase an additional holiday each year Pension contributions of 5% Annual wellbeing health check Health Shield cash plan Death in service benefit Cycle to work scheme What happens next After receiving your application, our team will review it and inform you of the next steps. If you are selected for the next stage, we will schedule an introductory Teams call to provide more information about our company, learn more about you, and understand your expectations. This will be an opportunity for you to ask any questions you may have. Please choose a time when you can be in a quiet place without distractions. Depending on the outcome of the call, the following stage will involve a face-to-face interview at our Wokingham office, which will also include practical technical questions. We understand the effort required to apply for a new job and value your time. We look forward to reviewing your application. This is an office-based role in Wokingham, Berkshire (RG41) and therefore candidates need to be located within a commutable distance to the office.  We will not consider candidates who need to relocate to be nearer the office.
14/02/2025
Full time
Why work with Inuvi At Inuvi, our mission is to foster a work environment where innovation, collaboration, and personal growth are at the forefront. We believe in the power of diverse perspectives and strive to create a culture where every team member feels valued and empowered to contribute their unique ideas. Working with us means being part of a dynamic team that is passionate about making a positive impact on our customers and the industry.   Introducing Inuvi’s IT team At Inuvi, our IT team is dedicated to ensuring the seamless operation and continuous improvement of our systems. The team is responsible for the support, deployment, automation, and maintenance of our infrastructure, ensuring its availability, performance, scalability, and security. Our IT professionals bring a wealth of experience from various backgrounds and work collaboratively with other departments to tackle complex challenges. With a strong focus on innovation and efficiency, the IT team plays a crucial role in driving Inuvi's success and delivering exceptional value to our customers.   What we are looking for We are looking for an IT Infrastructure Analyst to be responsible for the support, deployment, automation and maintenance of our systems whilst ensuring their availability, performance scalability and security.  You must have proven “hands on” experience in a similar role with a good understanding of IT Infrastructure both on premise and in the cloud; which includes a good understanding of DevOps engineering practices. As a small team our roles are broad which will enable your involvement in a range of technologies and projects and foster opportunities for your personal growth. You must be able to demonstrate real world experience of IT infrastructure both on-premise and in the AWS cloud which includes: Good working knowledge of on-premises server infrastructure including Hyper-V servers, storage arrays, switches and firewalls. An excellent understanding of AWS, and a wealth of practical hands-on real-world experience using the core services (EC2, CloudWatch, IAM, RDS, S3 etc at a minimum) An excellent understanding of networking principles and technologies (subnets, VLANs and routing) A good understanding of development pipelines and the technologies used to implement them in AWS and/or Azure Experience using Docker containers Good working knowledge of Linux and Windows server administration and support Experience with JIRA, Bitbucket/Git Proficiency at scripting (especially PowerShell) A good understanding of database technologies (MySQL in particular) and experience administering them Strong interpersonal and communication skills with an ability to produce documentation to a high standard Ability to troubleshoot in a logical manner with a pro-active approach, spot potential problems, escalate and react when necessary Ability to deal competently with pressure and be able to prioritise workload Effective collaboration with other members of the IT team and wider business   In return, we will ensure you have: Very competitive salary with annual salary reviews 25 days holiday a year (plus bank holidays) for some well-deserved time off Opportunity to purchase an additional holiday each year Pension contributions of 5% Annual wellbeing health check Health Shield cash plan Death in service benefit Cycle to work scheme What happens next After receiving your application, our team will review it and inform you of the next steps. If you are selected for the next stage, we will schedule an introductory Teams call to provide more information about our company, learn more about you, and understand your expectations. This will be an opportunity for you to ask any questions you may have. Please choose a time when you can be in a quiet place without distractions. Depending on the outcome of the call, the following stage will involve a face-to-face interview at our Wokingham office, which will also include practical technical questions. We understand the effort required to apply for a new job and value your time. We look forward to reviewing your application. This is an office-based role in Wokingham, Berkshire (RG41) and therefore candidates need to be located within a commutable distance to the office.  We will not consider candidates who need to relocate to be nearer the office.
Tenth Revolution Group
Databricks Architect
Tenth Revolution Group Edinburgh, Midlothian
Data Architect - Databricks (Hybrid, UK) Locations: London, Manchester, or Edinburgh Hybrid: 2-3 days per week on-site Salary: Competitive (Manager & Senior Manager grades available) About the Role We are seeking an experienced Data Architect with deep expertise in Databricks to help our clients design, build, and scale modern data platforms. You will play a pivotal role in shaping Lakehouse architectures that enable advanced analytics, AI/ML, and enterprise-wide data-driven decision-making. Working closely with clients early in their data journey, you will assess business needs, define architectural direction, and guide the implementation of robust, secure, and scalable solutions. This is a hands-on architecture role suited to someone who has spent the last 2-3 years working directly with Databricks at an architectural level and is ready to progress towards programmes such as the Databricks DPP. Key Responsibilities * Architect and implement Databricks Lakehouse solutions across ingestion, processing, storage, and analytics layers. * Recommend best practices and innovative approaches for modern data platforms. * Build strong client relationships and confidently present architectural decisions to senior stakeholders. * Shape client data strategies and promote governance, quality, and security standards. * Lead architectural engagements and ensure delivery within scope, budget, and timelines. * Optimise Databricks workloads for performance, scalability, and cost efficiency. * Implement governance and compliance frameworks using Unity Catalog, Purview, and cloud-native controls. * Develop CI/CD pipelines using Databricks Repos, GitHub Actions, or Azure DevOps. * Contribute to RFI/RFP responses and deliver innovative Proofs of Concept. * Support the internal Architecture Practice by developing reusable patterns and accelerators. Skills & Experience * Proven experience delivering enterprise-scale Databricks solutions end-to-end. * Strong background in Lakehouse Architecture, including structured and unstructured data. * Expertise in Spark, PySpark, Delta Lake, and Databricks workflows. * Experience building scalable ETL/ELT pipelines, including Delta Live Tables. * Strong programming skills in Python, Scala, or SQL. * Solid understanding of data modelling (3NF, Kimball, Data Vault). * Experience integrating Lakehouse architectures with BI tools such as Power BI and Tableau. * Hands-on experience with at least one major cloud platform (Azure, AWS, or GCP) and understanding of Databricks implications across each. * Knowledge of Databricks security best practices (RBAC, IAM, encryption). * Excellent communication, stakeholder engagement, and problem-solving skills. Highly Valued Certifications * Databricks Certified Data Engineer (Associate/Professional) * Databricks Certified Machine Learning (Associate/Professional) * Databricks Generative AI Fundamentals * Databricks Lakehouse Fundamentals Why Join Us? * Generous annual leave and private medical insurance. * Strong focus on wellbeing and personal development. * A culture that rewards high performance and nurtures talent. * Opportunities to work on impactful client projects and drive meaningful change. * Supportive environment with investment in certifications and career progression. Additional Information * This role is fully signed off and part of a growing Databricks capability. * Candidates must be willing to travel between UK offices when required. * Suitable for individuals with strong architectural experience rather than purely engineering backgrounds. Please can you send me a copy of your CV if you're interested
18/03/2026
Full time
Data Architect - Databricks (Hybrid, UK) Locations: London, Manchester, or Edinburgh Hybrid: 2-3 days per week on-site Salary: Competitive (Manager & Senior Manager grades available) About the Role We are seeking an experienced Data Architect with deep expertise in Databricks to help our clients design, build, and scale modern data platforms. You will play a pivotal role in shaping Lakehouse architectures that enable advanced analytics, AI/ML, and enterprise-wide data-driven decision-making. Working closely with clients early in their data journey, you will assess business needs, define architectural direction, and guide the implementation of robust, secure, and scalable solutions. This is a hands-on architecture role suited to someone who has spent the last 2-3 years working directly with Databricks at an architectural level and is ready to progress towards programmes such as the Databricks DPP. Key Responsibilities * Architect and implement Databricks Lakehouse solutions across ingestion, processing, storage, and analytics layers. * Recommend best practices and innovative approaches for modern data platforms. * Build strong client relationships and confidently present architectural decisions to senior stakeholders. * Shape client data strategies and promote governance, quality, and security standards. * Lead architectural engagements and ensure delivery within scope, budget, and timelines. * Optimise Databricks workloads for performance, scalability, and cost efficiency. * Implement governance and compliance frameworks using Unity Catalog, Purview, and cloud-native controls. * Develop CI/CD pipelines using Databricks Repos, GitHub Actions, or Azure DevOps. * Contribute to RFI/RFP responses and deliver innovative Proofs of Concept. * Support the internal Architecture Practice by developing reusable patterns and accelerators. Skills & Experience * Proven experience delivering enterprise-scale Databricks solutions end-to-end. * Strong background in Lakehouse Architecture, including structured and unstructured data. * Expertise in Spark, PySpark, Delta Lake, and Databricks workflows. * Experience building scalable ETL/ELT pipelines, including Delta Live Tables. * Strong programming skills in Python, Scala, or SQL. * Solid understanding of data modelling (3NF, Kimball, Data Vault). * Experience integrating Lakehouse architectures with BI tools such as Power BI and Tableau. * Hands-on experience with at least one major cloud platform (Azure, AWS, or GCP) and understanding of Databricks implications across each. * Knowledge of Databricks security best practices (RBAC, IAM, encryption). * Excellent communication, stakeholder engagement, and problem-solving skills. Highly Valued Certifications * Databricks Certified Data Engineer (Associate/Professional) * Databricks Certified Machine Learning (Associate/Professional) * Databricks Generative AI Fundamentals * Databricks Lakehouse Fundamentals Why Join Us? * Generous annual leave and private medical insurance. * Strong focus on wellbeing and personal development. * A culture that rewards high performance and nurtures talent. * Opportunities to work on impactful client projects and drive meaningful change. * Supportive environment with investment in certifications and career progression. Additional Information * This role is fully signed off and part of a growing Databricks capability. * Candidates must be willing to travel between UK offices when required. * Suitable for individuals with strong architectural experience rather than purely engineering backgrounds. Please can you send me a copy of your CV if you're interested
Proactive Appointments
Data Engineer
Proactive Appointments Exeter, Devon
Data Engineer Exeter | Hybrid (Flexible) | Up to £50k per annum | Permanent Our leading client are currently seeking an experienced Data Engineer to join their team, a growing organisation investing in its data and analytics capabilities. This role will be key in designing, building, and maintaining scalable data solutions to support Business Intelligence and reporting needs. Key Responsibilities Design, develop, and maintain robust ETL processes Build and optimise data pipelines for efficient data integration Write and optimise complex SQL queries, stored procedures, and database objects Develop and manage SSIS packages Create and maintain reports using SSRS Support the development and optimisation of data warehouse solutions Ensure data quality, integrity, and performance across systems Collaborate with stakeholders to deliver data-driven solutions Troubleshoot and resolve data-related issues Required Experience 3-5+ years' experience in a Data Engineering or similar role Strong proficiency in SQL , including performance tuning and optimisation Hands-on experience with ETL development and data integration Proven experience with SSIS (SQL Server Integration Services) Experience developing reports using SSRS (SQL Server Reporting Services Solid understanding of data warehousing Experience working with large, complex datasets Strong analytical and problem-solving skills Due to the volume of applications received for positions, it will not be possible to respond to all applications and only applicants who are considered suitable for interview will be contacted. Proactive Appointments Limited operates as an employment agency and employment business and is an equal opportunities organisation We take our obligations to protect your personal data very seriously. Any information provided to us will be processed as detailed in our Privacy Notice, a copy of which can be found on our website
18/03/2026
Full time
Data Engineer Exeter | Hybrid (Flexible) | Up to £50k per annum | Permanent Our leading client are currently seeking an experienced Data Engineer to join their team, a growing organisation investing in its data and analytics capabilities. This role will be key in designing, building, and maintaining scalable data solutions to support Business Intelligence and reporting needs. Key Responsibilities Design, develop, and maintain robust ETL processes Build and optimise data pipelines for efficient data integration Write and optimise complex SQL queries, stored procedures, and database objects Develop and manage SSIS packages Create and maintain reports using SSRS Support the development and optimisation of data warehouse solutions Ensure data quality, integrity, and performance across systems Collaborate with stakeholders to deliver data-driven solutions Troubleshoot and resolve data-related issues Required Experience 3-5+ years' experience in a Data Engineering or similar role Strong proficiency in SQL , including performance tuning and optimisation Hands-on experience with ETL development and data integration Proven experience with SSIS (SQL Server Integration Services) Experience developing reports using SSRS (SQL Server Reporting Services Solid understanding of data warehousing Experience working with large, complex datasets Strong analytical and problem-solving skills Due to the volume of applications received for positions, it will not be possible to respond to all applications and only applicants who are considered suitable for interview will be contacted. Proactive Appointments Limited operates as an employment agency and employment business and is an equal opportunities organisation We take our obligations to protect your personal data very seriously. Any information provided to us will be processed as detailed in our Privacy Notice, a copy of which can be found on our website
Tenth Revolution Group
Databricks Architect
Tenth Revolution Group
Data Architect - Databricks (Hybrid, UK) Locations: London, Manchester, or Edinburgh Hybrid: 2-3 days per week on-site Salary: Competitive (Manager & Senior Manager grades available) About the Role We are seeking an experienced Data Architect with deep expertise in Databricks to help our clients design, build, and scale modern data platforms. You will play a pivotal role in shaping Lakehouse architectures that enable advanced analytics, AI/ML, and enterprise-wide data-driven decision-making. Working closely with clients early in their data journey, you will assess business needs, define architectural direction, and guide the implementation of robust, secure, and scalable solutions. This is a hands-on architecture role suited to someone who has spent the last 2-3 years working directly with Databricks at an architectural level and is ready to progress towards programmes such as the Databricks DPP. Key Responsibilities * Architect and implement Databricks Lakehouse solutions across ingestion, processing, storage, and analytics layers. * Recommend best practices and innovative approaches for modern data platforms. * Build strong client relationships and confidently present architectural decisions to senior stakeholders. * Shape client data strategies and promote governance, quality, and security standards. * Lead architectural engagements and ensure delivery within scope, budget, and timelines. * Optimise Databricks workloads for performance, scalability, and cost efficiency. * Implement governance and compliance frameworks using Unity Catalog, Purview, and cloud-native controls. * Develop CI/CD pipelines using Databricks Repos, GitHub Actions, or Azure DevOps. * Contribute to RFI/RFP responses and deliver innovative Proofs of Concept. * Support the internal Architecture Practice by developing reusable patterns and accelerators. Skills & Experience * Proven experience delivering enterprise-scale Databricks solutions end-to-end. * Strong background in Lakehouse Architecture, including structured and unstructured data. * Expertise in Spark, PySpark, Delta Lake, and Databricks workflows. * Experience building scalable ETL/ELT pipelines, including Delta Live Tables. * Strong programming skills in Python, Scala, or SQL. * Solid understanding of data modelling (3NF, Kimball, Data Vault). * Experience integrating Lakehouse architectures with BI tools such as Power BI and Tableau. * Hands-on experience with at least one major cloud platform (Azure, AWS, or GCP) and understanding of Databricks implications across each. * Knowledge of Databricks security best practices (RBAC, IAM, encryption). * Excellent communication, stakeholder engagement, and problem-solving skills. Highly Valued Certifications * Databricks Certified Data Engineer (Associate/Professional) * Databricks Certified Machine Learning (Associate/Professional) * Databricks Generative AI Fundamentals * Databricks Lakehouse Fundamentals Why Join Us? * Generous annual leave and private medical insurance. * Strong focus on wellbeing and personal development. * A culture that rewards high performance and nurtures talent. * Opportunities to work on impactful client projects and drive meaningful change. * Supportive environment with investment in certifications and career progression. Additional Information * This role is fully signed off and part of a growing Databricks capability. * Candidates must be willing to travel between UK offices when required. * Suitable for individuals with strong architectural experience rather than purely engineering backgrounds. Please can you send me a copy of your CV if you're interested
18/03/2026
Full time
Data Architect - Databricks (Hybrid, UK) Locations: London, Manchester, or Edinburgh Hybrid: 2-3 days per week on-site Salary: Competitive (Manager & Senior Manager grades available) About the Role We are seeking an experienced Data Architect with deep expertise in Databricks to help our clients design, build, and scale modern data platforms. You will play a pivotal role in shaping Lakehouse architectures that enable advanced analytics, AI/ML, and enterprise-wide data-driven decision-making. Working closely with clients early in their data journey, you will assess business needs, define architectural direction, and guide the implementation of robust, secure, and scalable solutions. This is a hands-on architecture role suited to someone who has spent the last 2-3 years working directly with Databricks at an architectural level and is ready to progress towards programmes such as the Databricks DPP. Key Responsibilities * Architect and implement Databricks Lakehouse solutions across ingestion, processing, storage, and analytics layers. * Recommend best practices and innovative approaches for modern data platforms. * Build strong client relationships and confidently present architectural decisions to senior stakeholders. * Shape client data strategies and promote governance, quality, and security standards. * Lead architectural engagements and ensure delivery within scope, budget, and timelines. * Optimise Databricks workloads for performance, scalability, and cost efficiency. * Implement governance and compliance frameworks using Unity Catalog, Purview, and cloud-native controls. * Develop CI/CD pipelines using Databricks Repos, GitHub Actions, or Azure DevOps. * Contribute to RFI/RFP responses and deliver innovative Proofs of Concept. * Support the internal Architecture Practice by developing reusable patterns and accelerators. Skills & Experience * Proven experience delivering enterprise-scale Databricks solutions end-to-end. * Strong background in Lakehouse Architecture, including structured and unstructured data. * Expertise in Spark, PySpark, Delta Lake, and Databricks workflows. * Experience building scalable ETL/ELT pipelines, including Delta Live Tables. * Strong programming skills in Python, Scala, or SQL. * Solid understanding of data modelling (3NF, Kimball, Data Vault). * Experience integrating Lakehouse architectures with BI tools such as Power BI and Tableau. * Hands-on experience with at least one major cloud platform (Azure, AWS, or GCP) and understanding of Databricks implications across each. * Knowledge of Databricks security best practices (RBAC, IAM, encryption). * Excellent communication, stakeholder engagement, and problem-solving skills. Highly Valued Certifications * Databricks Certified Data Engineer (Associate/Professional) * Databricks Certified Machine Learning (Associate/Professional) * Databricks Generative AI Fundamentals * Databricks Lakehouse Fundamentals Why Join Us? * Generous annual leave and private medical insurance. * Strong focus on wellbeing and personal development. * A culture that rewards high performance and nurtures talent. * Opportunities to work on impactful client projects and drive meaningful change. * Supportive environment with investment in certifications and career progression. Additional Information * This role is fully signed off and part of a growing Databricks capability. * Candidates must be willing to travel between UK offices when required. * Suitable for individuals with strong architectural experience rather than purely engineering backgrounds. Please can you send me a copy of your CV if you're interested
Scope AT Limited
Hedge Fund - Python Developer (Equities) - Trade life cycle - PnL - Kafka - Contract
Scope AT Limited
Hedge Fund - Python Developer (Equities) - Trade life cycle - PnL - Kafka - Contract Our Hedge Fund client is looking for a Python Developer/Engineer Contract role This team is responsible for the firms equity transaction data platform, including trade life cycle event processing, enrichment, and PnL calculations. The role is ideal for an engineer who enjoys building robust, high-throughput services and data pipelines in a fast-paced, delivery-focused environment. Principal Responsibilities Design and develop solutions for trade life cycle event processing, including corporate actions, expiries, and other post-trade events. Build and operate Python-based services that perform large-scale data transformations and calculations. Publish and distribute transaction and PnL data using Kafka, including AVRO-based schemas and streaming patterns. Required Skills Minimum of 6+ years of professional Python development experience, ideally in capital markets or a fintech firm. Experience in finance: understanding of common financial asset classes; knowledge of equities corporate action processing, trade life cycle concepts, and/or P&L calculations is a strong plus. Experience with Kafka (or equivalent streaming/messaging platforms) and schema-based event publishing (eg, AVRO). Strong experience performing large-scale data calculations in Python using libraries like pandas, polars, and NumPy. Experience building REST services using frameworks such as FastAPI and/or Flask. Strong SQL skills and experience working with relational databases in production environments. Hands-on experience with containerized deployments and modern infrastructure tooling (Docker, Kubernetes) and familiarity with cloud platforms. Understanding of modern SDLC practices (testing strategy, CI/CD, release management, observability, and operational ownership). Office based, 5 days per week. Based in London. Contract role inside IR35 By applying to this job you are sending us your CV, which may contain personal information. Please refer to our Privacy Notice to understand how we process this information. In short, in order to supply you with work finding services, we will hold and process your personal data, and only with your express permission we will share this personal data with a client (or a third party working on behalf of the client) by email or by upload to the Client/third parties vendor management system. By giving us permission to send your CV to a client, this constitutes permission to share the personal data that would be necessary to consider your application, interview you (Phone/video/face to face) and if successful hire you. Scope AT acts as an employment agency for Permanent Recruitment and an employment business for the supply of temporary workers. By applying for this job you accept the Terms and Conditions, Data Protection Policy, Privacy Notice and Disclaimers which can be found at our website.
18/03/2026
Contractor
Hedge Fund - Python Developer (Equities) - Trade life cycle - PnL - Kafka - Contract Our Hedge Fund client is looking for a Python Developer/Engineer Contract role This team is responsible for the firms equity transaction data platform, including trade life cycle event processing, enrichment, and PnL calculations. The role is ideal for an engineer who enjoys building robust, high-throughput services and data pipelines in a fast-paced, delivery-focused environment. Principal Responsibilities Design and develop solutions for trade life cycle event processing, including corporate actions, expiries, and other post-trade events. Build and operate Python-based services that perform large-scale data transformations and calculations. Publish and distribute transaction and PnL data using Kafka, including AVRO-based schemas and streaming patterns. Required Skills Minimum of 6+ years of professional Python development experience, ideally in capital markets or a fintech firm. Experience in finance: understanding of common financial asset classes; knowledge of equities corporate action processing, trade life cycle concepts, and/or P&L calculations is a strong plus. Experience with Kafka (or equivalent streaming/messaging platforms) and schema-based event publishing (eg, AVRO). Strong experience performing large-scale data calculations in Python using libraries like pandas, polars, and NumPy. Experience building REST services using frameworks such as FastAPI and/or Flask. Strong SQL skills and experience working with relational databases in production environments. Hands-on experience with containerized deployments and modern infrastructure tooling (Docker, Kubernetes) and familiarity with cloud platforms. Understanding of modern SDLC practices (testing strategy, CI/CD, release management, observability, and operational ownership). Office based, 5 days per week. Based in London. Contract role inside IR35 By applying to this job you are sending us your CV, which may contain personal information. Please refer to our Privacy Notice to understand how we process this information. In short, in order to supply you with work finding services, we will hold and process your personal data, and only with your express permission we will share this personal data with a client (or a third party working on behalf of the client) by email or by upload to the Client/third parties vendor management system. By giving us permission to send your CV to a client, this constitutes permission to share the personal data that would be necessary to consider your application, interview you (Phone/video/face to face) and if successful hire you. Scope AT acts as an employment agency for Permanent Recruitment and an employment business for the supply of temporary workers. By applying for this job you accept the Terms and Conditions, Data Protection Policy, Privacy Notice and Disclaimers which can be found at our website.
Tenth Revolution Group
Lead Data Analyst
Tenth Revolution Group Newcastle Upon Tyne, Tyne And Wear
Lead Data Analyst/Data Product Lead - Managing Consultant The Opportunity You'll lead the delivery of analytical outcomes that enable organisations to realise their strategic vision. Acting as the bridge between business goals, data requirements, and technical implementation, you'll guide multidisciplinary teams and help clients modernise their data platforms, analytical capabilities, and decision-making processes. This role is ideal for someone who thrives in complex environments, enjoys solving ambiguous problems, and is passionate about modern cloud, big data, and analytics technologies. What You'll Do * Own and lead analytical delivery within broader data platform or transformation programmes. * Guide teams of analysts, data engineers and analytics engineers to deliver end-to-end outcomes-from data workflows to analytical services and reporting assets. * Define and uphold standards for requirements, documentation, code quality, version control, and release management. * Partner with stakeholders across business and technology to prioritise work, manage expectations, and drive adoption. * Run workshops to clarify requirements, map processes, and align teams on analytical definitions and success criteria. * Shape and maintain analytical services, ensuring clear "definition of done" for outputs and user stories. * Promote best practices in cloud, big data, analytics engineering, and AI-accelerated frameworks. * Contribute to proposals, shaping analytics workstreams, estimating effort, and defining delivery approaches. * Support the creation of reusable assets such as analytics frameworks, reconciliation packs, and migration playbooks. * Act as a role model for consulting behaviours: curiosity, clarity, pragmatism, integrity, and client empathy. About You You bring a blend of analytical depth, technical understanding, and strong consulting skills. You can see the bigger picture, navigate ambiguity, and lead teams to deliver high-quality analytical products. Experience & capabilities include: * Significant experience leading analytical product delivery in complex, multi-team environments. * Proven track record delivering analytical and technical outcomes on modern cloud platforms (eg, AWS, Azure, Snowflake, Databricks). * Strong experience with data migration validation, reconciliation, data controls, and go-live readiness. * Ability to mentor analysts and collaborate effectively with engineers and architects. * Strong stakeholder engagement skills across business and technical teams. * Advanced SQL and Python skills. * Solid understanding of data modelling (dimensional; Data Vault familiarity a plus). * Strong BI and analytics experience (dashboarding, semantic modelling, storytelling). * Familiarity with modern data warehousing, distributed processing, streaming, and DataOps. * Comfortable leading iterative delivery using agile principles. Qualifications & Tools Experience with some of the following is beneficial: * SQL/Python, Power BI, Tableau, Qlik, Dataiku, Alteryx * AWS, Azure, GCP, Snowflake, Databricks certifications * SAFe, Scrum Master or similar agile qualifications * Modern data warehousing tools (Fabric, Lake Formation, Snowflake, Databricks) * dbt or equivalent transformation tooling * Airflow/ADF/Dagster * Data governance, cataloguing, lineage tools * Agile toolsets such as JIRA, Confluence, DevOps Working Environment * Permanent role with flexible working options. * Hybrid model: typically 3 days per week in office (Newcastle). * Some UK and international travel may be required. * Eligibility for security clearance is essential. What's in It for You * Competitive salary with bonus potential. * Highly collaborative culture with strong values and a people-first mindset. * Flexible benefits focused on wellbeing and lifestyle. * 25 days' holiday, with the option to flex to 30. * Two CSR volunteering days. * Award-winning learning and development, including dedicated training time. * Personal tech budget for devices and accessories. * Rapid progression opportunities in a high-growth environment. Please send me a copy of your CV if you're interested
18/03/2026
Full time
Lead Data Analyst/Data Product Lead - Managing Consultant The Opportunity You'll lead the delivery of analytical outcomes that enable organisations to realise their strategic vision. Acting as the bridge between business goals, data requirements, and technical implementation, you'll guide multidisciplinary teams and help clients modernise their data platforms, analytical capabilities, and decision-making processes. This role is ideal for someone who thrives in complex environments, enjoys solving ambiguous problems, and is passionate about modern cloud, big data, and analytics technologies. What You'll Do * Own and lead analytical delivery within broader data platform or transformation programmes. * Guide teams of analysts, data engineers and analytics engineers to deliver end-to-end outcomes-from data workflows to analytical services and reporting assets. * Define and uphold standards for requirements, documentation, code quality, version control, and release management. * Partner with stakeholders across business and technology to prioritise work, manage expectations, and drive adoption. * Run workshops to clarify requirements, map processes, and align teams on analytical definitions and success criteria. * Shape and maintain analytical services, ensuring clear "definition of done" for outputs and user stories. * Promote best practices in cloud, big data, analytics engineering, and AI-accelerated frameworks. * Contribute to proposals, shaping analytics workstreams, estimating effort, and defining delivery approaches. * Support the creation of reusable assets such as analytics frameworks, reconciliation packs, and migration playbooks. * Act as a role model for consulting behaviours: curiosity, clarity, pragmatism, integrity, and client empathy. About You You bring a blend of analytical depth, technical understanding, and strong consulting skills. You can see the bigger picture, navigate ambiguity, and lead teams to deliver high-quality analytical products. Experience & capabilities include: * Significant experience leading analytical product delivery in complex, multi-team environments. * Proven track record delivering analytical and technical outcomes on modern cloud platforms (eg, AWS, Azure, Snowflake, Databricks). * Strong experience with data migration validation, reconciliation, data controls, and go-live readiness. * Ability to mentor analysts and collaborate effectively with engineers and architects. * Strong stakeholder engagement skills across business and technical teams. * Advanced SQL and Python skills. * Solid understanding of data modelling (dimensional; Data Vault familiarity a plus). * Strong BI and analytics experience (dashboarding, semantic modelling, storytelling). * Familiarity with modern data warehousing, distributed processing, streaming, and DataOps. * Comfortable leading iterative delivery using agile principles. Qualifications & Tools Experience with some of the following is beneficial: * SQL/Python, Power BI, Tableau, Qlik, Dataiku, Alteryx * AWS, Azure, GCP, Snowflake, Databricks certifications * SAFe, Scrum Master or similar agile qualifications * Modern data warehousing tools (Fabric, Lake Formation, Snowflake, Databricks) * dbt or equivalent transformation tooling * Airflow/ADF/Dagster * Data governance, cataloguing, lineage tools * Agile toolsets such as JIRA, Confluence, DevOps Working Environment * Permanent role with flexible working options. * Hybrid model: typically 3 days per week in office (Newcastle). * Some UK and international travel may be required. * Eligibility for security clearance is essential. What's in It for You * Competitive salary with bonus potential. * Highly collaborative culture with strong values and a people-first mindset. * Flexible benefits focused on wellbeing and lifestyle. * 25 days' holiday, with the option to flex to 30. * Two CSR volunteering days. * Award-winning learning and development, including dedicated training time. * Personal tech budget for devices and accessories. * Rapid progression opportunities in a high-growth environment. Please send me a copy of your CV if you're interested
Integration Architect
Infoplus Technologies UK Ltd Norwich, Norfolk
The Role The Integration Architect is responsible for designing, governing, and delivering enterprise-scale integration solutions across distributed systems. This role requires deep expertise in event-driven architecture (EDA) , Real Time streaming , and cloud-native integration patterns using Kafka and AWS messaging/streaming services such as EventBridge, SQS, SNS, Kinesis . The Integration Architect partners with engineering, product, and cloud teams to create scalable, secure, and resilient integration landscapes. Your responsibilities: Define enterprise integration architecture using event-driven , microservices , and Real Time streaming patterns. Architect solutions using Kafka , AWS EventBridge , SQS/SNS , Kinesis Streams/Firehose , and Kafka Connect . Establish integration standards, best practices, reusable frameworks, and governance models. Design solutions that ensure high availability, scalability, observability, and security. Evaluate system integration options and recommend optimal patterns (Pub/Sub, CQRS, event sourcing, streaming analytics, request-response APIs, batch). Lead the end-to-end delivery of integration platforms and streaming pipelines. Define event schemas, streaming topologies, routing logic, partitions, consumer groups, and throughput targets. Guide development teams in building producers, consumers, connectors, and stream processing applications. Review designs/code to ensure alignment with architecture guidelines. Architect integration workloads using: AWS Kinesis Streams & Kinesis Firehose AWS EventBridge event bus SQS/SNS for messaging patterns Kafka clusters (Confluent, MSK, or open-source) Work closely with cloud engineering teams on infrastructure design, IaC (Terraform/CloudFormation), performance tuning, and cost optimization. Implement monitoring using CloudWatch, Grafana, Prometheus, or OpenTelemetry. Enforce integration security practices: Authentication/Authorization IAM policies Encryption at rest/in transit Data governance & lineage Ensure solutions meet RPO, RTO, resiliency, disaster recovery, and failover requirements. Establish observability using tracing, logging, alerting, and dashboards. Collaborate with product owners, domain architects, delivery managers, and business stakeholders. Translate business requirements into scalable integration architectures. Provide technical leadership across teams and mentor integration engineers. Essential skills/knowledge/experience: Strong hands-on experience with Kafka : Topics, partitions, consumer groups Kafka Streams, ksqlDB Schema Registry & Avro/JSON/Protobuf Kafka Connect connectors Deep expertise in AWS streaming & messaging : Amazon Kinesis Data Streams , Firehose EventBridge (rules, event buses, routing) SQS/SNS with dead-letter queues AWS Lambda event-based integrations Experience designing event-driven and microservices architectures. Strong knowledge of: API integration patterns (REST, GraphQL) ETL/ELT and data pipelines Distributed system design High-throughput and low-latency data streaming Good understanding of Java/Python/Node.js for integration logic. Familiarity with containerization & orchestration (Docker, Kubernetes). Working knowledge of CI/CD, DevOps, IaC. Architectural thinking with strong problem-solving abilities. Ability to lead teams, mentor developers, and influence decisions. Strong communication and stakeholder engagement skills. Experience working in Agile environments.
18/03/2026
Contractor
The Role The Integration Architect is responsible for designing, governing, and delivering enterprise-scale integration solutions across distributed systems. This role requires deep expertise in event-driven architecture (EDA) , Real Time streaming , and cloud-native integration patterns using Kafka and AWS messaging/streaming services such as EventBridge, SQS, SNS, Kinesis . The Integration Architect partners with engineering, product, and cloud teams to create scalable, secure, and resilient integration landscapes. Your responsibilities: Define enterprise integration architecture using event-driven , microservices , and Real Time streaming patterns. Architect solutions using Kafka , AWS EventBridge , SQS/SNS , Kinesis Streams/Firehose , and Kafka Connect . Establish integration standards, best practices, reusable frameworks, and governance models. Design solutions that ensure high availability, scalability, observability, and security. Evaluate system integration options and recommend optimal patterns (Pub/Sub, CQRS, event sourcing, streaming analytics, request-response APIs, batch). Lead the end-to-end delivery of integration platforms and streaming pipelines. Define event schemas, streaming topologies, routing logic, partitions, consumer groups, and throughput targets. Guide development teams in building producers, consumers, connectors, and stream processing applications. Review designs/code to ensure alignment with architecture guidelines. Architect integration workloads using: AWS Kinesis Streams & Kinesis Firehose AWS EventBridge event bus SQS/SNS for messaging patterns Kafka clusters (Confluent, MSK, or open-source) Work closely with cloud engineering teams on infrastructure design, IaC (Terraform/CloudFormation), performance tuning, and cost optimization. Implement monitoring using CloudWatch, Grafana, Prometheus, or OpenTelemetry. Enforce integration security practices: Authentication/Authorization IAM policies Encryption at rest/in transit Data governance & lineage Ensure solutions meet RPO, RTO, resiliency, disaster recovery, and failover requirements. Establish observability using tracing, logging, alerting, and dashboards. Collaborate with product owners, domain architects, delivery managers, and business stakeholders. Translate business requirements into scalable integration architectures. Provide technical leadership across teams and mentor integration engineers. Essential skills/knowledge/experience: Strong hands-on experience with Kafka : Topics, partitions, consumer groups Kafka Streams, ksqlDB Schema Registry & Avro/JSON/Protobuf Kafka Connect connectors Deep expertise in AWS streaming & messaging : Amazon Kinesis Data Streams , Firehose EventBridge (rules, event buses, routing) SQS/SNS with dead-letter queues AWS Lambda event-based integrations Experience designing event-driven and microservices architectures. Strong knowledge of: API integration patterns (REST, GraphQL) ETL/ELT and data pipelines Distributed system design High-throughput and low-latency data streaming Good understanding of Java/Python/Node.js for integration logic. Familiarity with containerization & orchestration (Docker, Kubernetes). Working knowledge of CI/CD, DevOps, IaC. Architectural thinking with strong problem-solving abilities. Ability to lead teams, mentor developers, and influence decisions. Strong communication and stakeholder engagement skills. Experience working in Agile environments.
Experis IT
SC Cleared Embedded Software Engineer
Experis IT Malvern, Worcestershire
Job Title: SC Cleared Embedded Software Engineer Location: Malvern, UK Duration; 6 months Rate: Up to £80 per hour via an approved umbrella company Must be willing and eligible to go through the SC Clearance process Are you an experienced Embedded Software Engineer with SC clearance and a passion for innovative technology? Our client, a leading organisation in the defence and aerospace sector, is hiring for a reputable company to support a critical project involving the re-architecture of a Legacy electro-optical development board. What you'll be doing: Baseline and re-architect a Legacy electro-optical development board to enhance performance and capabilities. Write, implement, and test software at both sub-system and system levels, ensuring seamless integration with application and GUI software, as well as FPGA colleagues. Collaborate closely with cross-disciplinary teams to deliver high-quality solutions aligned with project goals. What you'll bring: Proven experience with Xilinx Petalinux, SDK/XSDK/Vitis, and Embedded Linux environments. Strong skills in C, C++, and Python programming. Familiarity with version control tools such as GIT, and build systems like Makefile. Knowledge of FPGA development, particularly with Xilinx SoC/MPSoC/RFSOC, and experience with development boards like ZCU111. Experience with FPGA design tools such as Xilinx Vivado, VHDL, Verilog, SystemVerilog, and TCL Scripting. Desirable skills include Docker, Bamboo, Confluence, and extended database knowledge with PostgreSQL and PostGIS. What you'll need: A background in Embedded software development within a defence or aerospace environment. Experience working on complex hardware-software integration projects. A proactive approach to problem-solving and collaboration. This is a fantastic opportunity to contribute to cutting-edge projects within a supportive and innovative environment. If you hold SC clearance and are ready to make an impact, we'd love to hear from you! Apply now to join a team committed to technological excellence and impactful solutions.
18/03/2026
Contractor
Job Title: SC Cleared Embedded Software Engineer Location: Malvern, UK Duration; 6 months Rate: Up to £80 per hour via an approved umbrella company Must be willing and eligible to go through the SC Clearance process Are you an experienced Embedded Software Engineer with SC clearance and a passion for innovative technology? Our client, a leading organisation in the defence and aerospace sector, is hiring for a reputable company to support a critical project involving the re-architecture of a Legacy electro-optical development board. What you'll be doing: Baseline and re-architect a Legacy electro-optical development board to enhance performance and capabilities. Write, implement, and test software at both sub-system and system levels, ensuring seamless integration with application and GUI software, as well as FPGA colleagues. Collaborate closely with cross-disciplinary teams to deliver high-quality solutions aligned with project goals. What you'll bring: Proven experience with Xilinx Petalinux, SDK/XSDK/Vitis, and Embedded Linux environments. Strong skills in C, C++, and Python programming. Familiarity with version control tools such as GIT, and build systems like Makefile. Knowledge of FPGA development, particularly with Xilinx SoC/MPSoC/RFSOC, and experience with development boards like ZCU111. Experience with FPGA design tools such as Xilinx Vivado, VHDL, Verilog, SystemVerilog, and TCL Scripting. Desirable skills include Docker, Bamboo, Confluence, and extended database knowledge with PostgreSQL and PostGIS. What you'll need: A background in Embedded software development within a defence or aerospace environment. Experience working on complex hardware-software integration projects. A proactive approach to problem-solving and collaboration. This is a fantastic opportunity to contribute to cutting-edge projects within a supportive and innovative environment. If you hold SC clearance and are ready to make an impact, we'd love to hear from you! Apply now to join a team committed to technological excellence and impactful solutions.
TJX Europe
Sr Engineer - Supply Chain
TJX Europe Watford, Hertfordshire
TJX Companies At TJX Companies, every day brings new opportunities for growth, exploration, and achievement. You'll be part of our vibrant team that embraces diversity, fosters collaboration, and prioritizes your development. Whether you're working in our four global Home Offices, Distribution Centers or Retail Stores-TJ Maxx, Marshalls, Homegoods, Homesense, Sierra, Winners, and TK Maxx, you'll find abundant opportunities to learn, thrive, and make an impact. Come join our TJX family-a Fortune 100 company and the world's leading off-price retailer. Job Description: Senior Engineer - Supply Chain What you'll discover Inclusive culture and career growth opportunities Global IT Organization which collaborates across U.S., Canada, Europe and Australia, click here to learn more Challenging, collaborative, and team-based environment What you'll do The Global Supply Chain - Retail Distribution Team is responsible for managingvarious Warehouse Management solutions within TJX IT. The organization delivers capabilities that enrich the customer experience and provide business value. We seek a motivated, talentedSenior Engineerwith good understanding of WMS and Labor Management As a Senior Engineer in Retail Warehouse Management Systems, you will be instrumental in all technical aspects of managing and configuring the Manhattan WMS/LMS to complement our warehouse operations. The role would also entail designing solutions that integrate Manhattan WMS with external systems triage technical issues (within Manhattan WMS logs to tracing upstream and downstream systems to identify issues, facilitate meetings with cross functional teams to resolve issues, provide input in Manhattan extension designs to ensure they are scalable. What you'll need The Global Supply Chain - Retail Distribution Team thrives on strong relationships with our business partners and works diligently to address their needs, which supports TJX growth and operational stability. On this tightly knit and fast-paced solution delivery team you will be constantly challenged to stretch and think outside the box. You will be working with product teams, architecture and business partners to strategically plan and deliver the product features by connecting the technical and business worlds. You will need to break down complex problems into steps that drive product development while keeping product quality and security as the priority. You will be responsible for most architecture, design and technical decisions within the assigned scope. Minimum Qualifications Bachelor's Degree or equivalent Engineering skillset 3-5 years of development experiencewith medium to large bespoke developed software solutions Agilepractitioner- Scrum/Kanban/SAFeenvironment Hands-on experience with Warehouse management operations and Configuration of COTS product Knowledge of Labor Management, Engineering standards Ability to work in distributed teams and develop multi-level relationships. Ability to lead and develop medium to large scale features/capabilities working with the product/platform/infrastructure/security teams. Responsible for achieving operational excellence as part of delivering feature. Excellent communication skills, ability to influence those around you Ability to understand the work environment and competing priorities in conjunction with developing/meeting project goals Shows a positive, open-minded and can-do attitude Strong engineering mindset, preferably with full stack experience Preferred Qualifications Knowledge of and experience with on-premise (Manhattan WMS/LMS 2018 On Prem) or cloud-based (Manhattan Active WMS/LMS) packaged software solutions and custom development. Ability to work in distributed teams and develop multi-level relationships. Personal drive for contributing to inclusion & diversity initiatives. Preferred Technical Skills: Integration Configuration: Setting up listeners, batch jobs, and message formats. Network and Printer Setup: Configuring networks and printers. Performance Testing: Designing and executing performance tests. Configuration Management: Managing and migrating configuration profiles. Defect Analysis: Analyzing and triaging defects. Extension Development: Developing and understanding extensions. Warehouse Management System (WMS): Configuring various WMS components (company details, user roles, system codes, etc.). Task Management: Configuring tasks and task paths. Putaway and Allocation: Configuring putaway types, zones, and allocation rules. Replenishment and Picking: Setting up replenishment triggers and picking parameters. Inventory Management: Managing inventory and cycle counts. System Integration: Integrating WMS with other systems and automation. SQL Skills: Writing and optimizing SQL queries. Purge and Archive: Managing data archiving and purging processes. Security and Access Control: Defining roles, permissions, and user provisioning. Dashboard Creation: Creating and managing dashboards. Printer Troubleshooting: Setting up and troubleshooting printers. Technical Configuration: Setting up servers, alerts, and monitoring. Benefits include Associate discount; 401(k) match; medical/dental/vision; HSA; health care FSA; life insurance; short/long-term disability; paid holidays/vacation /sick/bereavement/parental leave; EAP; incentive programs for management; auto/home insurance discounts; tuition reimbursement; scholarship program; adoption/surrogacy assistance; smoking cessation; child care/cell phone discounts; pet/legal insurance; credit union; referral bonuses. All benefits are subject to applicable plan or program terms (including eligibility terms) and may change from time to time. Contact your TJX representative for more information. In addition to our open door policy and supportive work environment, we also strive to provide a competitive salary and benefits package. TJX considers all applicants for employment without regard to race, color, religion, gender, sexual orientation, national origin, age, disability, gender identity and expression, marital or military status, or based on any individual's status in any group or class protected by applicable federal, state, or local law. TJX also provides reasonable accommodations to qualified individuals with disabilities in accordance with the Americans with Disabilities Act and applicable state and local law. Address: 770 Cochituate Rd Location: USA Home Office Framingham MA 770 Cochituate Rd This position has a starting salary range of $91,200.00 to $(phone number removed) per year. Actual starting pay is determined by a number of factors, including relevant skills, qualifications, and experience. This position is eligible for an annual incentive.
18/03/2026
Full time
TJX Companies At TJX Companies, every day brings new opportunities for growth, exploration, and achievement. You'll be part of our vibrant team that embraces diversity, fosters collaboration, and prioritizes your development. Whether you're working in our four global Home Offices, Distribution Centers or Retail Stores-TJ Maxx, Marshalls, Homegoods, Homesense, Sierra, Winners, and TK Maxx, you'll find abundant opportunities to learn, thrive, and make an impact. Come join our TJX family-a Fortune 100 company and the world's leading off-price retailer. Job Description: Senior Engineer - Supply Chain What you'll discover Inclusive culture and career growth opportunities Global IT Organization which collaborates across U.S., Canada, Europe and Australia, click here to learn more Challenging, collaborative, and team-based environment What you'll do The Global Supply Chain - Retail Distribution Team is responsible for managingvarious Warehouse Management solutions within TJX IT. The organization delivers capabilities that enrich the customer experience and provide business value. We seek a motivated, talentedSenior Engineerwith good understanding of WMS and Labor Management As a Senior Engineer in Retail Warehouse Management Systems, you will be instrumental in all technical aspects of managing and configuring the Manhattan WMS/LMS to complement our warehouse operations. The role would also entail designing solutions that integrate Manhattan WMS with external systems triage technical issues (within Manhattan WMS logs to tracing upstream and downstream systems to identify issues, facilitate meetings with cross functional teams to resolve issues, provide input in Manhattan extension designs to ensure they are scalable. What you'll need The Global Supply Chain - Retail Distribution Team thrives on strong relationships with our business partners and works diligently to address their needs, which supports TJX growth and operational stability. On this tightly knit and fast-paced solution delivery team you will be constantly challenged to stretch and think outside the box. You will be working with product teams, architecture and business partners to strategically plan and deliver the product features by connecting the technical and business worlds. You will need to break down complex problems into steps that drive product development while keeping product quality and security as the priority. You will be responsible for most architecture, design and technical decisions within the assigned scope. Minimum Qualifications Bachelor's Degree or equivalent Engineering skillset 3-5 years of development experiencewith medium to large bespoke developed software solutions Agilepractitioner- Scrum/Kanban/SAFeenvironment Hands-on experience with Warehouse management operations and Configuration of COTS product Knowledge of Labor Management, Engineering standards Ability to work in distributed teams and develop multi-level relationships. Ability to lead and develop medium to large scale features/capabilities working with the product/platform/infrastructure/security teams. Responsible for achieving operational excellence as part of delivering feature. Excellent communication skills, ability to influence those around you Ability to understand the work environment and competing priorities in conjunction with developing/meeting project goals Shows a positive, open-minded and can-do attitude Strong engineering mindset, preferably with full stack experience Preferred Qualifications Knowledge of and experience with on-premise (Manhattan WMS/LMS 2018 On Prem) or cloud-based (Manhattan Active WMS/LMS) packaged software solutions and custom development. Ability to work in distributed teams and develop multi-level relationships. Personal drive for contributing to inclusion & diversity initiatives. Preferred Technical Skills: Integration Configuration: Setting up listeners, batch jobs, and message formats. Network and Printer Setup: Configuring networks and printers. Performance Testing: Designing and executing performance tests. Configuration Management: Managing and migrating configuration profiles. Defect Analysis: Analyzing and triaging defects. Extension Development: Developing and understanding extensions. Warehouse Management System (WMS): Configuring various WMS components (company details, user roles, system codes, etc.). Task Management: Configuring tasks and task paths. Putaway and Allocation: Configuring putaway types, zones, and allocation rules. Replenishment and Picking: Setting up replenishment triggers and picking parameters. Inventory Management: Managing inventory and cycle counts. System Integration: Integrating WMS with other systems and automation. SQL Skills: Writing and optimizing SQL queries. Purge and Archive: Managing data archiving and purging processes. Security and Access Control: Defining roles, permissions, and user provisioning. Dashboard Creation: Creating and managing dashboards. Printer Troubleshooting: Setting up and troubleshooting printers. Technical Configuration: Setting up servers, alerts, and monitoring. Benefits include Associate discount; 401(k) match; medical/dental/vision; HSA; health care FSA; life insurance; short/long-term disability; paid holidays/vacation /sick/bereavement/parental leave; EAP; incentive programs for management; auto/home insurance discounts; tuition reimbursement; scholarship program; adoption/surrogacy assistance; smoking cessation; child care/cell phone discounts; pet/legal insurance; credit union; referral bonuses. All benefits are subject to applicable plan or program terms (including eligibility terms) and may change from time to time. Contact your TJX representative for more information. In addition to our open door policy and supportive work environment, we also strive to provide a competitive salary and benefits package. TJX considers all applicants for employment without regard to race, color, religion, gender, sexual orientation, national origin, age, disability, gender identity and expression, marital or military status, or based on any individual's status in any group or class protected by applicable federal, state, or local law. TJX also provides reasonable accommodations to qualified individuals with disabilities in accordance with the Americans with Disabilities Act and applicable state and local law. Address: 770 Cochituate Rd Location: USA Home Office Framingham MA 770 Cochituate Rd This position has a starting salary range of $91,200.00 to $(phone number removed) per year. Actual starting pay is determined by a number of factors, including relevant skills, qualifications, and experience. This position is eligible for an annual incentive.
VIQU Ltd
Lead QA Engineer
VIQU Ltd Ludlow, Shropshire
Role: Lead QA Engineer Location: Ludlow (2 days on site) Salary: Up to £55,000 per annum VIQU are supporting a growing UK-based software provider within who are seeking a Lead QA Engineer to strengthen their product engineering capability. This will initially be a standalone role, developing and leading the strategy whilst building a global team. The organisation heavily focused on automation and AI, so you will be expected to introduce related efficiencies. The Role: Lead the building, enhancement and maintenance of test automation frameworks using Playwright. Establish and maintain the QA strategy, encompassing functional, regression, integration, and performance testing. Implement and utilise AI tools to speed up test processes. Lead a global QA team. Work closely with developers to embed quality within the SDLC Create and execute SQL queries for Back End validation and data-driven testing Integrate automation into CI/CD pipelines Key Skills & Experience: Strong experience leading a QA function Excellent hands-on skills with Playwright (Experience with Selenium would also be nice to have). Experience leading QA teams. Proven experience building or significantly enhancing automation frameworks Strong SQL skills for data validation and Back End testing Experience integrating automated tests within CI/CD environments Comfortable operating directly within development teams Exposure to AI-assisted QA/testing tool Job role: Lead QA Engineer Job type: Permanent Salary: £45,000-£55,000 per annum Location: Ludlow (2 days on site) This is an opportunity for a Lead Automation QA Engineer to join a growing product business during a key phase of expansion, contributing directly to delivery success within a collaborative engineering team. Apply now to speak with VIQU IT in confidence. Or reach out via the VIQU IT website. Do you know someone great? We'll thank you with up to £1,000 if your referral is successful (terms apply).
18/03/2026
Full time
Role: Lead QA Engineer Location: Ludlow (2 days on site) Salary: Up to £55,000 per annum VIQU are supporting a growing UK-based software provider within who are seeking a Lead QA Engineer to strengthen their product engineering capability. This will initially be a standalone role, developing and leading the strategy whilst building a global team. The organisation heavily focused on automation and AI, so you will be expected to introduce related efficiencies. The Role: Lead the building, enhancement and maintenance of test automation frameworks using Playwright. Establish and maintain the QA strategy, encompassing functional, regression, integration, and performance testing. Implement and utilise AI tools to speed up test processes. Lead a global QA team. Work closely with developers to embed quality within the SDLC Create and execute SQL queries for Back End validation and data-driven testing Integrate automation into CI/CD pipelines Key Skills & Experience: Strong experience leading a QA function Excellent hands-on skills with Playwright (Experience with Selenium would also be nice to have). Experience leading QA teams. Proven experience building or significantly enhancing automation frameworks Strong SQL skills for data validation and Back End testing Experience integrating automated tests within CI/CD environments Comfortable operating directly within development teams Exposure to AI-assisted QA/testing tool Job role: Lead QA Engineer Job type: Permanent Salary: £45,000-£55,000 per annum Location: Ludlow (2 days on site) This is an opportunity for a Lead Automation QA Engineer to join a growing product business during a key phase of expansion, contributing directly to delivery success within a collaborative engineering team. Apply now to speak with VIQU IT in confidence. Or reach out via the VIQU IT website. Do you know someone great? We'll thank you with up to £1,000 if your referral is successful (terms apply).
Gails
Full Stack Data Scientist / Engineer
Gails
Role Overview: We are looking for a highly analytical, hands-on Full Stack Data Scientist / Engineer to design, build and deploy data-driven solutions that solve real operational and commercial problems. This role is ideal for someone who enjoys combining data science, software development and data engineering to create robust, scalable solutions that deliver measurable business value. You will work across the full lifecycle of analytics and AI delivery: from understanding business problems, designing data pipelines and developing models, through to deployment, optimisation and ongoing improvement. You will play a key role in shaping solutions across forecasting, site selection, ordering, production, rota scheduling, logistics and online services optimisation, while also helping to extend our Bread GPT large language model insight synthesis capability.This is a hands-on role for someone who can code well in Python, solve data engineering challenges, and work closely with colleagues and partners to turn ideas into production-ready solutions. Business Overview We are a fast growing and fast paced, highly successful artisan food manufacturing and hospitality group delivering high-quality baked goods to our customers. We aim to feed better people better by focusing on people, technology, innovation and sustainability. We are looking for a talented Full Stack Data Scientist / Data Engineer to join our team and drive the development and management of our enterprise-grade applications across our bakeries. Responsibilities: Develop advanced analytics / data science solutions to solve problems focused on forecasting, new site selection, ordering, production, rota scheduling, logistics and online services optimisation. Extend functionality of our Bread GPT service (Large Language Model insight synthesis engine). Data engineering: build and develop ETL processes in Microsoft Fabric to support reporting, insight and applied AI models hands-on role working with other staff and partners. Utilize data science and analytics to enhance application functionality and performance. Work with the data team to create and deploy machine learning models and AI-driven solutions for real-world applications. Ensure the continuous development and delivery of solutions. Monitor and evolve solutions. Mentor and guide junior team members, fostering a culture of continuous learning and improvement. Develop effective working relationships with colleagues within and beyond the Technology team to ensure that a consistent, high-quality service is delivered. ARE YOU THE MISSING INGREDIENT Ideally a bachelor's degree in Computer Science, Analytics, Engineering, or a related field. Minimum of 3+ years of experience within excellent knowledge of Python and preferably R. Knowledge of ETL processes - ideally basic understanding of Microsoft ETL (Data Factory / Synapse / Fabric) Knowledge of databases (SQL & NoSQL) and API development/integration. Understanding of software development and application design. Proven experience in building data science solutions and developing customised LLM applications. Strong interest in technology. Excellent problem-solving skills and attention to detail. Knowledge of effective business analysis - ability to gather, document, and analyse business requirements effectively and the experience creating user stories, process flows, and wireframes. Ability to work effectively in a fast-paced, dynamic environment. Strong communication and collaboration skills. "Can do" outlook and approach to work. Demonstrate the ability to think around issues and look at the bigger picture to provide solutions through a variety of problem-solving techniques. Ability to prioritise issues according to business needs, and to escalate when necessary/appropriate, and problem solve Preferred Qualifications: Experience in manufacturing, retail or hospitality industries. Knowledge of programming languages and frameworks. BENEFITS BAKED IN Free food and drink when working 50% off food and drink when not working 33 days holiday Pension Scheme Discounts and Savings from high-street retailers and restaurants 24 hour GP service Cycle to work scheme Enhanced Maternity package Development programmes for you to RISE with GAIL's
18/03/2026
Full time
Role Overview: We are looking for a highly analytical, hands-on Full Stack Data Scientist / Engineer to design, build and deploy data-driven solutions that solve real operational and commercial problems. This role is ideal for someone who enjoys combining data science, software development and data engineering to create robust, scalable solutions that deliver measurable business value. You will work across the full lifecycle of analytics and AI delivery: from understanding business problems, designing data pipelines and developing models, through to deployment, optimisation and ongoing improvement. You will play a key role in shaping solutions across forecasting, site selection, ordering, production, rota scheduling, logistics and online services optimisation, while also helping to extend our Bread GPT large language model insight synthesis capability.This is a hands-on role for someone who can code well in Python, solve data engineering challenges, and work closely with colleagues and partners to turn ideas into production-ready solutions. Business Overview We are a fast growing and fast paced, highly successful artisan food manufacturing and hospitality group delivering high-quality baked goods to our customers. We aim to feed better people better by focusing on people, technology, innovation and sustainability. We are looking for a talented Full Stack Data Scientist / Data Engineer to join our team and drive the development and management of our enterprise-grade applications across our bakeries. Responsibilities: Develop advanced analytics / data science solutions to solve problems focused on forecasting, new site selection, ordering, production, rota scheduling, logistics and online services optimisation. Extend functionality of our Bread GPT service (Large Language Model insight synthesis engine). Data engineering: build and develop ETL processes in Microsoft Fabric to support reporting, insight and applied AI models hands-on role working with other staff and partners. Utilize data science and analytics to enhance application functionality and performance. Work with the data team to create and deploy machine learning models and AI-driven solutions for real-world applications. Ensure the continuous development and delivery of solutions. Monitor and evolve solutions. Mentor and guide junior team members, fostering a culture of continuous learning and improvement. Develop effective working relationships with colleagues within and beyond the Technology team to ensure that a consistent, high-quality service is delivered. ARE YOU THE MISSING INGREDIENT Ideally a bachelor's degree in Computer Science, Analytics, Engineering, or a related field. Minimum of 3+ years of experience within excellent knowledge of Python and preferably R. Knowledge of ETL processes - ideally basic understanding of Microsoft ETL (Data Factory / Synapse / Fabric) Knowledge of databases (SQL & NoSQL) and API development/integration. Understanding of software development and application design. Proven experience in building data science solutions and developing customised LLM applications. Strong interest in technology. Excellent problem-solving skills and attention to detail. Knowledge of effective business analysis - ability to gather, document, and analyse business requirements effectively and the experience creating user stories, process flows, and wireframes. Ability to work effectively in a fast-paced, dynamic environment. Strong communication and collaboration skills. "Can do" outlook and approach to work. Demonstrate the ability to think around issues and look at the bigger picture to provide solutions through a variety of problem-solving techniques. Ability to prioritise issues according to business needs, and to escalate when necessary/appropriate, and problem solve Preferred Qualifications: Experience in manufacturing, retail or hospitality industries. Knowledge of programming languages and frameworks. BENEFITS BAKED IN Free food and drink when working 50% off food and drink when not working 33 days holiday Pension Scheme Discounts and Savings from high-street retailers and restaurants 24 hour GP service Cycle to work scheme Enhanced Maternity package Development programmes for you to RISE with GAIL's
VIQU IT
Lead QA Engineer
VIQU IT Ludford, Shropshire
Role: Lead QA Engineer Location: Ludlow (2 days on site) Salary: Up to £55,000 per annum VIQU are supporting a growing UK-based software provider within who are seeking a Lead QA Engineer to strengthen their product engineering capability. This will initially be a standalone role, developing and leading the strategy whilst building a global team. The organisation heavily focused on automation and AI, so you will be expected to introduce related efficiencies. The Role: Lead the building, enhancement and maintenance of test automation frameworks using Playwright. Establish and maintain the QA strategy, encompassing functional, regression, integration, and performance testing. Implement and utilise AI tools to speed up test processes. Lead a global QA team. Work closely with developers to embed quality within the SDLC Create and execute SQL queries for backend validation and data-driven testing Integrate automation into CI/CD pipelines Key Skills & Experience: Strong experience leading a QA function Excellent hands-on skills with Playwright (Experience with Selenium would also be nice to have). Experience leading QA teams. Proven experience building or significantly enhancing automation frameworks Strong SQL skills for data validation and backend testing Experience integrating automated tests within CI/CD environments Comfortable operating directly within development teams Exposure to AI-assisted QA/testing tool Job role: Lead QA Engineer Job type: Permanent Salary: £45,000-£55,000 per annum Location: Ludlow (2 days on site) This is an opportunity for a Lead Automation QA Engineer to join a growing product business during a key phase of expansion, contributing directly to delivery success within a collaborative engineering team. Apply now to speak with VIQU IT in confidence. Or reach out via the VIQU IT website. Do you know someone great? We ll thank you with up to £1,000 if your referral is successful (terms apply). For more exciting roles and opportunities like this, please follow us on IT Recruitment.
17/03/2026
Full time
Role: Lead QA Engineer Location: Ludlow (2 days on site) Salary: Up to £55,000 per annum VIQU are supporting a growing UK-based software provider within who are seeking a Lead QA Engineer to strengthen their product engineering capability. This will initially be a standalone role, developing and leading the strategy whilst building a global team. The organisation heavily focused on automation and AI, so you will be expected to introduce related efficiencies. The Role: Lead the building, enhancement and maintenance of test automation frameworks using Playwright. Establish and maintain the QA strategy, encompassing functional, regression, integration, and performance testing. Implement and utilise AI tools to speed up test processes. Lead a global QA team. Work closely with developers to embed quality within the SDLC Create and execute SQL queries for backend validation and data-driven testing Integrate automation into CI/CD pipelines Key Skills & Experience: Strong experience leading a QA function Excellent hands-on skills with Playwright (Experience with Selenium would also be nice to have). Experience leading QA teams. Proven experience building or significantly enhancing automation frameworks Strong SQL skills for data validation and backend testing Experience integrating automated tests within CI/CD environments Comfortable operating directly within development teams Exposure to AI-assisted QA/testing tool Job role: Lead QA Engineer Job type: Permanent Salary: £45,000-£55,000 per annum Location: Ludlow (2 days on site) This is an opportunity for a Lead Automation QA Engineer to join a growing product business during a key phase of expansion, contributing directly to delivery success within a collaborative engineering team. Apply now to speak with VIQU IT in confidence. Or reach out via the VIQU IT website. Do you know someone great? We ll thank you with up to £1,000 if your referral is successful (terms apply). For more exciting roles and opportunities like this, please follow us on IT Recruitment.
Verelogic
Head of Data & Analytics (Reporting & BI)
Verelogic
Head of Data & Analytics (Reporting & BI) Hybrid - Covering South West with site visits Permanent Overview An exciting opportunity with our client for a hands-on Head of Data & Analytics to lead reporting and business intelligence across a group of manufacturing businesses. This role combines strategic leadership with technical delivery, driving a modern data capability while developing high-quality reporting and insights to support operational and strategic decision-making. Key Responsibilities Define and deliver the BI & reporting strategy across the organisation Design and develop Power BI dashboards, reports, and visualisations Work with SQL and raw datasets to build scalable data models Establish data governance, KPI frameworks, and reporting standards Develop and optimise data pipelines, ETL processes, and reporting performance Provide actionable insights to improve operational efficiency and performance Automate reporting processes and improve data accessibility Partner with senior stakeholders across operations, finance, and leadership Essential Skills & Experience Expert-level Power BI (dashboard design, visualisation) Advanced SQL (query writing, optimisation) Strong data modelling and analytical dataset design Experience working with large, complex datasets Background in manufacturing, engineering, or operational environments Proven ability to work both strategically and hands-on Excellent stakeholder engagement and communication skills Desirable Experience with ERP systems and data integration Knowledge of ETL and data warehousing Advanced DAX Exposure to Azure (Synapse, Azure SQL, Data Lakes) Experience with Python or Microsoft BI stack tools Track record of improving dashboard performance and data quality What's on Offer Competitive salary Opportunity to lead and shape a growing data function Collaborative, forward-thinking environment Chance to build a modern analytics capability from the ground up
17/03/2026
Full time
Head of Data & Analytics (Reporting & BI) Hybrid - Covering South West with site visits Permanent Overview An exciting opportunity with our client for a hands-on Head of Data & Analytics to lead reporting and business intelligence across a group of manufacturing businesses. This role combines strategic leadership with technical delivery, driving a modern data capability while developing high-quality reporting and insights to support operational and strategic decision-making. Key Responsibilities Define and deliver the BI & reporting strategy across the organisation Design and develop Power BI dashboards, reports, and visualisations Work with SQL and raw datasets to build scalable data models Establish data governance, KPI frameworks, and reporting standards Develop and optimise data pipelines, ETL processes, and reporting performance Provide actionable insights to improve operational efficiency and performance Automate reporting processes and improve data accessibility Partner with senior stakeholders across operations, finance, and leadership Essential Skills & Experience Expert-level Power BI (dashboard design, visualisation) Advanced SQL (query writing, optimisation) Strong data modelling and analytical dataset design Experience working with large, complex datasets Background in manufacturing, engineering, or operational environments Proven ability to work both strategically and hands-on Excellent stakeholder engagement and communication skills Desirable Experience with ERP systems and data integration Knowledge of ETL and data warehousing Advanced DAX Exposure to Azure (Synapse, Azure SQL, Data Lakes) Experience with Python or Microsoft BI stack tools Track record of improving dashboard performance and data quality What's on Offer Competitive salary Opportunity to lead and shape a growing data function Collaborative, forward-thinking environment Chance to build a modern analytics capability from the ground up
Tilt Recruitment
Data Engineer
Tilt Recruitment
Data Engineer Remote (UK) 1 2 days per month in the South Manchester area. Up to £46,000 + flexibility for standout candidates If you re a Data Engineer who wants more than just maintaining pipelines, this is a chance to shape something from the ground up . This role sits at the heart of a business investing heavily in its data capability. They re moving away from reactive reporting towards a modern, insight-led platform and they need someone who wants to be part of that journey, not just observe it. The Opportunity You ll join at a pivotal moment. The current data platform is being rebuilt, transitioning from legacy and mixed approaches into a modern, cloud-first architecture built on Azure Synapse and a medallion (bronze silver gold) design. This isn t a keep the lights on role. You ll be: Designing and building scalable data pipelines from scratch Shaping how data is structured, governed, and used across the business Influencing technical direction and bringing in better ways of working Helping the organisation move towards predictive and insight-driven decision making You ll have genuine ownership and the space to challenge existing approaches and introduce best practice. What You ll Be Doing Building and optimising data pipelines using Azure Synapse (pipelines and notebooks) Writing and maintaining robust SQL, including complex stored procedures Designing and implementing modern data warehouse architecture (medallion model) Ensuring data quality, validation, and reliability across the platform Integrating data from APIs and multiple internal sources Collaborating with stakeholders to turn data into something genuinely useful What We re Looking For You don t need to tick every box, but there are a few things that really matter: Must-have: Strong SQL skills, including writing advanced stored procedures Hands-on experience with Azure Synapse (pipelines and notebooks) Experience building or working within modern data warehouse architectures Ability to get up and running quickly without heavy onboarding Nice to have: Python, PySpark or Spark experience Exposure to APIs and external data integration Experience working in evolving or transforming data environments Most importantly, you ll be someone who : Wants to improve and evolve things, not just maintain them Is comfortable challenging existing approaches constructively Enjoys solving problems and taking ownership Why This Role? There are plenty of Data Engineer roles out there. This one stands out because of the impact and trajectory . Build, not maintain genuine opportunity to shape a modern data platform Influence your ideas and approach will matter from day one Growth strong focus on learning, with access to training and new technologies Variety a mix of engineering, problem-solving, and collaboration Flexibility remote-first, with occasional in-person team time You ll also be joining a team that values curiosity, improvement, and doing things properly rather than just quickly. Location & Flexibility Remote-first role across the UK 1 2 days per month in the Manchester area Open to candidates across the North West, and beyond A Note on Fit This role won t suit someone looking for routine, maintenance-only work. It will suit someone who wants to: Build Improve Challenge And leave things better than they found them If you re looking for a role where you can genuinely influence a data platform and grow with it, this is well worth a conversation. Tilt Recruitment are specialists in IT Recruitment. We work hard to find our candidates their perfect roles within fantastic organisations across the UK. If this role isn t right for you, please still get in touch with us as we may have other roles which may suit you better. We also offer up to £500 for every successful referral, if you know someone who matches this skill set please let us know. Tilt Recruitment is acting as an Employment Agency in relation to this vacancy
17/03/2026
Full time
Data Engineer Remote (UK) 1 2 days per month in the South Manchester area. Up to £46,000 + flexibility for standout candidates If you re a Data Engineer who wants more than just maintaining pipelines, this is a chance to shape something from the ground up . This role sits at the heart of a business investing heavily in its data capability. They re moving away from reactive reporting towards a modern, insight-led platform and they need someone who wants to be part of that journey, not just observe it. The Opportunity You ll join at a pivotal moment. The current data platform is being rebuilt, transitioning from legacy and mixed approaches into a modern, cloud-first architecture built on Azure Synapse and a medallion (bronze silver gold) design. This isn t a keep the lights on role. You ll be: Designing and building scalable data pipelines from scratch Shaping how data is structured, governed, and used across the business Influencing technical direction and bringing in better ways of working Helping the organisation move towards predictive and insight-driven decision making You ll have genuine ownership and the space to challenge existing approaches and introduce best practice. What You ll Be Doing Building and optimising data pipelines using Azure Synapse (pipelines and notebooks) Writing and maintaining robust SQL, including complex stored procedures Designing and implementing modern data warehouse architecture (medallion model) Ensuring data quality, validation, and reliability across the platform Integrating data from APIs and multiple internal sources Collaborating with stakeholders to turn data into something genuinely useful What We re Looking For You don t need to tick every box, but there are a few things that really matter: Must-have: Strong SQL skills, including writing advanced stored procedures Hands-on experience with Azure Synapse (pipelines and notebooks) Experience building or working within modern data warehouse architectures Ability to get up and running quickly without heavy onboarding Nice to have: Python, PySpark or Spark experience Exposure to APIs and external data integration Experience working in evolving or transforming data environments Most importantly, you ll be someone who : Wants to improve and evolve things, not just maintain them Is comfortable challenging existing approaches constructively Enjoys solving problems and taking ownership Why This Role? There are plenty of Data Engineer roles out there. This one stands out because of the impact and trajectory . Build, not maintain genuine opportunity to shape a modern data platform Influence your ideas and approach will matter from day one Growth strong focus on learning, with access to training and new technologies Variety a mix of engineering, problem-solving, and collaboration Flexibility remote-first, with occasional in-person team time You ll also be joining a team that values curiosity, improvement, and doing things properly rather than just quickly. Location & Flexibility Remote-first role across the UK 1 2 days per month in the Manchester area Open to candidates across the North West, and beyond A Note on Fit This role won t suit someone looking for routine, maintenance-only work. It will suit someone who wants to: Build Improve Challenge And leave things better than they found them If you re looking for a role where you can genuinely influence a data platform and grow with it, this is well worth a conversation. Tilt Recruitment are specialists in IT Recruitment. We work hard to find our candidates their perfect roles within fantastic organisations across the UK. If this role isn t right for you, please still get in touch with us as we may have other roles which may suit you better. We also offer up to £500 for every successful referral, if you know someone who matches this skill set please let us know. Tilt Recruitment is acting as an Employment Agency in relation to this vacancy
Experis
SC Cleared Embedded Software Engineer
Experis
Job Title: SC Cleared Embedded Software Engineer Location: Malvern, UK Duration; 6 months Rate: Up to 80 per hour via an approved umbrella company Must be willing and eligible to go through the SC Clearance process Are you an experienced Embedded Software Engineer with SC clearance and a passion for innovative technology? Our client, a leading organisation in the defence and aerospace sector, is hiring for a reputable company to support a critical project involving the re-architecture of a legacy electro-optical development board. What you'll be doing: Baseline and re-architect a legacy electro-optical development board to enhance performance and capabilities. Write, implement, and test software at both sub-system and system levels, ensuring seamless integration with application and GUI software, as well as FPGA colleagues. Collaborate closely with cross-disciplinary teams to deliver high-quality solutions aligned with project goals. What you'll bring: Proven experience with Xilinx Petalinux, SDK/XSDK/Vitis, and embedded Linux environments. Strong skills in C, C++, and Python programming. Familiarity with version control tools such as GIT, and build systems like Makefile. Knowledge of FPGA development, particularly with Xilinx SoC/MPSoC/RFSOC, and experience with development boards like ZCU111. Experience with FPGA design tools such as Xilinx Vivado, VHDL, Verilog, SystemVerilog, and TCL scripting. Desirable skills include Docker, Bamboo, Confluence, and extended database knowledge with PostgreSQL and PostGIS. What you'll need: A background in embedded software development within a defence or aerospace environment. Experience working on complex hardware-software integration projects. A proactive approach to problem-solving and collaboration. This is a fantastic opportunity to contribute to cutting-edge projects within a supportive and innovative environment. If you hold SC clearance and are ready to make an impact, we'd love to hear from you! Apply now to join a team committed to technological excellence and impactful solutions.
17/03/2026
Contractor
Job Title: SC Cleared Embedded Software Engineer Location: Malvern, UK Duration; 6 months Rate: Up to 80 per hour via an approved umbrella company Must be willing and eligible to go through the SC Clearance process Are you an experienced Embedded Software Engineer with SC clearance and a passion for innovative technology? Our client, a leading organisation in the defence and aerospace sector, is hiring for a reputable company to support a critical project involving the re-architecture of a legacy electro-optical development board. What you'll be doing: Baseline and re-architect a legacy electro-optical development board to enhance performance and capabilities. Write, implement, and test software at both sub-system and system levels, ensuring seamless integration with application and GUI software, as well as FPGA colleagues. Collaborate closely with cross-disciplinary teams to deliver high-quality solutions aligned with project goals. What you'll bring: Proven experience with Xilinx Petalinux, SDK/XSDK/Vitis, and embedded Linux environments. Strong skills in C, C++, and Python programming. Familiarity with version control tools such as GIT, and build systems like Makefile. Knowledge of FPGA development, particularly with Xilinx SoC/MPSoC/RFSOC, and experience with development boards like ZCU111. Experience with FPGA design tools such as Xilinx Vivado, VHDL, Verilog, SystemVerilog, and TCL scripting. Desirable skills include Docker, Bamboo, Confluence, and extended database knowledge with PostgreSQL and PostGIS. What you'll need: A background in embedded software development within a defence or aerospace environment. Experience working on complex hardware-software integration projects. A proactive approach to problem-solving and collaboration. This is a fantastic opportunity to contribute to cutting-edge projects within a supportive and innovative environment. If you hold SC clearance and are ready to make an impact, we'd love to hear from you! Apply now to join a team committed to technological excellence and impactful solutions.
Agilis Recruitment Ltd
Data Engineer
Agilis Recruitment Ltd
Agilis are currently working exclusively with a key client who are a leading technology consultancy in their search for a Data Engineer. This is a fantastic opportunity to join a fast growing, forward thinking company and helping them take their Data engineering to the next level! Job Description: We are seeking a highly skilled and motivated Data Engineer to join a dynamic team. The ideal candidate will have a strong background in SQL, Python, ETL processes, and data integration, ideally in Databricks. You will play a crucial role in continuing an exciting project designing, developing, and maintaining data infrastructure to ensure the seamless inflow, data sanitation/consolidation and automated report production for clients. Key Responsibilities: Design and Development: Design, develop, and maintain scalable ETL pipelines to process and integrate data from various sources. Implement data validation routines to ensure data quality and integrity. Develop and optimize SQL queries for data extraction, transformation, and loading. Strategic Solution Design: Data Integration: Integrate data from multiple sources, including APIs & relational databases. Collaborate with cross-functional teams to gather and understand data requirements. Database Management: Design and maintain relational database schemas to support business needs. Ensure efficient storage, retrieval, and management of large datasets. API Management: Develop and maintain APIs for data access and integration. Utilize tools like Postman for API testing and documentation. A good understanding of working with APIs: Ensure robust and efficient API integration and management. Data bricks Management: Manage permissions and access controls within Databricks to ensure data security and compliance Data Analytics and Reporting: Work with data analysts to provide clean and well-structured data for analysis. Develop and maintain documentation for data processes and workflows. Develop and maintain automatic report production to ensure seamless delivery of critical data Collaboration and Communication: Collaborate with colleagues to gather requirements and translate them into technical specifications. Communicate effectively with team members to ensure alignment on data initiatives Qualifications: Bachelor's degree or equivalent experience in Computer Science, Information Technology, or a related field. Proven experience as a Data Engineer or in a similar role. Strong proficiency in SQL or Python or ideally both. Experience with ETL processes and tools. Knowledge of data validation routines and data integration techniques. Familiarity with relational database design and management. Experience with API development and testing using tools like Postman. Experience of Databricks or similar data platforms desirable Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. For more information please apply using the link or get in touch with Recruitment
17/03/2026
Full time
Agilis are currently working exclusively with a key client who are a leading technology consultancy in their search for a Data Engineer. This is a fantastic opportunity to join a fast growing, forward thinking company and helping them take their Data engineering to the next level! Job Description: We are seeking a highly skilled and motivated Data Engineer to join a dynamic team. The ideal candidate will have a strong background in SQL, Python, ETL processes, and data integration, ideally in Databricks. You will play a crucial role in continuing an exciting project designing, developing, and maintaining data infrastructure to ensure the seamless inflow, data sanitation/consolidation and automated report production for clients. Key Responsibilities: Design and Development: Design, develop, and maintain scalable ETL pipelines to process and integrate data from various sources. Implement data validation routines to ensure data quality and integrity. Develop and optimize SQL queries for data extraction, transformation, and loading. Strategic Solution Design: Data Integration: Integrate data from multiple sources, including APIs & relational databases. Collaborate with cross-functional teams to gather and understand data requirements. Database Management: Design and maintain relational database schemas to support business needs. Ensure efficient storage, retrieval, and management of large datasets. API Management: Develop and maintain APIs for data access and integration. Utilize tools like Postman for API testing and documentation. A good understanding of working with APIs: Ensure robust and efficient API integration and management. Data bricks Management: Manage permissions and access controls within Databricks to ensure data security and compliance Data Analytics and Reporting: Work with data analysts to provide clean and well-structured data for analysis. Develop and maintain documentation for data processes and workflows. Develop and maintain automatic report production to ensure seamless delivery of critical data Collaboration and Communication: Collaborate with colleagues to gather requirements and translate them into technical specifications. Communicate effectively with team members to ensure alignment on data initiatives Qualifications: Bachelor's degree or equivalent experience in Computer Science, Information Technology, or a related field. Proven experience as a Data Engineer or in a similar role. Strong proficiency in SQL or Python or ideally both. Experience with ETL processes and tools. Knowledge of data validation routines and data integration techniques. Familiarity with relational database design and management. Experience with API development and testing using tools like Postman. Experience of Databricks or similar data platforms desirable Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. For more information please apply using the link or get in touch with Recruitment

Modal Window

  • Home
  • Contact
  • About Us
  • FAQs
  • Terms & Conditions
  • Privacy
  • Employer
  • Post a Job
  • Search Resumes
  • Sign in
  • Job Seeker
  • Find Jobs
  • Create Resume
  • Sign in
  • IT blog
  • Facebook
  • Twitter
  • LinkedIn
  • Youtube
© 2008-2026 IT Job Board