Digital Research Infrastructure Engineer - Linux Specialist
PML operations grade 4 £30000 - £45000 DOE
Full Time
Open Ended Appointment
The Role
We have an exciting opportunity at PML for an individual with skills in Linux system administration to join the PML’s Digital Innovation and Marine Autonomy (DIMA) group. The role provides a business critical link between scientists, PML Applications (commercial work) and our IT Group to support the Linux computing infrastructure as it continues to evolve, underpinning PML science in multiple areas and across all levels. This ranges from data generation, (storage technologies and data management), processing and analysis (high performance computing and technologies such as JupyterHub), to making visual outputs for end users (web technologies and virtualisation) to increase the reach and impact of PML science.
About You
You will enjoy working with others to help deliver a modern and reliable digital infrastructure to underpin the world leading research carried out at PML. You will understand the importance of stability from existing infrastructure but will also be keen to learn and try new technologies. You will have experience of administering Linux systems, ideally using Ubuntu, and will be able to make use of scripts and common tools such as ansible to manage this. You will understand the importance of taking a proactive approach to identify and resolve and problems and will be able to make use of monitoring software (e.g., Nagios, Grafana) to accomplish this. You will understand best practices in cybersecurity and be able to apply these.
Skills Required
Linux systems administration and monitoring
Linux scripting (e.g., bash and Python)
Experience in management of data at the Terrabyte to Petabyte scale and storage technologies such as NFS and S3.
Cybersecurity (Understand and apply best practices)
Container technologies (Docker and Kubernetes)
High performance Computing (Slurm)
Virtualisation (VMWare)
Key Deliverables
Maintain our storage infrastructure to ensure data is distributed across servers based on existing capacity and projected changes in data volumes. This includes regular data moves and liaising with stakeholders to ensure data is backed up and archiving projects are completes as needed.
Monitor high performance computing infrastructure to identify and resolve problems either on their own or by working with IT (depending on the nature of the problem).
Act of a point of contact between scientists and IT to answer questions, help identify solutions and provide training.
Work with the data architect to maintain and develop web infrastructure used to provide existing and planned data search and visualisation services.
Manage the NEODAAS GPU cluster (MAGEO), including liaising with IT, vendors and system users.
About PML
As a marine-focused charity we develop and apply innovative science with a view to ensuring ocean sustainability. With over 40 years of experience, we offer evidence-based solutions to societal challenges. Our impact spans from research publications to informing policies and training future scientists. The science undertaken at PML contributes to UN Sustainable Development Goals by promoting healthy, productive and resilient oceans and seas.
To support PML’s science it operates in house Linux infrastructure used for processing satellite data, running models and making outputs accessible through web visualisation tools. This infrastructure includes a large amount of storage (6 PB), a High-Performance Computing cluster with over 1500 cores, a 40 GPU cluster (the MAssive GPU cluster for Earth Observation; MAGEO) and a virtual machine cluster. The role will be part of the Digital Innovation and Marine Autonomy (DIMA) group within PML. DIMA is a pioneering digital science group dedicated to advancing PML’s world-class and cutting-edge environmental research through the utilisation of state-of-the-art digital and autonomous technologies. The team comprises research software engineers, research infrastructure engineers, marine technologists and scientists who work on a variety of projects using autonomous vessels, satellite data, drones, Artificial Intelligence, High Performance Computing and data visualisation tools to help deliver PML’s goals. The team have an enthusiasm for solving problems through collaboration and shared learning.
11/04/2024
Full time
Digital Research Infrastructure Engineer - Linux Specialist
PML operations grade 4 £30000 - £45000 DOE
Full Time
Open Ended Appointment
The Role
We have an exciting opportunity at PML for an individual with skills in Linux system administration to join the PML’s Digital Innovation and Marine Autonomy (DIMA) group. The role provides a business critical link between scientists, PML Applications (commercial work) and our IT Group to support the Linux computing infrastructure as it continues to evolve, underpinning PML science in multiple areas and across all levels. This ranges from data generation, (storage technologies and data management), processing and analysis (high performance computing and technologies such as JupyterHub), to making visual outputs for end users (web technologies and virtualisation) to increase the reach and impact of PML science.
About You
You will enjoy working with others to help deliver a modern and reliable digital infrastructure to underpin the world leading research carried out at PML. You will understand the importance of stability from existing infrastructure but will also be keen to learn and try new technologies. You will have experience of administering Linux systems, ideally using Ubuntu, and will be able to make use of scripts and common tools such as ansible to manage this. You will understand the importance of taking a proactive approach to identify and resolve and problems and will be able to make use of monitoring software (e.g., Nagios, Grafana) to accomplish this. You will understand best practices in cybersecurity and be able to apply these.
Skills Required
Linux systems administration and monitoring
Linux scripting (e.g., bash and Python)
Experience in management of data at the Terrabyte to Petabyte scale and storage technologies such as NFS and S3.
Cybersecurity (Understand and apply best practices)
Container technologies (Docker and Kubernetes)
High performance Computing (Slurm)
Virtualisation (VMWare)
Key Deliverables
Maintain our storage infrastructure to ensure data is distributed across servers based on existing capacity and projected changes in data volumes. This includes regular data moves and liaising with stakeholders to ensure data is backed up and archiving projects are completes as needed.
Monitor high performance computing infrastructure to identify and resolve problems either on their own or by working with IT (depending on the nature of the problem).
Act of a point of contact between scientists and IT to answer questions, help identify solutions and provide training.
Work with the data architect to maintain and develop web infrastructure used to provide existing and planned data search and visualisation services.
Manage the NEODAAS GPU cluster (MAGEO), including liaising with IT, vendors and system users.
About PML
As a marine-focused charity we develop and apply innovative science with a view to ensuring ocean sustainability. With over 40 years of experience, we offer evidence-based solutions to societal challenges. Our impact spans from research publications to informing policies and training future scientists. The science undertaken at PML contributes to UN Sustainable Development Goals by promoting healthy, productive and resilient oceans and seas.
To support PML’s science it operates in house Linux infrastructure used for processing satellite data, running models and making outputs accessible through web visualisation tools. This infrastructure includes a large amount of storage (6 PB), a High-Performance Computing cluster with over 1500 cores, a 40 GPU cluster (the MAssive GPU cluster for Earth Observation; MAGEO) and a virtual machine cluster. The role will be part of the Digital Innovation and Marine Autonomy (DIMA) group within PML. DIMA is a pioneering digital science group dedicated to advancing PML’s world-class and cutting-edge environmental research through the utilisation of state-of-the-art digital and autonomous technologies. The team comprises research software engineers, research infrastructure engineers, marine technologists and scientists who work on a variety of projects using autonomous vessels, satellite data, drones, Artificial Intelligence, High Performance Computing and data visualisation tools to help deliver PML’s goals. The team have an enthusiasm for solving problems through collaboration and shared learning.
Location
Dstl Porton Down, Salisbury, Wiltshire, SP4 0JQ or Dstl Portsdown West, Fareham, Hampshire, PO17 6AD
About the job
Job summary
Dstl is the science and technology arm of the Ministry of Defence. We improve the front-line capability of the UK Armed Forces helping keep our country safe.
The Cyber Security and Safety Group has never been more important. Many military platforms such as fast jets, unmanned air vehicles, helicopters, naval vessels, and land vehicles are becoming increasingly reliant on Software, Artificial Intelligence (AI) and Autonomous functions to control all aspects of their behaviour.
We’re looking for mathematically strong data scientists to help make AI reliant military systems robust and trustworthy in complex operations to help save lives.
An example of our world class inspiring work is designing and trialling a variety of autonomous air and ground vehicles out in Salisbury plain with the US and Australia. AI models were retrained in flight to meet changing mission situations to enhance commanders’ decision-making.
You could be involved in:
Assessing and improving AI content in Defence and Security safety critical systems in the Air, Sea and Land domains, to ensure that they are safe, secure and protected.
Applying the latest thinking in verification and validation of artificial intelligence and autonomous functions for defence and security purposes.
Innovating to support the delivery of the UK Cyber Strategy by researching algorithms for Cyber defence.
Dstl recognises the importance of diversity and inclusion as people from diverse backgrounds bring fresh ideas. We are committed to building an inclusive working environment in which each employee fulfils their potential and maximises their contribution.
We particularly welcome female and ethnic minority applicants and those from the LGBTQI community, as they are under-represented within Dstl at these levels.
Job description
In this role you will:
Have a drive for keeping abreast of the latest developments in cyber security and emerging trends in artificial intelligence. We give our people the opportunity to think and innovate. We offer loads of opportunities for training and scholarships, attending and presenting at conferences, and collaborating with internal research and industry and academia.
Work in a team consisting of highly professional Autonomy and Mathematical experts with enviable national and international reputations to take part in cutting edge research. Use your critical thinking and creative problem solving skills to implement state of art methods and tools.
Develop a knowledge of undertaking verification, validation and vulnerability assessments on Systems of interest.
Appreciate the importance of safety, security requirements to have a positive impact on defence and security of the UK.
Deliver technical reports and recommendations to leadership, senior officials across government and military and other non-technical audiences through clear data storytelling and well-crafted verbal presentations
Person specification
We are looking for someone who has:
A keen interest in algorithms, AI, ML or statistical analysis along with a willingness to develop additional capabilities in cyber security and safety.
Experience contributing to Software or AI / ML intensive projects.
Is looking for a career with a difference, doing a job that provides the latest and most effective tools to defend our nation and uphold the principle of freedom.
Important Information:
Our work in defence, security and intelligence requires our employees to be UK Nationals who are able to gain a high level of security clearance to undertake the projects we are involved in to protect us from security threats. For this reason, only UK Nationals will be able to apply for this role. If you are an international or dual-national candidate, and you think you have the skills we need, please consider applying to any of our government, security or defence partners.
This role will require full UK security clearance and you should have resided in the UK for the past 5 years. For some roles Developed Vetting will also be required, in this case you should have resided in the UK for the past 10 years.
Behaviours
We'll assess you against these behaviours during the selection process:
Changing and Improving
Communicating and Influencing
Seeing the Big Picture
Working Together
Benefits
Benefits
Dstl’s full range of great benefits can be found in the information pack which includes:
Financial : An excellent pension scheme starting from 26% employer contribution ( find out more here ). In Year Rewarding Achievement bonuses and thank you vouchers. Rental deposit scheme and cycle to work scheme.
Flexible working : Options include alternative working patterns such as; compressed hours (e.g. working a 4 day week/ 9 day fortnight), job shares and annualised hours (agreed number of hours per annum paid monthly i.e. working term-time only).
Working hours: Flexibility around your working day (e.g. start time, finish time). Ability to bank hours in a 12 month reference period including the ability to accrue and use 3 days per calendar month.
Where you work: Depending on your role, blended working may be available including remote working to suit you and your team. This can be discussed at interview.
Annual leave: 25 days pro rata (rising to 30 after 5 years) plus 8 public holidays with the ability to buy/sell 5 additional days per annum.
Family: Maternity, adoption or shared parental leave of up to 26 weeks with full pay, an additional 13 weeks statutory pay and a further 13 weeks unpaid
Learning and Development: Dstl encourages and supports charterships, accreditations and provides employees access to fully funded apprenticeships up to level 7 (Masters Degree). Dstl will pay for 2 memberships with relevant bodies/institutions. Employees also have access to Civil Service Learning.
Facilities: Onsite parking, EV Charging points, restaurants, cafés and gyms.
Things you need to know
Selection process details
This vacancy is using Success Profiles (opens in a new window) , and will assess your Behaviours and Experience.
We want you to have your best chance of success in our recruitment process, so If at any stage of the application process you would like help or assistance please contact the Dstl Recruitment Team dstlrecruitment@dstl.gov.uk and we will do all we can to support you.
Sifting will be taking place bi-weekly throughout the campaign, successful applicants will be invited to attend an online interview via MS Teams.
Feedback will only be provided if you attend an interview or assessment.
Security
Successful candidates must undergo a criminal record check. Successful candidates must meet the security requirements before they can be appointed. The level of security needed is security check (opens in a new window) . See our vetting charter (opens in a new window) . People working with government assets must complete baseline personnel security standard (opens in new window) checks.
Nationality requirements
Open to UK nationals only. This job is not open to candidates who hold a dual nationality.
Working for the Civil Service
The Civil Service Code (opens in a new window) sets out the standards of behaviour expected of civil servants. We recruit by merit on the basis of fair and open competition, as outlined in the Civil Service Commission's recruitment principles (opens in a new window) . The Civil Service embraces diversity and promotes equal opportunities. As such, we run a Disability Confident Scheme (DCS) for candidates with disabilities who meet the minimum selection criteria. The Civil Service also offers a Redeployment Interview Scheme to civil servants who are at risk of redundancy, and who meet the minimum requirements for the advertised vacancy.
Apply and further information
This vacancy is part of the Great Place to Work for Veterans (opens in a new window) initiative. Once this job has closed, the job advert will no longer be available. You may want to save a copy for your records.
Contact point for applicants
Job contact :
Name : Dstl Recruitment
Email : dstlrecruitment@dstl.gov.uk
Recruitment team
Email : dstlrecruitment@dstl.gov.uk
Further information
Should you wish to raise a formal complaint about the Dstl recruitment process you should email dstlrecruitment@dstl.gov.uk stating the nature of the issue. We will respond within 5 working days.
Attachments
20230626_CSAS_Data_Scientist_Autonomy_Dependability_L5 Opens in new window (docx, 66kB) Candidate_info_pack_CIS - 20220824 Opens in new window (pdf, 1378kB)
03/07/2023
Full time
Location
Dstl Porton Down, Salisbury, Wiltshire, SP4 0JQ or Dstl Portsdown West, Fareham, Hampshire, PO17 6AD
About the job
Job summary
Dstl is the science and technology arm of the Ministry of Defence. We improve the front-line capability of the UK Armed Forces helping keep our country safe.
The Cyber Security and Safety Group has never been more important. Many military platforms such as fast jets, unmanned air vehicles, helicopters, naval vessels, and land vehicles are becoming increasingly reliant on Software, Artificial Intelligence (AI) and Autonomous functions to control all aspects of their behaviour.
We’re looking for mathematically strong data scientists to help make AI reliant military systems robust and trustworthy in complex operations to help save lives.
An example of our world class inspiring work is designing and trialling a variety of autonomous air and ground vehicles out in Salisbury plain with the US and Australia. AI models were retrained in flight to meet changing mission situations to enhance commanders’ decision-making.
You could be involved in:
Assessing and improving AI content in Defence and Security safety critical systems in the Air, Sea and Land domains, to ensure that they are safe, secure and protected.
Applying the latest thinking in verification and validation of artificial intelligence and autonomous functions for defence and security purposes.
Innovating to support the delivery of the UK Cyber Strategy by researching algorithms for Cyber defence.
Dstl recognises the importance of diversity and inclusion as people from diverse backgrounds bring fresh ideas. We are committed to building an inclusive working environment in which each employee fulfils their potential and maximises their contribution.
We particularly welcome female and ethnic minority applicants and those from the LGBTQI community, as they are under-represented within Dstl at these levels.
Job description
In this role you will:
Have a drive for keeping abreast of the latest developments in cyber security and emerging trends in artificial intelligence. We give our people the opportunity to think and innovate. We offer loads of opportunities for training and scholarships, attending and presenting at conferences, and collaborating with internal research and industry and academia.
Work in a team consisting of highly professional Autonomy and Mathematical experts with enviable national and international reputations to take part in cutting edge research. Use your critical thinking and creative problem solving skills to implement state of art methods and tools.
Develop a knowledge of undertaking verification, validation and vulnerability assessments on Systems of interest.
Appreciate the importance of safety, security requirements to have a positive impact on defence and security of the UK.
Deliver technical reports and recommendations to leadership, senior officials across government and military and other non-technical audiences through clear data storytelling and well-crafted verbal presentations
Person specification
We are looking for someone who has:
A keen interest in algorithms, AI, ML or statistical analysis along with a willingness to develop additional capabilities in cyber security and safety.
Experience contributing to Software or AI / ML intensive projects.
Is looking for a career with a difference, doing a job that provides the latest and most effective tools to defend our nation and uphold the principle of freedom.
Important Information:
Our work in defence, security and intelligence requires our employees to be UK Nationals who are able to gain a high level of security clearance to undertake the projects we are involved in to protect us from security threats. For this reason, only UK Nationals will be able to apply for this role. If you are an international or dual-national candidate, and you think you have the skills we need, please consider applying to any of our government, security or defence partners.
This role will require full UK security clearance and you should have resided in the UK for the past 5 years. For some roles Developed Vetting will also be required, in this case you should have resided in the UK for the past 10 years.
Behaviours
We'll assess you against these behaviours during the selection process:
Changing and Improving
Communicating and Influencing
Seeing the Big Picture
Working Together
Benefits
Benefits
Dstl’s full range of great benefits can be found in the information pack which includes:
Financial : An excellent pension scheme starting from 26% employer contribution ( find out more here ). In Year Rewarding Achievement bonuses and thank you vouchers. Rental deposit scheme and cycle to work scheme.
Flexible working : Options include alternative working patterns such as; compressed hours (e.g. working a 4 day week/ 9 day fortnight), job shares and annualised hours (agreed number of hours per annum paid monthly i.e. working term-time only).
Working hours: Flexibility around your working day (e.g. start time, finish time). Ability to bank hours in a 12 month reference period including the ability to accrue and use 3 days per calendar month.
Where you work: Depending on your role, blended working may be available including remote working to suit you and your team. This can be discussed at interview.
Annual leave: 25 days pro rata (rising to 30 after 5 years) plus 8 public holidays with the ability to buy/sell 5 additional days per annum.
Family: Maternity, adoption or shared parental leave of up to 26 weeks with full pay, an additional 13 weeks statutory pay and a further 13 weeks unpaid
Learning and Development: Dstl encourages and supports charterships, accreditations and provides employees access to fully funded apprenticeships up to level 7 (Masters Degree). Dstl will pay for 2 memberships with relevant bodies/institutions. Employees also have access to Civil Service Learning.
Facilities: Onsite parking, EV Charging points, restaurants, cafés and gyms.
Things you need to know
Selection process details
This vacancy is using Success Profiles (opens in a new window) , and will assess your Behaviours and Experience.
We want you to have your best chance of success in our recruitment process, so If at any stage of the application process you would like help or assistance please contact the Dstl Recruitment Team dstlrecruitment@dstl.gov.uk and we will do all we can to support you.
Sifting will be taking place bi-weekly throughout the campaign, successful applicants will be invited to attend an online interview via MS Teams.
Feedback will only be provided if you attend an interview or assessment.
Security
Successful candidates must undergo a criminal record check. Successful candidates must meet the security requirements before they can be appointed. The level of security needed is security check (opens in a new window) . See our vetting charter (opens in a new window) . People working with government assets must complete baseline personnel security standard (opens in new window) checks.
Nationality requirements
Open to UK nationals only. This job is not open to candidates who hold a dual nationality.
Working for the Civil Service
The Civil Service Code (opens in a new window) sets out the standards of behaviour expected of civil servants. We recruit by merit on the basis of fair and open competition, as outlined in the Civil Service Commission's recruitment principles (opens in a new window) . The Civil Service embraces diversity and promotes equal opportunities. As such, we run a Disability Confident Scheme (DCS) for candidates with disabilities who meet the minimum selection criteria. The Civil Service also offers a Redeployment Interview Scheme to civil servants who are at risk of redundancy, and who meet the minimum requirements for the advertised vacancy.
Apply and further information
This vacancy is part of the Great Place to Work for Veterans (opens in a new window) initiative. Once this job has closed, the job advert will no longer be available. You may want to save a copy for your records.
Contact point for applicants
Job contact :
Name : Dstl Recruitment
Email : dstlrecruitment@dstl.gov.uk
Recruitment team
Email : dstlrecruitment@dstl.gov.uk
Further information
Should you wish to raise a formal complaint about the Dstl recruitment process you should email dstlrecruitment@dstl.gov.uk stating the nature of the issue. We will respond within 5 working days.
Attachments
20230626_CSAS_Data_Scientist_Autonomy_Dependability_L5 Opens in new window (docx, 66kB) Candidate_info_pack_CIS - 20220824 Opens in new window (pdf, 1378kB)
About us:
Gower College Swansea is one of the largest colleges in Wales with a strong reputation for high quality teaching and learning. We have six campuses across the city with over 4,500 full time learners and 10,000 part time learners. We currently have a turnover of over £50 million making us a major employer in the region with approximately 1,000 staff.
At Gower College Swansea we are passionate about investing in our staff and looking after their wellbeing to ensure they feel supported in work and also at home.
The role:
You will be responsible for the delivery of high-quality Data Analytics apprenticeship training and assessment on our work-based learning contract.
37 hours per week
Permanent
£31,498 - £36,642 per annum
Jubilee Court, Swansea, SA5
Key Responsibilities:
Plan and deliver training and assessments across a range of courses ensuring schemes/records of work, assignment schedules, are appropriate to the syllabus content and awarding body standards
Work with customers in industry to maintain and build relationships and to secure contracts with existing and new organisations
Meet regularly with learners as determined by the programme delivery to establish and maintain monitoring and review arrangements for students undertaking training.
About you:
Level 4 qualification or equivalent in Data Analytics
Commercial knowledge, experience and understanding of industry, including training needs
Creative, innovative and enthusiastic
Benefits for you:
28 days annual leave, plus bank holidays, and the college is closed for two weeks over the Christmas period
Free Parking
A Local Government Pension Scheme with an average employer contribution of 21% (2023)
Free annual subscription to the Headspace Mindfulness app
Discounted study opportunities on College programmes
Hybrid Working
View more benefits here: https://www.gcs.ac.uk/recruitment/benefits-and-wellbeing
We welcome applications from individuals regardless of age, gender, ethnicity, disability, sexual orientation, gender identity, socio-economic background, religion and/or belief. We particularly welcome applications from groups currently underrepresented within our organisation.
If you wish to continue your application journey using Welsh Language, please visit our Cymraeg site. We encourage Welsh Language applications as we recognise the importance of delivering services in Welsh, and the need to grow our bilingual workforce.
Gower College Swansea is committed to safeguarding and promoting the welfare of young people and expects all staff to share this commitment. Appointments are subject to an enhanced DBS check and require registration with the Education Workforce Council for Wales.
Appointments will normally be made to the bottom of the salary scale with annual increments on 1st August each year (subject to a start date before 1st February).
15/05/2023
Full time
About us:
Gower College Swansea is one of the largest colleges in Wales with a strong reputation for high quality teaching and learning. We have six campuses across the city with over 4,500 full time learners and 10,000 part time learners. We currently have a turnover of over £50 million making us a major employer in the region with approximately 1,000 staff.
At Gower College Swansea we are passionate about investing in our staff and looking after their wellbeing to ensure they feel supported in work and also at home.
The role:
You will be responsible for the delivery of high-quality Data Analytics apprenticeship training and assessment on our work-based learning contract.
37 hours per week
Permanent
£31,498 - £36,642 per annum
Jubilee Court, Swansea, SA5
Key Responsibilities:
Plan and deliver training and assessments across a range of courses ensuring schemes/records of work, assignment schedules, are appropriate to the syllabus content and awarding body standards
Work with customers in industry to maintain and build relationships and to secure contracts with existing and new organisations
Meet regularly with learners as determined by the programme delivery to establish and maintain monitoring and review arrangements for students undertaking training.
About you:
Level 4 qualification or equivalent in Data Analytics
Commercial knowledge, experience and understanding of industry, including training needs
Creative, innovative and enthusiastic
Benefits for you:
28 days annual leave, plus bank holidays, and the college is closed for two weeks over the Christmas period
Free Parking
A Local Government Pension Scheme with an average employer contribution of 21% (2023)
Free annual subscription to the Headspace Mindfulness app
Discounted study opportunities on College programmes
Hybrid Working
View more benefits here: https://www.gcs.ac.uk/recruitment/benefits-and-wellbeing
We welcome applications from individuals regardless of age, gender, ethnicity, disability, sexual orientation, gender identity, socio-economic background, religion and/or belief. We particularly welcome applications from groups currently underrepresented within our organisation.
If you wish to continue your application journey using Welsh Language, please visit our Cymraeg site. We encourage Welsh Language applications as we recognise the importance of delivering services in Welsh, and the need to grow our bilingual workforce.
Gower College Swansea is committed to safeguarding and promoting the welfare of young people and expects all staff to share this commitment. Appointments are subject to an enhanced DBS check and require registration with the Education Workforce Council for Wales.
Appointments will normally be made to the bottom of the salary scale with annual increments on 1st August each year (subject to a start date before 1st February).
Royal Society of Biology
Remote mostly, occasional visits to London office
1 day/week, for indefinite duration. £400-£500/day incl., depending on demonstrated expertise. Reports to the Director of Technology.
Key tasks in this role:
(1) Developing and maintaining in-depth knowledge of the ins and outs of our core business systems and services as part of our business continuity plan – being there if the Director of Technology is unavailable.
(2) Providing some general application and user support for our day-to-day operations, for our extensive in-house developed suite of cloud bases applications and platforms. This involves helping to support users and respond to some support queries.
The main focus of this role is really on the business continuity aspect . As such, it may at first sight be a bit strange that we specifically also want this role to involve some user support tasks. We understand our prime target candidates probably don’t see themselves primarily as a user support person. However, we feel that, realistically, it would be impossible to step in at short notice without being properly familiar with the organisation, its people and its day-to-day concerns and workings.
Just for clarity, in this role the candidate will not be writing code for us or designing our solutions and they will therefore be expected to work within the operational and strategic vision as set out by the Director and with the systems and services we have. We want to be clear about these constraints to avoid any future disappointment or disagreement.
In this role, the candidate will most of the time be able to work from home/remotely, but will be required to attend our London office as required to carry out in-person or on-site duties and meetings occasionally.
Our systems and services make heavy (and increasing) use of the AWS cloud, including services such as EC2, SES, IAM, Lambda, API Gateway, CloudFront.
As stated above, a key part of the role is to become intimately familiar with the ins and outs of the core business systems and services as part of our business continuity planning.
In terms of the skills, abilities and experience that we see as important for this role:
Substantial and proven experience in managing AWS resources, including EC2, S3, IAM, SES, Lambda, DynamoDB,…
Detailed understanding of the Windows environment, including desktop and server OS, Active Directory with Group Policy and Windows Server management.
In-depth understanding of core network technologies such as DHCP, DNS, RADIUS, …
Decent experience in managing Office365 email services.
A good grasp of good security and data protection practice.
Exposure to languages such as PHP, Javascript and node.js and decent expertise with MySQL databases is beneficial.
Some real-world experience with programming in the area of web/cloud applications would be an added benefit, but not an absolute requirement.
With regard to the sort of personality we are am looking for:
Someone friendly and good at engaging with people at all levels and in all functions.
Someone who communicates effectively and in a constructive manner.
Someone who operates at all times with the highest degree of integrity and honesty.
Someone who is organised and methodical.
The Royal Society of Biology is a single unified voice for biology: advising Government and influencing policy; advancing education and professional development; supporting our members, and engaging and encouraging public interest in the life sciences. The Society represents a diverse membership of individuals, learned societies and other organisations.
Individual members include practising scientists, students at all levels, professionals in academia, industry and education, and non-professionals with an interest in biology.
Our vision is of a world that understands the true value of biology and how it can contribute to improving life for all.
Our mission is to be the unifying voice for biology, to facilitate the promotion of new discoveries in biological science for national and international benefit, and to engage the wider public with our work.
05/10/2022
Contractor
1 day/week, for indefinite duration. £400-£500/day incl., depending on demonstrated expertise. Reports to the Director of Technology.
Key tasks in this role:
(1) Developing and maintaining in-depth knowledge of the ins and outs of our core business systems and services as part of our business continuity plan – being there if the Director of Technology is unavailable.
(2) Providing some general application and user support for our day-to-day operations, for our extensive in-house developed suite of cloud bases applications and platforms. This involves helping to support users and respond to some support queries.
The main focus of this role is really on the business continuity aspect . As such, it may at first sight be a bit strange that we specifically also want this role to involve some user support tasks. We understand our prime target candidates probably don’t see themselves primarily as a user support person. However, we feel that, realistically, it would be impossible to step in at short notice without being properly familiar with the organisation, its people and its day-to-day concerns and workings.
Just for clarity, in this role the candidate will not be writing code for us or designing our solutions and they will therefore be expected to work within the operational and strategic vision as set out by the Director and with the systems and services we have. We want to be clear about these constraints to avoid any future disappointment or disagreement.
In this role, the candidate will most of the time be able to work from home/remotely, but will be required to attend our London office as required to carry out in-person or on-site duties and meetings occasionally.
Our systems and services make heavy (and increasing) use of the AWS cloud, including services such as EC2, SES, IAM, Lambda, API Gateway, CloudFront.
As stated above, a key part of the role is to become intimately familiar with the ins and outs of the core business systems and services as part of our business continuity planning.
In terms of the skills, abilities and experience that we see as important for this role:
Substantial and proven experience in managing AWS resources, including EC2, S3, IAM, SES, Lambda, DynamoDB,…
Detailed understanding of the Windows environment, including desktop and server OS, Active Directory with Group Policy and Windows Server management.
In-depth understanding of core network technologies such as DHCP, DNS, RADIUS, …
Decent experience in managing Office365 email services.
A good grasp of good security and data protection practice.
Exposure to languages such as PHP, Javascript and node.js and decent expertise with MySQL databases is beneficial.
Some real-world experience with programming in the area of web/cloud applications would be an added benefit, but not an absolute requirement.
With regard to the sort of personality we are am looking for:
Someone friendly and good at engaging with people at all levels and in all functions.
Someone who communicates effectively and in a constructive manner.
Someone who operates at all times with the highest degree of integrity and honesty.
Someone who is organised and methodical.
The Royal Society of Biology is a single unified voice for biology: advising Government and influencing policy; advancing education and professional development; supporting our members, and engaging and encouraging public interest in the life sciences. The Society represents a diverse membership of individuals, learned societies and other organisations.
Individual members include practising scientists, students at all levels, professionals in academia, industry and education, and non-professionals with an interest in biology.
Our vision is of a world that understands the true value of biology and how it can contribute to improving life for all.
Our mission is to be the unifying voice for biology, to facilitate the promotion of new discoveries in biological science for national and international benefit, and to engage the wider public with our work.
Description
We are looking for a Data Engineer to help us build and maintain scalable and resilient pipelines that will ingest, process, and deliver the data needed for predictive and descriptive analytics. These data pipelines will further connect to machine learning pipelines to facilitate automatic retraining of our models.
We are a diverse group of data scientists, data engineers, software engineers, machine learning engineers from over 30 different countries. We are smart and fast moving, operating in small teams, with freedom for independent work and fast decision making.
To empower scientists and radically improve how science is published, evaluated and disseminated to researchers, innovators and the public, we have built our own state-of-the-art Artificial Intelligence Review Assistant (AIRA), backed by cutting-edge machine learning algorithms.
Key Responsibilities
Work in a team of machine learning engineers responsible for the productization of prototypes developed by data scientists.
Collaborate with data scientists, machine learning engineers, and other data engineers to design scalable, reliable, and maintainable ETL processes that ensure data scientists and automated ML processes have the necessary data available
Research and adopt the best DataOps & MLOps standards to design and develop scalable end-to-end data pipelines.
Identify opportunities for data process automation.
Establish and enforce best practices (e.g. in development, quality assurance, optimization, release, and monitoring).
Requirements
Degree in Computer Science or similar
Proven experience as a Data Engineer
Proficiency in Python
Experience with a Cloud Platform (e.g. Azure, AWS, GCP)
Experience with a workflow engine (e.g. Data Factory, Airflow)
Experience with SQL and NoSQL (e.g. MongoDB) databases
Experience with Hadoop & Spark
Great communication, teamwork, problem-solving, and organizational skills.
Nice To Have
Understanding of supervised and unsupervised machine learning algorithms
Stream-processing frameworks (e.g. Kafka)
Benefits
Competitive salary.
Participation in Frontiers annual bonus scheme
25 leave days + 4 well-being days (pro rata and expiring each year on 31st of December)
Great work-life balance.
Opportunity to work remotely
Fresh fruit, snacks and coffee.
English classes.
Team building/sport activities and monthly social events.
Lots of opportunities to work with exciting technologies and solve challenging problems
Who we are
Frontiers is an award-winning open science platform and leading open access scholarly publisher. We are one of the largest and most cited publishers globally. Our journals span science, health, humanities and social sciences, engineering, and sustainability and we continue to expand into new academic disciplines so more researchers can publish open access.
23/12/2021
Full time
Description
We are looking for a Data Engineer to help us build and maintain scalable and resilient pipelines that will ingest, process, and deliver the data needed for predictive and descriptive analytics. These data pipelines will further connect to machine learning pipelines to facilitate automatic retraining of our models.
We are a diverse group of data scientists, data engineers, software engineers, machine learning engineers from over 30 different countries. We are smart and fast moving, operating in small teams, with freedom for independent work and fast decision making.
To empower scientists and radically improve how science is published, evaluated and disseminated to researchers, innovators and the public, we have built our own state-of-the-art Artificial Intelligence Review Assistant (AIRA), backed by cutting-edge machine learning algorithms.
Key Responsibilities
Work in a team of machine learning engineers responsible for the productization of prototypes developed by data scientists.
Collaborate with data scientists, machine learning engineers, and other data engineers to design scalable, reliable, and maintainable ETL processes that ensure data scientists and automated ML processes have the necessary data available
Research and adopt the best DataOps & MLOps standards to design and develop scalable end-to-end data pipelines.
Identify opportunities for data process automation.
Establish and enforce best practices (e.g. in development, quality assurance, optimization, release, and monitoring).
Requirements
Degree in Computer Science or similar
Proven experience as a Data Engineer
Proficiency in Python
Experience with a Cloud Platform (e.g. Azure, AWS, GCP)
Experience with a workflow engine (e.g. Data Factory, Airflow)
Experience with SQL and NoSQL (e.g. MongoDB) databases
Experience with Hadoop & Spark
Great communication, teamwork, problem-solving, and organizational skills.
Nice To Have
Understanding of supervised and unsupervised machine learning algorithms
Stream-processing frameworks (e.g. Kafka)
Benefits
Competitive salary.
Participation in Frontiers annual bonus scheme
25 leave days + 4 well-being days (pro rata and expiring each year on 31st of December)
Great work-life balance.
Opportunity to work remotely
Fresh fruit, snacks and coffee.
English classes.
Team building/sport activities and monthly social events.
Lots of opportunities to work with exciting technologies and solve challenging problems
Who we are
Frontiers is an award-winning open science platform and leading open access scholarly publisher. We are one of the largest and most cited publishers globally. Our journals span science, health, humanities and social sciences, engineering, and sustainability and we continue to expand into new academic disciplines so more researchers can publish open access.
We are looking for a junior or medior data science engineer to complement our data science team working with many of our industrial customers from various sectors, including (petro-) chemical, paper and pulp, automotive, metallurgy, telecom and food-and beverage. You will use various ML / AI / data science libraries and work on a variety of applications. You will get to use various state of the art technologies including Elastic, Kafka, Kubernetes and Luigi. Finally, you will have the opportunity to look behind the scenes of many domains and data. What are your responsibilities? You will typically work together with a more senior member of the team on projects and your day job typically consists of: Help build and improve the algorithms in a scalable manner for AI-based anomaly detection and predictive modelling. Apply and sometimes (co-)invent and implement AI/ML algorithms for processing various types of data (timeseries, tabular, etc.). Develop computer models and perform predictive and prescriptive analytics for various applications. Build Proof of Concepts in notebooks, integrate these algorithms into the operational flow of the customer, train the users, and provide support. Interface with various data sources over various connector pipelines (SQL, Elasticsearch, Kafka, REST APIs, etc.). Tune algorithms and data pipelines for optimal performance. Train, tune, and deploy anomaly detection and predictive models on industrial or IoT data. Knack/experience in consultancy services. Qualifications: Previous hands-on experience in Data Science, delivering machine learning models to production. Bachelor's or Master's degree in Data Science, Statistics, Computer Science, Mathematics, or Engineering - or equivalent. Proficiency in Python and relevant data science libraries (NumPy, pandas, scikit-learn, etc.). Experience with SQL, Power BI, Git & GitHub. Strong knowledge of Machine Learning Algorithms and respective theory. Ability to work within a team, collaborating effectively with colleagues. Strong stakeholder management skills and the ability to influence. A drive to learn new technologies and techniques. Experience/aptitude towards research and openness to learn new technologies. Experience with Azure, Spark (PySpark), and Kubeflow - desirable. We pay competitive salaries based on experience of the candidates. Along with this, you will be entitled to an award-winning range of benefits including: Access to our company car scheme or car allowance. Free confidential 24/7 GP service. Hundreds of discounts (including retail, childcare + gym). Affordable loans & enhanced pension scheme.
14/05/2025
Full time
We are looking for a junior or medior data science engineer to complement our data science team working with many of our industrial customers from various sectors, including (petro-) chemical, paper and pulp, automotive, metallurgy, telecom and food-and beverage. You will use various ML / AI / data science libraries and work on a variety of applications. You will get to use various state of the art technologies including Elastic, Kafka, Kubernetes and Luigi. Finally, you will have the opportunity to look behind the scenes of many domains and data. What are your responsibilities? You will typically work together with a more senior member of the team on projects and your day job typically consists of: Help build and improve the algorithms in a scalable manner for AI-based anomaly detection and predictive modelling. Apply and sometimes (co-)invent and implement AI/ML algorithms for processing various types of data (timeseries, tabular, etc.). Develop computer models and perform predictive and prescriptive analytics for various applications. Build Proof of Concepts in notebooks, integrate these algorithms into the operational flow of the customer, train the users, and provide support. Interface with various data sources over various connector pipelines (SQL, Elasticsearch, Kafka, REST APIs, etc.). Tune algorithms and data pipelines for optimal performance. Train, tune, and deploy anomaly detection and predictive models on industrial or IoT data. Knack/experience in consultancy services. Qualifications: Previous hands-on experience in Data Science, delivering machine learning models to production. Bachelor's or Master's degree in Data Science, Statistics, Computer Science, Mathematics, or Engineering - or equivalent. Proficiency in Python and relevant data science libraries (NumPy, pandas, scikit-learn, etc.). Experience with SQL, Power BI, Git & GitHub. Strong knowledge of Machine Learning Algorithms and respective theory. Ability to work within a team, collaborating effectively with colleagues. Strong stakeholder management skills and the ability to influence. A drive to learn new technologies and techniques. Experience/aptitude towards research and openness to learn new technologies. Experience with Azure, Spark (PySpark), and Kubeflow - desirable. We pay competitive salaries based on experience of the candidates. Along with this, you will be entitled to an award-winning range of benefits including: Access to our company car scheme or car allowance. Free confidential 24/7 GP service. Hundreds of discounts (including retail, childcare + gym). Affordable loans & enhanced pension scheme.
Job Description The Applied Innovation of AI (AI2) team is an elite machine learning group strategically located within the CTO office of JP Morgan Chase. AI2 tackle business critical priorities using innovative machine learning techniques and technologies with a focus on AI for Data, Software, Cybersecurity & Controls and Technology Infrastructure. The team partners closely with all lines of business and engineering teams across the firm to execute long-term projects in these areas that require significant machine learning development to support JPMC businesses as they grow. We are looking for excellent data engineers to help us with the design, development, deployment, delivery, and maintenance of AI products to our clients. In this role, you will be working with other engineers and data scientists in building and maintaining software and infrastructure that supports our team in developing and delivering disruptive AI products that serve our customers in production. Responsibilities Collaborate with data scientists and research/machine learning engineers to deliver products to production. Build and maintain scalable infrastructure as code in the cloud (private & public). Manage infrastructure for model training/serving and governance. Manage data infrastructure supporting the inference pipelines. Contribute significantly to architecture and software management discussions & tasks Rapid prototyping & shorten development cycles for our software and AI/ML products: Build infrastructure for our AI/ML data pipelines & workstreams from data analysis, experimentation, model training, model evaluation, deployment, operationalization, and tuning to visualization. Improve and maintain our automated CI/CD pipeline while collaborating with our stakeholders, various testing partners and model contributors. Increase our deployment velocity, including the process for deploying models and data pipelines into production. Requirements Minimum Bachelor of Science degree in Computer Science, Software Engineering, Electrical Engineering, Computer Engineering or related field. Experience in containerization - Docker/Kubernetes. Experience in AWS cloud and services (S3, Lambda, Aurora, ECS, EKS, SageMaker, Bedrock, Athena, Secrets Manager, Certificate Manager etc.) Proven DevOps/MLOps experience provisioning and maintaining infrastructure leveraging some of the following: Terraform, Ansible, AWS CDK, CloudFormation. Experience with CI/CD pipelines ex. Jenkins/Spinnaker. Experience with monitoring tools such as Prometheus, Grafana, Splunk and Datadog. Proven programming/scripting skills with some of the modern programming languages like Python. Solid software design, problem solving and debugging skills. Strong interpersonal skills; able to work independently as well as in a team. Desirable You have a strong commitment to development best practices and code reviews. You believe in continuous learning, sharing best practices, encouraging and elevating less experienced colleagues as they learn. Experience with data labelling, validation, provenance and versioning. About Us J.P. Morgan is a global leader in financial services, providing strategic advice and products to the world's most prominent corporations, governments, wealthy individuals and institutional investors. Our first-class business in a first-class way approach to serving clients drives everything we do. We strive to build trusted, long-term partnerships to help our clients achieve their business objectives. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants' and employees' religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team Our professionals in our Corporate Functions cover a diverse range of areas from finance and risk to human resources and marketing. Our corporate teams are an essential part of our company, ensuring that we're setting our businesses, clients, customers and employees up for success.
14/05/2025
Full time
Job Description The Applied Innovation of AI (AI2) team is an elite machine learning group strategically located within the CTO office of JP Morgan Chase. AI2 tackle business critical priorities using innovative machine learning techniques and technologies with a focus on AI for Data, Software, Cybersecurity & Controls and Technology Infrastructure. The team partners closely with all lines of business and engineering teams across the firm to execute long-term projects in these areas that require significant machine learning development to support JPMC businesses as they grow. We are looking for excellent data engineers to help us with the design, development, deployment, delivery, and maintenance of AI products to our clients. In this role, you will be working with other engineers and data scientists in building and maintaining software and infrastructure that supports our team in developing and delivering disruptive AI products that serve our customers in production. Responsibilities Collaborate with data scientists and research/machine learning engineers to deliver products to production. Build and maintain scalable infrastructure as code in the cloud (private & public). Manage infrastructure for model training/serving and governance. Manage data infrastructure supporting the inference pipelines. Contribute significantly to architecture and software management discussions & tasks Rapid prototyping & shorten development cycles for our software and AI/ML products: Build infrastructure for our AI/ML data pipelines & workstreams from data analysis, experimentation, model training, model evaluation, deployment, operationalization, and tuning to visualization. Improve and maintain our automated CI/CD pipeline while collaborating with our stakeholders, various testing partners and model contributors. Increase our deployment velocity, including the process for deploying models and data pipelines into production. Requirements Minimum Bachelor of Science degree in Computer Science, Software Engineering, Electrical Engineering, Computer Engineering or related field. Experience in containerization - Docker/Kubernetes. Experience in AWS cloud and services (S3, Lambda, Aurora, ECS, EKS, SageMaker, Bedrock, Athena, Secrets Manager, Certificate Manager etc.) Proven DevOps/MLOps experience provisioning and maintaining infrastructure leveraging some of the following: Terraform, Ansible, AWS CDK, CloudFormation. Experience with CI/CD pipelines ex. Jenkins/Spinnaker. Experience with monitoring tools such as Prometheus, Grafana, Splunk and Datadog. Proven programming/scripting skills with some of the modern programming languages like Python. Solid software design, problem solving and debugging skills. Strong interpersonal skills; able to work independently as well as in a team. Desirable You have a strong commitment to development best practices and code reviews. You believe in continuous learning, sharing best practices, encouraging and elevating less experienced colleagues as they learn. Experience with data labelling, validation, provenance and versioning. About Us J.P. Morgan is a global leader in financial services, providing strategic advice and products to the world's most prominent corporations, governments, wealthy individuals and institutional investors. Our first-class business in a first-class way approach to serving clients drives everything we do. We strive to build trusted, long-term partnerships to help our clients achieve their business objectives. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants' and employees' religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team Our professionals in our Corporate Functions cover a diverse range of areas from finance and risk to human resources and marketing. Our corporate teams are an essential part of our company, ensuring that we're setting our businesses, clients, customers and employees up for success.
Company TEC Partners are representing a nationally recognised IT company with an emphasis on educational tech. They develop tech with a focus on educational enrichment and development for schools and those with a keen interest in software development. About this Data Scientist role As a Data Scientist, you will craft pipelines used for managing business critical data and work cross-functionally to ensure the data sets managed will produce key insight and relevant data to drive business decisions and solutions. Why work as a Data Scientist with our client? Basic salary up to 70,000 Hybrid working pattern 25 days holiday + bank holidays Private healthcare Career progression What is expected of you as a Data Scientist with our client? Ideally 5+ years' experience working as a Data Scientist within a service driven industry A strong academic background with a degree in Computer Science or a related field Proficiency with a data focused coding language such as Python, R or SQL Experience using ML frameworks including TensorFlow and PyTorch The ability to work individually and within a team Strong written and verbal communication skills and the ability to explain complex computer science theories to people with varying levels of data literacy within the organisation Responsibilities of a Data Scientist with our client To work collaboratively within the wider Data Science team Advise of any delays or challenges relating to your projects To work with a problem solving attitude and an inquisitorial nature to understand the importance of your work Provide support and mentorship to junior members of the Data Science team If you are interested in this Data Scientist vacancy and you would like to hear more about it or other AI, ML, Data or Python Software Developer vacancies, please contact Stuart at TEC Partners today.
14/05/2025
Full time
Company TEC Partners are representing a nationally recognised IT company with an emphasis on educational tech. They develop tech with a focus on educational enrichment and development for schools and those with a keen interest in software development. About this Data Scientist role As a Data Scientist, you will craft pipelines used for managing business critical data and work cross-functionally to ensure the data sets managed will produce key insight and relevant data to drive business decisions and solutions. Why work as a Data Scientist with our client? Basic salary up to 70,000 Hybrid working pattern 25 days holiday + bank holidays Private healthcare Career progression What is expected of you as a Data Scientist with our client? Ideally 5+ years' experience working as a Data Scientist within a service driven industry A strong academic background with a degree in Computer Science or a related field Proficiency with a data focused coding language such as Python, R or SQL Experience using ML frameworks including TensorFlow and PyTorch The ability to work individually and within a team Strong written and verbal communication skills and the ability to explain complex computer science theories to people with varying levels of data literacy within the organisation Responsibilities of a Data Scientist with our client To work collaboratively within the wider Data Science team Advise of any delays or challenges relating to your projects To work with a problem solving attitude and an inquisitorial nature to understand the importance of your work Provide support and mentorship to junior members of the Data Science team If you are interested in this Data Scientist vacancy and you would like to hear more about it or other AI, ML, Data or Python Software Developer vacancies, please contact Stuart at TEC Partners today.
Our Company: QiH is a fast-growing, innovative, and progressive scale-up business headquartered in London with a collective of brilliant brains in Skopje. We are at the start of an exciting journey as we build out our internal engineering capability, spearheading our tech transformation, building best in class products and tackling exciting and complex challenges along the way! Data is at the core of what we do at QiH, but our people are at the heart of our success! At QiH, we have created an energetic and target-driven culture and continuously invest in each individual. The Role Our Technology team is growing! We're looking for a highly skilled and experienced Senior Data Engineer to design and build a robust, scalable, and high-performance real-time data backbone. In this role you will lead the design, development, and implementation of a real-time data backbone. This position focuses on building and maintaining scalable, high-performance data pipelines that support real-time data processing and analytics. You will work closely with various teams to ensure the infrastructure meets the business's growing data needs while leveraging modern technologies for efficient data ingestion, storage, and analysis. Key Responsibilities: Design, develop, and maintain scalable real-time data pipelines and infrastructure. Integrate various data sources and ensure seamless real-time data flow across the organisation. Build efficient, fault-tolerant, and highly available data ingestion processes. Monitor and improve data pipeline performance and scalability for low-latency and high-throughput. Work closely with cross-functional teams, including software engineers, data scientists, and business stakeholders, to ensure data infrastructure aligns with business objectives. About You: Hands-on experience in data engineering with a proven track record of building and managing real-time data pipelines across a track record of multiple initiatives. Expertise in developing data backbones using distributed streaming platforms (Kafka, Spark Streaming, Flink, etc.). Experience working with cloud platforms such as AWS, GCP, or Azure for real-time data ingestion and storage. Programming skills in Python, Java, Scala, or a similar language. Proficiency in database technologies (SQL, NoSQL, time-series databases) and data modelling. Strong understanding of data pipeline orchestration tools (e.g., Apache Airflow, Kubernetes). You thrive when working as part of a team. Comfortable in a fast-paced environment. Have excellent written and verbal English skills. Last but not least, you'll have no ego! What You'll Get: Competitive Basic Salary Quarterly Bonuses Hybrid working Private Health Care (Bupa) Market Leading Training Programme Recognition & Reward Scheme Annual Company Conference (previous destinations Bologna, Dubrovnik, Belgrade and Thessaloniki) Regular Happy Hour / Team Lunches Free Coffee, Drinks & Snacks What's the next step? Our hiring process ensures we're recruiting the right people for the role. We ensure that people are as suitable for us as we are for them. If you like the sound of what we're all about at QiH and want to join a team where you can make an impact, please apply or contact us at .
14/05/2025
Full time
Our Company: QiH is a fast-growing, innovative, and progressive scale-up business headquartered in London with a collective of brilliant brains in Skopje. We are at the start of an exciting journey as we build out our internal engineering capability, spearheading our tech transformation, building best in class products and tackling exciting and complex challenges along the way! Data is at the core of what we do at QiH, but our people are at the heart of our success! At QiH, we have created an energetic and target-driven culture and continuously invest in each individual. The Role Our Technology team is growing! We're looking for a highly skilled and experienced Senior Data Engineer to design and build a robust, scalable, and high-performance real-time data backbone. In this role you will lead the design, development, and implementation of a real-time data backbone. This position focuses on building and maintaining scalable, high-performance data pipelines that support real-time data processing and analytics. You will work closely with various teams to ensure the infrastructure meets the business's growing data needs while leveraging modern technologies for efficient data ingestion, storage, and analysis. Key Responsibilities: Design, develop, and maintain scalable real-time data pipelines and infrastructure. Integrate various data sources and ensure seamless real-time data flow across the organisation. Build efficient, fault-tolerant, and highly available data ingestion processes. Monitor and improve data pipeline performance and scalability for low-latency and high-throughput. Work closely with cross-functional teams, including software engineers, data scientists, and business stakeholders, to ensure data infrastructure aligns with business objectives. About You: Hands-on experience in data engineering with a proven track record of building and managing real-time data pipelines across a track record of multiple initiatives. Expertise in developing data backbones using distributed streaming platforms (Kafka, Spark Streaming, Flink, etc.). Experience working with cloud platforms such as AWS, GCP, or Azure for real-time data ingestion and storage. Programming skills in Python, Java, Scala, or a similar language. Proficiency in database technologies (SQL, NoSQL, time-series databases) and data modelling. Strong understanding of data pipeline orchestration tools (e.g., Apache Airflow, Kubernetes). You thrive when working as part of a team. Comfortable in a fast-paced environment. Have excellent written and verbal English skills. Last but not least, you'll have no ego! What You'll Get: Competitive Basic Salary Quarterly Bonuses Hybrid working Private Health Care (Bupa) Market Leading Training Programme Recognition & Reward Scheme Annual Company Conference (previous destinations Bologna, Dubrovnik, Belgrade and Thessaloniki) Regular Happy Hour / Team Lunches Free Coffee, Drinks & Snacks What's the next step? Our hiring process ensures we're recruiting the right people for the role. We ensure that people are as suitable for us as we are for them. If you like the sound of what we're all about at QiH and want to join a team where you can make an impact, please apply or contact us at .
The Platform mission creates the technology that enables Spotify to learn quickly and scale easily, enabling rapid growth in our users and our business around the globe. Spanning many disciplines, we work to make the business work; creating the frameworks, capabilities and tools needed to welcome a billion customers. Join us and help to amplify productivity, quality and innovation across Spotify. Location London Job type Permanent What You'll Do Work in a fast-paced environment to build an analytic tool for our developers. Collaborate with other engineers, data scientists, designers, user researchers and product managers to craft the best possible user experience on our internal tools and infrastructure. Ensure that our APIs will have smooth integration with the current web clients - the desktop and TV app to name a few. Identify pain points in adoption and integration experience of our team's internal tools and SDKs. Ensure the developer experience is seamless for all developers. Take part in on-call rotation to help support the product area. Who You Are You have 5+ years of experience as a software engineer, comfortable around backend, frontend and infrastructure. You are passionate about building a good developer experience by creating robust APIs on tracking user behaviour data. You have a quality-mindset and are interested in all parts of software development: coding, testing, deployment and monitoring. You care about the user experience and know how to build user-friendly web applications. Experience with any of the following areas: GCP, Kubernetes, GRPC, protobuf, large scale data pipelines, would be a bonus! Where You'll Be This role is based in London (UK). We offer you the flexibility to work where you work best! There will be some in person meetings, but still allows for flexibility to work from home. Extensive learning opportunities, through our dedicated team, GreenHouse. Flexible share incentives letting you choose how you share in our success. Global parental leave, six months off - fully paid - for all new parents. All The Feels, our employee assistance program and self-care hub. Flexible public holidays, swap days off according to your values and beliefs. Learn about life at Spotify You are welcome at Spotify for who you are, no matter where you come from, what you look like, or what's playing in your headphones. Our platform is for everyone, and so is our workplace. The more voices we have represented and amplified in our business, the more we will all thrive, contribute, and be forward-thinking! So bring us your personal experience, your perspectives, and your background. It's in our differences that we will find the power to keep revolutionizing the way the world listens. Spotify transformed music listening forever when we launched in 2008. Our mission is to unlock the potential of human creativity by giving a million creative artists the opportunity to live off their art and billions of fans the chance to enjoy and be passionate about these creators. Everything we do is driven by our love for music and podcasting. Today, we are the world's most popular audio streaming subscription service with a community of more than 500 million users.
14/05/2025
Full time
The Platform mission creates the technology that enables Spotify to learn quickly and scale easily, enabling rapid growth in our users and our business around the globe. Spanning many disciplines, we work to make the business work; creating the frameworks, capabilities and tools needed to welcome a billion customers. Join us and help to amplify productivity, quality and innovation across Spotify. Location London Job type Permanent What You'll Do Work in a fast-paced environment to build an analytic tool for our developers. Collaborate with other engineers, data scientists, designers, user researchers and product managers to craft the best possible user experience on our internal tools and infrastructure. Ensure that our APIs will have smooth integration with the current web clients - the desktop and TV app to name a few. Identify pain points in adoption and integration experience of our team's internal tools and SDKs. Ensure the developer experience is seamless for all developers. Take part in on-call rotation to help support the product area. Who You Are You have 5+ years of experience as a software engineer, comfortable around backend, frontend and infrastructure. You are passionate about building a good developer experience by creating robust APIs on tracking user behaviour data. You have a quality-mindset and are interested in all parts of software development: coding, testing, deployment and monitoring. You care about the user experience and know how to build user-friendly web applications. Experience with any of the following areas: GCP, Kubernetes, GRPC, protobuf, large scale data pipelines, would be a bonus! Where You'll Be This role is based in London (UK). We offer you the flexibility to work where you work best! There will be some in person meetings, but still allows for flexibility to work from home. Extensive learning opportunities, through our dedicated team, GreenHouse. Flexible share incentives letting you choose how you share in our success. Global parental leave, six months off - fully paid - for all new parents. All The Feels, our employee assistance program and self-care hub. Flexible public holidays, swap days off according to your values and beliefs. Learn about life at Spotify You are welcome at Spotify for who you are, no matter where you come from, what you look like, or what's playing in your headphones. Our platform is for everyone, and so is our workplace. The more voices we have represented and amplified in our business, the more we will all thrive, contribute, and be forward-thinking! So bring us your personal experience, your perspectives, and your background. It's in our differences that we will find the power to keep revolutionizing the way the world listens. Spotify transformed music listening forever when we launched in 2008. Our mission is to unlock the potential of human creativity by giving a million creative artists the opportunity to live off their art and billions of fans the chance to enjoy and be passionate about these creators. Everything we do is driven by our love for music and podcasting. Today, we are the world's most popular audio streaming subscription service with a community of more than 500 million users.
Curve is a next-gen insights, analytics and technology consultancy that leverages digital consumer data and advanced technology to help businesses unlock consumer opportunities. Digital consumer data is powerful; it's big, it's real, and it's always updating. We use a combination of in-house technology and bespoke solutions, powered by AI, to transform data from sources such as Social, Reviews, Search, and broader marketing and sales data. These reveal fresh insights for our clients; helping them to build better products and brands, to deliver effective marketing to consumers. Our software, machine learning and AI are key to how we deliver impact, centred on: Natural Language Processing, GPT & other LLMs : unearthing trends, themes and other patterns from large text-based data sets, and deploying state-of-the-art AI to automate and empower consumer facing businesses and their insights & analytics functions Marketing Data Science & Personalisation : using first party consumer data to understand each client's consumer base, building personalisation and other machine learning models to better engage with and excite consumers Data Engineering & Data Architecture : data engineering across a variety of tools to integrate these leading technologies into optimised and efficient data models and ecosystems, feeding into best-in-class analytics dashboards, marketing activation and front-end platforms Software Engineering: full stack expertise to build, maintain and support internal and externally facing Software & Data as a Service solutions, in AWS, that accelerate delivery and unlock deeper insights for our clients As a start-up, we can move faster than most companies and do things differently. We have experienced rapid growth so far and we're looking for a AI Engineer to join our growing team. ABOUT THE ROLE You'll play a crucial role in designing, prototyping and evaluating innovative data science solutions, powered by digital consumer data sources. Through your proactive research, and driven by your passion for data science, you'll surface the latest AI and NLP innovations, connecting the dots with Curve's use cases to propose novel solutions to our most challenging problems. You'll work on a mix of small proof of concepts and larger projects, both of which push the boundaries of what we can do with digital data and technology; building innovative models on top of the latest open source tools, as well as making use of cutting edge AI APIs from OpenAI, Google, Microsoft and AWS. You'll regularly share your knowledge of the rapidly changing and exciting world of AI to ensure Curve benefits from your ongoing research. Additionally, you'll work to shape the future of our fast-growing London start-up, driving and enhancing our data science capabilities and best practices as we continue to grow, playing a key part in maintaining our Technology Team's competitive edge. We are looking for a passionate and ambitious data scientist, enthusiastic about continuously learning through exploring and experimenting with trends in data science and NLP, with a keen interest in applying innovative approaches to digital consumer data sources for the benefit of Curve and our clients. WHAT YOU'LL BE DOING Develop innovative data science models and prototypes in Python, from scratch Proactively research, experiment with, and share the latest trends in data science, AI and NLP Translate unstructured business problems into clear requirements Interrogate and data mine rich digital consumer data sources such as social posts, search engine data, product reviews, clickstream data and beyond Proactively and continuously identify and propose ways to improve the impact of our data solutions Explore new and novel ways to understand and enrich digital consumer data sources Work closely with an amazing cross-functional Curve team from specialists in data science & engineering to strategy, insights and analytics experts WHAT WE'RE LOOKING FOR Bachelor's degree or higher in an applicable field such as Computer Science, Statistics, Maths or similar Science or Engineering discipline Experience designing, developing, evaluating and optimising data science models and NLP solutions 3+ years' experience with Python, with strong code design skills, and preferably OOP experience Passionate about AI and NLP, with demonstrable experience researching, experimenting with and communicating the benefits of new data science technologies and methods Some experience with defining hypotheses and running effective data science experiments SQL and some data warehousing knowledge NICE TO HAVES OR EXCITED TO LEARN: Experience analysing data in a marketing or other consumer-centric context Experience developing data solutions in cloud environments such as Azure, AWS or GCP Experience utilising social listening tools and/or search/web analytics tools Get to know Curve's journey and meet some of the minds fuelling our passion
14/05/2025
Full time
Curve is a next-gen insights, analytics and technology consultancy that leverages digital consumer data and advanced technology to help businesses unlock consumer opportunities. Digital consumer data is powerful; it's big, it's real, and it's always updating. We use a combination of in-house technology and bespoke solutions, powered by AI, to transform data from sources such as Social, Reviews, Search, and broader marketing and sales data. These reveal fresh insights for our clients; helping them to build better products and brands, to deliver effective marketing to consumers. Our software, machine learning and AI are key to how we deliver impact, centred on: Natural Language Processing, GPT & other LLMs : unearthing trends, themes and other patterns from large text-based data sets, and deploying state-of-the-art AI to automate and empower consumer facing businesses and their insights & analytics functions Marketing Data Science & Personalisation : using first party consumer data to understand each client's consumer base, building personalisation and other machine learning models to better engage with and excite consumers Data Engineering & Data Architecture : data engineering across a variety of tools to integrate these leading technologies into optimised and efficient data models and ecosystems, feeding into best-in-class analytics dashboards, marketing activation and front-end platforms Software Engineering: full stack expertise to build, maintain and support internal and externally facing Software & Data as a Service solutions, in AWS, that accelerate delivery and unlock deeper insights for our clients As a start-up, we can move faster than most companies and do things differently. We have experienced rapid growth so far and we're looking for a AI Engineer to join our growing team. ABOUT THE ROLE You'll play a crucial role in designing, prototyping and evaluating innovative data science solutions, powered by digital consumer data sources. Through your proactive research, and driven by your passion for data science, you'll surface the latest AI and NLP innovations, connecting the dots with Curve's use cases to propose novel solutions to our most challenging problems. You'll work on a mix of small proof of concepts and larger projects, both of which push the boundaries of what we can do with digital data and technology; building innovative models on top of the latest open source tools, as well as making use of cutting edge AI APIs from OpenAI, Google, Microsoft and AWS. You'll regularly share your knowledge of the rapidly changing and exciting world of AI to ensure Curve benefits from your ongoing research. Additionally, you'll work to shape the future of our fast-growing London start-up, driving and enhancing our data science capabilities and best practices as we continue to grow, playing a key part in maintaining our Technology Team's competitive edge. We are looking for a passionate and ambitious data scientist, enthusiastic about continuously learning through exploring and experimenting with trends in data science and NLP, with a keen interest in applying innovative approaches to digital consumer data sources for the benefit of Curve and our clients. WHAT YOU'LL BE DOING Develop innovative data science models and prototypes in Python, from scratch Proactively research, experiment with, and share the latest trends in data science, AI and NLP Translate unstructured business problems into clear requirements Interrogate and data mine rich digital consumer data sources such as social posts, search engine data, product reviews, clickstream data and beyond Proactively and continuously identify and propose ways to improve the impact of our data solutions Explore new and novel ways to understand and enrich digital consumer data sources Work closely with an amazing cross-functional Curve team from specialists in data science & engineering to strategy, insights and analytics experts WHAT WE'RE LOOKING FOR Bachelor's degree or higher in an applicable field such as Computer Science, Statistics, Maths or similar Science or Engineering discipline Experience designing, developing, evaluating and optimising data science models and NLP solutions 3+ years' experience with Python, with strong code design skills, and preferably OOP experience Passionate about AI and NLP, with demonstrable experience researching, experimenting with and communicating the benefits of new data science technologies and methods Some experience with defining hypotheses and running effective data science experiments SQL and some data warehousing knowledge NICE TO HAVES OR EXCITED TO LEARN: Experience analysing data in a marketing or other consumer-centric context Experience developing data solutions in cloud environments such as Azure, AWS or GCP Experience utilising social listening tools and/or search/web analytics tools Get to know Curve's journey and meet some of the minds fuelling our passion
Compliance Engineering - Full Stack Software Engineer - Vice President - Birmingham location_on Birmingham, West Midlands, England, United Kingdom YOUR IMPACT Are you passionate about developing mission-critical, high quality software solutions, using cutting-edge technology, in a dynamic environment? OUR IMPACT We are Compliance Engineering, a global team of more than 400 engineers and scientists who work on the most complex, mission-critical problems. We: build and operate a suite of platforms and applications that prevent, detect, and mitigate regulatory and reputational risk across the firm. have access to the latest technology and to massive amounts of structured and unstructured data. leverage modern frameworks to build responsive and intuitive front end and Big Data applications. The firm is making a significant investment to uplift and rebuild the Compliance application portfolio in 2025 and as part of that initiative we are setting up a new Compliance Engineering team in Birmingham. You have a unique opportunity to be part of this new, exciting team and play a crucial role in not just building this team but also help forge its vision. To achieve this, Compliance Engineering is looking to fill several full stack developer roles across different teams. HOW YOU WILL FULFILL YOUR POTENTIAL As a member of our team, you will: partner globally with sponsors, users and engineering colleagues across multiple divisions to create end-to-end solutions, learn from experts, be able to innovate and incubate new ideas, have an opportunity to work on a broad range of problems, often dealing with large data sets, including real-time processing, messaging, workflow and UI/UX, be involved in the full life cycle; defining, designing, implementing, testing, deploying, and maintaining software across our products. QUALIFICATIONS A successful candidate will possess the following attributes: A Bachelor's or Master's degree in Computer Science, Computer Engineering, or a similar field of study. 5+ years of experience in Java development. 5+ years of experience in automated testing and SDLC concepts. An ability to drive to goals and milestones while valuing and maintaining a strong attention to detail. Strong communication and documentation skills - both verbally and in writing to effectively interact with multiple people and global teams. Excellent judgment, analytical thinking, and problem-solving skills. Strong team player & experience working with global teams. Self-motivated individual that possesses excellent time management and organizational skills. ABOUT GOLDMAN SACHS Goldman Sachs is a leading global investment banking, securities and investment management firm that provides a wide range of financial services to a substantial and diversified client base that includes corporations, financial institutions, governments and individuals. Founded in 1869, the firm is headquartered in New York and maintains offices in all major financial centers around the world. At Goldman Sachs, we commit our people, capital and ideas to help our clients, shareholders and the communities we serve to grow. We believe who you are makes you better at what you do. We're committed to fostering and advancing diversity and inclusion in our own workplace and beyond by ensuring every individual within our firm has a number of opportunities to grow professionally and personally, from our training and development opportunities and firmwide networks to benefits, wellness and personal finance offerings and mindfulness programs. Learn more about our culture, benefits, and people at We're committed to finding reasonable accommodations for candidates with special needs or disabilities during our recruiting process. Learn more: Disability Statement . Goldman Sachs is an equal opportunity employer and does not discriminate on the basis of race, color, religion, sex, national origin, age, veterans status, disability, or any other characteristic protected by applicable law.
14/05/2025
Full time
Compliance Engineering - Full Stack Software Engineer - Vice President - Birmingham location_on Birmingham, West Midlands, England, United Kingdom YOUR IMPACT Are you passionate about developing mission-critical, high quality software solutions, using cutting-edge technology, in a dynamic environment? OUR IMPACT We are Compliance Engineering, a global team of more than 400 engineers and scientists who work on the most complex, mission-critical problems. We: build and operate a suite of platforms and applications that prevent, detect, and mitigate regulatory and reputational risk across the firm. have access to the latest technology and to massive amounts of structured and unstructured data. leverage modern frameworks to build responsive and intuitive front end and Big Data applications. The firm is making a significant investment to uplift and rebuild the Compliance application portfolio in 2025 and as part of that initiative we are setting up a new Compliance Engineering team in Birmingham. You have a unique opportunity to be part of this new, exciting team and play a crucial role in not just building this team but also help forge its vision. To achieve this, Compliance Engineering is looking to fill several full stack developer roles across different teams. HOW YOU WILL FULFILL YOUR POTENTIAL As a member of our team, you will: partner globally with sponsors, users and engineering colleagues across multiple divisions to create end-to-end solutions, learn from experts, be able to innovate and incubate new ideas, have an opportunity to work on a broad range of problems, often dealing with large data sets, including real-time processing, messaging, workflow and UI/UX, be involved in the full life cycle; defining, designing, implementing, testing, deploying, and maintaining software across our products. QUALIFICATIONS A successful candidate will possess the following attributes: A Bachelor's or Master's degree in Computer Science, Computer Engineering, or a similar field of study. 5+ years of experience in Java development. 5+ years of experience in automated testing and SDLC concepts. An ability to drive to goals and milestones while valuing and maintaining a strong attention to detail. Strong communication and documentation skills - both verbally and in writing to effectively interact with multiple people and global teams. Excellent judgment, analytical thinking, and problem-solving skills. Strong team player & experience working with global teams. Self-motivated individual that possesses excellent time management and organizational skills. ABOUT GOLDMAN SACHS Goldman Sachs is a leading global investment banking, securities and investment management firm that provides a wide range of financial services to a substantial and diversified client base that includes corporations, financial institutions, governments and individuals. Founded in 1869, the firm is headquartered in New York and maintains offices in all major financial centers around the world. At Goldman Sachs, we commit our people, capital and ideas to help our clients, shareholders and the communities we serve to grow. We believe who you are makes you better at what you do. We're committed to fostering and advancing diversity and inclusion in our own workplace and beyond by ensuring every individual within our firm has a number of opportunities to grow professionally and personally, from our training and development opportunities and firmwide networks to benefits, wellness and personal finance offerings and mindfulness programs. Learn more about our culture, benefits, and people at We're committed to finding reasonable accommodations for candidates with special needs or disabilities during our recruiting process. Learn more: Disability Statement . Goldman Sachs is an equal opportunity employer and does not discriminate on the basis of race, color, religion, sex, national origin, age, veterans status, disability, or any other characteristic protected by applicable law.
We are looking for a highly skilled, strategic and tenacious EMEA ISV Leader with strong vision and unparalleled execution. You will design, develop and execute the GTM strategy and programs with ISVs. You have driven scale and produced multiplier effects. You will define priorities, drive high activity, work closely with the leadership team and roll up your sleeves to execute. Your program will drive competitive advantage to our ISV partners, accelerate Databricks' growth and unlock business impact to joint customers. The impact you will have GTM Strategy & Execution : Develop and execute EMEA GTM strategy implementing innovative strategies that expand our market presence. Sales Programs: Launch sales plays and programs within Databricks, ISVs and our shared ecosystem with an industry-first lens. Launch programs sales plays, partner power plays (with Cloud, SIs and other ISVs). Demand Generation: Work with Sales Dev, Demand Gen, and Marketing to launch outbound and inbound campaigns to spread awareness and support pipeline creation for the joint solutions. Operational Excellence: Create and drive operational rigor for cadence, reporting, KPIs, escalation process, stakeholder updates and QBRs. Market Research: Conduct market research including customer discussions and aggregated customer feedback to identify trends, customer needs, and competitive landscape to inform solution development and partner strategy. Provide data-driven insights on ISV consumption trends and represent 'the voice of the partner'. Sales Enablement: Collaborate with enablement, sales and solution engineering teams to drive ISV-related sales motions, including training, messaging, and co-selling efforts. Deal Support: Anticipate (and proactively solve) channel conflict, support deal creation through close, drive account field engagement with BU leaders and strategic priority accounts. One-Team: Develop an environment for winning and success to further nurture a 'one team' collaborative culture. Examples of responsibilities Set a winning Strategy: Understand and align the right ISVs to EMEA Sales Leader priorities, company priorities and critical industry imperatives. Drive Partner Success: Work with ISV Partner C-suite, Alliance and Sales teams to build and execute on GTM plans in region. Facilitate Regional QBRs with important partners. Build for Scale: Work with Sales Programs to embed ISV Partners into core motions and priorities. Activate and Enable the field: Create and deliver enablement through EMEA on how best to work with ISVs and which ISVs to work with for certain industry imperatives and company priorities. Lead Sales workshops between partner and Databricks to unlock new use cases and progress said pipeline. Represent the business: Have the depth to handle discussions on data collaboration, technology partners and built ons with customers and partners. Make our ISV Program a competitive differentiator: Build mindshare with top ISVs by engaging top Data and AI Leadership, Industry Leadership and C-suite. Build a program that is just as much a competitive differentiator for partners as our product. Triangulate and drive ecosystem success: Drive Partner Power Play, aligning the right ISVs to the right SIs. Define repeatable use cases and work with industry team and partners to build Brickbuilder Solutions. Measure, Iterate and Improve: Identify cross functional gaps and work to bring teams together to solve them. Identify gaps in our ecosystem portfolio and assist to recruit the appropriate partners. Drives efficiencies and productivity across their region, guiding on accurate activity tracking from a depth of experience. Collaborate Cross Functionally: Interlock and build effective relationship with Sales teams, Business Development, Product, Engineering, Pre-sales, Post-sales, Marketing, Partners and other partners in the ecosystem. What we look for Experience Extensive experience selling Software, SaaS, Cloud Sales. 8+ years experience in securing and supporting ISV partners at scale with track record of success in planning and executing ISV co-sell programs. Demonstrated history of consistent goal achievement in a highly competitive environment (top 10% performer). Degree in business, economics, engineering, finance, science or math preferable. Track record of building strong ecosystems of lucrative customer relationships and cross-functional partnerships (Sales, Engineering, Marketing). Skills Outstanding communication skills (verbal, written and presentation) for both technical and executive audiences. Technically knowledgeable in the open source software, big data, IoT, and/or cloud computing space. Ability to translate technical concepts into business value, interacting with both business executives and technical audiences (data scientists and engineers). In-depth understanding of alliance/partner organisations, key stakeholder management, joint value proposition development and delivering field enablement programs. Possess aptitude to learn quickly and establish credibility. Proactive, entrepreneurial spirit and tenacious team player. About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide - including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 - rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook. Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please visit . Our Commitment to Diversity and Inclusion At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics. Compliance If access to export-controlled technology or source code is required for performance of job duties, it is within Employer's discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.
14/05/2025
Full time
We are looking for a highly skilled, strategic and tenacious EMEA ISV Leader with strong vision and unparalleled execution. You will design, develop and execute the GTM strategy and programs with ISVs. You have driven scale and produced multiplier effects. You will define priorities, drive high activity, work closely with the leadership team and roll up your sleeves to execute. Your program will drive competitive advantage to our ISV partners, accelerate Databricks' growth and unlock business impact to joint customers. The impact you will have GTM Strategy & Execution : Develop and execute EMEA GTM strategy implementing innovative strategies that expand our market presence. Sales Programs: Launch sales plays and programs within Databricks, ISVs and our shared ecosystem with an industry-first lens. Launch programs sales plays, partner power plays (with Cloud, SIs and other ISVs). Demand Generation: Work with Sales Dev, Demand Gen, and Marketing to launch outbound and inbound campaigns to spread awareness and support pipeline creation for the joint solutions. Operational Excellence: Create and drive operational rigor for cadence, reporting, KPIs, escalation process, stakeholder updates and QBRs. Market Research: Conduct market research including customer discussions and aggregated customer feedback to identify trends, customer needs, and competitive landscape to inform solution development and partner strategy. Provide data-driven insights on ISV consumption trends and represent 'the voice of the partner'. Sales Enablement: Collaborate with enablement, sales and solution engineering teams to drive ISV-related sales motions, including training, messaging, and co-selling efforts. Deal Support: Anticipate (and proactively solve) channel conflict, support deal creation through close, drive account field engagement with BU leaders and strategic priority accounts. One-Team: Develop an environment for winning and success to further nurture a 'one team' collaborative culture. Examples of responsibilities Set a winning Strategy: Understand and align the right ISVs to EMEA Sales Leader priorities, company priorities and critical industry imperatives. Drive Partner Success: Work with ISV Partner C-suite, Alliance and Sales teams to build and execute on GTM plans in region. Facilitate Regional QBRs with important partners. Build for Scale: Work with Sales Programs to embed ISV Partners into core motions and priorities. Activate and Enable the field: Create and deliver enablement through EMEA on how best to work with ISVs and which ISVs to work with for certain industry imperatives and company priorities. Lead Sales workshops between partner and Databricks to unlock new use cases and progress said pipeline. Represent the business: Have the depth to handle discussions on data collaboration, technology partners and built ons with customers and partners. Make our ISV Program a competitive differentiator: Build mindshare with top ISVs by engaging top Data and AI Leadership, Industry Leadership and C-suite. Build a program that is just as much a competitive differentiator for partners as our product. Triangulate and drive ecosystem success: Drive Partner Power Play, aligning the right ISVs to the right SIs. Define repeatable use cases and work with industry team and partners to build Brickbuilder Solutions. Measure, Iterate and Improve: Identify cross functional gaps and work to bring teams together to solve them. Identify gaps in our ecosystem portfolio and assist to recruit the appropriate partners. Drives efficiencies and productivity across their region, guiding on accurate activity tracking from a depth of experience. Collaborate Cross Functionally: Interlock and build effective relationship with Sales teams, Business Development, Product, Engineering, Pre-sales, Post-sales, Marketing, Partners and other partners in the ecosystem. What we look for Experience Extensive experience selling Software, SaaS, Cloud Sales. 8+ years experience in securing and supporting ISV partners at scale with track record of success in planning and executing ISV co-sell programs. Demonstrated history of consistent goal achievement in a highly competitive environment (top 10% performer). Degree in business, economics, engineering, finance, science or math preferable. Track record of building strong ecosystems of lucrative customer relationships and cross-functional partnerships (Sales, Engineering, Marketing). Skills Outstanding communication skills (verbal, written and presentation) for both technical and executive audiences. Technically knowledgeable in the open source software, big data, IoT, and/or cloud computing space. Ability to translate technical concepts into business value, interacting with both business executives and technical audiences (data scientists and engineers). In-depth understanding of alliance/partner organisations, key stakeholder management, joint value proposition development and delivering field enablement programs. Possess aptitude to learn quickly and establish credibility. Proactive, entrepreneurial spirit and tenacious team player. About Databricks Databricks is the data and AI company. More than 10,000 organizations worldwide - including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 - rely on the Databricks Data Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook. Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. For specific details on the benefits offered in your region, please visit . Our Commitment to Diversity and Inclusion At Databricks, we are committed to fostering a diverse and inclusive culture where everyone can excel. We take great care to ensure that our hiring practices are inclusive and meet equal employment opportunity standards. Individuals looking for employment at Databricks are considered without regard to age, color, disability, ethnicity, family or marital status, gender identity or expression, language, national origin, physical and mental ability, political affiliation, race, religion, sexual orientation, socio-economic status, veteran status, and other protected characteristics. Compliance If access to export-controlled technology or source code is required for performance of job duties, it is within Employer's discretion whether to apply for a U.S. government license for such positions, and Employer may decline to proceed with an applicant on this basis alone.
Econometrician / Data Scientist London (Hybrid working 3 office days per week) Salary DOE £40,000-£45,000 Additional Benefits: Gym Membership, Pension and yearly bonus Job Reference: J12950 We're excited to be hiring for a unique opportunity to join a fast-growing, independent marketing effectiveness agency that genuinely puts its people first. This is a chance for someone who wants to be a bigger fish in a smaller sea to step into a role where you can truly make your mark, have real influence, and accelerate your career growth as we continue to scale. With a loyal and diverse client base, and a culture built on support and empowerment, you'll be part of a team where your ideas are heard and your impact is recognised. Roles and Responsibilities This role is well-suited for candidates who have a strong analytical mindset and prefer working behind the scenes with data. Leading the modelling process from briefing, data exploration, and variable selection through to model building, interpretation, and being involved in the presentation of results (interim and final debriefs will be presented by the Account Director). Creating clear and insightful output decks for both internal stakeholders and client presentations. Experience & Skills Required Strong econometric modelling skills using tools such as R, Python, or other statistical software packages (e.g., EViews, SAS). Experience with model validation, diagnostics, and performance metrics. Ability to handle large datasets, clean and transform raw data, and apply advanced statistical techniques such as regression, lag structures, adstock, saturation, and interaction effects. The successful candidate will be expected to take full ownership of modelling projects, from raw data ingestion through to final model delivery and client-ready outputs, with minimal supervision. If this sounds like you then please apply!
14/05/2025
Full time
Econometrician / Data Scientist London (Hybrid working 3 office days per week) Salary DOE £40,000-£45,000 Additional Benefits: Gym Membership, Pension and yearly bonus Job Reference: J12950 We're excited to be hiring for a unique opportunity to join a fast-growing, independent marketing effectiveness agency that genuinely puts its people first. This is a chance for someone who wants to be a bigger fish in a smaller sea to step into a role where you can truly make your mark, have real influence, and accelerate your career growth as we continue to scale. With a loyal and diverse client base, and a culture built on support and empowerment, you'll be part of a team where your ideas are heard and your impact is recognised. Roles and Responsibilities This role is well-suited for candidates who have a strong analytical mindset and prefer working behind the scenes with data. Leading the modelling process from briefing, data exploration, and variable selection through to model building, interpretation, and being involved in the presentation of results (interim and final debriefs will be presented by the Account Director). Creating clear and insightful output decks for both internal stakeholders and client presentations. Experience & Skills Required Strong econometric modelling skills using tools such as R, Python, or other statistical software packages (e.g., EViews, SAS). Experience with model validation, diagnostics, and performance metrics. Ability to handle large datasets, clean and transform raw data, and apply advanced statistical techniques such as regression, lag structures, adstock, saturation, and interaction effects. The successful candidate will be expected to take full ownership of modelling projects, from raw data ingestion through to final model delivery and client-ready outputs, with minimal supervision. If this sounds like you then please apply!
Machine Learning Engineer, AWS Generative AI Innovation Center DESCRIPTION The Generative AI Innovation Center at AWS helps AWS customers accelerate the use of Generative AI and realize transformational business opportunities. This is a cross-functional team of ML scientists, engineers, architects, and strategists working step-by-step with customers to build bespoke solutions that harness the power of generative AI. As an ML Engineer, you'll partner with technology and business teams to build solutions that surprise and delight our customers. You will work directly with customers and innovate in a fast-paced organization that contributes to game-changing projects and technologies. We're looking for Engineers and Architects capable of using generative AI and other ML techniques to design, evangelize, and implement state-of-the-art solutions for never-before-solved problems. Key job responsibilities Collaborate with ML scientists and engineers to research, design, and develop generative AI algorithms to address real-world challenges. Work across customer engagement to understand what adoption patterns for generative AI are working and rapidly share them across teams and leadership. Interact with customers directly to understand the business problem, help and aid them in the implementation of generative AI solutions, deliver briefing and deep dive sessions to customers and guide customers on adoption patterns and paths for generative AI. Create and deliver reusable technical assets that help to accelerate the adoption of generative AI on the AWS platform. Create and deliver best practice recommendations, tutorials, blog posts, sample code, and presentations adapted to technical, business, and executive stakeholders. Provide customer and market feedback to Product and Engineering teams to help define product direction. About the team Generative AI Innovation Center is a program that pairs you with AWS science and strategy experts with deep experience in AI/ML and generative AI techniques to: Imagine new applications of generative AI to address your needs. Identify new use cases based on business value. Integrate Generative AI into your existing applications and workflows. BASIC QUALIFICATIONS Bachelor's degree in computer science or equivalent. Experience in professional, non-internship software development. Experience coding in Python, R, Matlab, Java, or other modern programming languages. Several years of relevant experience in developing and deploying large scale machine learning or deep learning models and/or systems into production, including batch and real-time data processing, model containerization, CI/CD pipelines, API development, model training, and productionizing ML models. Experience contributing to the architecture and design (architecture, design patterns, reliability, and scaling) of new and current systems. PREFERRED QUALIFICATIONS Masters or PhD degree in computer science, or related technical, math, or scientific field. Proven knowledge of deep learning and experience using Python and frameworks such as Pytorch, TensorFlow. Proven knowledge of Generative AI and hands-on experience of building applications with large foundation models. Experiences related to AWS services such as SageMaker, EMR, S3, DynamoDB, and EC2, hands-on experience of building ML solutions on AWS. Strong communication skills, with attention to detail and ability to convey rigorous mathematical concepts and considerations to non-experts. Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting decisions based on your experience and skills. We value your passion to discover, invent, simplify and build.
14/05/2025
Full time
Machine Learning Engineer, AWS Generative AI Innovation Center DESCRIPTION The Generative AI Innovation Center at AWS helps AWS customers accelerate the use of Generative AI and realize transformational business opportunities. This is a cross-functional team of ML scientists, engineers, architects, and strategists working step-by-step with customers to build bespoke solutions that harness the power of generative AI. As an ML Engineer, you'll partner with technology and business teams to build solutions that surprise and delight our customers. You will work directly with customers and innovate in a fast-paced organization that contributes to game-changing projects and technologies. We're looking for Engineers and Architects capable of using generative AI and other ML techniques to design, evangelize, and implement state-of-the-art solutions for never-before-solved problems. Key job responsibilities Collaborate with ML scientists and engineers to research, design, and develop generative AI algorithms to address real-world challenges. Work across customer engagement to understand what adoption patterns for generative AI are working and rapidly share them across teams and leadership. Interact with customers directly to understand the business problem, help and aid them in the implementation of generative AI solutions, deliver briefing and deep dive sessions to customers and guide customers on adoption patterns and paths for generative AI. Create and deliver reusable technical assets that help to accelerate the adoption of generative AI on the AWS platform. Create and deliver best practice recommendations, tutorials, blog posts, sample code, and presentations adapted to technical, business, and executive stakeholders. Provide customer and market feedback to Product and Engineering teams to help define product direction. About the team Generative AI Innovation Center is a program that pairs you with AWS science and strategy experts with deep experience in AI/ML and generative AI techniques to: Imagine new applications of generative AI to address your needs. Identify new use cases based on business value. Integrate Generative AI into your existing applications and workflows. BASIC QUALIFICATIONS Bachelor's degree in computer science or equivalent. Experience in professional, non-internship software development. Experience coding in Python, R, Matlab, Java, or other modern programming languages. Several years of relevant experience in developing and deploying large scale machine learning or deep learning models and/or systems into production, including batch and real-time data processing, model containerization, CI/CD pipelines, API development, model training, and productionizing ML models. Experience contributing to the architecture and design (architecture, design patterns, reliability, and scaling) of new and current systems. PREFERRED QUALIFICATIONS Masters or PhD degree in computer science, or related technical, math, or scientific field. Proven knowledge of deep learning and experience using Python and frameworks such as Pytorch, TensorFlow. Proven knowledge of Generative AI and hands-on experience of building applications with large foundation models. Experiences related to AWS services such as SageMaker, EMR, S3, DynamoDB, and EC2, hands-on experience of building ML solutions on AWS. Strong communication skills, with attention to detail and ability to convey rigorous mathematical concepts and considerations to non-experts. Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting decisions based on your experience and skills. We value your passion to discover, invent, simplify and build.
Company Overview The SPTS division of KLA designs, manufactures and markets wafer processing solutions for the global semiconductor and related industries. SPTS provides industry leading etch and deposition process technologies on a range of single wafer handling platforms. End-market applications include micro-electromechanical systems (MEMS), advanced packaging, LED, high speed RF device IC's and power semiconductors. SPTS is part of KLA Corporation which develops industry-leading equipment and services that enable innovation throughout the electronics industry. We provide advanced process control and process-enabling solutions for manufacturing wafers and reticles, integrated circuits, packaging, printed circuit boards and flat panel displays. In close collaboration with leading customers across the globe, our expert teams of physicists, engineers, data scientists and problem-solvers design solutions that move the world forward. Job Description/Preferred Qualifications The Cybersecurity team at KLA is dedicated to safeguarding our critical assets and ensuring the security of our operations. As a member of the Digital Information Risk team, we are seeking a qualified AI Security Engineer to develop robust AI security review processes and risk assessments in line with responsible AI practices. Come join our team in this exciting role! This position will be responsible for identifying and mitigating AI-specific risks and vulnerabilities and proactively communicating potential AI-specific threats as a part of the broader KLA AI Security program. The ideal candidate will be able to demonstrate a breadth of knowledge across cybersecurity, threat intelligence, and artificial intelligence. Responsibilities: Conduct in-depth technical assessments of AI systems to identify security risks. Develop threat models for AI systems to anticipate and account for potential impacts due to misuse, abuse, or other adversarial attacks. Document all findings and develop mitigation strategies for identified risks, and coordinate with business partners to deploy countermeasures to reduce risk to systems and applications. Partner with the cyber threat intelligence team to research and remain current on threats and vulnerabilities to proactively find opportunities to amend our AI security strategy. Leverage research, industry trends, and internal data points to understand how AI systems could be abused and misused. Partner with monitoring teams to develop detections based on potential adversarial behaviours targeting LLMs. Be a champion of responsible AI practices, and reinforce responsibility at KLA through co-championing AI awareness campaigns. Ensure leadership is aware of key risks, potential threats, and if there are anticipated changes to ongoing projects. Support the growing cybersecurity team through mentoring junior analysts. Requirements: Bachelor's Degree in Computer Science, Cybersecurity or related field is required. Demonstrated ability in cybersecurity with a focus on artificial intelligence. Demonstrable experience with LLMs, deep understanding of AI/ML frameworks (PyTorch, Hugging Face, TensorFlow, etc.). Expertise in Python, knowledge of SQL a plus. Effective communication, interpersonal skills, and ability to work with partners across the business. Minimum Qualifications: Doctorate (Academic) Degree, Master's Level Degree and related work experience; Bachelor's Level Degree and related work experience.
14/05/2025
Full time
Company Overview The SPTS division of KLA designs, manufactures and markets wafer processing solutions for the global semiconductor and related industries. SPTS provides industry leading etch and deposition process technologies on a range of single wafer handling platforms. End-market applications include micro-electromechanical systems (MEMS), advanced packaging, LED, high speed RF device IC's and power semiconductors. SPTS is part of KLA Corporation which develops industry-leading equipment and services that enable innovation throughout the electronics industry. We provide advanced process control and process-enabling solutions for manufacturing wafers and reticles, integrated circuits, packaging, printed circuit boards and flat panel displays. In close collaboration with leading customers across the globe, our expert teams of physicists, engineers, data scientists and problem-solvers design solutions that move the world forward. Job Description/Preferred Qualifications The Cybersecurity team at KLA is dedicated to safeguarding our critical assets and ensuring the security of our operations. As a member of the Digital Information Risk team, we are seeking a qualified AI Security Engineer to develop robust AI security review processes and risk assessments in line with responsible AI practices. Come join our team in this exciting role! This position will be responsible for identifying and mitigating AI-specific risks and vulnerabilities and proactively communicating potential AI-specific threats as a part of the broader KLA AI Security program. The ideal candidate will be able to demonstrate a breadth of knowledge across cybersecurity, threat intelligence, and artificial intelligence. Responsibilities: Conduct in-depth technical assessments of AI systems to identify security risks. Develop threat models for AI systems to anticipate and account for potential impacts due to misuse, abuse, or other adversarial attacks. Document all findings and develop mitigation strategies for identified risks, and coordinate with business partners to deploy countermeasures to reduce risk to systems and applications. Partner with the cyber threat intelligence team to research and remain current on threats and vulnerabilities to proactively find opportunities to amend our AI security strategy. Leverage research, industry trends, and internal data points to understand how AI systems could be abused and misused. Partner with monitoring teams to develop detections based on potential adversarial behaviours targeting LLMs. Be a champion of responsible AI practices, and reinforce responsibility at KLA through co-championing AI awareness campaigns. Ensure leadership is aware of key risks, potential threats, and if there are anticipated changes to ongoing projects. Support the growing cybersecurity team through mentoring junior analysts. Requirements: Bachelor's Degree in Computer Science, Cybersecurity or related field is required. Demonstrated ability in cybersecurity with a focus on artificial intelligence. Demonstrable experience with LLMs, deep understanding of AI/ML frameworks (PyTorch, Hugging Face, TensorFlow, etc.). Expertise in Python, knowledge of SQL a plus. Effective communication, interpersonal skills, and ability to work with partners across the business. Minimum Qualifications: Doctorate (Academic) Degree, Master's Level Degree and related work experience; Bachelor's Level Degree and related work experience.
The role We are looking for a Data Engineer to join the Data Science & Engineering team in London. Working at WGSN Together, we create tomorrow A career with WGSN is fast-paced, exciting and full of opportunities to grow and develop. We're a team of consumer and design trend forecasters, content creators, designers, data analysts, advisory consultants and much more, united by a common goal: to create tomorrow. WGSN's trusted consumer and design forecasts power outstanding product design, enabling our customers to create a better future. Our services cover consumer insights, beauty, consumer tech, fashion, interiors, lifestyle, food and drink forecasting, data analytics and expert advisory. If you are an expert in your field, we want to hear from you. Role overview As a Data Engineer, you will be responsible for managing and improving the company's data ingestion, transformation, and analytical needs, directly supporting Data Science and Analytics functions. This role is essential in building and scaling new and existing data streams to support WGSN's growing set of data products and services. Despite a core set of responsibilities, this role requires an element of creativity and curiosity in developing new approaches to solve problems. The team You'll be joining a dynamic and diverse team of engineers and scientists within the wider decision science function. We thrive on collaboration, continuous learning, and building intelligent solutions that drive meaningful business impact. Whether it's testing new machine learning models or designing robust data pipelines, our team values creativity, technical excellence, and a shared passion for turning data into insight. Key accountabilities ETL Development Develop and implement robust Extract, Transform, Load (ETL) processes to ensure the seamless flow of data from source systems to data repositories. Optimise and streamline data pipelines for maximum performance and reliability. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and be creative in designing appropriate solutions. Database Management Manage and administer databases, ensuring data integrity, security, and availability. Perform database tuning and optimisation for improved query performance. Data Quality and Governance Implement data alerting, testing, and quality checks, ensuring adherence to governance policies. Collaboration and Communication Collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to understand data needs and deliver effective solutions. Clearly communicate technical concepts to non-technical stakeholders. New tools and technologies Stay abreast of emerging technologies and trends in data engineering and data science. Proactively evaluate and productionise new developments in AI that can enhance the data science and engineering workflow. This list is not exhaustive and there may be other activities you are required to deliver. Skills, experience & qualifications required Degree in Computer Science, Engineering, Mathematics, or equivalent experience. 3-5 years of data engineering experience, with a proven track record of designing, building, and maintaining scalable data pipelines and ETL processes. Proficiency in SQL and experience working with relational and non-relational databases (e.g. Snowflake, BigQuery, PostgreSQL, MySQL, MongoDB). Hands-on experience with big data technologies such as Apache Spark, Kafka, Hive, or Hadoop. Proficient in at least one programming language (e.g. Python, Scala, Java, R). Experience deploying and maintaining cloud infrastructure (e.g. AWS, GCP, or Azure). Familiarity with data modeling and warehousing concepts, and dimensional modeling techniques. dbt knowledge is preferable. Comfortable working with CI/CD tools, version control, and containers (e.g Git, Jenkins, Docker). Understanding of data governance, security best practices, and compliance frameworks such as GDPR or HIPAA. Familiarity with project management tools and Agile ways of working (e.g. Jira, Asana). Can craft KPIs to measure the impact of data-driven projects and initiatives. What we offer Our benefits and wellbeing package offers flexible benefits you can tailor to your own personal needs, including: - 25 days of holiday per year - with an option to buy/ sell up to 5 days. - Pension, Life Assurance and Income Protection Flexible benefits platform with options including Private Medical, Dental Insurance & Critical Illness. - Employee assistance programme, season ticket loans and cycle to work scheme. - Volunteering opportunities and charitable giving options. - Great learning and development opportunities. More about WGSN WGSN is the global authority on consumer trend forecasting. We help brands around the world create the right products at the right time for tomorrow's consumer. Our values We Are Everywhere The future is everything, it happens everywhere. WGSN is the world-leading forecaster because we track and analyse consumer behaviours, product innovation, design and creativity, everywhere. We Are Future Focused We utilise our global resources and intelligence to research, source and analyse quantitative and qualitative data to produce our forecasts. Everything we do is focused on working with our customers to create a successful and positive tomorrow. We Are Rigorous We source, review and assess quantitative and qualitative data to produce robust, actionable forecasts. To provide credible insights and design solutions for our clients, it is essential that rigour runs through everything we do. Our culture An inclusive culture is one of our key priorities. We want our people to truly be themselves and thrive. We love having a diverse team of people who bring new ideas, different strengths and perspectives & reflect the global audience we work with. Inclusive workforce We are committed to supporting the environment and sustainability, including ensuring our pension plan defaults to sustainable options and striving to be net zero by 2030. Recognising great performance is a key part of our culture. Our Awards schemes recognise and reward the brilliant achievements of our people. We offer a flexible working environment with a wide range of flexible, hybrid and agile working arrangements. Conversations about flexible working have always been-and will continue to be-actively encouraged here, but we do not offer full remote working. We want to ensure everyone has the opportunity to perform their best when interviewing, so if you require any reasonable adjustments that would make you more comfortable during the process, please let us know so that we can do our best to support you. A Note for Recruiters Thank you so much for your interest in working with us at WGSN! Our internal Talent Acquisition team takes care of all our recruitment efforts. When we need some extra help, we partner with agencies on our Preferred Supplier List (PSL) that truly understand our business, culture and ways of working together. Since we focus on these established partnerships, we're unable to respond to unsolicited contacts or CVs from outside our PSL. But don't worry! If we decide to explore new partnerships, we'll be sure to reach out.
14/05/2025
Full time
The role We are looking for a Data Engineer to join the Data Science & Engineering team in London. Working at WGSN Together, we create tomorrow A career with WGSN is fast-paced, exciting and full of opportunities to grow and develop. We're a team of consumer and design trend forecasters, content creators, designers, data analysts, advisory consultants and much more, united by a common goal: to create tomorrow. WGSN's trusted consumer and design forecasts power outstanding product design, enabling our customers to create a better future. Our services cover consumer insights, beauty, consumer tech, fashion, interiors, lifestyle, food and drink forecasting, data analytics and expert advisory. If you are an expert in your field, we want to hear from you. Role overview As a Data Engineer, you will be responsible for managing and improving the company's data ingestion, transformation, and analytical needs, directly supporting Data Science and Analytics functions. This role is essential in building and scaling new and existing data streams to support WGSN's growing set of data products and services. Despite a core set of responsibilities, this role requires an element of creativity and curiosity in developing new approaches to solve problems. The team You'll be joining a dynamic and diverse team of engineers and scientists within the wider decision science function. We thrive on collaboration, continuous learning, and building intelligent solutions that drive meaningful business impact. Whether it's testing new machine learning models or designing robust data pipelines, our team values creativity, technical excellence, and a shared passion for turning data into insight. Key accountabilities ETL Development Develop and implement robust Extract, Transform, Load (ETL) processes to ensure the seamless flow of data from source systems to data repositories. Optimise and streamline data pipelines for maximum performance and reliability. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and be creative in designing appropriate solutions. Database Management Manage and administer databases, ensuring data integrity, security, and availability. Perform database tuning and optimisation for improved query performance. Data Quality and Governance Implement data alerting, testing, and quality checks, ensuring adherence to governance policies. Collaboration and Communication Collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to understand data needs and deliver effective solutions. Clearly communicate technical concepts to non-technical stakeholders. New tools and technologies Stay abreast of emerging technologies and trends in data engineering and data science. Proactively evaluate and productionise new developments in AI that can enhance the data science and engineering workflow. This list is not exhaustive and there may be other activities you are required to deliver. Skills, experience & qualifications required Degree in Computer Science, Engineering, Mathematics, or equivalent experience. 3-5 years of data engineering experience, with a proven track record of designing, building, and maintaining scalable data pipelines and ETL processes. Proficiency in SQL and experience working with relational and non-relational databases (e.g. Snowflake, BigQuery, PostgreSQL, MySQL, MongoDB). Hands-on experience with big data technologies such as Apache Spark, Kafka, Hive, or Hadoop. Proficient in at least one programming language (e.g. Python, Scala, Java, R). Experience deploying and maintaining cloud infrastructure (e.g. AWS, GCP, or Azure). Familiarity with data modeling and warehousing concepts, and dimensional modeling techniques. dbt knowledge is preferable. Comfortable working with CI/CD tools, version control, and containers (e.g Git, Jenkins, Docker). Understanding of data governance, security best practices, and compliance frameworks such as GDPR or HIPAA. Familiarity with project management tools and Agile ways of working (e.g. Jira, Asana). Can craft KPIs to measure the impact of data-driven projects and initiatives. What we offer Our benefits and wellbeing package offers flexible benefits you can tailor to your own personal needs, including: - 25 days of holiday per year - with an option to buy/ sell up to 5 days. - Pension, Life Assurance and Income Protection Flexible benefits platform with options including Private Medical, Dental Insurance & Critical Illness. - Employee assistance programme, season ticket loans and cycle to work scheme. - Volunteering opportunities and charitable giving options. - Great learning and development opportunities. More about WGSN WGSN is the global authority on consumer trend forecasting. We help brands around the world create the right products at the right time for tomorrow's consumer. Our values We Are Everywhere The future is everything, it happens everywhere. WGSN is the world-leading forecaster because we track and analyse consumer behaviours, product innovation, design and creativity, everywhere. We Are Future Focused We utilise our global resources and intelligence to research, source and analyse quantitative and qualitative data to produce our forecasts. Everything we do is focused on working with our customers to create a successful and positive tomorrow. We Are Rigorous We source, review and assess quantitative and qualitative data to produce robust, actionable forecasts. To provide credible insights and design solutions for our clients, it is essential that rigour runs through everything we do. Our culture An inclusive culture is one of our key priorities. We want our people to truly be themselves and thrive. We love having a diverse team of people who bring new ideas, different strengths and perspectives & reflect the global audience we work with. Inclusive workforce We are committed to supporting the environment and sustainability, including ensuring our pension plan defaults to sustainable options and striving to be net zero by 2030. Recognising great performance is a key part of our culture. Our Awards schemes recognise and reward the brilliant achievements of our people. We offer a flexible working environment with a wide range of flexible, hybrid and agile working arrangements. Conversations about flexible working have always been-and will continue to be-actively encouraged here, but we do not offer full remote working. We want to ensure everyone has the opportunity to perform their best when interviewing, so if you require any reasonable adjustments that would make you more comfortable during the process, please let us know so that we can do our best to support you. A Note for Recruiters Thank you so much for your interest in working with us at WGSN! Our internal Talent Acquisition team takes care of all our recruitment efforts. When we need some extra help, we partner with agencies on our Preferred Supplier List (PSL) that truly understand our business, culture and ways of working together. Since we focus on these established partnerships, we're unable to respond to unsolicited contacts or CVs from outside our PSL. But don't worry! If we decide to explore new partnerships, we'll be sure to reach out.
RSMB is looking for an enthusiastic Data Scientist/Statistician who has an interest in media research to join their team based in Central London. You will join them on a full-time, permanent basis, and in return, you will receive a competitive salary of £28,000 per annum for graduate entry level, rising to £35,000 depending on degree of relevant post-graduate experience. RSMB is a leading company specialising in media measurement solutions. We work with various clients, including industry measurement bodies like Barb (UK TV audience measurement) and RAJAR (radio audience measurement), to help them understand, plan, and measure consumer behaviour across media. We focus on statistics and data science in media, developing models and methodologies for audience and viewer measurement. Our team of around 50 people operates in a hybrid working environment based in Holborn, London. The Data Scientist/Statistician role: RSMB is looking for a Data Scientist/Statistician to join our team working on some of the UK s most interesting media measurement projects like Barb, RAJAR, CFlight and TouchPoints. Whether you re a recent graduate or have a few years of experience in stats, data science, or media analytics, this is a great opportunity to work with big datasets, solve real-world problems, and help shape how the UK media industry understands audiences. Benefits you will receive as their Data Scientist/Statistician: Pension scheme 25 days holiday per annum (rising to 30 days) Private medical insurance Season ticket loan Group life and permanent health insurance. Key responsibilities as their Data Scientist/Statistician will include: Providing statistical expertise across RSMB s work, gaining in depth knowledge of methodologies used in media measurement services such as Barb, RAJAR, and TouchPoints. Developing complex methodologies to deliver cross-platform measurement solutions, including contributions to projects like CFlight. Running data fusion processes to support comprehensive audience insights, particularly for projects such as TouchPoints. Evaluating third-party methodologies through rigorous audits, to validate and ensure the integrity and reliability of data sources and analytical approaches. Utilizing programming skills in PL/SQL, R and Python to extract, manipulate and analyse large datasets effectively. Communicating technical methodologies and insights clearly through written reports and presentations, contributing to both internal discussions and client-facing meetings. What they are looking for in their Data Scientist/Statistician: The Data Scientist/Statistician should have broad knowledge of statistical/mathematical techniques. Ideally the candidate should possess the following skills: Education: At least a 2:1 Bachelor's degree in Data Science, Statistics, Mathematics, or a related field. If applying with a predicted grade, any job offer will be subject to achieving this grade. Technical Skills: Competent with Microsoft packages including Excel. Analytical Skills: Numerate with the ability to interpret and present complex data. Strong problem-solving skills and ability to think critically. Communication: Excellent verbal and written communication skills, with the ability to present data findings clearly and convey technic Personal Attributes? Strong interpersonal skills and the ability to liaise with people at all levels. Self-motivated and confident to manage their own projects as well as working within teams for larger projects. Excellent attention to detail and superb organisational skills. Able to use initiative to work independently with the ability to manage own time and organise priorities. Flexible and adaptable the needs of the job may change from week to week. Collaborative team player, committed to the collective success of the company. The ability to manage client relationships effectively, ensuring client satisfaction and addressing any concerns promptly. Please note: Applicants must be eligible to work in the UK & we are not accepting agency applications for this role. If you feel you have the skills and experience to become a Data Scientist/Statistician in this exciting role, then please click apply now They d love to hear from you!
13/05/2025
Full time
RSMB is looking for an enthusiastic Data Scientist/Statistician who has an interest in media research to join their team based in Central London. You will join them on a full-time, permanent basis, and in return, you will receive a competitive salary of £28,000 per annum for graduate entry level, rising to £35,000 depending on degree of relevant post-graduate experience. RSMB is a leading company specialising in media measurement solutions. We work with various clients, including industry measurement bodies like Barb (UK TV audience measurement) and RAJAR (radio audience measurement), to help them understand, plan, and measure consumer behaviour across media. We focus on statistics and data science in media, developing models and methodologies for audience and viewer measurement. Our team of around 50 people operates in a hybrid working environment based in Holborn, London. The Data Scientist/Statistician role: RSMB is looking for a Data Scientist/Statistician to join our team working on some of the UK s most interesting media measurement projects like Barb, RAJAR, CFlight and TouchPoints. Whether you re a recent graduate or have a few years of experience in stats, data science, or media analytics, this is a great opportunity to work with big datasets, solve real-world problems, and help shape how the UK media industry understands audiences. Benefits you will receive as their Data Scientist/Statistician: Pension scheme 25 days holiday per annum (rising to 30 days) Private medical insurance Season ticket loan Group life and permanent health insurance. Key responsibilities as their Data Scientist/Statistician will include: Providing statistical expertise across RSMB s work, gaining in depth knowledge of methodologies used in media measurement services such as Barb, RAJAR, and TouchPoints. Developing complex methodologies to deliver cross-platform measurement solutions, including contributions to projects like CFlight. Running data fusion processes to support comprehensive audience insights, particularly for projects such as TouchPoints. Evaluating third-party methodologies through rigorous audits, to validate and ensure the integrity and reliability of data sources and analytical approaches. Utilizing programming skills in PL/SQL, R and Python to extract, manipulate and analyse large datasets effectively. Communicating technical methodologies and insights clearly through written reports and presentations, contributing to both internal discussions and client-facing meetings. What they are looking for in their Data Scientist/Statistician: The Data Scientist/Statistician should have broad knowledge of statistical/mathematical techniques. Ideally the candidate should possess the following skills: Education: At least a 2:1 Bachelor's degree in Data Science, Statistics, Mathematics, or a related field. If applying with a predicted grade, any job offer will be subject to achieving this grade. Technical Skills: Competent with Microsoft packages including Excel. Analytical Skills: Numerate with the ability to interpret and present complex data. Strong problem-solving skills and ability to think critically. Communication: Excellent verbal and written communication skills, with the ability to present data findings clearly and convey technic Personal Attributes? Strong interpersonal skills and the ability to liaise with people at all levels. Self-motivated and confident to manage their own projects as well as working within teams for larger projects. Excellent attention to detail and superb organisational skills. Able to use initiative to work independently with the ability to manage own time and organise priorities. Flexible and adaptable the needs of the job may change from week to week. Collaborative team player, committed to the collective success of the company. The ability to manage client relationships effectively, ensuring client satisfaction and addressing any concerns promptly. Please note: Applicants must be eligible to work in the UK & we are not accepting agency applications for this role. If you feel you have the skills and experience to become a Data Scientist/Statistician in this exciting role, then please click apply now They d love to hear from you!
AI Engineer / Data Scientist - Contract Position in London An exciting contract opportunity has arisen for a skilled AI Engineer / Data Scientist in the vibrant heart of London. We are seeking a dynamic individual who can harness the power of data and machine learning technologies to drive innovation and improve processes. Role Overview: Location: London, United Kingdom Type: Contract Requires commuting to Dublin once a month Sector: Technology Required Skills: LLM (Latent Log-linear Model): Expertise in using LLM for advanced predictive modelling and analysis. Data Bricks: Proficiency with Databricks platform for big data processing and analytics. Machine Learning: Strong background in developing and deploying machine learning algorithms and models. Python: Excellent coding skills in Python, particularly for data science and machine learning applications. This role is ideal for someone who thrives in a fast-paced environment and is eager to contribute to cutting-edge projects in artificial intelligence and data science. If you are ready to take on this challenging role, we would love to hear from you. Please click here to find out more about our Key Information Documents. Please note that the documents provided contain generic information. If we are successful in finding you an assignment, you will receive a Key Information Document which will be specific to the vendor set-up you have chosen and your placement. To find out more about Computer Futures please visit (url removed) Computer Futures, a trading division of SThree Partnership LLP is acting as an Employment Business in relation to this vacancy Registered office 8 Bishopsgate, London, EC2N 4BQ, United Kingdom Partnership Number OC(phone number removed) England and Wales
13/05/2025
Contractor
AI Engineer / Data Scientist - Contract Position in London An exciting contract opportunity has arisen for a skilled AI Engineer / Data Scientist in the vibrant heart of London. We are seeking a dynamic individual who can harness the power of data and machine learning technologies to drive innovation and improve processes. Role Overview: Location: London, United Kingdom Type: Contract Requires commuting to Dublin once a month Sector: Technology Required Skills: LLM (Latent Log-linear Model): Expertise in using LLM for advanced predictive modelling and analysis. Data Bricks: Proficiency with Databricks platform for big data processing and analytics. Machine Learning: Strong background in developing and deploying machine learning algorithms and models. Python: Excellent coding skills in Python, particularly for data science and machine learning applications. This role is ideal for someone who thrives in a fast-paced environment and is eager to contribute to cutting-edge projects in artificial intelligence and data science. If you are ready to take on this challenging role, we would love to hear from you. Please click here to find out more about our Key Information Documents. Please note that the documents provided contain generic information. If we are successful in finding you an assignment, you will receive a Key Information Document which will be specific to the vendor set-up you have chosen and your placement. To find out more about Computer Futures please visit (url removed) Computer Futures, a trading division of SThree Partnership LLP is acting as an Employment Business in relation to this vacancy Registered office 8 Bishopsgate, London, EC2N 4BQ, United Kingdom Partnership Number OC(phone number removed) England and Wales
Junior Data Scientist London SaaS Data Platform Avanti Recruitment is working with a growing software house who make a SaaS based Data platform for customers to help them Enterprise businesses transform their data into actionable insights. Working with household names in Finance, Retail, Travel, Telco, and Healthcare, they turn Enterprise data into competitive advantage. About the role: Build and refine Data Science/ AI/ML models that solve genuine commercial challenges Work across predictive analytics, personalization, and decision-making tools Deploy models on major cloud platforms (AWS, Azure, GCP) Collaborate with AI Engineers and Product Teams on scalable solutions Help increase business value through AI implementation, particularly in the private equity space About the company: You will be joining a team of 5 and they are looking to expand with multiple new hires Their software is deployed with clients in the UK, Germany, France, and Ireland What they're looking for: 1-2 years experience in Data Science(perfect for second-jobbers) First-class STEM degree from a prestigious university Strong Python skills (Pandas, NumPy, Scikit-Learn) Cloud computing experience (primarily Azure but they work with all major cloud providers depending on their clients needs). SQL knowledge and database experience (SQL assessment is part of the interview process) Problem-solving mindset and good communication skills Experience optimizing code for large-scale, high-volume datasets Why join them? Work on Data Science and AI challenges that actually matter Fast-track your career with mentorship from senior experts Gain valuable experience across multiple industries and cloud platforms (Azure, AWS, GCP, Databricks) Competitive salary with genuine development opportunities Currently serving clients in four countries with two live products and two cutting-edge AI solutions in development. This is your chance to apply machine learning at scale while building solutions that shape the future of data-driven decision-making. Target Salary: £45,000 (considering applications in the range of £40,000 - £50,000) Location: Central London hybrid working Duration: Permanent N.B. They don t sponsor visas and won t consider those on short term visas. APPLY NOW FOR IMMEDIATE CONSIDERATION.
13/05/2025
Full time
Junior Data Scientist London SaaS Data Platform Avanti Recruitment is working with a growing software house who make a SaaS based Data platform for customers to help them Enterprise businesses transform their data into actionable insights. Working with household names in Finance, Retail, Travel, Telco, and Healthcare, they turn Enterprise data into competitive advantage. About the role: Build and refine Data Science/ AI/ML models that solve genuine commercial challenges Work across predictive analytics, personalization, and decision-making tools Deploy models on major cloud platforms (AWS, Azure, GCP) Collaborate with AI Engineers and Product Teams on scalable solutions Help increase business value through AI implementation, particularly in the private equity space About the company: You will be joining a team of 5 and they are looking to expand with multiple new hires Their software is deployed with clients in the UK, Germany, France, and Ireland What they're looking for: 1-2 years experience in Data Science(perfect for second-jobbers) First-class STEM degree from a prestigious university Strong Python skills (Pandas, NumPy, Scikit-Learn) Cloud computing experience (primarily Azure but they work with all major cloud providers depending on their clients needs). SQL knowledge and database experience (SQL assessment is part of the interview process) Problem-solving mindset and good communication skills Experience optimizing code for large-scale, high-volume datasets Why join them? Work on Data Science and AI challenges that actually matter Fast-track your career with mentorship from senior experts Gain valuable experience across multiple industries and cloud platforms (Azure, AWS, GCP, Databricks) Competitive salary with genuine development opportunities Currently serving clients in four countries with two live products and two cutting-edge AI solutions in development. This is your chance to apply machine learning at scale while building solutions that shape the future of data-driven decision-making. Target Salary: £45,000 (considering applications in the range of £40,000 - £50,000) Location: Central London hybrid working Duration: Permanent N.B. They don t sponsor visas and won t consider those on short term visas. APPLY NOW FOR IMMEDIATE CONSIDERATION.
Jobs - Frequently Asked Questions
Use the location filter to find IT jobs in cities like London, Manchester, Birmingham, and across the UK.
Entry-level roles include IT support technician, junior developer, QA tester, and helpdesk analyst.
New jobs are posted daily. Set up alerts to be notified as soon as new roles match your preferences.
Key skills include problem-solving, coding, cloud computing, networking, and familiarity with tools like AWS or SQL.
Yes, many employers offer training or junior roles. Focus on building a strong CV with relevant coursework or personal projects.