National AI Awards 2025Discover AI's trailblazers! Join us to celebrate innovation and nominate industry leaders.

Nominate & Attend

Senior Data Engineer (GCP)

Ipsos
Harrow
1 month ago
Create job alert

Role Overview:

The Audience Measurement team at Ipsos use deep understanding of people to make sense of audiences and how they consume media. We use these to influence media strategy, helping clients to answer crucial questions, such as how to target audiences, maximise attention across platforms, enhance audience experience, and demonstrate or increase audience value.


We are recruiting a Data Engineer for one of our flagship accounts, Out Of Home – Route.


Your work will be essential for the Route project's success. Building robust and scalable infrastructure and being the linchpin between the data platform and data science teams in an ‘advisory-like’ role will enable data scientists to focus on research and innovation in synthetic data, enhancing data-driven solutions for clients and Ipsos. You will often support the productionisation and scaling of local ML models, guiding the data science teams and providing the guardrails as they develop and iterate.


What will I be doing?


As a Data Engineer on the ground-breaking Route project, you'll develop and maintain the data infrastructure for a first-of-its-kind synthetic travel survey, generating audience figures for Out-of-Home media in Great Britain. Reporting to a Principal Data and Platform Engineer, you will collaborate with data scientists to design, build, and optimise data pipelines for high-quality audience measurement data.


Your key responsibilities will include:

Develop robust, scalable data pipelines and optimise data storage solutions on GCP. Implement ETL processes and CI/CD pipelines to ensure clean, structured data ready for use. Work with data scientists to integrate synthetic models into production environments and provide the guardrails and advice as they develop.  Provide technical support, troubleshoot issues, and research new technologies to enhance capabilities. Document pipelines and the platform including architectures and user guides, helping to enforce data management standards. Participate in agile ceremonies and provide occasional client interaction. Engage in DataOps practices and improve data delivery performance.

Our Tech Stack:

GCP:GCS, BigQuery, GKE, Artifact Registry, Vertex AI, App Engine, Data Store, Secret Manager, Pub/SubOpen-Source Tools:Argo Workflows, Argo Events, dbt, Elementary, Cerberus, Terraform, Jupyter Hub, Docker, KubernetesCoding Languages:Python, SQL, JavaScript (minimal)CI/CD & PM Tools: Azure DevOps, Confluence

What do I need to bring with me?


It is essential that your personal attributes compliment your technical skills. To be successful in this role you will need the following skills and experience:

A minimum of 3 years relevant commercial experience with experience of scalable data pipelines using Argo on GKE, SQL on BigQuery, Python libraries like Pandas. Comfortable with APIs and Cloud storage systems. Basic web application development knowledge using Flask, HTML, JavaScript. Experience with containerisation (Docker) and orchestration (Kubernetes). Familiarity with Terraform and data systems optimisation. Commitment to data quality and experience in synthetic data, AI/ML model deployment. Excellent communication skills and a collaborative and positive mindset. Willingness to learn. GCP certifications are a plus.

Benefits:


We offer a comprehensive benefits package designed to support you as an individual. Our standard benefits include 25 days annual leave, pension contribution, income protection and life assurance. In addition, there are a range health & wellbeing, financial benefits and professional development opportunities. 


We realise you may have commitments outside of work and will consider flexible working applications - please highlight what you are looking for when you make your application. We have a hybrid approach to work and ask people to be in the office or with clients for 3 days per week.


We are committed to equality, treating people fairly, promoting a positive and inclusive working environment and ensuring we have diversity of people and views. We recognise that this is important for our business success - a more diverse workforce will enable us to better reflect and understand the world we research and ultimately deliver better research and insight to our clients. We are proud to be a member of the Disability Confident scheme, certified as Level 1 Disability Confident Committed. We are dedicated to providing an inclusive and accessible recruitment process. 

Related Jobs

View all jobs

Senior Data Engineer

Senior Data Engineer

Senior Data Engineer

Senior Data Engineer

Senior Data Engineer

Senior Data Engineer

National AI Awards 2025

Subscribe to Future Tech Insights for the latest jobs & insights, direct to your inbox.

By subscribing, you agree to our privacy policy and terms of service.

Industry Insights

Discover insightful articles, industry insights, expert tips, and curated resources.

How to Get a Better Machine Learning Job After a Lay-Off or Redundancy

Redundancy in machine learning can feel especially frustrating when your role was technically advanced, strategically important, or AI-facing. But the UK still has strong demand for machine learning professionals across fintech, healthtech, retail, cybersecurity, autonomous systems, and generative AI. Whether you're a research-oriented ML engineer, production-focused MLOps developer, or applied scientist, this guide is designed to help you bounce back from redundancy and find a better opportunity that suits your goals.

Machine Learning Jobs Salary Calculator 2025: Figure Out Your True Worth in Seconds

Why last year’s pay survey is useless for UK ML professionals today Ask a Machine Learning Engineer wrangling transformer checkpoints, an MLOps Lead firefighting drift alarms, or a Research Scientist training diffusion models at 3 a.m.: “Am I earning what I deserve?” The honest answer changes monthly. A single OpenAI model drop doubles GPU demand, healthcare regulators release fresh explainability guidance, & a fintech unicorn pays six figures for vector‑search expertise. Each shock nudges salary bands. Any PDF salary guide printed in 2024 now looks like an outdated Jupyter notebook—missing the gen‑AI tsunami, the surge in edge inference, & the UK’s new Responsible‑AI framework. To give ML professionals an accurate benchmark, MachineLearningJobs.co.uk distilled a transparent, three‑factor formula that estimates a realistic 2025 salary in under a minute. Feed in your discipline, UK region, & seniority; you’ll receive a defensible figure—no stale averages, no guesswork. This article unpacks the formula, highlights the forces driving ML pay skyward, & offers five practical moves to boost your value inside the next ninety days.

How to Present Machine Learning Solutions to Non-Technical Audiences: A Public Speaking Guide for Job Seekers

Machine learning is driving change across nearly every industry—from retail and finance to health and logistics. But while the technology continues to evolve rapidly, the ability to communicate it clearly has become just as important as building the models themselves. Whether you're applying for a junior ML engineer role, a research position, or a client-facing AI consultant job, UK employers increasingly expect candidates to explain complex machine learning solutions to non-technical audiences. In this guide, you’ll learn how to confidently present your work, structure your message, use simple visuals, and explain the real-world value of machine learning in a way that makes sense to people without a background in data science.