Data Engineer

London
1 month ago
Applications closed

Related Jobs

View all jobs

Data Engineer

Data Engineer

Data Engineer

Data Engineer

Data Engineer

Data Engineer

Data Engineer | Hybrid Role | £50k-£60k | London

Route delivers world-class audience measurement for out-of-home advertising across Great Britain. We're building new data models and collection methods to better understand how people move and which OOH ads they see. You'll build the data pipelines and infrastructure that makes this happen.

Note: Applicants must be eligible to work in the UK, as visa sponsorship is not available.

Why this role?

From day one, you'll have the freedom to innovate and make decisions that directly affect the business. This isn't a traditional data engineering role; you'll wear multiple hats. You'll design and maintain local and cloud (GCP) data pipelines that power our analytics and dashboards and automate key processes. This includes managing our local infrastructure, including Proxmox virtualisation hosts and network configuration, ensuring seamless integration between on-premises and cloud systems. Working closely with the team, you'll get hands-on experience with everything from data engineering to infrastructure management.

What's in it for you?

Immediate Impact: Your work directly supports how audience data is used to measure advertising effectiveness across GB. You'll see results quickly, whether it's improving data quality, automating tasks, or building tools that surface insights.
Career Growth: You'll work with Python, Go, SQL, PostgreSQL, BigQuery, Google Cloud, and DevOps practices. With dedicated study days and a supportive team, you'll accelerate your learning and development.
Visibility: In a small team, your ideas matter. You'll have direct access to senior leadership and influence how data is used across the business.
Flexibility: Hybrid working (2-3 days in office) with flexible remote options.
Work-Life Balance: With 25 days holiday, private healthcare, and a social, relaxed working environment.
What you'll do:

Build and maintain data pipelines that are fast, reliable, and ready for analysis.
Manage local server and cloud infrastructure using DevOps practices, in a hybrid cloud environment.
Run quality checks and validation on release datasets. Automate routine checks, loading, and reporting.
Create dashboards that turn complex data into clear, actionable insights for internal teams and external stakeholders.
Support the insight team with data analysis and handle queries from industry stakeholders.
What we're looking for:

Experience building ETL pipelines, preferably with Python.
Strong SQL skills. Experience with PostgreSQL and BigQuery preferred.
Experience with cloud platforms, ideally Google Cloud.
Hands-on experience with Infrastructure-as-Code like Terraform.
Solid Linux/Unix skills, especially for server and network config.
Ability to communicate with both technical and non-technical people.
Ability to work independently with minimal supervision

Subscribe to Future Tech Insights for the latest jobs & insights, direct to your inbox.

By subscribing, you agree to our privacy policy and terms of service.

Industry Insights

Discover insightful articles, industry insights, expert tips, and curated resources.

Machine Learning Jobs for Career Switchers in Their 30s, 40s & 50s (UK Reality Check)

Are you considering a career change into machine learning in your 30s, 40s or 50s? You’re not alone. In the UK, organisations across industries such as finance, healthcare, retail, government & technology are investing in machine learning to improve decisions, automate processes & unlock new insights. But with all the hype, it can be hard to tell which roles are real job opportunities and which are just buzzwords. This article gives you a practical, UK-focused reality check: which machine learning roles truly exist, what skills employers really hire for, how long retraining realistically takes, how to position your experience and whether age matters in your favour or not. Whether you come from analytics, engineering, operations, research, compliance or business strategy, there is a credible route into machine learning if you approach it strategically.

How to Write a Machine Learning Job Ad That Attracts the Right People

Machine learning now sits at the heart of many UK organisations, powering everything from recommendation engines and fraud detection to forecasting, automation and decision support. As adoption grows, so does demand for skilled machine learning professionals. Yet many employers struggle to attract the right candidates. Machine learning job adverts often generate high volumes of applications, but few applicants have the blend of modelling skill, engineering awareness and real-world experience the role actually requires. Meanwhile, strong machine learning engineers and scientists quietly avoid adverts that feel vague, inflated or confused. In most cases, the issue is not the talent market — it is the job advert itself. Machine learning professionals are analytical, technically rigorous and highly selective. A poorly written job ad signals unclear expectations and low ML maturity. A well-written one signals credibility, focus and a serious approach to applied machine learning. This guide explains how to write a machine learning job ad that attracts the right people, improves applicant quality and strengthens your employer brand.

Maths for Machine Learning Jobs: The Only Topics You Actually Need (& How to Learn Them)

Machine learning job adverts in the UK love vague phrases like “strong maths” or “solid fundamentals”. That can make the whole field feel gatekept especially if you are a career changer or a student who has not touched maths since A level. Here is the practical truth. For most roles on MachineLearningJobs.co.uk such as Machine Learning Engineer, Applied Scientist, Data Scientist, NLP Engineer, Computer Vision Engineer or MLOps Engineer with modelling responsibilities the maths you actually use is concentrated in four areas: Linear algebra essentials (vectors, matrices, projections, PCA intuition) Probability & statistics (uncertainty, metrics, sampling, base rates) Calculus essentials (derivatives, chain rule, gradients, backprop intuition) Basic optimisation (loss functions, gradient descent, regularisation, tuning) If you can do those four things well you can build models, debug training, evaluate properly, explain trade-offs & sound credible in interviews. This guide gives you a clear scope plus a six-week learning plan, portfolio projects & resources so you can learn with momentum rather than drowning in theory.