Senior Data Engineer

Lloyds Banking Group
Bristol
16 hours ago
Create job alert

WORKING PATTERN: Our work style is hybrid, which involves spending at least two days per week, or 40% of our time, at our Bristol office.


About this opportunity…

The Finance Platform - AI & Agentic team is looking for a passionate and experienced Senior Data Engineer to join our growing team! If you thrive on enabling high-performing software, streamlining data pipelines and automating ETL processes, you're looking in the right place.


You’ll play a key role in crafting and delivering our Finance AI & Agentic Data strategy, applying modern cloud-native technologies to accelerate data delivery, improve operational resilience, and ensure compliance with security and engineering standards. You’ll be working with Data Analytics managers and business stakeholders to build Finance based AI intelligence solutions - your role will be to source data, understand the data, build data pipelines and identify how to automate all data pipelines.


You’ll be comfortable understanding the architecture and use of all data products - CDPs, ODPs and FDPs - to create strong reuse and sourcing of data for our AI & ML models. Working with solution architects to create software and data solutions to support AI and ML applications to be developed -with security, control and lineage to be incorporated into your detailed designs. You will be working closely with data scientists, ML & AI engineers, software and DevOps engineers to collaborate on solutions to best serve our internal finance customers.


About us…

We’re on an exciting journey and there couldn’t be a better time to join us. The investments we’re making in our people, data, and technology are leading to innovative projects, fresh possibilities, and countless new ways for our people to work, learn, and thrive.


What you’ll do…

  • Lead the design, implementation, and maintenance of scalable, secure, and performant Data engineering pipelines and tooling on Google Cloud Platform (GCP).
  • Collaborate with engineers, architects, and product teams to enable data pipelines, data transformations and script data loads.
  • You’ll support building BigQuery datasets & materialised views to support the Data Analytics teams, and work to keep our data costs low and queries streamlined.
  • Implementing and using tools such as dBT, dataflow, EasyIngest etc to manage data effectively.
  • Champion the use of endorsed technologies and common build patterns to minimise technical debt.
  • Mentor junior engineers and support recruitment to grow Data Engineering capability across the team, lab and platform.

What you’ll need…

  • Understanding of utilising DevOps with data engineering practices and cloud-native tooling, including YAML scripting, CI/CD, source code management, and orchestration.
  • Proficiency with automation and scripting in languages like Python, Apache Beam, Bash, or your preferred language.
  • A passion for process improvement, operational excellence, and platform reliability.
  • Ability to lead initiatives independently and influence engineering best practices.
  • Excellent stakeholder engagement, communication, and teamwork skills.
  • Experience using gcloud commands for provisioning resources in containerized applications
  • Data integration with multiple internal platforms- AI tool (Vertex, Cortex), Machine Learning as a service, on‑prem databases.

And any experience of these would be useful…

  • Experience using Gcloud commands for provisioning resources in containerized applications
  • Understanding of cost optimisation in cloud environments.
  • Understand how to utilise API & MCP connectivity and enablement to support our applications and data pipelines.

About working for us…

Our focus is to ensure we’re inclusive every day, building an organisation that reflects modern society and celebrates diversity in all its forms. We want our people to feel that they belong and can be their best, regardless of background, identity or culture. And it’s why we especially welcome applications from under-represented groups. We’re disability confident. So if you’d like reasonable adjustments to be made to our recruitment processes, just let us know.


We also offer a wide-ranging benefits package, which includes…

  • A generous pension contribution of up to 15%
  • An annual bonus award, subject to Group performance
  • Share schemes including free shares
  • Benefits you can adapt to your lifestyle, such as discounted shopping
  • 30 days’ holiday, with bank holidays on top
  • A range of wellbeing initiatives and generous parental leave policies

Want to do amazing work, that’s interesting and makes a difference to millions of people? Join our journey!


#J-18808-Ljbffr

Related Jobs

View all jobs

Senior Data Engineer

Senior Data Engineer

Senior Data Engineer

Senior Data Engineer

Senior Data Engineer - Energy

Senior Data Engineer, SQL, RDBMS, AWS, Python, Mainly Remote

Subscribe to Future Tech Insights for the latest jobs & insights, direct to your inbox.

By subscribing, you agree to our privacy policy and terms of service.

Industry Insights

Discover insightful articles, industry insights, expert tips, and curated resources.

Machine Learning Jobs for Career Switchers in Their 30s, 40s & 50s (UK Reality Check)

Are you considering a career change into machine learning in your 30s, 40s or 50s? You’re not alone. In the UK, organisations across industries such as finance, healthcare, retail, government & technology are investing in machine learning to improve decisions, automate processes & unlock new insights. But with all the hype, it can be hard to tell which roles are real job opportunities and which are just buzzwords. This article gives you a practical, UK-focused reality check: which machine learning roles truly exist, what skills employers really hire for, how long retraining realistically takes, how to position your experience and whether age matters in your favour or not. Whether you come from analytics, engineering, operations, research, compliance or business strategy, there is a credible route into machine learning if you approach it strategically.

How to Write a Machine Learning Job Ad That Attracts the Right People

Machine learning now sits at the heart of many UK organisations, powering everything from recommendation engines and fraud detection to forecasting, automation and decision support. As adoption grows, so does demand for skilled machine learning professionals. Yet many employers struggle to attract the right candidates. Machine learning job adverts often generate high volumes of applications, but few applicants have the blend of modelling skill, engineering awareness and real-world experience the role actually requires. Meanwhile, strong machine learning engineers and scientists quietly avoid adverts that feel vague, inflated or confused. In most cases, the issue is not the talent market — it is the job advert itself. Machine learning professionals are analytical, technically rigorous and highly selective. A poorly written job ad signals unclear expectations and low ML maturity. A well-written one signals credibility, focus and a serious approach to applied machine learning. This guide explains how to write a machine learning job ad that attracts the right people, improves applicant quality and strengthens your employer brand.

Maths for Machine Learning Jobs: The Only Topics You Actually Need (& How to Learn Them)

Machine learning job adverts in the UK love vague phrases like “strong maths” or “solid fundamentals”. That can make the whole field feel gatekept especially if you are a career changer or a student who has not touched maths since A level. Here is the practical truth. For most roles on MachineLearningJobs.co.uk such as Machine Learning Engineer, Applied Scientist, Data Scientist, NLP Engineer, Computer Vision Engineer or MLOps Engineer with modelling responsibilities the maths you actually use is concentrated in four areas: Linear algebra essentials (vectors, matrices, projections, PCA intuition) Probability & statistics (uncertainty, metrics, sampling, base rates) Calculus essentials (derivatives, chain rule, gradients, backprop intuition) Basic optimisation (loss functions, gradient descent, regularisation, tuning) If you can do those four things well you can build models, debug training, evaluate properly, explain trade-offs & sound credible in interviews. This guide gives you a clear scope plus a six-week learning plan, portfolio projects & resources so you can learn with momentum rather than drowning in theory.