Senior Data Engineer

Genio News
Leeds
4 days ago
Create job alert
Hi 👋 we’re Genio. We create beautifully simple learning tools that boost knowledge, skills, and confidence.

  • We’re a SaaS scale up and one of the fastest growing tech companies in the North.
  • There’s 100+ of us around the UK with our HQ in Leeds.
  • Our software is award-winning and used by 100,000s of students at over 800 universities & colleges worldwide.
  • We’re growing so we can achieve our mission to support 1 million students to become better learners by 2030.

🚀 Join our growing team as a Senior Data Engineer at Genio!

As a Senior Data Engineer at Genio, you’ll play a key part in owning the maintenance and development of Genio’s data architecture and platforms. You’ll help build a scalable data platform to support both analytical and product development. This is an exciting time to join the data team as we look to champion a culture of data driven innovation and continuous improvement.


👥 Meet the team:

You’ll be joining the Technology function, made up of Engineering, Product, Product Marketing, UX and Data – all working closely together in cross-functional squads to build the best product for our users.


You’ll work closely with Data Engineers, fellow Analysts, and teams across the business like Customer Experience, Learning, and Engineering. You’ll report into our Head of Data and be part of the Data team’s mission to empower everyone at Genio to use data with confidence and clarity.


To give you an idea of who you may work closely with:



  • Phil - Head of Data, father of two under 3, and sports enthusiast
  • Matt - Senior Analytics Engineer, enjoys funky bass line and gifs, occasionally makes ice cream
  • Emma - Senior Data Analyst, dog lover and currently on Mat leave
  • Alex - Data Engineer, loves Krav Maga, furious knitter
  • Nathan - Data Scientist, AI and cooking enthusiast

What you’ll be doing:



  • Design + Develop Architecture: Architect and build scalable, high-performance data structures within our GCP and Databricks Lakehouse.
  • Lead Data-Latency Strategy: Implement and manage low-latency solutions to power near-real time analytical and product features.
  • Own ELT Ingestion: Own the end-to-end ingestion process using Airbyte + Python, transforming raw data into actionable datasets.
  • Operationalise Data (Reverse ETL): Close the loop by syncing processed data back into SaaS tools to drive business workflows and automation.
  • Drive Technical Innovation: Ensure our data stack remains at the leading edge by evaluating and implementing emerging tools and frameworks as they become available.
  • Collaborate on Product Integration: Partner with product and engineering teams to embed data solutions directly into our core offerings.
  • A Seasoned Engineer: You have extensive experience building production-grade data systems and thrive in a senior or lead capacity.
  • Python Expert: You are highly proficient in Python to build robust data processes and applications.
  • Cloud & Lakehouse Native: You have an understanding of GCP/AWS and are an expert in navigating the Databricks (or similar) ecosystems.
  • Streaming Specialist: You’ve "been there and done that" with technology like Kafka or Flink, successfully implementing near-real-time or real-time data solutions.
  • DataOps Advocate: You believe in CI/CD, containerisation (Docker/K8s), and automated testing; you don't consider a task "done" until it's observable and repeatable.
  • Architectural Thinker: You have an in-depth knowledge of database internals, data warehousing, and modern Data Lake design.
  • Detail-Oriented & Communicative: You have a sharp eye for the "small things" that break pipelines and the ability to explain complex technical trade-offs to non-technical partners.
  • Lifelong Learner: You are naturally curious and stay ahead of the curve by experimenting with and adopting the latest industry frameworks.

Not every one of the above is essential, but hopefully it gives an idea of what we find useful day to day.


đź’°Salary and benefits:

🏖️ 33 days annual leave (Inclusive of bank holidays)


🎄 3 gifted days off at Christmas


đź’° EMI Share Options Scheme


🎓 Generous individual learning and training allowance


⌚ Truly flexible hours to suit when you work best


đź’» Full home working set up and beautiful collaborative office space


đźš— Free Leeds City Centre office parking


🌴 Nomad working policy with family travel insurance


🍼 Enhanced 26 weeks maternity and 4 weeks paternity (fully paid)


🤝 2 volunteering days per year


🤍 Health cash plan (from glasses to massages)


Location:

We have a beautiful office space in Leeds and we love it when we get together to collaborate in person. We typically operate a hybrid way of working, however some of our roles support remote working within the UK, if you live more than 50 miles from the office.


We will discuss ways of working with you at interview however if you have any questions before you apply please reach out to


đź’ˇ What to expect next:

We’ll review your application and provide a response from Monday 5th January onwards. Even if it’s not the news you’d hoped for, we appreciate it’s good to know either way.


If we invite you to meet with us for interview, here’s an overview of what the process will look like:



  • Screening interview with someone in our Recruitment team (30 minutes).
  • Technical test; you will have 1 week to complete the test when it suits you best. The test isn’t timed but we suggest no longer than 3 hours.
  • Interview with the Head of Data and one other data team member. This consists of competency based questions and a task review.
  • Final stage interview which includes a 1 hour culture and values interview. This will be with our Head of Data and another Genio team member.

Ahead of your interviews you will receive a confirmation email outlining who you’ll be meeting and when, anything you’ll need to prepare in advance and any resources we think you might find helpful.


đź‘€ Interested in learning more about a career at Genio?
💌 Not quite the right role for you however you’d love to be a part of Genio’s journey?

Let’s connect! Reach out to and we’ll add you to our network, to keep you updated with any future opportunities we think you might be interested in.


🤝 Our Commitment to Equality


We are committed to equality of opportunity for all staff and applications from individuals are encouraged regardless of age, disability, sex, gender reassign­ment, sexual orientation, pregnancy and maternity, race, religion or belief, marriage and civil partnerships, trade union membership, and caring responsibilities.


We think it’s important that you understand how we use and handle your personal information, so here’s a link to our privacy notice. By submitting your application, you’re confirming that you’ve read and understood this notice.


#J-18808-Ljbffr

Related Jobs

View all jobs

Senior Data Engineer

Senior Data Engineer

Senior Data Engineer

Senior Data Engineer

Senior Data Engineer - Energy

Senior Data Engineer, SQL, RDBMS, AWS, Python, Mainly Remote

Subscribe to Future Tech Insights for the latest jobs & insights, direct to your inbox.

By subscribing, you agree to our privacy policy and terms of service.

Industry Insights

Discover insightful articles, industry insights, expert tips, and curated resources.

Machine Learning Jobs for Career Switchers in Their 30s, 40s & 50s (UK Reality Check)

Are you considering a career change into machine learning in your 30s, 40s or 50s? You’re not alone. In the UK, organisations across industries such as finance, healthcare, retail, government & technology are investing in machine learning to improve decisions, automate processes & unlock new insights. But with all the hype, it can be hard to tell which roles are real job opportunities and which are just buzzwords. This article gives you a practical, UK-focused reality check: which machine learning roles truly exist, what skills employers really hire for, how long retraining realistically takes, how to position your experience and whether age matters in your favour or not. Whether you come from analytics, engineering, operations, research, compliance or business strategy, there is a credible route into machine learning if you approach it strategically.

How to Write a Machine Learning Job Ad That Attracts the Right People

Machine learning now sits at the heart of many UK organisations, powering everything from recommendation engines and fraud detection to forecasting, automation and decision support. As adoption grows, so does demand for skilled machine learning professionals. Yet many employers struggle to attract the right candidates. Machine learning job adverts often generate high volumes of applications, but few applicants have the blend of modelling skill, engineering awareness and real-world experience the role actually requires. Meanwhile, strong machine learning engineers and scientists quietly avoid adverts that feel vague, inflated or confused. In most cases, the issue is not the talent market — it is the job advert itself. Machine learning professionals are analytical, technically rigorous and highly selective. A poorly written job ad signals unclear expectations and low ML maturity. A well-written one signals credibility, focus and a serious approach to applied machine learning. This guide explains how to write a machine learning job ad that attracts the right people, improves applicant quality and strengthens your employer brand.

Maths for Machine Learning Jobs: The Only Topics You Actually Need (& How to Learn Them)

Machine learning job adverts in the UK love vague phrases like “strong maths” or “solid fundamentals”. That can make the whole field feel gatekept especially if you are a career changer or a student who has not touched maths since A level. Here is the practical truth. For most roles on MachineLearningJobs.co.uk such as Machine Learning Engineer, Applied Scientist, Data Scientist, NLP Engineer, Computer Vision Engineer or MLOps Engineer with modelling responsibilities the maths you actually use is concentrated in four areas: Linear algebra essentials (vectors, matrices, projections, PCA intuition) Probability & statistics (uncertainty, metrics, sampling, base rates) Calculus essentials (derivatives, chain rule, gradients, backprop intuition) Basic optimisation (loss functions, gradient descent, regularisation, tuning) If you can do those four things well you can build models, debug training, evaluate properly, explain trade-offs & sound credible in interviews. This guide gives you a clear scope plus a six-week learning plan, portfolio projects & resources so you can learn with momentum rather than drowning in theory.