Senior Big Data Engineer (Java Focus)

Global Relay
City of London
1 month ago
Applications closed

Related Jobs

View all jobs

Senior Data Engineer Big Data/ Hadoop/ Spark

Data Engineer

Databricks Data Engineer

Senior Data Engineering Consultant

Lead Data Engineer

Lead Data Engineer

Who we are:

For over 20 years, Global Relay has set the standard in enterprise information archiving with industry-leading cloud archiving, surveillance, eDiscovery, and analytics solutions. We securely capture and preserve the communications data of the world’s most highly regulated firms, giving them greater visibility and control over their information and ensuring compliance with stringent regulations.


Though we offer competitive compensation and benefits and all the other perks one would expect from an established company, we are not your typical technology company. Global Relay is a career-building company. A place for big ideas. New challenges. Groundbreaking innovation. It’s a place where you can genuinely make an impact – and be recognized for it.


We believe great businesses thrive on diversity, inclusion, and the contributions of all employees. To that end, we recruit candidates from different backgrounds and foster a work environment that encourages employees to collaborate and learn from each other, completely free of barriers.


Your Role:

Joining the Reporting product line, you would work as a member of a highly focused team. This team specialises in Java-based data engineering, designing and delivering large-scale ELT/ETL workflows on a data lake house platform. You will be working with modern big data technologies to move, transform, and optimise data for high-performance analytics and regulatory reporting. The environment encourages autonomy, problem-solving, and system-level thinking. If you’re passionate about clean, well-tested, performant code and enjoy working on complex data pipelines at scale, you’ll thrive here.


Tech Stack:

  • Micro-services Container Platforms (Kubernetes, CRC, Docker)
  • Big Data Technologies (Apache Spark, Flink, Hadoop, Airflow, Trino, Iceberg)
  • Dependency injection frameworks (Spring)
  • Observability (Loki/Grafana)
  • Large scale data processing (Kafka)
  • CI/CD Build tools (Maven, Git, Jenkins, Ansible)
  • NoSQL DBs (Cassandra, Zookeeper, HBase)

Your Responsibilities:

  • Develop ETL, ELT and streaming processes using big data frameworks primarily in Java
  • Design, implement and provide architectural guidance in deploying microservices as a part of an agile development team
  • Write unit and integration tests for your Java code
  • Collaborate with testers in development of functional test cases
  • Develop deployment systems for Java based systems
  • Collaborate with product owners on user story generation and refinement
  • Monitor and support the operation of production systems
  • Participate in knowledge sharing activities with colleagues
  • Pair programming and peer reviews

About you:
Required Experience:

  • Minimum 5 years of Java development experience in an Agile environment, building scalable applications and services with a focus on big data solutions and analytics
  • 3+ year experience in developing ETL / ELT processes using relevant technologies and tools.
  • Experienced in working with data lakes and data warehouse platforms for both batch and streaming data sources.
  • ANSI SQL experience or other flavours of SQL
  • Experience of unstructured, semi-structured and structured data processing.
  • A good understanding of ETL/ELT principles, best practices and patterns used.
  • Experienced in some big data technologies such as Hadoop, Spark and Flink o Experience in web services technologies
  • Experience in Test Driven Development o Experience in CI/CD • Attributes:
  • Good communication skills
  • Problem Solving
  • Self-starter
  • Team player

What you can expect:

At Global Relay, there’s no ceiling to what you can achieve. It’s the land of opportunity for the energetic, the intelligent, the driven. You’ll receive the mentoring, coaching, and support you need to reach your career goals. You’ll be part of a culture that breeds creativity and rewards perseverance and hard work. And you’ll be working alongside smart, talented individuals from diverse backgrounds, with complementary knowledge and skills.


Global Relay is an equal-opportunity employer committed to diversity, equity, and inclusion.


We seek to ensure reasonable adjustments, accommodations, and personal time are tailored to meet the unique needs of every individual.


To learn more about our business, culture, and community involvement, visit www.globalrelay.com.


#J-18808-Ljbffr

Subscribe to Future Tech Insights for the latest jobs & insights, direct to your inbox.

By subscribing, you agree to our privacy policy and terms of service.

Industry Insights

Discover insightful articles, industry insights, expert tips, and curated resources.

How Many Machine Learning Tools Do You Need to Know to Get a Machine Learning Job?

Machine learning is one of the most exciting and rapidly growing areas of tech. But for job seekers it can also feel like a maze of tools, frameworks and platforms. One job advert wants TensorFlow and Keras. Another mentions PyTorch, scikit-learn and Spark. A third lists Mlflow, Docker, Kubernetes and more. With so many names out there, it’s easy to fall into the trap of thinking you must learn everything just to be competitive. Here’s the honest truth most machine learning hiring managers won’t say out loud: 👉 They don’t hire you because you know every tool. They hire you because you can solve real problems with the tools you know. Tools are important — no doubt — but context, judgement and outcomes matter far more. So how many machine learning tools do you actually need to know to get a job? For most job seekers, the real number is far smaller than you think — and more logically grouped. This guide breaks down exactly what employers expect, which tools are core, which are role-specific, and how to structure your learning for real career results.

What Hiring Managers Look for First in Machine Learning Job Applications (UK Guide)

Whether you’re applying for machine learning engineer, applied scientist, research scientist, ML Ops or data scientist roles, hiring managers scan applications quickly — often making decisions before they’ve read beyond the top third of your CV. In the competitive UK market, it’s not enough to list skills. You must send clear signals of relevance, delivery, impact, reasoning and readiness for production — and do it within the first few lines of your CV or portfolio. This guide walks you through exactly what hiring managers look for first in machine learning applications, how they evaluate CVs and portfolios, and what you can do to improve your chances of getting shortlisted at every stage — from your CV and LinkedIn profile to your cover letter and project portfolio.

MLOps Jobs in the UK: The Complete Career Guide for Machine Learning Professionals

Machine learning has moved from experimentation to production at scale. As a result, MLOps jobs have become some of the most in-demand and best-paid roles in the UK tech market. For job seekers with experience in machine learning, data science, software engineering or cloud infrastructure, MLOps represents a powerful career pivot or progression. This guide is designed to help you understand what MLOps roles involve, which skills employers are hiring for, how to transition into MLOps, salary expectations in the UK, and how to land your next role using specialist platforms like MachineLearningJobs.co.uk.