Senior Data Engineer

Domestic & General Service GmbH
London
1 month ago
Applications closed

Related Jobs

View all jobs

Senior Data Engineer

Senior Data Engineer

Senior Data Engineer

Senior Data Engineer - Databricks

Senior Data Engineer - DV Cleared

Senior Data Engineer - MS Fabric - Remote - £70k - £75k

For this role, senior experience of Data Engineering and building automated data pipelines on IBM Datastage & DB2, AWS and Databricks from source to operational databases through to curation layer is expected using the latest cloud modern technologies where experience of delivering complex pipelines will be significantly valuable to how D&G maintain and deliver world class data pipelines.

Job summary:

D+G is transforming into a technology-powered product business serving customers around the world. Our products and services rely heavily on compelling digital experiences and data-led journeys for our B2B and B2C clients and customers.

This is a key lead engineering role with D&G’s technology team which presents a challenging and exciting opportunity which will require real enthusiasm and modern Data engineering experience to stabilise, enhance and transform D&G’s operational Customer Databases as they move from legacy systems to new scalable cloud solutions across the UK, EU and US. The role will require an experienced Data engineer with good knowledge of IBM Datastage & DB2, AWS & Databricks pipelines who is able to excel in challenging environments with the confidence to help the teams steer the right course in the development of the data platform alongside supporting any required tooling decisions.

The role will enable D+G to deliver a modern data services layer, delivered as a product, and which can hence be consumed by key service channels and stakeholders on demand.

Strategic Impact:

Quality Customer Data is the lifeblood of D&G’s operations which allows us to serve our customers with outstanding propositions and outcomes. This role will be integral to supporting this through the following areas of delivery:

This role will initially help stabilise existing on-prem Customer Data Platforms to help serve our customers and protect the one-billion-pound revenue across the UK and EU. Targets will be to reduce merge and compliance incident backlog, promote more automation and support onboarding of 3rd party to provide managed break / fix service.

Support Data Growth in UK and US Markets

Supporting further growth in UK / EU markets through enhancement of the Customer on-prem IBM platforms to ensure they remain available, robust and secure for growing data demands in UK / EU whilst leading on delivery of cloud based solutions for the US pipelines and Data platform.

Knowledge, Expertise, Complexity and Scope:

Knowledge in the following areas essential:

  1. Databricks:Expertise in managing and scaling Databricks environments for ETL, data science, and analytics use cases.
  2. AWS Cloud:Extensive experience with AWS services such as S3, Glue, Lambda, RDS, and IAM.
  3. IBM Skills:DB2, Datastage, Tivoli Workload Scheduler, Urban Code.
  4. Programming Languages: Proficiency in Python, SQL.
  5. Data Warehousing & ETL: Experience with modern ETL frameworks and data warehousing techniques.
  6. DevOps & CI/CD:Familiarity with DevOps practices for data engineering, including infrastructure-as-code (e.g., Terraform, CloudFormation), CI/CD pipelines, and monitoring (e.g., CloudWatch, Datadog).
  7. Familiarity with big data technologies like Apache Spark, Hadoop, or similar.
  8. ETL/ELT tools and creating common data sets across on-prem (IBMDatastage ETL) and cloud data stores.
  9. Leadership & Strategy:Lead Data Engineering team(s) in designing, developing, and maintaining highly scalable and performant data infrastructures.
  10. Customer Data Platform Development:Architect and manage our data platforms using IBM (legacy platform) & Databricks on AWS technologies (e.g., S3, Lambda, Glacier, Glue, EventBridge, RDS) to support real-time and batch data processing needs.
  11. Data Governance & Best Practices:Implement best practices for data governance, security, and data quality across our data platform. Ensure data is well-documented, accessible, and meets compliance standards.
  12. Pipeline Automation & Optimisation:Drive the automation of data pipelines and workflows to improve efficiency and reliability.
  13. Team Management:Mentor and grow a team of data engineers, ensuring alignment with business goals, delivery timelines, and technical standards.
  14. Cross Company Collaboration:Work closely with all levels of business stakeholder including data scientists, finance analysts, MI and cross-functional teams to ensure seamless data access and integration with various tools and systems.
  15. Cloud Management:Lead efforts to integrate and scale cloud data services on AWS, optimising costs and ensuring the resilience of the platform.
  16. Performance Monitoring:Establish monitoring and alerting solutions to ensure the high performance and availability of data pipelines and systems to ensure no impact to downstream consumers.

Key Responsibilities:

  1. Manage outcomes for the on-prem customer platform break / fix service.
  2. Build and deliver automated and secure data pipelines that provisions data for all business users and applications (including operational and insight).
  3. Work with the DevOps developer and testers to help support and create our AWS & Databricks infrastructure and continuous delivery pipelines.
  4. Ensure all developments are tested and deployed within the automated CI / CD pipeline where appropriate.
  5. Version and store all development artefacts in the agreed repository.
  6. Ensure all data are catalogued and appropriate documentation is created and maintained for all ETL code and associated NFR’s.
  7. Collaborate with the product owner (Data) & business stakeholders to understand the requirements and capabilities.
  8. Collaborate with the lead architect, CCOE to align to the best practice delivery strategy.
  9. Participate in the teams agile planning and delivery process to ensure work is delivered in line with the Product Owners priorities.
  10. Create low level designs for Epic’s and Stories and where required, support the lead architect to create the designs to enable the realization of the Data Lake, Operational Customer DB, Warehouse and marts while ensuring scalability, security by design, ease of use and high availability & reliability.
  11. Identify the key capabilities needed for success and the technology choices, coding standards, testing techniques and delivery approach to deliver reliable data services.
  12. Learn emerging technologies to keep abreast of new or better ways of delivering the Data Pipeline.
  13. Welcomes a challenge as a new opportunity to learn new things and make new friends whilst always thinking of better techniques to solve problems.

At Domestic & General, we are proud of our 100-year legacy and excited about our future growth plans. We are expanding our horizons, entering new markets and territories internationally and we need your expertise to help us on the journey.

Remote

Full Time

Salary: £23,000 per year plus OTE £7,200 per annum
Contract: Permanent, full time with a fixed weekly shift pattern
Location: REMOTE – working from home

#J-18808-Ljbffr

Get the latest insights and jobs direct. Sign up for our newsletter.

By subscribing you agree to our privacy policy and terms of service.

Industry Insights

Discover insightful articles, industry insights, expert tips, and curated resources.

Portfolio Projects That Get You Hired for Machine Learning Jobs (With Real GitHub Examples)

In today’s data-driven landscape, the field of machine learning (ML) is one of the most sought-after career paths. From startups to multinational enterprises, organisations are on the lookout for professionals who can develop and deploy ML models that drive impactful decisions. Whether you’re an aspiring data scientist, a seasoned researcher, or a machine learning engineer, one element can truly make your CV shine: a compelling portfolio. While your CV and cover letter detail your educational background and professional experiences, a portfolio reveals your practical know-how. The code you share, the projects you build, and your problem-solving process all help prospective employers ascertain if you’re the right fit for their team. But what kinds of portfolio projects stand out, and how can you showcase them effectively? This article provides the answers. We’ll look at: Why a machine learning portfolio is critical for impressing recruiters. How to select appropriate ML projects for your target roles. Inspirational GitHub examples that exemplify strong project structure and presentation. Tangible project ideas you can start immediately, from predictive modelling to computer vision. Best practices for showcasing your work on GitHub, personal websites, and beyond. Finally, we’ll share how you can leverage these projects to unlock opportunities—plus a handy link to upload your CV on Machine Learning Jobs when you’re ready to apply. Get ready to build a portfolio that underscores your skill set and positions you for the ML role you’ve been dreaming of!

Machine Learning Job Interview Warm‑Up: 30 Real Coding & System‑Design Questions

Machine learning is fuelling innovation across every industry, from healthcare to retail to financial services. As organisations look to harness large datasets and predictive algorithms to gain competitive advantages, the demand for skilled ML professionals continues to soar. Whether you’re aiming for a machine learning engineer role or a research scientist position, strong interview performance can open doors to dynamic projects and fulfilling careers. However, machine learning interviews differ from standard software engineering ones. Beyond coding proficiency, you’ll be tested on algorithms, mathematics, data manipulation, and applied problem-solving skills. Employers also expect you to discuss how to deploy models in production and maintain them effectively—touching on MLOps or advanced system design for scaling model inferences. In this guide, we’ve compiled 30 real coding & system‑design questions you might face in a machine learning job interview. From linear regression to distributed training strategies, these questions aim to test your depth of knowledge and practical know‑how. And if you’re ready to find your next ML opportunity in the UK, head to www.machinelearningjobs.co.uk—a prime location for the latest machine learning vacancies. Let’s dive in and gear up for success in your forthcoming interviews.

Negotiating Your Machine Learning Job Offer: Equity, Bonuses & Perks Explained

How to Secure a Compensation Package That Matches Your Technical Mastery and Strategic Influence in the UK’s ML Landscape Machine learning (ML) has rapidly shifted from an emerging discipline to a mission-critical function in modern enterprises. From optimising e-commerce recommendations to powering autonomous vehicles and driving innovation in healthcare, ML experts hold the keys to transformative outcomes. As a mid‑senior professional in this field, you’re not only crafting sophisticated algorithms; you’re often guiding strategic decisions about data pipelines, model deployment, and product direction. With such a powerful impact on business results, companies across the UK are going beyond standard salary structures to attract top ML talent. Negotiating a compensation package that truly reflects your value means looking beyond the numbers on your monthly payslip. In addition to a competitive base salary, you could be securing equity, performance-based bonuses, and perks that support your ongoing research, development, and growth. However, many mid‑senior ML professionals leave these additional benefits on the table—either because they’re unsure how to negotiate them or they simply underestimate their long-term worth. This guide explores every critical aspect of negotiating a machine learning job offer. Whether you’re joining an AI-focused start-up or a major tech player expanding its ML capabilities, understanding equity structures, bonus schemes, and strategic perks will help you lock in a package that matches your technical expertise and strategic influence. Let’s dive in.