Senior BigQuery Data Engineer - Contract

Augustinus Bader
London
3 days ago
Create job alert

Role title

Senior BigQuery Data Engineer / Contract


Contract type

Day rate contractor

Initial term four to five months


Context

The business is consolidating a growing number of data sources into BigQuery as a core enterprise data platform. Initial focus has been on DTC and ecommerce, with planned expansion across finance, operations, logistics, marketing and others.

Current data sources include ecommerce platforms, subscription systems, customer service tools, personalisation platforms, and marketplace integrations. Data is actively consumed via SQL and AI assisted analysis to power internal reporting applications built in Laravel.

The role is required to stabilise, structure, and future proof the BigQuery environment so it can support scale, governance, and enterprise wide adoption. Preference for IaC, such as Terraform

Primary objectives

  • BigQuery architecture and data model ownership
  • Review the current BigQuery structure, ingestion patterns, and table design.
  • Design and implement a scalable, well governed data architecture suitable for a global enterprise business
  • Define and implement golden datasets with clear ownership, access rules, and change control.
  • Introduce appropriate schema and field level controls to prevent uncontrolled changes and data drift.
  • Ensure the data model supports downstream analytics, AI driven querying, and application level reporting.
  • Produce clear documentation explaining the architecture, data model, and usage patterns for both technical and non technical stakeholders.
  • Delivery oversight and operating model support
  • Work alongside the existing data engineering resource to review current data pipelines, models, and delivery practices.
  • Assess the effectiveness of current ways of working, technical approaches, and delivery processes against current and future business needs.
  • Provide an evidence based view on strengths, gaps, and areas for improvement across data engineering capability and operating model.
  • Make pragmatic recommendations on role scope, process improvements, upskilling opportunities, and resourcing required to support the target state.
  • The required tech stack is Google Big Query plus ‘Infrastructure as Code’ / DBT to be confirmed.

Key deliverables

  • Documented target state BigQuery architecture and data model.
  • Defined and implemented golden tables with clear ownership and governance.
  • Standards for data ingestion, transformation, and consumption.
  • A practical roadmap for scaling BigQuery usage across additional business functions.
  • Clear documentation that enables confident use of data across the organisation.

Required experience

  • Strong hands on experience designing and operating BigQuery environments at scale.
  • Deep understanding of data modelling, analytics architecture, and data governance.
  • Experience working with complex, multi source data environments, ideally including ecommerce and subscription data.
  • Experience with data pipeline orchestrations tools such as Cloud Composer, Airflow or equivalent.
  • Comfort working in fast moving environments with imperfect starting points.
  • Ability to balance best practice with pragmatism and delivery speed.
  • Strong communication skills and ability to explain complex concepts clearly.

Nice to have

  • Experience supporting AI driven analytics or natural language querying of data.
  • Experience working closely with application teams consuming data directly in products or dashboards.
  • Background in DTC, retail, or consumer brands.

Working style

  • Hands on and delivery focused.
  • Pragmatic and outcome driven.
  • 2 Days in office (Central London) per week.
  • Comfortable operating with autonomy.
  • Able to challenge existing approaches constructively.
  • Focused on clarity, documentation, and long term sustainability.

Success looks like

  • BigQuery is trusted as a scalable, governed enterprise data platform.
  • Golden datasets (Curated, business validated tables that serve as the single source of truth) are clearly defined, locked down, and actively used.
  • The business is unblocked to expand data usage across finance, operations, and other functions.

There is clear visibility on the current operating model and what is required to support future growth.

Related Jobs

View all jobs

Senior BigQuery Data Engineer - Contract

Senior Data Engineer - Customer Data Services - CIO Enabling

Senior Data Engineer

Senior GCP Data Engineer: dbt & BigQuery — Remote & Flexible

Senior Data Engineer in London - Qodea

Senior Big Data Engineer – Cloud Pipelines & Governance

Subscribe to Future Tech Insights for the latest jobs & insights, direct to your inbox.

By subscribing, you agree to our privacy policy and terms of service.

Industry Insights

Discover insightful articles, industry insights, expert tips, and curated resources.

How Many Machine Learning Tools Do You Need to Know to Get a Machine Learning Job?

Machine learning is one of the most exciting and rapidly growing areas of tech. But for job seekers it can also feel like a maze of tools, frameworks and platforms. One job advert wants TensorFlow and Keras. Another mentions PyTorch, scikit-learn and Spark. A third lists Mlflow, Docker, Kubernetes and more. With so many names out there, it’s easy to fall into the trap of thinking you must learn everything just to be competitive. Here’s the honest truth most machine learning hiring managers won’t say out loud: 👉 They don’t hire you because you know every tool. They hire you because you can solve real problems with the tools you know. Tools are important — no doubt — but context, judgement and outcomes matter far more. So how many machine learning tools do you actually need to know to get a job? For most job seekers, the real number is far smaller than you think — and more logically grouped. This guide breaks down exactly what employers expect, which tools are core, which are role-specific, and how to structure your learning for real career results.

What Hiring Managers Look for First in Machine Learning Job Applications (UK Guide)

Whether you’re applying for machine learning engineer, applied scientist, research scientist, ML Ops or data scientist roles, hiring managers scan applications quickly — often making decisions before they’ve read beyond the top third of your CV. In the competitive UK market, it’s not enough to list skills. You must send clear signals of relevance, delivery, impact, reasoning and readiness for production — and do it within the first few lines of your CV or portfolio. This guide walks you through exactly what hiring managers look for first in machine learning applications, how they evaluate CVs and portfolios, and what you can do to improve your chances of getting shortlisted at every stage — from your CV and LinkedIn profile to your cover letter and project portfolio.

MLOps Jobs in the UK: The Complete Career Guide for Machine Learning Professionals

Machine learning has moved from experimentation to production at scale. As a result, MLOps jobs have become some of the most in-demand and best-paid roles in the UK tech market. For job seekers with experience in machine learning, data science, software engineering or cloud infrastructure, MLOps represents a powerful career pivot or progression. This guide is designed to help you understand what MLOps roles involve, which skills employers are hiring for, how to transition into MLOps, salary expectations in the UK, and how to land your next role using specialist platforms like MachineLearningJobs.co.uk.