📍Location:Remote-first (UK-based)
đź’°Rate:Up toÂŁ550 p/d
📆Contract:6 - 12 months (Outside IR35)
đź› Tech Stack:BigQuery, SQL, GCP (Dataflow, Cloud Storage, Pub/Sub), ETL/ELT, Python
We’re working with a fast-moving team that’s building high-performance data infrastructure onGoogle Cloud Platform. They’re looking for aSenior Data Engineerto help design, implement, and optimise scalable pipelines while supporting both data and product teams with reliable, secure, and efficient systems.
If you enjoy working withlarge-scale datasetsand solving performance challenges in cloud environments, this is a great opportunity to lead with impact and collaborate across teams.
What You’ll Be Doing:
🔹 Designing and implementingscalable data pipelinesusingBigQueryand otherGCP tools
🔹 Developing and maintainingETL/ELT workflowsfor transforming and aggregating data
🔹Optimising database performancefor cost-effective, efficient querying
🔹 Collaborating with analysts, scientists, and engineers on analytics and product use cases
🔹 Automating workflows andmonitoring pipeline performance
🔹 Mentoring junior engineers and fostering a strong engineering culture
What You Need:
âś” StrongSQL skillsand experience with large-scale datasets
âś” Hands-on experience withBigQueryand GCP tools likeDataflow, Cloud Storage, and Pub/Sub
âś” Deep knowledge ofquery optimisation and performance tuning
âś” Proven experience withETL/ELT design and data modelling best practices
âś” A collaborative mindset and solid communication skills
âś” Ability tomentor and guidejunior team members
This is anoutside IR35contract, so you must beUK-based. Interested? Click Apply or get in touch withIonut Roghinafor more details!