Jobs

Data Engineer – Python, Linux, Apache Airflow, AWS or GCP


Job details
  • London
  • 1 week ago

Data Engineer – Python, Linux, Apache Airflow, AWS or GCP

I’m working with a successful Data Analytics consultancy based in London to recruit an experienced Data Engineer to join their team.

In this role, you'll design, build, and optimize data pipelines and infrastructure to support our business intelligence and analytics.

Key responsibilities:

  • Develop data pipelines using Apache Airflow, Python, and cloud platforms (AWS, Azure, GCP)

  • Integrate data from databases, data lakes, and other sources

  • Implement efficient ETL/ELT processes for high-quality, reliable data

  • Optimize pipeline performance and scalability

  • Collaborate with data teams to deliver impactful data solutions

    Required skills:

  • 5+ years experience in a data role

  • Expertise in Python, SQL, and workflow tools like Apache Airflow

  • Experience with relational databases and cloud data architectures – PostgreSQL, MS Sql

  • Understanding of data modeling, ETL, and data quality best practices

  • Strong problem-solving and analytical skills

  • Excellent communication and collaboration abilities

    This will be a great role in a modern data consultancy.

    Salary: Circa £60k

    Duration: Permanent

    Location: Central London, Hybrid working

Sign up for our newsletter

The latest news, articles, and resources, sent to your inbox weekly.

Similar Jobs

Data Engineer – Python, SQL, Linux, GCP - Outstanding Data Consultancy

Data Engineer – Python, Linux, Apache Airflow, AWS or GCPI’m working with a small but outstanding Data Analytics consultancy who are looking to recruit a Data Engineer with at least 2 years experience to work on a long term client project. They work with a very impressive client list to...

London