TheRole
Join a high-impact data transformation programme within the aviation sector, where a global leading airline is undergoind a major data modernization journey.
The initiative goes far beyond a basic lift and shift; it’s a forward-looking transformation that blends the stability of mature legacy systems with the innovation of cloud-first, AI-driven architecture.
As a senior data engineer, you’ll play a critical role in building and optimizing modern, scalable data solutions that enable smarter decision-making, richer customer experiences and operational excellence.
You’ll be a part of a highly collaborative network of teams working with cutting edge cloud-based technologies whilst also navigating complex legacy ‘on-premise’ environments .
This role offers engineers the opportunity to leave behind traditional approaches and contribute to a programme with long-term impact at the forefront of aviation data innovation.
Yourresponsibilities:
Design, build and maintain robust data pipelines that support critical business applications and analytics
Analyze, re-engineer and modernize existingETLprocesses from legacy systems into scalable cloud-native solutions
Contribute to the development and optimization of a cloud-based data platform, leveraging tools like snowflake, AWS and airflow
Work closely with data architects, analysts and other engineers to deliver high-quality, production-ready code
Participate in code reviews, ensuring adherence to best practices and high engineering standards
Investigate data quality issues and implement monitoring and alterting systems to ensure pipeline reliability
Document workflows, data lineage and technical designs to support maintainability and knowledge sharing
Champion a culture of continuous improvement, experimentation and technical excellence within the team
Essential skills/knowledge/experience:
Strong hands-on experience with data engineering in both on-prem and cloud-based environments
Good working knowledge and hands on experience working with Teradata and Infomatica.
Proficiency in working with legacy systems and traditional ETL workflows
Solid experience building data pipelines using modern tools (Airflow, DBT, Glue etc.) and working with large volumes of structures and semi-structured data
Demonstrated experience withSQL and Python for data manipulation, pipeline development and workflow orchestration
Strong grasp of data modelling, data warehousing concepts and performance optimization techniques
Hands-on exposure to cloud platforms, especiallyAWS
Experience working in agile teams and using version control and CI/CD practices
Desirableskills/knowledge/experience:
Experience withSnowflakeor other cloud-native data warehouse technologies
Familiarity with GraphQL and its use in data-driven APIs
Exposure to data governance, data quality and metadata management tools
Interest or experience in applying machine learning / AI pipelines or features within a data engineering context
Understanding of DevOps conceps as applied to data (DataOps) and infrastructure-as-code tools like Teraform or Cloudformation
Previous experience in highly regulated industries or large-scale, enterprise-grade environments