Description
:
Department:Data Management–Data Insights team
The Data Management team plays a vital role in our organisation, transforming raw data into comprehensive and meaningful insights that can drive informed decision-making. Moreover the data management team is responsible for establishing and maintaining a robust and efficient data model that ensures data is consistent and accurate across the organisation.
Position purpose
The Senior Data Engineer is a key role in the Data Management team, dedicated to converting complex and large data sets into clear, actionable insights for the FO data-driven decision-making.
This role combines exceptional technical abilities in data modelling, transformation and visualisation with proficient skills on data processing technologies and methods. The data engineer is tasked with designing and executing effective data pipelines that support complex and interactive visualisations.
Candidates for this role should excel in event-driven architecture and advanced analytics modelling using Python and Dagster/Airflow, and possess in-depth experience with Tableau/Power BI and Alteryx, plus Data Warehouse management experience to ensure optimal database performance.
A significant aspect of this role includes engaging directly with business stakeholders and senior management to assess their requirements, suggest outcomes, and translate business objectives into effective data visualisation solutions while optimising the underlying data architecture.
Main responsibilities
Data Cleaning, Preparation and Transformation: Create data pipelines that combine internal and external EDFT data, across several APIs and databases (relational and non-relational) preparing and refining data for analysis.
Data Quality Monitoring: Ensuring the accuracy and integrity of data at all times. This includes testing and validating data pipelines.
Data Analysis, Design and Development of Visualisations and Dashboards: Creating clear, compelling, and accurate visualisations and dashboards that make complex data easily understandable.
Database Architecture and Optimisation: Developing database structures for the Data Insights team that are both efficient and scalable, ensuring optimal performance.
Stakeholder Management and Collaboration: Engaging directly with business stakeholders and senior management to assess their requirements, suggest outcomes, and translate business objectives into effective data visualisation solutions.
Presentation of insights: Effectively presenting data visualisations and dashboards to stakeholders, transforming complex datasets into engaging, accessible stories, and ensuring that insights are not only informative but also resonate with the audience.
Experience and technical requirements
Strong Python (3.x) and SQL coding experience is essential. Experience with scripting languages (Bash, shell scripting), Proficiency with Git and CI/CD (ideally using Azure DevOps) Proficiency in data orchestration tools (Dagster/Airflow). Proficiency on ETL processes and data modelling (Alteryx). Proficiency in visualisation tools such as Tableau and Power BI. Proficiency with relational databases (SQL Server, Oracle or PostgreSQL). Strong skills in data analysis, including the use of statistical software and the ability to clean, process, and analyse large datasets.
Person Specification:
Excellent communication skills. Strong leadership and mentoring skills to guide and develop junior team members. Effective problem-solving and critical-thinking abilities to tackle data-related challenges. Keen eye for detail. Self-aware as to the value of current work vs. overall goal. Able to multitask, switch focus and prioritise own tasks being comfortable to work under pressure with demanding front office users. Takes ownership of any issues that come up and facilitates their resolution quickly using own initiative while managing expectations. A thirst for the latest technologies and automation coupled with a curiosity to research and innovate on new approaches. Interest in energy trading and willingness to work across the business on understanding the needs of different teams.
Beneficial Requirements:
Experience with Docker and containers desirable Experience with cloud computing (e.g. Microsoft Azure) beneficial Some experience of working in the energy market or prior exposure to a trading environment would be beneficial. Experience with real time streams (Kafka)and visualisations (Grafana and timeseries dbs) Experience interacting with the Tableau API is beneficial Experience interacting with the Alteryx API is beneficial. Experience interacting with the backend of Airflow/Dagster is beneficial.
Hours of work:
40 hours per week - Monday to Friday – Hybrid