Senior Data Engineer opportunity to join a well-known financial services company working on a Remote first basis (once a month in the office on average).
Senior Data Engineer - Remote First (offices in lake district) - Up to £65,000 + Great Benefits
Purpose of Job
Advise on and ensure the maximum return of value from the company's data assets; through use of best practice data engineering techniques that aid the delivery of the company's data management strategy and roadmap and align to regulatory requirements.
Contribute towards the support, maintenance, and improvement of the company's platform inc data pipelines, and ensure data quality and governance.
Responsibilities
Advise on the Design, implementation, and maintenance of complex data engineering solutions. Ensure the implementation, and maintenance of complex data engineering solutions to acquire and prepare data and that it adheres to Best Practice. (Extract, Transform, Load) Contributes towards and ensures the delivery of the creation and maintenance of data pipelines to connect data within and between data stores, applications, and organisations. Contributes towards and ensures the delivery of complex data quality checking and remediation. Identifying data sources, data processing concepts and methods Evaluating, designing and implementing on-premise, cloud-based and hybrid data engineering solutions Structuring and storing data for uses including - but not limited to - analytics, machine learning, data mining, sharing with applications and organisations. Harvesting structured and unstructured data Integrating, consolidating, and cleansing data Migrating and converting data Applying ethical principles and regulatory requirements in handling data Ensuring appropriate storage of data in line with relevant legislation, and the company requirements. Guiding and contributing towards the development of junior & trainee Data Engineers. Providing Technical guidance to Data Engineers
Knowledge, Experience and qualifications
Excellent level of knowledge & experience in Azure Data, including DataLake, Data Factory, DataBricks, Azure SQL (Indicative experience = 5yrs+) Build and test processes supporting data extraction, data transformation, data structures, metadata, dependency and workload management. Knowledge on Spark architecture and modern Datawarehouse/Data-Lake/Lakehouse techniques Build transformation tables using SQL. Moderate level knowledge of Python/PySpark or equivalent programming language. PowerBI Data Gateways and DataFlows, permissions. Creation, utilisation, optimisation and maintenance of Relational SQL and NoSQL databases. Experienced working with CI/CD tooling such as Azure DevOps/GitHub including repos, source-control, pipelines, actions. Awareness of Informatica or similar data governance tools (Desirable) Experience of working in agile (SCRUM) and waterfall delivery-type teams. Experienced with Confluence & Jira. Experience in a financial services or other highly regulated environment. Any experience or interest in AI is desirable