Senior Data Engineer
Salary: around £75k - plus a very attractive on top
Location: London or Leeds - relaxed about hybrid working, if preferred
About Us
Sports forecasting company specializing in developing player level, play-by-play simulators that generate highly accurate, near-instant outcome projections for sporting events. Since its establishment, the company has focused on serving the emerging sports betting market and, over the past six years, has grown into a globally recognized leader in sports modelling and analytics.
Purpose of role:
The purpose of this role is to design, build, and maintain scalable and efficient data transformation workflows that empower the business with actionable insights and impactful visualisations. This role bridges the gap between data engineering and business intelligence by owning the transformation layer and enabling clear, trusted, and timely analytics across the organisation. The successful candidate will have a strong grasp of modern data modelling practices, analytics tooling, and interactive dashboard development in Power BI and Plotly/Dash.
Key responsibilities:
Designing and maintaining robust data transformation pipelines (ELT) using SQL, Apache Airflow, or similar tools.
Building and optimizing data models that power dashboards and analytical tools
Developing clear, insightful, and interactive dashboards and reports using Power BI and Plotly/Dash.
Collaborating closely with stakeholders to understand reporting needs and translate them into analytical solutions.
Ensuring high data quality, consistency, and documentation across business-critical datasets.
Managing the semantic layer and ensuring data definitions are aligned across teams.
Contributing to architecture and platform design decisions, especially regarding the analytics layer.
Driving adoption of best practices in analytics engineering, including version control, testing, and CI/CD for analytics code.
Mentoring and coaching junior analytics and data engineers.
Acting as a data evangelist across the company, promoting a self-serve data culture through training and enablement.
Skills and Competencies:
Significant experience in building, maintaining, and scaling modern data pipelines and transformation workflows (ELT), ideally within a cloud or lakehouse environment.
Strong experience with data modeling techniques (e.g. dimensional, star/snowflake schemas) and analytics layer design to support business intelligence and self-serve reporting.
Proficiency in analytics engineering tools such as airflow, SQL, and version control systems like Git.
Hands-on experience developing dashboards and reports using Power BI, Plotly/Dash, or other modern visualisation tools.
Strong understanding of data governance, quality, and documentation best practices.
Strong ability to debug and optimize slow or failing data pipelines and queries.
Knowledge of data privacy and security practices, including data masking, row-level security, and encryption techniques.
Experience collaborating with cross-functional teams including data engineers, data scientists, and business stakeholders.
Familiarity with cloud-based data ecosystems such as AWS, Azure, or GCP, and working with data warehouse/lakehouse technologies such as Snowflake, BigQuery, Redshift, or Athena/Glue.
Essential:
Proficient in writing clean, efficient, and maintainable SQL and Python code, particularly for data transformation and analytics use cases.
Strong understanding of data modeling concepts, including star/snowflake schemas and designing models optimized for reporting and dashboarding.
Proficient in analytics tools such as Power BI, Plotly/Dash, or similar for building interactive and impactful visualizations.
Deep experience with modern ELT workflows and transformation tools (e.g., dbt, custom SQL models, etc).
Strong ability to debug and optimize slow or failing data pipelines and queries
Familiarity with distributed systems (e.g., Spark, Kafka) and how they support scalable analytics solutions.
Experience designing and integrating with APIs and handling system integrations, including data migrations and networked data sources.
Practical experience with cloud platforms such as AWS, Azure, or GCP, and building scalable, secure data architectures.
Commitment to clean systems and documentation, including logging, reproducibility, and data quality tracking.
Strong communication and stakeholder engagement skills, with the ability to translate data needs into technical solutions.
Ability to define and implement data quality checks, profiling rules, and reconciliation processes.
Desirable:
Experience troubleshooting complex data issues and proposing resilient solutions in production environments.
Ability to clearly communicate complex technical concepts to non-technical stakeholders and guide decision-making with data.
Awareness of data governance frameworks and compliance standards (e.g., GDPR, CCPA) with experience embedding controls in analytics workflows.
Experience coding in Python, particularly for automation, API integrations, or dashboard development.
Sports industry experience or passion for applying analytics in the sports domain.
Feel free to follow my company page (OTA Recruitment Limited) on LinkedIn for more on current and future vacancies