Commerce Data Solutions builds and maintains best in class data products enabling business teams to analyze and measure subscriber movements and support revenue generation initiatives. The Senior Data Engineer will contribute to the Company's success by partnering with business, analytics and infrastructure teams to design and build data pipelines to facilitate measuring subscriber movements and metrics. Collaborating across disciplines, they will identify internal/external data sources, design table structure, define ETL strategy & automated Data Quality checks.
- Partner with technical and non-technical colleagues to understand data and reporting requirements.
- Work with engineering teams to collect required data from internal and external systems.
- Design table structures and define ETL pipelines to build performant data solutions that are reliable and scalable in a fast growing data ecosystem.
- Develop Data Quality checks
- Develop and maintain ETL routines using ETL and orchestration tools such as Airflow
- Implement database deployments using tools like Schema Change
- Perform ad hoc analysis as necessary.
- Perform SQL and ETL tuning as necessary.
- 5+ years of relevant data engineering experience.
- Strong understanding of data modeling principles including dimensional modeling, data normalization principles.
- Good understanding of SQL engines and able to conduct advanced performance tuning.
- Ability to think strategically, analyze and interpret market and consumer information.
- Strong communication skills - written and verbal presentations.
- Excellent conceptual and analytical reasoning competencies.
- Comfortable working in a fast‑paced and highly collaborative environment.
- Familiarity with Agile Scrum principles and ceremonies, 2+ years of work experience implementing and reporting on business key performance indicators in data warehousing environments, required.
- 2+ years of experience using analytic SQL, working with traditional relational databases and/or distributed systems (Snowflake or Redshift), required.
- 1+ years of experience programming languages (e.g. Python, Pyspark), preferred.
- 1+ years of experience with data orchestration/ETL tools (Airflow, Nifi), preferred.
- Experience with Snowflake, Databricks/EMR/Spark, and/or Airflow.
Required Education
- Bachelor's Degree in computer science, information systems, or related field or equivalent work experience.
The hiring range for this position in Santa Monica, CA is $138,900 – $186,200 per year, in San Francisco, CA is $152,100 – $203,900 per year, in Seattle, WA is $145,400 – $195,000 per year, and in New York, NY is $145,400 – $195,000 per year. The base pay actually offered will take into account internal equity and also may vary depending on the candidate's geographic region, job‑related knowledge, skills, and experience among other factors. A bonus and/or long‑term incentive units may be provided as part of the compensation package, in addition to the full range of medical, financial, and/or other benefits, dependent on the level and position offered.