Responsibilities:Design, develop and operationalize data pipelines for batch, streaming and event-based processingCreate and review conceptual, logical, and physical data models for transactional and analytical use casesDesign and implement data processes to support DW, ODS, Data Lake, Lakehouse architectural patternsUse open-source and proprietary technologies on the cloud to solve for common data engineering problemsPerform data analysis to solve for complex and unique business problemsEvaluate data quality for existing use cases and implement improvementsConduct quality assurance (QA) to ensure reliability and accuracy of solutions before production releasesTroubleshoot and resolve production issues related to the data platform in a timely mannerAdapt and stay agile in high-paced value-add driven environmentProvide technical support to junior engineers collaborating on your projectBusiness Knowledge / Technical Skills:Data Structures & AlgorithmsExperience with database solutions (OLAP and OLTP)Knowledge of web application deployment processes and application architectureAbility to establish and follow SDLCProficiency in cloud technologies (Azure, AWS, or GCP) and their data-related servicesUnderstanding of Horizontal and Vertical scalingSome experience with IAC (Infrastructure as Code), ARM templates, Bicep, Terraform or AWS CloudFormation is a plusExcellent interpersonal and communication skills with the ability to work effectively across all levels of the organizationWorking knowledge of project management fundamentals, including agile and continuous improvement methodologiesExperience with cloud cost optimizationKnowledge of modern technology platforms, tools, and techniques (i.E., Cloud, Machine Learning, IoT/IIoT, etc.)Understanding and experience with data security best practices (encryption, tokenization, masking)Basic knowledge of regulatory and compliance policies for data management (CCPA, GDPR, PCI, PII, HIPPA etc.)Error catching and handling in batch and streamingProficient in one of the following languages: Python, C#, Java, JavaScriptProficient in SQLExperience with warehousing solutions like: Snowflake, BigQuery, Redshift, SynapseProficiency in effective code management, collaboration and version control: GITAdequate level of knowledge and experience with some of the following: API, YAML, Kafka, Airflow, JSON, AVRO, ParquetProfessional Experience & Education:7+ years of experience in data engineeringSTEM, Finance, or Economics degree preferred, Masters degree bonusRelevant certification from Azure, AWS, GCP, IBMCross industry exposure and experience preferredOther Requirements:Willingness to work in ET time zones if requiredComfortable working from various locations including office or at homeAbout Cerberus: Established in 1992, Cerberus Capital Management, L.P., together with its affiliates, is one of the world's leading private investment firms. Through its team of investment and operations professionals, Cerberus specializes in providing both financial resources and operational expertise to help transform undervalued and underperforming companies into industry leaders for long-term success and value creation. Cerberus holds controlling or significant minority interests in companies around the world.The Firm’s proprietary operations team, Cerberus Operations and Advisory Company, LLC (COAC), employs world-class operating executives to support Cerberus’ investment teams in the following areas: sourcing opportunities, conducting highly informed due diligence, taking interim management roles, monitoring the performance of investments and assisting in the planning and implementation of operational improvement initiatives at Cerberus’ portfolio companies.Cerberus Technology Solutions is an operating company and subsidiary of Cerberus Capital Management focused exclusively on leveraging emerging technology, data, and advanced analytics to drive transformations. Our expert technologists work closely with Cerberus investment and operating professionals across our global businesses and platforms on a variety of operating initiatives targeted at improving systems and generating value from data.