- Company Name
- Cedar
- Job Title
- Software Engineer III (Data Platform)
- Job Description
-
Job Title: Software Engineer III (Data Platform)
Role Summary: Design and build scalable, high‑quality data pipelines and platform services that enable client‑facing engineering, product integration, and data science teams, handling tens of millions of records daily while ensuring observability and performance.
Expectations: Deliver robust, maintainable code; participate in architecture and platform vision; collaborate cross‑functionally; operate effectively in a fast‑paced, dynamic environment with competing priorities.
Key Responsibilities:
- Evolve integration pipelines from legacy ETL scripts to a modern, scalable stack (dbt, Airflow, Snowflake).
- Engineer for scale, optimizing performance of daily data processing and ensuring data quality and observability.
- Develop a data platform to support current products and future experimentation.
- Partner with delivery engineering to improve integration quality, speed, and scalability.
- Apply engineering best practices: version control, automated unit/integration testing, CI/CD, structured logging, and observability.
Required Skills:
- ≥3 years software design and systems engineering experience with large, complex datasets.
- Fluency in Python; proficiency with data processing libraries.
- Strong SQL skills, experience with Postgres.
- Familiarity with data ecosystem components: storage, query, ETL orchestration, governance, analytics, visualization, AI/ML pipelines.
- Self‑starter initiative and strong communication.
- Comfortable with version control, automated testing, CI/CD, logging, and observability.
- Nice to have: experience with dbt, Snowflake, Airflow, Dagster, SQLAlchemy, Fivetran, Kafka; SQL query optimization; Kotlin, Go; ETL/ELT, Medallion Architecture, Data Mesh concepts.
Required Education & Certifications:
- Bachelor’s degree in Computer Science, Data Science, Engineering, or a related field. Certifications not mandatory.