- Company Name
- Swapcard
- Job Title
- Senior Data Engineer - Remote
- Job Description
-
Job Title: Senior Data Engineer – Remote
Role Summary:
Design, build, and maintain production-grade data pipelines and transformations that support large-scale analytics and reporting. Manage cloud-based data warehousing, enforce engineering standards, and collaborate with product, analytics, and infrastructure teams to deliver reliable, high-quality data solutions.
Expectations:
- Deliver production code 5+ years experience.
- Lead end-to-end pipeline design and ongoing optimization.
- Maintain data quality and reliability in a remote, cross-functional environment.
- Communicate status, risks, and improvements to stakeholders promptly.
Key Responsibilities:
1. Construct efficient data ingestion pipelines from internal databases and SaaS sources via Fivetran, Airbyte, or custom Python scripts.
2. Develop, test, and deploy transformation models using Redshift, dbt, and Dagster.
3. Monitor and troubleshoot the cloud data warehouse (Redshift) to ensure performance, availability, and data integrity.
4. Establish and uphold data engineering best practices, coding standards, and documentation.
5. Collaborate with analytics, product, and infrastructure teams to address data needs and provide architectural guidance.
6. Maintain comprehensive documentation of architecture, processes, and operational procedures.
Required Skills:
- Strong SQL expertise with deep understanding of data warehouse concepts.
- 2+ years hands‑on experience delivering dbt transformation pipelines.
- Proven experience in building and tuning ETL/ELT pipelines.
- Proficient in Python for data ingestion, transformation, and automation.
- Familiarity with workflow orchestrators (Airflow, Dagster, Prefect, or similar).
- 4+ years experience operating a cloud data warehouse (AWS Redshift).
- Excellent problem‑solving and debugging capabilities.
- Fluent in English (written and spoken).
Bonus Skills:
- Additional experience with AWS services beyond Redshift (e.g., S3, Glue, Lambda).
- Practical knowledge of Dagster beyond theoretical use.
- Data visualization tools such as Tableau.
- Infrastructure tooling (Docker, Terraform, Helm).
Required Education & Certifications:
- Bachelor’s degree in Computer Science, Data Engineering, Statistics, or related field.
- AWS Certified Solutions Architect, Data Analytics, or similar certifications are a plus.