- Company Name
- VDart Digital
- Job Title
- Senior Snowflake Developer - Python Data Architect.
- Job Description
-
**Job Title**
Senior Snowflake Developer – Python Data Architect
**Role Summary**
Design, develop, and manage enterprise Snowflake data platforms that process petabyte‑scale analytics workloads. Architect data models, optimize query performance, and build production‑grade Python pipelines that deliver 99.99 % reliability and <15 min data freshness.
**Expectations**
- 8–12 years of data engineering experience, with 4+ years in Snowflake.
- Proven track record leading Snowflake architecture and Python data pipelines at scale.
- Strong hands‑on experience in DevOps, CI/CD, and data orchestration tools.
**Key Responsibilities**
- Design and maintain Snowflake data warehouse: multi‑warehouse sizing, clustering strategies, materialized views, and search optimization.
- Own Snowflake advanced features: Snowpark Python UDFs, streams/tasks, zero‑copy cloning, auto‑clustering, and secure sharing across accounts.
- Architect and build scalable ETL/ELT pipelines using Snowpark DataFrames, Python stored procedures, and external functions.
- Integrate Snowflake with cloud services (AWS S3/Glue, Azure Data Factory) and optimize Pandas/Arrow data movement.
- Implement automation for dynamic warehouse sizing, resource monitoring, query tagging, and cost controls.
- Validate data pipelines with Great Expectations, reconciliation logic, and enforce data governance (RBAC, dynamic masking, secure views, Time Travel).
- Drive performance tuning: achieve ≥100× query acceleration via clustering, materialized views, and caching.
- Collaborate with data science, analytics, and operations teams to ensure data availability and quality.
- Lead DevOps practices: Git branching, CI/CD (GitHub Actions), dbt deployments, Airflow DAGs, and Snowflake task orchestration.
**Required Skills**
- Snowflake: Virtual warehouses, Snowpark Python, streams/tasks, clustering, auto‑clustering, materialized views, Zero‑copy cloning, query acceleration.
- Python: Snowpark DataFrames, Pandas optimization, multiprocessing, logging, external function integration.
- SQL: Advanced window functions, lateral joins, query profiling, result caching.
- Cloud Platforms: AWS S3/Glue, Azure Data Factory, serverless functions.
- DevOps & Orchestration: Git, GitHub Actions, CI/CD pipelines, dbt, Airflow, Azure Data Factory.
- Monitoring & Validation: Great Expectations, resource monitors, performance dashboards.
- Security & Governance: RBAC, dynamic data masking, secure views, Time Travel compliance.
**Required Education & Certifications**
- Bachelor’s (or higher) degree in Computer Science, Software Engineering, Data Engineering, or related field.
- Snowflake SnowPro Advanced certification highly preferred.
- Experience with cloud‑native data tools (AWS/GCP/Azure) and data orchestration frameworks.