- Company Name
- TechShack
- Job Title
- Databricks Data Architect
- Job Description
-
**Job Title:** Databricks Data Architect
**Role Summary:**
Deliver data platform modernization and large‑scale architecture solutions on a 6‑month rolling contract. Provide technical leadership for data lakes, lakehouses, and enterprise data warehouses using Databricks and associated tooling.
**Expectations:**
- Lead design, development, and deployment of enterprise‑grade data architecture.
- Collaborate with stakeholders to translate business requirements into scalable technical solutions.
- Maintain technical excellence, code quality, and best practices in a hybrid environment.
**Key Responsibilities:**
- Architect and implement data lakes, lakehouses, and data warehouse solutions on Azure, AWS, and/or GCP.
- Design and build batch and streaming pipelines using PySpark, Python, SQL, Databricks, SSIS, ADF, Informatica, or DataStage.
- Set up CI/CD pipelines with Azure DevOps, Terraform, or AWS CodePipeline.
- Model enterprise data using ERwin, ER Studio, or Power Designer; integrate with SQL Server, PostgreSQL, Oracle, Redshift, Synapse, Snowflake, or Fabric.
- Implement data governance and MDM with Unity Catalog, Profisee, Alation, or DQ Pro.
- Enable BI reporting in Power BI, Tableau, or QlikView.
- Conduct architecture reviews, performance tuning, and capacity planning.
- Communicate architecture decisions to technical and non‑technical stakeholders.
**Required Skills:**
- 10+ years in BI/Data Warehousing, 4+ years in data/solution architecture.
- Expertise in Databricks, PySpark, Python, SQL, and ETL/ELT tools.
- Experience with batch/streaming pipelines and CI/CD (Azure DevOps, Terraform, CodePipeline).
- Cloud platform experience across Azure, AWS, and GCP.
- Enterprise data modeling (ERwin, ER Studio, Power Designer).
- Integration with relational and cloud databases (SQL Server, PostgreSQL, Oracle, Redshift, Synapse).
- Data governance and MDM knowledge (Unity Catalog, Profisee, Alation, DQ Pro).
- BI tool proficiency (Power BI, Tableau, QlikView).
- Strong communication, stakeholder management, and project delivery.
- Preferred sector knowledge in Insurance or Finance.
**Required Education & Certifications:**
- Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field.
- Relevant certifications (e.g., Databricks Certified Data Engineer, Azure Data Engineer Associate, AWS Certified Data Analytics, Big Data certifications) are highly desirable.