- Company Name
- Sia
- Job Title
- Senior Data Engineer Consultant
- Job Description
-
Job title: Senior Data Engineer Consultant
Role Summary: Lead client-facing data engineering engagements, co‑design scalable data architectures, and deliver end‑to‑end solutions from strategy to execution. Drive internal offerings, thought leadership, and business development while mentoring junior consultants.
Expectations:
- Own end‑to‑end delivery of data engineering projects, acting as trusted advisor to senior stakeholders.
- Advance the firm’s data engineering portfolio through research, training, and publications.
- Represent the firm at industry events and contribute to external thought leadership.
- Develop proposals, identify partnership opportunities, and nurture long‑term client relationships.
- Mentor and coach junior‑to‑mid level consultants, reviewing technical deliverables and supporting career growth.
Key Responsibilities:
- Design robust, scalable data pipelines and architectures for structured and unstructured data.
- Build and maintain data solutions on cloud platforms (AWS, Azure, GCP) and distributed frameworks (Spark, Kafka).
- Develop logical and physical data models; optimize relational, NoSQL, time‑series, and graph databases.
- Implement batch, streaming, and real‑time data integration best practices across diverse data sources.
- Apply CI/CD and DataOps practices to ensure continuous delivery, version control, and automation of data workflows.
- Deliver training, workshops, and internal knowledge‑sharing sessions on data engineering topics.
- Author articles, white papers, and webinars to position the firm as a data engineering thought leader.
- Draft proposals and support new business acquisition through technical expertise and relationship management.
- Supervise project teams, conduct code reviews, and provide constructive feedback to junior consultants.
Required Skills:
- Strong background in data architecture, ETL/ELT pipeline design, and data modeling.
- Proficiency with cloud services (AWS, Azure, GCP), Databricks, Snowflake, Spark, Kafka, and other distributed frameworks.
- Experience with relational, NoSQL, time‑series, and graph database technologies.
- Familiarity with batch, streaming, and real‑time data integration patterns.
- Knowledge of CI/CD tooling, DataOps concepts, version control, and automation.
- Excellent stakeholder management, communication, and presentation skills.
- Ability to mentor and develop less experienced team members.
- Fluency in English; fluency in Dutch or French required.
Required Education & Certifications:
- Master’s degree (or equivalent) in Data Science, Econometrics, Applied Mathematics, Computer Science, Engineering, or a related technical field with substantial programming emphasis.
- 3–6 years of relevant professional experience in data engineering, preferably within Energy or Finance sectors.
- Relevant certifications (e.g., AWS Certified Solutions Architect, Google Cloud Professional Data Engineer, Databricks Certified Data Engineer) are a plus.