cover image
Insight Global

Data Engineer

Hybrid

Toronto, Canada

Freelance

04-02-2026

Share this job:

Skills

Communication Leadership Adaptability Critical Thinking Python SQL Big Data CI/CD DevOps Docker Kubernetes Effective Communication Machine Learning PyTorch TensorFlow Analytical Skills Azure AWS cloud platforms Analytics Snowflake Data Science Artificial Intelligence Hadoop Spark Kafka

Job Specifications

Required Skills & Experience

Qualifications

Expertise in big data technologies (Hadoop, Spark, Kafka)

Advanced SQL proficiency for investigation and optimization

Strong Python development skills

Experience with cloud platforms (AWS or Azure)

Knowledge of Snowflake and dimensional modeling

Experience with DevOps tooling (CI/CD, Docker, Kubernetes)

Familiarity with ML frameworks (TensorFlow, PyTorch)

Strong debugging and analytical skills

Behavioral Competencies

Leadership and mentorship

Effective communication with technical and non‑technical audiences

Problem‑solving and critical thinking

Ownership and accountability

Adaptability and continuous learning

Job Description

The Big Data Engineer is responsible for architecting, developing, and maintaining enterprise‑scale data platforms that support analytics, operational reporting, and machine learning initiatives. This role requires deep technical expertise in distributed systems, cloud platforms, and data modeling, combined with strong communication and leadership capabilities.

Responsibilities

Design and implement large‑scale ETL/ELT pipelines using Python, Spark, and distributed processing frameworks

Develop and maintain big data infrastructure leveraging Hadoop, Spark, Kafka, and Kafka Streams

Architect cloud‑native data solutions on AWS or Azure, including serverless components

Build and optimize Snowflake data warehouses using dimensional modeling best practices

Conduct data discovery and source analysis to support new integrations and transformations

Model complex datasets using normalized, denormalized, star, and snowflake schemas

Integrate external systems through RESTful APIs and automated ingestion frameworks

Implement DevOps practices including CI/CD, containerization, and orchestration

Develop real‑time streaming applications using Kafka, Storm, Kinesis, or Pub/Sub

Operationalize machine learning models in collaboration with data science teams

Optimize performance across queries, pipelines, and distributed workloads

Troubleshoot and resolve complex data issues across upstream and downstream systems

Maintain comprehensive documentation for data pipelines, models, and integrations

We may use artificial intelligence tools to assist with the screening, assessment, or selection of potential applicants for this position.

About the Company

Insight Global is an international professional services and staffing company specializing in delivering talent and technical solutions to Fortune 1000 companies across the IT, Non-IT, Healthcare, and Engineering industries. Fueled by staffing and talent experts, Evergreen, our professional services brand, brings technical advisors and culture consultants to help customers tackle their biggest challenges. With over 70 locations across North America, Europe, and Asia, and global staffing capabilities in 50+ countries, our tea... Know more