cover image
York Digital Consulting Inc.

Data Pipeline Architect, SIEM & Observability

Hybrid

Toronto, Canada

Freelance

14-10-2025

Share this job:

Skills

Apache Airflow Encryption Splunk CI/CD Monitoring Version Control Azure Data Factory Architecture Data Architecture apache Azure AWS Terraform Infrastructure as Code

Job Specifications

Job Title: Data Pipeline Architect, SIEM and Observability

Location: Toronto, Occasional Hybrid

Duration: 4-5 months contract, possible extensions.

Position Summary

Our customer is seeking a Data Pipeline and Observability Specialist to support the integration and optimization of data ingestion processes using a Data Pipeline Management Platform. The specialist will work with cross-functional technical and business teams to reduce Splunk license costs, archive data to AWS S3 buckets, and develop standardized ingestion templates and transformation models.

Key Responsibilities

Design and support the implementation of data pipelines using Data Pipeline Management Platforms such as DataBahn or Cribl.
Create filtering rules to prevent non-SIEM data from being ingested into Splunk.
Develop additional data pipelines to enable new data-driven opportunities or to streamline existing processes.
Create standard templates and transformation models for ingesting billing and consumption data provided by external service providers.
Evaluate archival storage options and retention policies using AWS S3 and Splunk SmartStore.
Enable Observability Data Pipelines for tools such as Dynatrace and SolarWinds.
Establish governance and controls for data ingestion, retention, and management use.
Facilitate knowledge transfer through continuous collaboration with customer resources.

Qualifications

Architectural Design & Strategy

Experience designing scalable, fault-tolerant, and distributed data pipelines across hybrid cloud environments.
Ability to define data architecture blueprints, including ingestion, transformation, storage, and access layers.
Expertise in data mesh or data fabric concepts for decentralized data ownership and governance.
Proven ability to integrate heterogeneous data sources (e.g., SaaS, on-prem, cloud-native) into unified pipelines.
Experience with API-based data ingestion and streaming data integration.
Experience consulting with business leaders and technical SMEs in the finance industry to:
Discover new data-driven opportunities.
Streamline existing data-driven processes.
Support existing data-driven initiatives to deliver more value or enable additional outcomes.

Security & Compliance

Experience implementing data encryption, masking, and anonymization techniques within pipelines.
Experience implementing role-based access controls (RBAC) and audit logging for data pipeline components.
Familiarity with compliance frameworks (e.g., SOC 2, ISO 27001) and how they impact data architecture.

Performance & Optimization

Ability to optimize pipeline performance through parallel processing, batch vs. stream trade-offs, and resource tuning.
Experience with monitoring and alerting for pipeline health using observability platforms.

Tooling & Automation

Experience with Infrastructure as Code (IaC) tools like Terraform or CloudFormation for pipeline deployment.
Familiarity with workflow orchestration tools such as Apache Airflow, Prefect, or Azure Data Factory.
Knowledge of version control and CI/CD practices for data pipeline lifecycle management.

About the Company

Drawing upon decades of experience in software development, support, operations, and testing, York Digital specializes in accelerating digital system modernization. We work with companies to enhance business agility and deliver maximum value while minimizing costs. Know more