Senior Data Engineer (Data Quality & Governance)

Job ID: 74610

Posted today

Toronto, Ontario

 

50 - 65/hr

Toronto, Ontario

Contract

50 - 65/hr

Remote

Job Details

SENIOR DATA ENGINEER - DATA QUALITY & GOVERNANCE (Canada)

This role supports the expansion of a newly established Data Centre of Excellence within the CTO organization as the company centralizes data ownership across Core, RAN, Transport, and Network Operations on Google Cloud Platform. The need is driven by a growing skillset gap as the organization shifts toward a quality-first data engineering model and scales enterprise data platforms.

The Data Engineer will act as a quality gate for GCP-based data pipelines, embedding automated data validation, monitoring, and governance into data workflows to ensure reliable, audit-ready, and trusted data. The ideal candidate is a senior-level data engineer with strong hands-on GCP experience who is passionate about data quality, pipeline reliability, and operational excellence. This individual is comfortable working across engineering, architecture, and security teams, proactively identifying issues before they impact downstream consumers, and continuously improving data pipelines through root cause analysis and infrastructure-as-code practices.

???????
Must Have Skills
  • Senior-level experience as a Data Engineer or Data Platform Engineer in cloud environments

  • Strong hands-on experience with Google Cloud Platform, including BigQuery and Cloud Storage

  • Experience building and operating batch and streaming pipelines using GCP-native services (e.g., Dataflow, Composer, Pub/Sub)

  • Familiarity with GCP data governance and metadata tooling (e.g., Dataplex or equivalent)

  • Experience with Infrastructure as Code (Terraform or similar) in compliant environments

  • Strong SQL and Python skills for data validation, automation, and tooling

  • Experience with data quality, observability, monitoring, and incident response

  • Experience working with containerized workloads and/or Kubernetes


Nice to Have

  • Telecom or network data experience

  • Exposure to AI/ML use cases, such as anomaly detection or intelligent alerting

  • Experience in regulated or large enterprise environments

  • GCP or AWS certifications


Key Responsibilities

  • Design, build, and operate reliable batch and streaming data pipelines on GCP

  • Embed automated data validation, quality checks, and monitoring into data workflows

  • Implement and support data governance, metadata management, and lineage practices

  • Ensure pipelines meet enterprise standards for quality, performance, security, and compliance

  • Proactively identify data and pipeline issues before they impact downstream consumers

  • Perform root cause analysis and implement improvements using Infrastructure as Code

  • Partner closely with architects, cloud engineers, security, and platform teams

  • Support audit readiness through documentation, controls, and observability


    TSG is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.
    #LI-KG1

Share This Job

Similar Jobs