Senior Snowflake Data Engineer
Location: Toronto (Hybrid – 200 Bay Street)
Engagement: One-year contract with potential for conversion to full-time
Openings: 1
​
Overview
We are seeking a Senior Snowflake Data Engineer to lead the design, development, and operation of enterprise-scale data platforms built on Snowflake. This role requires deep hands-on expertise with Snowflake-native features, modern ELT pipelines, cloud data architecture, and production-grade data engineering practices. You will play a key role in shaping platform standards, mentoring engineers, and evolving the Snowflake ecosystem.
​
Responsibilities
-
Lead the design and implementation of Snowflake-based data architectures, including schemas, data vault/house/star models, materialized views, and zero-copy cloning strategies.
-
Build, maintain, and optimize production ETL/ELT pipelines using Snowflake-native tools such as Snowpipe, Snowpark, Streams & Tasks, along with partner tools (e.g., dbt, Fivetran, Matillion, Airbyte, Streamsets).
-
Develop Snowflake-native utilities and applications using Snowpark for Python, UDFs, external functions, and internal tooling to accelerate development and data delivery.
-
Optimize query performance and cost through warehouse sizing, clustering keys, partitioning strategies, workload isolation, and resource monitoring.
-
Implement data governance, security, and access controls, including role-based access control, masking policies, object tagging, audit logging, and data lineage.
-
Automate infrastructure and deployments using Infrastructure as Code (IaC), CI/CD pipelines, and automated testing for SQL and Snowpark workloads.
-
Build operational observability tooling, including monitoring, alerting, usage and cost reporting, and incident response playbooks.
-
Mentor engineers, review technical designs, and contribute to roadmap decisions for Snowflake platform evolution.
Required Skills and Experience
-
Strong hands-on experience designing, deploying, and operating Snowflake in production environments.
-
Deep expertise with Snowflake features including Snowpark, Streams & Tasks, Snowpipe, Time Travel, cloning, materialized views, external functions, and UDFs.
-
Proven ETL/ELT development experience using SQL, dbt, and one or more ingestion tools (e.g., Fivetran, Matillion, Airbyte, Streamsets, Kafka connectors).
-
Strong proficiency in Python (Snowpark and connectors), SQL tuning, and performance optimization.
-
Experience with Infrastructure as Code and automation tools such as Terraform, GitHub Actions, Jenkins, or equivalent.
-
Solid knowledge of cloud platforms (AWS, Azure, or GCP) and their integration with Snowflake.
-
Strong understanding of modern data architecture patterns, including medallion architecture, data modeling, governance, and secure data sharing.
-
Demonstrated experience implementing CI/CD, automated testing, and production operational best practices for data workloads.
Preferred Qualifications
-
Snowflake SnowPro Core or advanced Snowflake certifications.
-
Experience with dbt (Core or Cloud) for modular, scalable SQL transformations.
-
Exposure to data catalogs, data lineage tools, or data virtualization technologies.
-
Familiarity with BI and analytics integrations such as Looker, Tableau, or Power BI, including Snowflake-optimized semantic layers.
-
Experience building internal developer tools or lightweight data applications using Snowpark or web frameworks.
