Snowflake Data Engineer (with Terraform)
Location Washington , DC and NYC
Hybrid role
Role summary
We are seeking a highly skilled Snowflake Data Engineer with strong expertise in Terraform to design, build, and manage scalable data workloads on Snowflake. The ideal candidate will be responsible for automating infrastructure deployment, orchestrating data pipelines, and ensuring efficient and secure data operations in a cloud environment.
Key responsibilities
- Design and implement Snowflake architecture components using Terraform modules, including accounts, databases, schemas, virtual warehouses, roles, users, grants, stages, pipes, tasks, and streams
- Develop reusable, versioned Terraform modules and maintain remote state backend and locking (S3 / Azure / GCS + state locking)
- Integrate Terraform workflows into CI / CD pipelines (GitHub, GitLab CI, Jenkins, etc.) to enable automated plan / apply and PR-based change control
- Automate deployment of Snowflake TASK objects (scheduled and stream processing) and ensure safe migration strategies for production workloads
- Implement security controls using least-privilege RBAC, object-level grants, and secrets management (HashiCorp Vault or cloud secret stores)
- Collaborate with data engineering teams to onboard pipelines (Snowpipe, ingestion stages, external tables) and ensure Terraform models match runtime needs
- Monitor, tune, and cost-optimize Snowflake compute usage and storage; implement resource monitors and alerting
Suggested qualifications (optional)
Strong hands-on experience with Snowflake platform internals and best practicesProven experience designing and implementing Terraform modules for Snowflake and cloud resourcesFamiliarity with Git-based CI / CD workflows and automated infrastructure testingExperience with secrets management solutions (HashiCorp Vault, AWS Secrets Manager, Azure Key Vault, GCP Secret Manager)Good understanding of data ingestion patterns (Snowpipe, external tables, streaming) and production migration strategies