Lead and mentor a team of platform and data engineers responsible for building and maintaining shared data and visualization platforms
Define and drive the platform roadmap across Kubernetes-based services, Snowflake data environments, and dbt-based transformation layers
Architect, implement, and oversee Kubernetes clusters supporting data services, microservices, and visualization backends, including autoscaling, observability, and security best practices
Own the Snowflake analytics platform, including warehouse strategy, schema design, performance optimization, cost governance, and role-based access control
Implement and maintain dbt Labs (dbt Core / dbt Cloud) projects, standards, and development workflows to ensure reliable, version-controlled data transformations
Partner with Analytics and BI teams to support visualization platforms (Tableau, Power BI, Looker, and custom dashboards), ensuring fast, secure, and reliable access to curated data models
Establish and enforce engineering standards for CI/CD, infrastructure as code, testing, monitoring, and incident response
Collaborate with security, compliance, and data governance teams to implement controls such as RBAC, row- and column-level security, secrets management, and audit logging
Translate business and product requirements into scalable platform capabilities and reusable engineering patterns
Track, measure, and optimize platform reliability and performance metrics (SLIs/SLOs), including pipeline SLAs, query performance, and dashboard response times
Manage vendor relationships and licensing for tools such as Snowflake, dbt, BI platforms, and Kubernetes ecosystem tooling
Drive continuous improvements in developer experience, including self-service onboarding, documentation, templates, and golden paths
8+ years of experience in software or data engineering, with 3+ years in a technical leadership or engineering management role
Strong, hands-on experience with Kubernetes (EKS, AKS, GKE, or on-prem), including deployment, scaling, networking, and observability
Proven experience designing and operating Snowflake as a centralized analytics and data warehouse platform
Practical experience implementing and managing dbt Labs (dbt Core / dbt Cloud), including best practices for modularity, testing, and documentation
Solid understanding of BI and visualization tools (Tableau, Power BI, Looker, or similar) and how to design data models for reporting and dashboard use cases
Strong knowledge of modern data engineering practices, including ELT/ETL, batch and streaming pipelines, data quality, and data observability
Proficiency in at least one programming language (e.g., Python, SQL) and Git-based development workflows
Experience with CI/CD pipelines and infrastructure-as-code tools such as Terraform, Helm, Argo CD, GitHub Actions, Azure DevOps, or Jenkins
Demonstrated ability to lead, mentor, and grow engineering teams, including goal setting, performance management, and fostering a culture of ownership
Excellent communication and stakeholder management skills, with the ability to clearly articulate technical concepts and business value