Snowflake Tech Lead / 53196-1
Dallas, TX
6 months
Onsite Work
Role Descriptions:
"Architecture & Design
• Design and implement end-to-end Snowflake-based data architectures for analytics, reporting, and advanced data use cases
• Define data modeling strategies (dimensional, data vault, and analytical models) optimized for Snowflake
• Establish standards for data ingestion, transformation, storage, and consumption
Snowflake Platform Management
• Architect and manage Snowflake features including Warehouses, Databases, Schemas, Cloning, Time Travel, Secure Data Sharing, Data Clean Rooms and Resource Monitoring
• Optimize performance and cost using warehouse sizing, clustering, caching, and query optimization
• Implement security best practices including RBAC, masking policies, row access policies, and data governance
Data Transformation & ETL/ELT
• Lead ELT pipeline development using DBT (models, macros, tests, documentation, and deployments)
• Design and implement ETL/ELT pipelines using cloud-native Snowpark and third-party tools. Implement Real time streaming and Batch data Processing.
• Ensure data quality, lineage, and observability across pipelines
Cloud & Big Data Integration
• Architect solutions leveraging cloud data services (AWS, Azure, or Google Cloud Platform) such as object storage, messaging, and orchestration services
• Integrate Apache Spark (Databricks or equivalent) for large-scale data processing and advanced transformations
• Support hybrid and multi-cloud data architectures
Development & Automation
• Develop data processing and automation solutions using Python
• Build reusable frameworks for ingestion, transformation, validation, and monitoring
• Implement CI/CD pipelines for data workloads and DBT, Snowpark deployments
Leadership & Collaboration
• Partner with business stakeholders, analytics, and data science teams to translate requirements into scalable solutions
• Mentor data engineers and analysts on Snowflake, DBT, Snowpark and data engineering best practices
• Provide architectural guidance, documentation, and design reviews"
• Strong hands-on experience with Snowflake architecture and performance tuning
• Expertise in DBT (models, testing, macros, documentation, environments)
• Solid experience with ETL/ELT frameworks and data integration patterns
• Proficiency in Python for data engineering and automation
• Experience with Snowpark Implementation
• Strong knowledge of cloud data services (AWS, Azure, or Google Cloud Platform)
• Advanced SQL and data modeling skills"
Skills: Digital : Amazon Web Service(AWS) Cloud Computing~Digital : Snowflake~Digital : PySpark
Experience Required: 8-10