Snowflake Data Architect

  • Atlanta, GA
  • Posted 1 day ago | Updated moments ago

Overview

On Site
BASED ON EXPERIENCE
Full Time
Contract - Independent
Contract - W2
Contract - 5+ mo(s)

Skills

Analytics
Reporting
Migration
Data Warehouse
Storage
Collaboration
Data Governance
Regulatory Compliance
Innovation
Data Architecture
Data Modeling
Informatica
Extract
Transform
Load
ELT
SQL
Cloud Computing
Amazon Web Services
Data Security
Access Control
Data Masking
Performance Tuning
Optimization
Snow Flake Schema
Continuous Integration
Continuous Delivery
Workflow
Python
Scala
Data Processing
Orchestration
Real-time
Streaming
Agile
Scrum

Job Details

Role: Snowflake Data Architect
Work location: Atlanta, GA - onsite.
Rate: $100/hr

JOB DESCRIPTION
Key Responsibilities:
* Architect and implement scalable data solutions using Snowflake.
* Design and optimize data models (star, snowflake, normalized) to support analytics and reporting.
* Lead the migration of legacy data warehouses to Snowflake.
* Define and enforce best practices for data ingestion, transformation, and storage.
* Collaborate with data engineers, analysts, and business stakeholders to align data architecture with business goals.
* Implement security, data governance, and compliance standards.
* Monitor and optimize Snowflake performance and cost efficiency.
* Stay current with Snowflake features and industry trends to drive innovation.

Must-Have Skills:
* 5+ years of experience in data architecture or engineering roles.
* 3+ years of hands-on experience with Snowflake.
* Strong expertise in SQL and data modeling (dimensional and normalized).
* Experience with ETL/ELT tools (e.g., dbt, Matillion, Informatica).
* Strong understanding of building ETL / ELT workloads using snowproc and snowflake SQL
* Proficiency in cloud platforms (AWS).
* Knowledge of data security, role-based access control, and data masking in Snowflake.
* Experience with performance tuning and cost optimization in Snowflake.

Nice-to-Have Skills:
* Experience in the life
* Snowflake certification (SnowPro Core or Advanced Architect).
* Experience with CI/CD pipelines for data workflows.
* Familiarity with Python or Scala for data processing.
* Experience with orchestration tools (e.g., Airflow, Prefect).
* Knowledge of data cataloging tools (e.g., Alation, Collibra).
* Exposure to real-time data streaming (e.g., Snowpipe).
* Experience in Agile/Scrum environments.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Apolis