Sr ETL Architect (Snowflake & AWS)

Overview

On Site
Depends on Experience
Accepts corp to corp applications
Contract - W2
Contract - Independent
Contract - 6 Month(s)

Skills

ETL Architect
Snowflake and AWS Cloud ecosystem to design
ETL architecture
frameworks
Snowflake-based data warehouse solutions including schema design
performance tuning
query optimization.
migration of existing data platforms to AWS cloud-native solutions.
Develop robust data pipelines using ETL/ELT tools
SQL
and Python/Scala
Integrate data from diverse sources (databases APIs streaming platforms flat files SaaS applications).
Implement data governance
metadata management
automation of data workflows
monitoring
and error handling.
technical leadership
mentoring
and architectural guidance to engineering teams.
data engineering
ETL development
or data architecture.
Snowflake (Data Modeling Virtual Warehouses Query Performance Tuning Security & Access Control).
AWS services such as S3 Redshift Glue Lambda EC2 RDS CloudFormation and IAM.
ETL/ELT design and implementation using tools like Informatica Talend Matillion AWS Glue or equivalent.
Strong programming and scripting skills (SQL Python Shell Scala or Java).
Solid understanding of data lake and data warehouse architectures.
Streaming technologies (Kafka Kinesis Spark Streaming) is a plus.
DevOps
CI/CD pipelines
and Infrastructure as Code (IaC) for data deployments.
Snowflake certification (SnowPro Architect/Advanced).
modern data orchestration frameworks (Airflow dbt Step Functions).
Background in big data ecosystems (Hadoop Spark)

Job Details

Hi,

Greetings from DIA SOFTWARE SOLUTIONS LLC!

We reaching out about an exciting Direct client opportunity with one of our clients. Please review the requirements and let me know if you are interested in this position?

Direct client Req:: Need Sr ETL Architect (Snowflake & AWS) , Onsite

PLEASE SEND THE RESUMES TO SKUMAR AT DIASOFTWARESOLUTIONS DOT COM !

Job Description:

About the Role
We are seeking an experienced ETL Architect with strong expertise in Snowflake and AWS Cloud ecosystem to design, develop, and optimize data integration solutions. The ideal candidate will be responsible for architecting scalable ETL pipelines, enabling efficient data movement, transformation, and integration across enterprise systems to support business intelligence, analytics, and advanced data initiatives.

Key Responsibilities
Design and implement ETL architecture, frameworks, and best practices for large-scale data integration projects.
Architect and optimize Snowflake-based data warehouse solutions including schema design, performance tuning, and query optimization.
Lead the migration of existing data platforms to AWS cloud-native solutions.
Develop robust data pipelines using ETL/ELT tools, SQL, and Python/Scala.
Integrate data from diverse sources (databases, APIs, streaming platforms, flat files, SaaS applications).
Implement data governance, metadata management, and security best practices across data pipelines.
Collaborate with data engineers, analysts, and business stakeholders to ensure high-quality, reliable data delivery.
Drive automation of data workflows, monitoring, and error handling.
Provide technical leadership, mentoring, and architectural guidance to engineering teams.
Required Skills & Qualifications
10+ years of experience in data engineering, ETL development, or data architecture.
Proven expertise with Snowflake (Data Modeling, Virtual Warehouses, Query Performance Tuning, Security & Access Control).
Strong knowledge of AWS services such as S3, Redshift, Glue, Lambda, EC2, RDS, CloudFormation, and IAM.
Hands-on experience with ETL/ELT design and implementation using tools like Informatica, Talend, Matillion, AWS Glue, or equivalent.
Strong programming and scripting skills (SQL, Python, Shell, Scala or Java).
Solid understanding of data lake and data warehouse architectures.
Experience with streaming technologies (Kafka, Kinesis, Spark Streaming) is a plus.
Familiarity with DevOps, CI/CD pipelines, and Infrastructure as Code (IaC) for data deployments.
Excellent problem-solving, communication, and leadership skills.
Preferred Qualifications
Snowflake certification (SnowPro Architect/Advanced).
Experience with modern data orchestration frameworks (Airflow, dbt, Step Functions).
Background in big data ecosystems (Hadoop, Spark)

DIA SOFTWARE SOLUTIONS LLC.

Austin, TX 78727| Direct:

DIA SOFTWARE SOLUTIONS is an Affirmative Action/Equal Opportunity Employer that supports workplace diversity. All employment decisions are made without regard to race, color, religion, sex, national origin, age, disability, veteran status, marital or family status, sexual orientation, gender identity, or genetic information. All Diasoft staff must be able to demonstrate the legal right to work in the United States. DIA SOFTWARE SOLUTIONS is an E-Verify employer

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Dia Software Solutions