Overview
On Site
Depends on Experience
Contract - W2
Contract - 12 Month(s)
100% Travel
Skills
Agile
Amazon Redshift
Amazon S3
Amazon Web Services
Analytical Skill
Apache Airflow
Cloud Computing
Collaboration
Communication
Computer Science
Continuous Delivery
Continuous Integration
Data Engineering
Data Governance
Data Integration
Data Modeling
Data Processing
Data Quality
Data Visualization
Data Warehouse
Database
Decision-making
Documentation
Extract
Transform
Load
Flat File
Git
Information Systems
Microsoft Power BI
Orchestration
Performance Tuning
Python
Regulatory Compliance
SQL
Scala
Scripting
Scrum
Snow Flake Schema
Step-Functions
Tableau
Version Control
Workflow
Job Details
Position: ETL Developer (Snowflake & AWS)
Location: Stamford, CT (Day 1 On-site)
Mode: Contract (W2)
Job Summary:
We are seeking a highly skilled ETL Developer with strong expertise in Snowflake and AWS to design, develop, and maintain scalable data pipelines and ETL workflows. The ideal candidate will have hands-on experience in cloud-based data warehousing, data integration, and data transformation, enabling data-driven decision-making across the organization.
Key Responsibilities:
- Design, develop, and optimize ETL workflows and data pipelines using Snowflake and AWS services.
- Build data ingestion processes from multiple sources including APIs, databases, and flat files.
- Ensure data quality, integrity, and consistency across all ETL layers.
- Collaborate closely with data architects, business analysts, and stakeholders to gather and refine data requirements.
- Monitor, troubleshoot, and improve ETL job performance.
- Implement automation and CI/CD practices for ETL deployments.
- Maintain clear documentation for ETL logic, data models, and workflows.
Required Skills & Qualifications:
- Bachelor s degree in Computer Science, Information Systems, or a related field.
- 9+ years of experience in ETL development and data engineering.
- Proven hands-on experience with Snowflake data modeling, performance tuning, and advanced SQL scripting.
- Strong proficiency with AWS services S3, Glue, Lambda, Redshift, CloudWatch, etc.
- Programming experience in Python or Scala for data processing.
- Experience with orchestration tools such as Apache Airflow or AWS Step Functions.
- Familiarity with Git, version control, and CI/CD pipelines.
- Excellent analytical, troubleshooting, and communication skills.
- Snowflake Certification (SnowPro Core or Advanced Architect).
- Experience with data visualization tools Tableau, Power BI, or QuickSight.
- Knowledge of data governance, security, and compliance practices.
- Experience working in Agile/Scrum development environments.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.