Overview
On Site
Depends on Experience
Accepts corp to corp applications
Contract - Independent
Contract - W2
Skills
ETL workflows and data pipelines using Snowflake and AWS services.
APIs
databases
and flat files.
Snowflake
AWS servicessuch as S3
Lambda
Glue
Redshift
and CloudWatch.
Python or Scala for data processing.
Apache Airflow or AWS Step Functions.
Job Details
JOB DESCRIPTION
POSITION: ETL DEVELOPER
DURATION: Long term
Location: CONNECTICUT, STAMFORD - Onsite
Job Description
DURATION: Long term
Location: CONNECTICUT, STAMFORD - Onsite
Job Description
Roles & Responsibilities
- Design, develop, and optimize ETL workflows and data pipelines using Snowflake and AWS services.
- Implement data ingestion from various sources including APIs, databases, and flat files.
- Ensure data quality, integrity, and consistency across all ETL processes.
- Collaborate with data architects, analysts, and business stakeholders to understand data requirements.
- Monitor and troubleshoot ETL jobs and performance issues.
- Automate data workflows and implement CI/CD practices for data pipeline deployment.
- Maintain documentation for ETL processes, data models, and data flow diagrams.
- Bachelor s degree in computer science, Information Systems, or related field.
- 12+ years of experience in ETL development and data engineering.
- Hands-on experience with Snowflake including data modeling, performance tuning, and SQL scripting.
- Proficiency in AWS services such as S3, Lambda, Glue, Redshift, and CloudWatch.
- Strong programming skills in Python or Scala for data processing.
- Experience with orchestration tools like Apache Airflow or AWS Step Functions.
- Familiarity with version control systems (e.g., Git) and CI/CD pipelines.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.