AWS Data Engineer with data warehouses, data lakes, and ETL

Overview

Remote
$75 - $80
Contract - W2
Contract - Independent
Contract - 12 Month(s)
No Travel Required

Skills

AWS
datalake
data warehouse

Job Details

AWS Data Engineer
Remote
Exp: 10+yrs
Candidate has to take glider test
At least 6-8 years of experience in working with data warehouses, data lakes, and ETL pipelines
Proven experience with building optimized data pipelines using Snowflake and dbt
Expert in orchestrating data pipelines using Apache Airflow, including authoring,scheduling, and monitoring workflows
Exposure to AWS and proficiency in cloud services such as EKS(Kubernetes), ECS, S3,RDS, IAM etc.
Experience designing and implementing CI/CD workflows using GitHub Actions, Codeship, Jenkins etc.
Experience with tools like Terraform, Docker, Kafka
Strong experience with Spark using Scala and Python
Advanced SQL knowledge, with experience in pulling complex queries, query authoring, and strong familiarity with Snowflake and various relational databases like Redshift, Postgres, etc.
Experience with data modeling and system design architecting scalable data platforms and applications for large enterprise clients.
A dedicated focus on building high-performance systems
Exposure to building data quality frameworks
Strong problem-solving and troubleshooting skills, with the ability to identify and resolve data engineering issues and system failures
Excellent communication skills, with the ability to communicate technical information to non-technical stakeholders and collaborate effectively with cross-functional teams
The ability to envision and construct scalable solutions that meet diverse needs for enterprise clients with dedicated data teams
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.