Overview
On Site
Depends on Experience
Accepts corp to corp applications
Contract - W2
Contract - 1 Month(s)
Skills
Amazon S3
Amazon Web Services
Analytical Skill
Apache Parquet
Business Intelligence
Cloud Computing
Collaboration
Computer Science
Job Details
We are looking for AWS Data Engineer for our client in Fort Mill, SC
Job Title: AWS Data Engineer
Job Type: Contract
Job Description:
Pay Range: $55hr - $60hr
- The AWS Data Engineer will design, build, and maintain scalable data pipelines and infrastructure on AWS, leveraging Terraform, Glue, Airflow, and SQL to support ingestion, transformation, and curation of enterprise data.
- 6 years of experience as a Data Engineer or Cloud Engineer.
- Strong expertise in AWS services including S3, Glue, Glue Catalog, Lake Formation, IAM, Athena, CloudWatch, and Lambda.
- Hands-on proficiency in Terraform (HCL) for infrastructure automation.
- Experience developing Airflow DAGs for orchestration of Glue, S3, and external data flows.
- Strong proficiency in PySpark and Python for ETL scripting.
- Ability to write and optimize complex SQL including joins, window functions, CTEs, and analytical queries.
- Familiarity with data lake formats such as Iceberg, Parquet, and Delta.
- Experience with CI/CD pipelines using GitHub Actions, CodePipeline, or Jenkins.
- Experience with AWS data engineering, ETL development, and metadata management.
- Experience supporting batch and near real-time data ingestion pipelines.
- Design, develop, and deploy AWS infrastructure using Terraform including S3, Glue, IAM, Lake Formation, and Athena resources.
- Develop and maintain AWS Glue ETL jobs using PySpark or Python Shell for ingestion, transformation, and curation across data layers.
- Integrate Airflow for orchestration of Glue jobs, pipelines, and task dependencies.
- Build and maintain Glue Catalog and manage metadata aligned with Lake Formation policies.
- Write complex SQL queries for data validation, transformation, and reporting.
- Manage Terraform state files, backend setup, and environment-based deployments.
- Implement ingestion frameworks for batch and near real-time data pipelines.
- Collaborate with Snowflake and BI teams for downstream data consumption.
- Contribute to high availability and disaster recovery strategies for core data components.
- Familiarity with AWS security best practices including encryption, IAM roles, and cross-account access.
- Data engineering and ETL development skills.
- AWS cloud engineering and Terraform automation skills.
- SQL optimization skills.
- Airflow orchestration experience.
- Bachelor's degree in computer science, information technology, engineering, or related field preferred.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.