AWS Data Engineer

Overview

On Site
$60 - $70
Contract - W2
Contract - 6 Month(s)
No Travel Required

Skills

Data Engineering
Collaboration
Communication
Continuous Delivery
Continuous Integration
Apache HTTP Server
Apache Spark
Attention To Detail
Business Intelligence
Agile
Amazon Redshift
Amazon S3
Amazon Web Services
Analytics
PySpark
Python
Regulatory Compliance
Scalability
Soft Skills
Spectrum
Extract
Transform
Load
IT Management
Mentorship
Performance Tuning
Documentation
Data Governance
Data Quality
Data Validation
Decision-making
DevOps
Electronic Health Record (EHR)
Employment Authorization
GC
Workflow

Job Details

Must be able to work on PVs W2

Looking for AWS Data Engineer(PV W2)

Must be able to go Onsite for 4 Days Onsite/Week in Torrance, California

Must be able to go Onsite for the F2F Interview in Torrance, California

Job Description -

THIS IS PVW2 ROLE

IN PERSON INTERVIEW REQUIRED

AWS Data Engineer
Location: Torrance, CA (Hybrid 4 Days Onsite per Week)
Duration : 6+ month
Work Authorization: [, ]

? Mandatory Skills Required:


5+ years of experience with AWS Glue or AWS EMR building data pipelines

4+ years working with Redshift and Redshift Spectrum

1+ year of hands-on experience with Apache Iceberg

1+ year experience with AWS ECS/ELB

?? Job Description:


The AWS Data Engineer will be responsible for designing, building, and maintaining scalable data pipelines that integrate information from various sources to support analytics and decision-making. The role emphasizes ensuring data accuracy, performance optimization, and adherence to compliance standards.

?? Key Responsibilities:


Develop and Maintain Data Pipelines
Build scalable and efficient ETL workflows using AWS Glue/EMR, Lambda, and Redshift

Leverage PySpark, Apache Spark, and Python for large dataset processing

Develop fault-tolerant and high-performance pipelines for data ingestion and transformation

Ensure Data Quality & Integrity
Cleanse and validate data for consistency and reliability

Implement robust error handling and data validation mechanisms

Performance & Optimization


Tune Redshift queries and AWS services for optimal cost/performance

Improve SLAs, scalability, and efficiency of data pipelines

Business & Stakeholder Support
Work closely with data analysts, BI developers, and stakeholders

Translate business needs into technical specifications

Documentation & Governance
Maintain clear documentation of ETL processes and system architecture

Ensure compliance with data governance and security standards

? Desired Qualifications (Nice-to-Haves):


Bachelor s/Master s degree in Computer Science or related field

7 10+ years in data engineering, ETL design, and large-scale data systems

5+ years hands-on with AWS ecosystem (S3, Glue, Lambda, Athena, EMR, Redshift, etc.)

3+ years experience working with relational DBs/data lakes

Experience with CI/CD, DevOps, and agile delivery practices

Strong communication and collaboration skills

Prior mentoring/technical leadership experience

?? Soft Skills & Work Style:
Detail-oriented with a strong commitment to data quality

Problem-solving mindset with ability to work independently and in a team

Willingness to stay current with latest data engineering trends and tools

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About eGrove Systems Corporation