Wilmington, Delaware
•
14d ago
ONLY W2 Key Responsibilities: Develop and maintain scalable ETL/ELT data pipelines using PySpark and Python.Design, build, and optimize data processing workflows on AWS Cloud (e.g., S3, Glue, EMR, Lambda, Redshift).Work with structured and unstructured data to support data analytics and machine learning.Implement data quality checks, performance tuning, and monitoring of data processes.Collaborate with data scientists, analysts, and business stakeholders to understand data needs.Required Skills:
Easy Apply
Contract
Depends on Experience