Overview
Hybrid
Depends on Experience
Contract - W2
Contract - 6 Month(s)
Skills
ETL
Python
PySpark
and Spark
Agile
AWS data services
Git
Jenkins
Terraform
Airflow
Job Details
Job Title - Data Engineer
Contract Duration - 6+ Months
Hybrid 3-4 days in McLean, VA
Need Former Capital One
Key Responsibilities:
- Design, build, and optimize ETL pipelines using Python, PySpark, and Spark
- Develop scalable data solutions leveraging Databricks, AWS Glue, EMR, and S3
- Collaborate with cross-functional engineering and analytics teams to implement best practices in data ingestion, transformation, and storage
- Support data quality, performance tuning, and process automation across the data lifecycle
- Work in Agile environments with CI/CD and version control tools
Required Skills and Experience:
- 3 to 7 plus years of experience in data engineering, preferably in cloud-based environments
- Strong proficiency in Python, PySpark, Spark, and SQL
- Hands-on experience with AWS data services (S3, Glue, EMR, Redshift, Lambda, Athena)
- Experience with Databricks or equivalent data lake platforms
- Familiarity with modern DevOps practices (Git, Jenkins, Terraform, Airflow, etc.)
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.