Overview
On Site
Depends on Experience
Contract - Independent
Contract - W2
Skills
Java
PySpark
Data
Airflow
AWS
GCP
Distributed Systems
Job Details
Direct Client
Two Rounds of Interview
On W2
6+ years of professional experience in data engineering or software development.
Handson experience in Java and Python, with expertise in PySpark.
Proven experience building and automating pipelines with Apache Airflow.
Deep knowledge of AWS and Google Cloud Platform data tooling (e.g., EMR, Dataproc, BigQuery, S3, GCS, Lambda, Composer).
Familiarity with distributed systems architecture (Spark, Kafka, Hadoop ecosystem).
Strong SQL skills and experience with both relational (MySQL, PostgreSQL) and NoSQL databases.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.