Overview
On Site
$$60 / hr
Contract - W2
Contract - 1 day((s))
Skills
Data Engineer
Job Details
Job Description:
Responsibilities:
Responsibilities:
- Design, develop, and automate data processing workflows using Airflow, PySpark, and Dataproc on Google Cloud Platform.
- Develop ETL (Extract, Transform, Load) processes that handle diverse data sources and formats.
- Manage and provision Google Cloud Platform resources including Dataproc clusters, serverless batches, Vertex AI instances, GCS buckets, and custom images.
- Provide platform and pipeline support to analytics and product teams, troubleshooting issues related to Spark, BigQuery, Airflow DAGs, and serverless workflows.
- Collaborate with data scientists and analysts to understand data needs and deliver robust solutions.
- Provide timely and effective technical support to internal users (e.g., data analysts, data scientists) addressing their data-related queries and problems.
- Optimize and fine-tune data systems for high performance, reliability, and cost efficiency.
- Perform root cause analysis for recurring issues and collaborate with data analysts and scientists to implement preventative measures to minimize future occurrences.
- Strong programming skills in Python, SQL.
- Hands-on experience with cloud platforms.
- Expertise in Google Cloud Platform data tools: BigQuery, Dataproc, Vertex AI, Pub/Sub, Cloud Functions.
- Strong hands-on experience with Apache Airflow (incl. Astronomer), PySpark, and Python.
- Familiarity with SQL, SparkSQL, Hive, PL/SQL, and data modeling.
- Comfortable supporting distributed data systems and large-scale batch/stream data processing.
- Optimize and support Spark jobs and ETL pipelines running on Dataproc.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.