Sunnyvale, California
•
Today
Data Engineer with Google Cloud Platform Onsite Mandatory skills Spark Scala Google Cloud Platform Airflow Dag ETL Pyspark Job Description : 1. Design, develop, and automate data processing workflows using Airflow, PySpark, and Dataproc on Google Cloud Platform. 2. Develop ETL (Extract, Transform, Load) processes that handle diverse data sources and formats. 3. Manage and provision Google Cloud Platform resources including Dataproc clusters, serverless batches, Vertex AI instances, GCS buckets
Easy Apply
Contract
Depends on Experience