Remote
•
Today
Role: Mid Level Data Engineer- Google Cloud Platform Mandatory Skills- Google Cloud Platform, AWS, Databricks, PySpark, DBT, Airflow or similar tools Overview We are also hiring multiple Data Engineers (Contractors) to support the buildout of a next-generation data platform. These engineers will focus on pipeline development, data transformation, and supporting the overall platform modernization effort. Key Responsibilities Build and maintain data pipelines using Databricks and PySpark Work wit
Easy Apply
Contract
$30 - $35



Location :Remote
Any visa
Job Description :
Mandatory Skills- Google Cloud Platform, AWS, Databricks, PySpark, DBT, Airflow or similar tools
Overview
We are also hiring multiple Data Engineers (Contractors) to support the buildout of a next-generation data platform. These engineers will focus on pipeline development, data transformation, and supporting the overall platform modernization effort.
Key Responsibilities
Build and maintain data pipelines using Databricks and PySpark
Work with structured and unstructured data sources (JSON, CSV, XML, etc.)
Support data ingestion, transformation, and validation processes
Collaborate with Lead Engineers on implementation and delivery
Ensure data quality and consistency across pipelines
Required Qualifications
Hands-on experience with PySpark and Databricks
Strong SQL and Python skills
Experience building and supporting data pipelines
Familiarity with Airflow or similar orchestration tools
Nice to Have
Experience with Google Cloud Platform / BigQuery
Exposure to DBT
Experience working in evolving or messy data environments