Overview
Skills
Job Details
Google Cloud Platform Data Migration Engineer-*REMOTE*- 6 month initial contract
Technical focus: Google Cloud Platform expertise, Python, PySpark, SQL
Must Have:
8+ years of experience in data processing, data engineering or large-scale data systems.
Python & PySpark (Spark SQL) 3+ years
Airflow (or any orchestration tool) 2+ years
Google Cloud Platform (BigQuery, GCS, Pub/Sub, Cloud Run, Functions, Cloud SQL) 3+ years
Real-time data ingestion (Kafka, webhooks, file-based) 2+ years
API integration (REST/webhooks) 2+ years
Kubernetes (GKE preferred) 1 2 years
BigQuery SQL & PostgreSQL 2+ years
YAML/config-driven pipeline design 2+ years
Schema transformation, hashing, DQF 2+ years
CI/CD, observability, lightweight dashboards (Grafana/Streamlit/Flask UI) 1+ year
Logistics or healthcare domain exposure nice to have
Experience Needed:
Strong with PySpark (esp. Spark SQL) for complex transformation pipelines
Hands-on with Airflow for orchestration and BigQuery SQL for querying and data modeling
Good experience in Google Cloud Platform (BigQuery, GCS, Pub/Sub, Cloud Run, Functions,Cloud SQL)
Comfortable with real-time ingestion: Kafka, webhooks, file-based triggers
Solid API integration skills (REST/webhooks), with ability to handle payload-driven workflows
Experience working in Kubernetes (GKE) for deploying and scaling pipelines
Comfortable handling UI-driven configuration, YAML-based setups, and modular frameworks
Exposure to schema transformation, data validation, hashing, and DQF logic
Domain familiarity with logistics/healthcare (CVS context is a big plus)
Strong ownership mindset able to work under pressure and balance speed with reliability
Basic understanding of BASH, SFTP transfers, Networking, Access Management
Bonus: PostgreSQL, CI/CD, monitoring, dashboarding, or lightweight UI development