Overview
Skills
Job Details
Position: Data Engineer
Location: 100% remote
Must Haves:
Python & PySpark (Spark SQL) 3+ years
Airflow (or any orchestration tool) 2+ years
Google Cloud Platform (BigQuery, GCS, Pub/Sub, Cloud Run, Functions, Cloud SQL) 3+ years
Real-time data ingestion (Kafka, webhooks, file-based) 2+ years
API integration (REST/webhooks) 2+ years
Kubernetes (GKE preferred) 1 2 years
BigQuery SQL & PostgreSQL 2+ years
YAML/config-driven pipeline design 2+ years
Schema transformation, hashing, DQF 2+ years
CI/CD, observability, lightweight dashboards (Grafana/Streamlit/Flask UI) 1+ year
Plusses:
Logistics or healthcare domain exposure nice to have
-PostgreSQL, CI/CD, monitoring, dashboarding, or lightweight UI development