Position: Data Engineer Location: 100% Remote (EST time zone) Contract Duration: 6+ months Only W2 Must Have Tech Stack: Python & PySpark (Spark SQL) 3+ years Airflow (or any orchestration tool) 2+ years Google Cloud Platform (BigQuery, GCS, Pub/Sub, Cloud Run, Functions, Cloud SQL) 3+ years Real-time data ingestion (Kafka, webhooks, file-based) 2+ years API integration (REST/webhooks) 2+ years Kubernetes (GKE preferred) 1 2 years BigQuery SQL & PostgreSQL 2+ years YAML/config-driven pipeline