Data Engineer

  • Posted 8 hours ago | Updated 8 hours ago

Overview

Remote
$50 - $60
Contract - W2

Skills

SQL
Python
GCS
Kafka
Apache Kafka
PySpark
API
PostgreSQL
Continuous Integration
Continuous Delivery

Job Details

Position: Data Engineer

Location: 100% remote

Must Haves:

Python & PySpark (Spark SQL) 3+ years
Airflow (or any orchestration tool) 2+ years
Google Cloud Platform (BigQuery, GCS, Pub/Sub, Cloud Run, Functions, Cloud SQL) 3+ years
Real-time data ingestion (Kafka, webhooks, file-based) 2+ years
API integration (REST/webhooks) 2+ years
Kubernetes (GKE preferred) 1 2 years
BigQuery SQL & PostgreSQL 2+ years
YAML/config-driven pipeline design 2+ years
Schema transformation, hashing, DQF 2+ years
CI/CD, observability, lightweight dashboards (Grafana/Streamlit/Flask UI) 1+ year

Plusses:

Logistics or healthcare domain exposure nice to have

-PostgreSQL, CI/CD, monitoring, dashboarding, or lightweight UI development

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About DVARN