Data Engineer/Python Developer/Remote

Overview

Remote
On Site
$DOE
Accepts corp to corp applications
Contract - W2
Contract - Independent
Contract - month Contract

Skills

AWS / Azure / GCP
Luigi

Job Details

Key Responsibilities

  • Design, build, and maintain ETL/ELT data pipelines
  • Develop Python-based data processing applications
  • Work with structured and unstructured data at scale
  • Integrate data from multiple sources (APIs, databases, files, streams)
  • Optimize data workflows for performance and reliability
  • Ensure data quality, validation, and monitoring
  • Collaborate with data scientists, analysts, and backend teams
  • Manage and maintain data warehouses/lakes
  • Implement logging, error handling, and automation
  • Follow best practices for security and compliance

Required Skills

Programming

  • Strong Python (Pandas, NumPy, PySpark)
  • Writing clean, modular, and testable code

Databases & Storage

  • SQL (PostgreSQL, MySQL, SQL Server)
  • NoSQL (MongoDB, Cassandra optional)
  • Data Warehouses (Snowflake, Redshift, BigQuery)

Big Data & Processing

  • Apache Spark, Hadoop (preferred)
  • Batch and streaming data processing

Cloud Platforms

  • AWS / Azure / Google Cloud Platform
    • S3, Lambda, Glue, Dataflow, BigQuery, etc.

Data Engineering Tools

  • Airflow, Prefect, Luigi (orchestration)
  • Kafka / PubSub (streaming optional)
  • DBT (data transformation)

DevOps & Other

  • Git, CI/CD
  • Docker, Kubernetes (nice to have)
  • Linux basics
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.