Senior Data Engineer - Python, AWS, Snowflake, Airflow | Contract | Remote

Remote • Posted 2 hours ago • Updated 2 hours ago
Contract W2
Remote
Depends on Experience
Fitment

Dice Job Match Score™

✨ Finding the perfect fit...

Job Details

Skills

  • PYTHON
  • SNOWFLAKE
  • AIRFLOW
  • AWS

Summary

Job Title: Senior Data Engineer (Python, AWS, Snowflake, Airflow)
Location: Remote (Preference for Chicago area candidates)
Duration 6 months plus 

Start Date: Mid-May
Interview Process: 2 rounds
Pay rate: $70/HR - $80/HR W2 (negotiable)
 
Overview
We are seeking a Senior Data Engineer to design and build scalable data pipelines supporting enterprise data initiatives. This role will focus on extracting, transforming, and loading both structured and unstructured data into a modern cloud data platform, primarily leveraging AWS and Snowflake.
You will play a key role in building end-to-end data workflows, orchestrating pipelines, and enabling downstream data consumption through optimized data structures and user-facing interfaces.
 
Key Responsibilities
  • Design and develop end-to-end data pipelines using Python
  • Extract data and documents from enterprise data stores using connectors
  • Load and manage data in AWS (S3) and Snowflake
  • Process both structured and unstructured data for downstream consumption
  • Build and maintain data models and tables for business applications
  • Orchestrate workflows using Apache Airflow for scheduling and monitoring
  • Collaborate on building UI/data access layers on top of Snowflake
  • Ensure data quality, scalability, and performance across pipelines
  • Support cross-functional teams with data and reporting needs
 
Required Skills
  • Strong experience in Python (data pipeline development, scripting)
  • Hands-on experience with AWS services (especially S3, Lambda)
  • Expertise in Snowflake (data loading, modeling, performance tuning)
  • Experience with Apache Airflow for orchestration and scheduling
  • Solid understanding of ETL/ELT processes
  • Experience handling both structured and unstructured data
 
Nice-to-Have Skills
  • Experience building UI or data access layers on top of Snowflake
  • Familiarity with data visualization/reporting tools
  • Knowledge of Kafka (not required but beneficial for other projects)
  • Experience with traditional ETL tools
 
Project Overview
  • Build a cloud-based data pipeline using Python
  • Load data into AWS S3, then into Snowflake
  • Transform raw data into structured formats for business use
  • Orchestrate the entire workflow using Airflow
  • Enable end-user access via UI on top of Snowflake
 
Additional Notes
  • Remote candidates are acceptable; local candidates near Chicago are a plus
  • Interviews will begin shortly, with onboarding targeted for mid-May
 
Please apply with your interest. You may also reach out to me at
 
Thank you,
Ashu

 
We provide a comprehensive package which includes.
Benefits
  • Medical for full time employees
  • Dental, and Vision Insurance
  • Life Insurance, Short-Term Disability, Long-Term Disability, etc.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 91140885
  • Position Id: 26-00254
  • Posted 2 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Remote

12d ago

Easy Apply

Contract

$60 - $75

Remote

11d ago

Easy Apply

Contract

Depends on Experience

Remote

4d ago

Easy Apply

Contract

70 - 75

Remote or Hybrid in Cleveland, Ohio

Today

Easy Apply

Contract

DOE

Search all similar jobs