Data Engineer - Python & AI

  • Washington D.C., DC
  • Posted 1 day ago | Updated 1 hour ago

Overview

On Site
Depends on Experience
Contract - W2
Contract - Independent
Contract - 12 Month(s)

Skills

ADF
Amazon SageMaker
Databricks
Docker
ELT
Extract
Transform
Load
Flask
PySpark
Python
NumPy
Data Storage
Machine Learning (ML)
Data Engineering
Data Cleansing
Amazon Kinesis
Amazon Web Services
Analytics
Apache Kafka
Artificial Intelligence
Cloud Computing
Collaboration
Good Clinical Practice
Google Cloud Platform
Kubernetes
Management
Microservices
Microsoft Azure
Orchestration
Pandas
Real-time
Relational Databases
SQL
Streaming
Training
Workflow

Job Details

Hi,
We do have an urgent requirement for the below position with our direct client, Please submit Resume, Rate and Contact details.
Role: Data Engineer - Python & AI
Duration: Long Term
Location: Washington

Position Overview:

We are looking for a Data Engineer with strong skills in Python and exposure to AI/ML pipelines. The candidate will be responsible for building and maintaining data pipelines, preparing data for analytics and machine learning, and supporting AI-driven solutions.

Key Responsibilities:

  • Develop and maintain ETL/ELT pipelines to process data from multiple sources.
  • Build data workflows using Python and SQL.
  • Work with Data Scientists to prepare data for model training and deployment.
  • Implement feature engineering, data cleaning, and transformation pipelines.
  • Support deployment of AI/ML models into production.
  • Manage and optimize data storage in data lakes / warehouses.
  • Collaborate with teams to troubleshoot data and integration issues.

Required Skills:

  • 3 7 years of experience in data engineering.
  • Strong knowledge of Python (Pandas, NumPy, PySpark).
  • Solid experience with SQL and relational databases.
  • Exposure to AI/ML workflows (data prep, feature engineering).
  • Experience with cloud platforms (AWS, Azure, or Google Cloud Platform).
  • Good understanding of ETL pipelines and orchestration tools (Airflow, ADF, Databricks).

Nice to Have:

  • Experience with MLOps tools (MLflow, SageMaker, Azure ML).
  • Knowledge of APIs/microservices using FastAPI or Flask.
  • Familiarity with real-time streaming (Kafka, Kinesis).
  • Hands-on with Docker/Kubernetes.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.