Opening for Data Engineer :: Contract :: Texas - Remote

Overview

Remote
On Site
Accepts corp to corp applications
Contract - Independent
Contract - W2
Contract - 30 day((s))

Skills

Python Expert (idiomatic python 3.11+) Heavy data engineering experience. Experience working alongside Data Scientists
assisting with model training and prediction frameworks. Nice to have: Azure Machine Learning & Azure Data Factory experience

Job Details

Job Title: Data Engineer
Location: Texas (Remote)
Job Type: Contract
Travel: 10% to Dallas/ Dallas Airport
Must Have:
  • Python Expert (idiomatic python 3.11+)
  • Heavy data engineering experience.
  • Experience working alongside Data Scientists, assisting with model training and prediction frameworks.
Job Summary
We're seeking an experienced Data Engineer with deep expertise in Python 3.11+ who can design and develop high-throughput data pipelines and collaborate closely with Data Scientists on model training and production prediction systems. Familiarity with Azure Machine Learning and Azure Data Factory is a strong advantage.
Key Responsibilities
  • Design, build, and maintain scalable ETL/ELT data pipelines in idiomatic Python 3.11+, handling both batch and streaming workloads
  • Collaborate with Data Scientists to operationalize ML models: assist in model training, deployment, and end-to-end prediction frameworks .
  • Optimize data storage solutions (SQL/NoSQL/data warehouses) and implement transformations in Python and SQL for analytical and ML use cases .
  • Integrate and manage message queues (e.g., Kafka, RabbitMQ) for asynchronous data processing workflows
  • Monitor, log, troubleshoot, and optimize data pipeline performance and reliability
  • (Nice to have) Design and maintain ML workflows using Azure Machine Learning, and manage data orchestration with Azure Data Factory
Required Qualifications
  • 4+ years of professional experience in Python (3.11+), including clean, idiomatic code practices
  • Proven history of building scalable data pipelines, data models, and ETL/ELT workflows
  • Strong SQL and database expertise (PostgreSQL, DynamoDB, or equivalents) .
  • Experience integrating REST APIs and message-driven frameworks in data engineering contexts .
  • Familiarity with version control and CI/CD systems (e.g., GitLab, Azure DevOps) .
  • Excellent collaboration and communication skills with technical and data science stakeholders
Preferred (Nice-to-Have)
  • Hands-on experience with Azure Machine Learning and Azure Data Factory
  • Familiarity with Azure Databricks, Spark/PySpark, or similar big data frameworks
  • Experience with Airflow or similar orchestration tools
  • Experience containerizing workloads (Docker/Kubernetes) for deployment and scaling
  • Experience in MLOps or assisting production ML systems with monitoring, retraining, and versioning.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.