Overview
On Site
Depends on Experience
Accepts corp to corp applications
Contract - W2
Contract - 1 Month(s)
Skills
Accountability
Analytics
Collaboration
Continuous Delivery
Continuous Integration
Data Engineering
Data Governance
Data Quality
Databricks
Job Details
We are looking for Data Engineer - Remote / Telecommute for our client in Dallas, TX
Job Title: Data Engineer - Remote / Telecommute
Job Type: Contract
Job Description:
- The Data Engineer is responsible for building, optimizing, and maintaining data pipelines, ensuring high data quality, strong observability, and reliable platform performance.
- This role supports modern data infrastructure, migration efforts, automation workflows, and cross-team collaboration to ensure scalable and efficient data operations.
- Advanced SQL skills for table management, deprecation, and data querying.
- Python scripting for automation, ETL workflows, and alert tooling.
- Experience with Airflow including DAG creation, dependency management, and alert configuration.
- Strong version control and CI/CD experience, including Git and deployment pipelines.
- Ability to configure alerts and reduce false positives using observability tools such as Monte Carlo.
- Experience integrating observability tooling with orchestration and monitoring platforms.
- Ability to perform root cause analysis on alert triggers and pipeline issues.
- Experience supporting legacy-to-modern data platform migrations.
- Ability to debug upstream dependencies and external data source failures.
- Experience documenting and handing off pipelines.
- Ability to create and maintain technical documentation including runbooks and alert resolution guides.
- Experience collaborating with cross-functional teams.
- Understanding of data governance and accountability models.
- Experience with table deprecation, cleanup, and alert consolidation.
- Relevant data engineering experience supporting large-scale data workflows, pipelines, and monitoring.
- Experience working with Snowflake, Databricks, Salesforce, Airflow, or equivalent tools.
- Build and maintain automated data pipelines and workflows.
- Manage data quality alerts, reduce noise, and resolve pipeline issues.
- Support migration of legacy data systems to modern platforms.
- Analyze and resolve upstream and dependency-related issues.
- Maintain clean, well-documented, and efficient data infrastructure.
- Collaborate with engineering and analytics teams to ensure alignment and data quality.
- Create technical documentation including wikis, runbooks, and process documentation.
- Consolidate and optimize alerts and monitoring configurations.
- Advanced SQL and Python.
- Airflow DAG development and orchestration.
- Observability and monitoring tools.
- Troubleshooting and debugging skills.
- Strong technical communication and documentation ability.
- Relevant education or equivalent practical experience in data engineering, software engineering, or a related technical field.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.