Senior ETL Tools Developer - Dallas, TX

  • Dallas, TX
  • Posted 10 hours ago | Updated 10 hours ago

Overview

On Site
Depends on Experience
Contract - W2

Skills

Apache Spark
Big Data
Collaboration
Communication
Computer Science
Conflict Resolution
Continuous Delivery
Continuous Integration
Data Engineering
Data Governance
Databricks
Dimensional Modeling
ELT
Extract
Transform
Load
Git
Data Lake
Data Processing
Data Quality
Data Warehouse
DevOps
IBM InfoSphere DataStage

Job Details

Role: Senior ETL Tools Developer Location: Dallas, TX

Job Description

We are looking for a Senior ETL Tools Developer with strong hands-on experience in Microsoft Azure Data Factory (ADF) and modern data integration tools. The ideal candidate will design, develop, and optimize large-scale ETL pipelines and build seamless integrations between enterprise systems like Coupa and SAP.

Key Responsibilities

Design, develop, and maintain scalable ETL pipelines using Azure Data Factory, Databricks, and related Azure services.

Build and manage integrations between internal systems and external platforms (Coupa, SAP).

Collaborate with data architects and business teams to define and deliver data solutions.

Optimize data workflows for performance, reliability, and cost-efficiency.

Implement robust data quality checks, error handling, and logging.

Participate in code reviews, testing, and performance tuning.

Ensure compliance with data governance and security policies.

Mentor junior developers and support best practices in data engineering.

Required Skills & Qualifications

Bachelor s or Master s in Computer Science, Information Systems, or related field.

5+ years of experience in ETL/ELT development; 2+ years in Azure Data Factory.

Strong SQL and Postgres skills for data transformation.

Experience with Azure Blob Storage, Azure Functions, and Databricks.

Hands-on integration experience with Coupa and SAP.

Familiarity with CI/CD pipelines and version control tools (Git, Azure DevOps).

Experience with Unix/Linux shell scripting for automation.

Strong knowledge of data warehousing, dimensional modeling, and big data processing.

Excellent communication and problem-solving skills.

Preferred / Nice-to-Have

Experience with IBM DataStage.

Knowledge of Power BI, Snowflake, or Apache Spark.

Understanding of data lake, data mesh, or data fabric concepts.

Microsoft Azure certification (DP-203) or equivalent.

Experience with REST/SOAP APIs and middleware integrations.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.