ETL Tools Developer - Dallas, TX (onsite )

  • Dallas, TX
  • Posted 2 days ago | Updated 2 days ago

Overview

On Site
Hybrid
Depends on Experience
Contract - W2

Skills

ELT
Linux
Microsoft Azure
Azure Data Factory
Coupa
sap

Job Details

ETL Tools Developer:

Location - Dallas, TX ( DAY 1 onsite ). Prefer local candidates.

Note: Candidate has to take an in-person client interview.

We are seeking a highly skilled and experienced Senior ETL Tools Developer with deep expertise in Microsoft Azure Data Factory (ADF) and modern data integration platforms. The ideal candidate will lead the design, development, and optimization of scalable ETL pipelines and build robust integration interfaces with enterprise platforms such as Coupa and SAP.

________________________________________

Key Responsibilities:

  • Design, develop, and maintain robust ETL pipelines using MS Azure Data Factory, MS Azure Databricks, and other Azure services.
  • Build and manage data integration interfaces between internal systems and external platforms such as Coupa and SAP.
  • Collaborate with data architects, analysts, and business stakeholders to understand data requirements and deliver scalable solutions.
  • Optimize data workflows for performance, reliability, and cost-efficiency.
  • Implement data quality checks, error handling, and logging mechanisms.
  • Participate in code reviews, architecture discussions, unit testing, and performance tuning.
  • Ensure compliance with data governance, security, and privacy standards.
  • Mentor junior developers and contribute to best practices in data engineering.

________________________________________

Required Qualifications:

  • Bachelor s or Master s degree in Computer Science, Information Systems, or related field.
  • 5+ years of experience in ETL/ELT development, with at least 2 years focused on Azure Data Factory.
  • Strong proficiency in SQL, Postgres, and data transformation logic.
  • Experience with Azure Blob Storage, Azure Functions, and Databricks.
  • Hands-on experience integrating with Coupa and SAP platforms.
  • Familiarity with CI/CD pipelines and version control tools (e.g., Git, Azure DevOps).
  • Experience with Unix/Linux shell scripting for automation and process orchestration.
  • Expertise in data extraction, transformation, and loading from/to various data sources (e.g., relational databases, flat files, XML, JSON, etc.).

  • Solid understanding of data warehousing concepts, dimensional modeling, and big data processing.
  • Excellent problem-solving, communication, and documentation skills.

________________________________________

Preferred / Good-to-Have Qualifications:

  • Experience with IBM DataStage development.
  • Experience with Power BI, Snowflake, or Apache Spark.
  • Knowledge of data lake architecture, data mesh, or data fabric concepts.
  • Microsoft Azure certifications (e.g., DP-203: Data Engineering on Microsoft Azure).
  • Experience with REST/SOAP APIs and middleware platforms for enterprise integration.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.