ETL Developer

  • Dallas, TX
  • Posted 19 hours ago | Updated 19 hours ago

Overview

Hybrid
$50 - $60
Contract - W2
No Travel Required

Skills

ETL + Azure + SQL + Datawarehouse + Postgres + CI/CD

Job Details

Description:

We are seeking a highly skilled and experienced Senior ETL Tools Developer with deep expertise in Microsoft Azure Data Factory (ADF) and modern data integration platforms. The ideal candidate will lead the design, development, and optimization of scalable ETL pipelines and build robust integration interfaces with enterprise platforms such as Coupa and SAP.
________________________________________
Key Responsibilities:
Design, develop, and maintain robust ETL pipelines using MS Azure Data Factory, MS Azure Databricks, and other Azure services.
Build and manage data integration interfaces between internal systems and external platforms such as Coupa and SAP.
Collaborate with data architects, analysts, and business stakeholders to understand data requirements and deliver scalable solutions.
Optimize data workflows for performance, reliability, and cost-efficiency.
Implement data quality checks, error handling, and logging mechanisms.
Participate in code reviews, architecture discussions, unit testing, and performance tuning.
Ensure compliance with data governance, security, and privacy standards.
Mentor junior developers and contribute to best practices in data engineering.
________________________________________
Required Qualifications:
Bachelor s or Master s degree in Computer Science, Information Systems, or related field.
5+ years of experience in ETL/ELT development, with at least 2 years focused on Azure Data Factory.
Strong proficiency in SQL, Postgres, and data transformation logic.
Experience with Azure Blob Storage, Azure Functions, and Databricks.
Hands-on experience integrating with Coupa and SAP platforms.
Familiarity with CI/CD pipelines and version control tools (e.g., Git, Azure DevOps).
Experience with Unix/Linux shell scripting for automation and process orchestration.
Expertise in data extraction, transformation, and loading from/to various data sources (e.g., relational databases, flat files, XML, JSON, etc.).

Solid understanding of data warehousing concepts, dimensional modeling, and big data processing.
Excellent problem-solving, communication, and documentation skills.
________________________________________
Preferred / Good-to-Have Qualifications:
Experience with IBM DataStage development.
Experience with Power BI, Snowflake, or Apache Spark.
Knowledge of data lake architecture, data mesh, or data fabric concepts.
Microsoft Azure certifications (e.g., DP-203: Data Engineering on Microsoft Azure).
Experience with REST/SOAP APIs and middleware platforms for enterprise integration.SQL

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.