ETL Developer

Overview

Remote
On Site
Depends on Experience
Contract - W2

Skills

Agile
DevOps
Data Integrity
Data Quality
Data Governance
Regulatory Compliance
Microsoft SQL Server
Modeling
Oracle
Python
SQL
Scala
Snow Flake Schema
Streaming
Scheduling
Problem Solving
RDBMS
Workflow

Job Details

Position: ETL Developer

Contract: W2 Only

Responsibilities

  • Design, develop, and maintain ETL pipelines to extract, transform, and load data from diverse sources into enterprise data warehouses.

  • Collaborate with business stakeholders and data analysts to gather requirements and define data integration objectives.

  • Develop and optimize data workflows using tools like SSIS, Azure Data Factory, Databricks, or Informatica.

  • Implement data quality checks, validations, and transformation logic to ensure accuracy and consistency.

  • Build and maintain data models (star schema, snowflake, normalized) to support reporting and analytics needs.

  • Work with large-scale datasets across relational databases, cloud data warehouses, and big data platforms.

  • Monitor, troubleshoot, and resolve ETL failures, performance issues, and bottlenecks.

  • Apply version control (Git), CI/CD pipelines, and DevOps practices for ETL development and deployment.

  • Ensure adherence to data governance, security, and compliance standards in all ETL processes.

  • Collaborate with BI developers, data engineers, and architects to deliver scalable data integration solutions.

Required Skills

  • 8 10+ years of experience in ETL development, data integration, or data engineering.

  • Strong expertise in ETL tools (SSIS, Azure Data Factory, Informatica, Databricks, or equivalent).

  • Proficiency in SQL and relational database systems (SQL Server, Oracle, PostgreSQL, etc.).

  • Solid understanding of data warehousing concepts (star schema, slowly changing dimensions, fact/dimension modeling).

  • Experience with performance tuning of ETL workflows and SQL queries.

  • Familiarity with cloud data platforms (Azure Synapse, AWS Redshift, Snowflake, Google BigQuery).

  • Knowledge of scheduling, monitoring, and automation tools for ETL jobs.

  • Experience with Git-based workflows, DevOps, and CI/CD automation.

  • Strong understanding of data quality, governance, and security best practices.

Nice-to-Have

  • Exposure to big data frameworks (Spark, Hadoop, Databricks).

  • Hands-on experience with Python, Scala, or shell scripting for ETL automation.

  • Knowledge of real-time/streaming ETL (Kafka, Event Hub, Kinesis).

  • Experience with containerization and orchestration (Docker, Kubernetes).

  • Familiarity with Terraform, Jenkins, GitHub Actions, or other automation tools.

Soft Skills

  • Strong analytical and problem-solving abilities with attention to detail.

  • Excellent communication skills to collaborate across technical and business teams.

  • Ability to work in Agile or hybrid Agile/Waterfall environments.

  • Self-motivated with a passion for continuous learning and improvement.

  • Commitment to data integrity, scalability, and reliability in all solutions.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.