ETL Developer

Overview

Remote
On Site
Depends on Experience
Contract - W2
Contract - 3 Year(s)
75% Travel
Able to Provide Sponsorship

Skills

API
Analytical Skill
Attention To Detail
Conflict Resolution
Continuous Integration
Data Engineering
Continuous Delivery
Big Data
Amazon Kinesis
FOCUS
IBM InfoSphere DataStage
Master Data Management
Extract, Transform, Load
Data Security
Data Integration
Cloud Computing
Apache Hadoop
Agile
Data Quality
Informatica
Microsoft SSIS
Performance Tuning
Pentaho
Google Cloud Platform
Debugging
Data Loading
Data Warehouse
Apache Spark
DevOps
Query Optimization
PL/SQL
PostgreSQL
Perl
Workflow
Streaming
Scalability
SQL

Job Details

Position: ETL Developer
Contract: W2 Only

Responsibilities

  • Design, develop, and maintain ETL processes for extracting, transforming, and loading data from multiple sources into data warehouses or data lakes.

  • Collaborate with business analysts, data architects, and stakeholders to gather requirements and translate them into ETL solutions.

  • Optimize ETL workflows for performance, scalability, and reliability.

  • Ensure data quality, consistency, and integrity across multiple systems and environments.

  • Develop reusable ETL components and maintain technical documentation.

  • Monitor, troubleshoot, and resolve ETL job failures, performance issues, and data discrepancies.

  • Support data integration, migration, and modernization initiatives.

  • Implement data security, compliance, and governance standards within ETL pipelines.

  • Participate in Agile ceremonies including sprint planning, daily standups, and retrospectives.

  • Stay current with emerging ETL tools, cloud-based platforms, and best practices.

Required Skills

  • 10+ years of experience as an ETL Developer in enterprise environments.

  • Proficient in ETL tools such as Informatica, Talend, DataStage, SSIS, or Pentaho.

  • Strong SQL skills for data analysis, transformations, and query optimization.

  • Solid understanding of relational databases (Oracle, SQL Server, PostgreSQL) and data modeling concepts.

  • Experience with data warehousing platforms (Snowflake, Redshift, BigQuery, Synapse).

  • Knowledge of scripting languages (Python, Shell, or Perl) for automation.

  • Proficiency in debugging, performance tuning, and root cause analysis of ETL processes.

  • Familiarity with version control (Git) and CI/CD for ETL deployments.

  • Experience with cloud platforms (AWS, Azure, or Google Cloud Platform) and their data services.

  • Understanding of data governance, lineage, and compliance practices.

Nice-to-Have

  • Experience with real-time/streaming ETL using Kafka, Kinesis, or Spark Streaming.

  • Exposure to big data ecosystems (Hadoop, Hive, Spark).

  • Familiarity with data catalog and metadata management tools.

  • Experience with API integration and web services.

  • Knowledge of DevOps tools for deployment automation (Jenkins, Ansible).

  • Exposure to Master Data Management (MDM) solutions.

Soft Skills

  • Strong analytical and problem-solving skills with attention to detail.

  • Excellent communication and collaboration skills with business and technical teams.

  • Ability to work independently and manage multiple priorities.

  • Detail-oriented mindset with focus on delivering high-quality, reliable ETL solutions.

  • Proactive learner with a passion for continuous improvement in data engineering practices.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.