Overview
Remote
On Site
Depends on Experience
Contract - W2
Skills
API
Agile
Amazon Redshift
Apache Kafka
Apache Spark
Data Integration
Data Warehouse
Data Migration
Google Cloud Platform
Microsoft Azure
PL/SQL
Python
SQL
Shell Scripting
Snow Flake Schema
Workflow
Job Details
Job Title: ETL Developer
Location: Austin, TX
Duration: W2 only
Job Summary:
We are seeking a skilled ETL Developer to design, develop, and maintain robust data integration solutions across various systems. The ideal candidate will have strong experience in ETL tools, data warehouse environments, and SQL performance tuning, along with a solid understanding of data modeling and data governance principles.
Key Responsibilities:
- Design, develop, and optimize ETL workflows and data pipelines to extract, transform, and load data from various sources into data warehouses or analytical systems.
- Collaborate with data architects, analysts, and business stakeholders to understand data requirements and deliver scalable ETL solutions.
- Perform data profiling, quality checks, and validation to ensure data accuracy and consistency.
- Troubleshoot ETL performance issues and optimize query performance.
- Maintain and enhance existing ETL processes and documentation.
- Participate in data migration and integration initiatives from legacy systems to modern platforms.
- Ensure data security and compliance with organizational policies and regulations.
Required Skills and Experience:
- 14+years of experience in ETL development and data integration.
- Hands-on expertise with Informatica PowerCenter, IICS, Talend, or SSIS (as per project requirements).
- Strong proficiency in SQL and relational databases (e.g., Oracle, SQL Server, Redshift, Snowflake).
- Solid understanding of data warehousing concepts, dimensional modeling, and data architecture principles.
- Experience in performance tuning of ETL processes and complex SQL queries.
- Familiarity with cloud platforms (e.g., AWS, Azure, or Google Cloud Platform) and modern data tools (ADF, Glue, etc.).
- Working knowledge of version control, CI/CD, and Agile methodologies.
Preferred Qualifications:
- Bachelor s degree in Computer Science, Information Systems, or related field.
- Exposure to Python, Shell scripting, or API-based integrations.
- Experience with data lake and real-time streaming (Kafka, Spark, etc.) environments is a plus.
- Strong analytical, problem-solving, and communication skills.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.