ETL Developer

Overview

Hybrid
$40 - $45
Contract - W2
Contract - 12 Month(s)

Skills

ETL
Pentaho Data integration
PDI
SQL
PostgreSQL
MySQL
SQL Server
Oracle
Airflow

Job Details

ETL Developer

Pittsburg, PA

Long Term Contract W2 Only

Note: open to candidates in either Pittsburgh. PA or Lake Mary, FL. They only require 3 years of experience.

Job Description:

Strong experience in ETL development and data integration, particularly using Pentaho

Data Integration (PDI). The ideal candidate will be responsible for designing, developing,

and maintaining robust data pipelines and ETL processes to support enterprise-level data

platforms. You will collaborate closely with data architects, analysts, and other

stakeholders to ensure data quality, availability, and performance across the organization.

Key Responsibilities

Design, develop, and optimize ETL workflows using Pentaho Data Integration (Kettle).

Maintain and improve existing ETL jobs and scheduling mechanisms for reliability and

efficiency.

Work closely with business and technical teams to understand data requirements and

implement solutions.

Integrate data from various sources (databases, flat files, APIs) into centralized data stores.

Support data warehousing initiatives and ensure timely and accurate data delivery.

Monitor and troubleshoot ETL jobs and database performance issues.

Assist with database administration tasks such as indexing, partitioning, and backup

strategies.

Implement best practices for data security, quality, and governance.

Document data workflows, schemas, and processes clearly for future maintenance and

scalability.

Required Qualifications

Bachelor s degree in computer science, Information Systems, or a related field.

3+ years of experience in database engineering or ETL development.

Strong proficiency in Pentaho Data Integration (PDI) including job/transformation design,

parameterization, and logging.

Experience with SQL databases such as PostgreSQL, MySQL, SQL Server, or Oracle.

Solid understanding of ETL principles, data modeling, and data warehousing concepts.

Proficient in SQL and performance tuning for complex queries.

Experience working with large datasets and batch processing environments.

Familiarity with scheduling tools (e.g., cron, Airflow, or enterprise schedulers).

Knowledge of version control systems (e.g., Git) and CI/CD workflows Expertise in ETL tools

specifically Pentaho Data Integrator including transformations, jobs and scheduling.

Strong hands-on experience with Oracle SQL and PL/SQL.

Writing complex queries, procedures, packages, and performance tuning.

Understanding of dimensional modelling and db normalization Experience in building and

maintaining data warehouses and marts Familiar in shell scripting, git, data base

administration, smtp.

Strong problem solving and analytical skills Familiar with Tableau and other BI tools

Preferred Qualifications

Experience with cloud-based data platforms (e.g., AWS, Azure, Google Cloud Platform).

Familiarity with other ETL tools or frameworks (e.g., Talend, Informatica) is a plus.

Knowledge of scripting languages such as Python, Shell, or Groovy.

Exposure to big data technologies (e.g., Hadoop, Spark) or data lakes.

Understanding of data governance, compliance, and security standards.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.