IBM DataStage Developer (SQL & Databricks)

Overview

On Site
$30 - $40
Accepts corp to corp applications
Contract - W2
Contract - 12 Month(s)
100% Travel
Able to Provide Sponsorship

Skills

Problem Solving
IBM InfoSphere DataStage
Microsoft Azure
Migration
Performance Tuning
Data Lake
Data Warehouse
Databricks
Debugging
Extract
Transform
Load
Apache Spark
Collaboration
Communication
Conflict Resolution
Data Governance
PySpark
Regulatory Compliance
SQL
Scalability
Workflow

Job Details

Responsibilities:

  • Design, develop, and maintain ETL workflows using IBM DataStage.

  • Integrate and process large datasets using SQL, Databricks (PySpark/SQL), and Azure Data Lake environments.

  • Collaborate with data architects and analysts to translate business requirements into technical solutions.

  • Optimize ETL jobs for performance, scalability, and cost efficiency.

  • Troubleshoot, debug, and resolve ETL/data pipeline issues proactively.

  • Implement best practices in data governance, security, and compliance.

  • Support migration or integration of on-prem data pipelines to Azure Databricks if required.

Requirements:

  • Mandatory: Strong hands-on experience in IBM DataStage ETL development.

  • Strong proficiency in SQL (complex queries, performance tuning).

  • Experience with Azure Databricks (PySpark / Spark SQL) for data transformations.

  • Knowledge of ETL workflows, data warehousing concepts, and data lake architectures.

  • Strong problem-solving and debugging skills.

  • Good communication and collaboration skills.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Padmas Technology LLC