Overview
Remote
Depends on Experience
Full Time
Skills
Analytics
CA Workload Automation AE
Cloud Computing
Collaboration
Continuous Delivery
Continuous Integration
Customer Relationship Management (CRM)
Data Engineering
Documentation
Extract
Transform
Load
GC
IBM DB2
IBM InfoSphere DataStage
Data Integration
Data Link Layer
Data Migration
Databricks
Debugging
Onboarding
Oracle
Orchestration
PL/SQL
Performance Tuning
Production Support
DevOps
Job Scheduling
Management
Microsoft Azure
Migration
Netezza
Python
SQL
Salesforce.com
Scripting
Software Modernization
Job Details
Job Description Senior Data Engineer / ETL Lead (Remote)
Position: Senior Data Engineer / ETL Lead
Location: Remote (U.S. only)
Duration: Long-term Contract
Client: SVB/FCB
Visa: -EAD, L2-EAD preferred
Start: ASAP (background check & onboarding ~3 weeks)
Overview:
We are seeking an experienced Senior Data Engineer / ETL Lead with strong hands-on experience in IBM InfoSphere DataStage and DB2 to design, develop, and optimize large-scale ETL workflows for enterprise data integration and analytics platforms. The ideal candidate will have experience in cloud migration, ETL performance tuning, and job orchestration. Familiarity with Netezza or Azure Databricks is a plus.
Responsibilities:
- Design, develop, and maintain ETL workflows using IBM DataStage for large-scale data pipelines.
- Integrate and transform data from multiple sources into DB2, ensuring quality and consistency.
- Collaborate with data architects, analysts, and business teams to deliver scalable data solutions.
- Optimize ETL jobs for performance, reliability, and maintainability.
- Manage job scheduling and automation using enterprise tools (Autosys, Tidal, or equivalent).
- Participate in deployment planning, documentation, and production support.
- Contribute to CI/CD setup and best practices for ETL code deployment.
Required Skills:
- 7+ years of experience in ETL Development and Data Engineering.
- Expert in IBM InfoSphere DataStage (v8.x v11.x).
- Strong experience with DB2, SQL, and PL/SQL.
- Hands-on experience with UNIX scripting, job scheduling, and data migration.
- Excellent debugging and performance-tuning skills.
Nice to Have:
- Experience with Netezza, Azure Databricks, or Oracle.
- Familiarity with Salesforce, Python automation, or Azure DevOps CI/CD.
- Exposure to CRM or legacy modernization projects.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.