![]()
Data Engineer - DataStage ETL III
BCforward is seeking a highly motivated and experienced Data Engineer - DataStage ETL III
Job Title: Data Engineer - DataStage ETL III
Location: Remote Job
Duration: 8 Months Contract
Pay Rate: $62/Hr. On W2
Must Have
3 + years of experience in ETL space
3+ years of proven experience with IBM DataStage.
Business Intelligence - Systems/ETL
SQL
Nice To Have
Python
Salesforce
Snowflake
Job Summary:
We are seeking a Senior Data Engineer to execute end?to?end Salesforce data migrations. Data will be moved from Snowflake to on-premise systems to DataStage and then to Salesforce. This role focuses on source?to?target mapping, IBM DataStage?driven ETL development, data cleansing and validation, mock migration runs, cutover execution, and post?load reconciliation. You will collaborate closely with Salesforce administrators, solution architects, product owners, and SMEs to deliver repeatable, auditable, and automation?ready migration processes.
Key Responsibilities
Develop source?to?target mapping specifications including field?level transformations, deduplication logic, and validation rules.
Build and configure ETL workflows using IBM DataStage, SQL, and Python to automate extraction, transformation, and loading into Salesforce.
Profile source data; assess data quality issues; implement cleansing and standardization routines.
Execute mock migration runs, capture error logs, perform root?cause analysis, and implement remediation steps.
Perform final production data loads and verifications with business sign?off.
Produce reconciliation reports to validate accuracy, completeness, and referential integrity post?migration.
Author a comprehensive migration runbook covering workflows, schedules, controls, rollback, and support procedures.
Implement quality checks, lineage notes, and audit artifacts aligned to governance and compliance requirements.
Partner with Salesforce teams to respect object relationships, load order, and platform limits during migrations.
Contribute to continuous improvement by optimizing ETL performance and automating repetitive validation tasks.
Critical Deliverables
Source?to?target data mapping logic with transformation rules.
Configured ETL workflows/scripts for extraction, transformation, and loading.
Validated data extraction files from source Salesforce org (or legacy systems).
Documented data cleansing and validation rules.
Test migration runs with error logs and remediation steps.
MOCK (3) data loads into target Salesforce org with verification.
Final data load into target Salesforce org with verification.
Post?migration data quality and reconciliation report.
Comprehensive migration runbook and documentation.
Required Skills
Hands?on experience with IBM DataStage for ETL/ELT development.
Strong SQL with proven performance tuning on large datasets.
Demonstrated ability to perform data mapping, cleansing, deduplication, and validation.
Working understanding of data governance, quality checks, lineage, and compliance.
Experience optimizing ETL jobs and troubleshooting data load issues.
Nice to Have
Knowledge of the Salesforce data model (objects, relationships, metadata, constraints).
Experience with Salesforce Data Loader or comparable migration tools.
Familiarity with Snowflake or another cloud data warehouse for staging/landing zones.
Exposure to CI/CD for scheduling and deploying DataStage jobs and Python scripts.
Proficiency in Python (or similar scripting language) for data manipulation and automation.
Qualifications
Bachelor's degree in Computer Science, Information Systems, or equivalent practical experience.
7+ years of data engineering experience, including 3+ years building production ETL pipelines.
Prior delivery of at least one enterprise CRM or Salesforce migration is strongly preferred.
Interested candidates please send resume in Word format Please reference job code 249720 when responding to this ad.