Must Have:
8 + years of experience in ETL space
8+ years of proven experience with IBM DataStage.
Salesforce
Business Intelligence - Systems/ETL
SQL
Python
Snowflake
Job Summary:
We are seeking a Senior Data Engineer to execute endtoend Salesforce data migrations. Data will be moved from Snowflake to on-premise systems to DataStage and then to Salesforce. This role focuses on sourcetotarget mapping, IBM DataStagedriven ETL development, data cleansing and validation, mock migration runs, cutover execution, and postload reconciliation. You will collaborate closely with Salesforce administrators, solution architects, product owners, and SMEs to deliver repeatable, auditable, and automationready migration processes.
Key Responsibilities:
Develop sourcetotarget mapping specifications including fieldlevel transformations, deduplication logic, and validation rules.
Build and configure ETL workflows using IBM DataStage, SQL, and Python to automate extraction, transformation, and loading into Salesforce.
Profile source data; assess data quality issues; implement cleansing and standardization routines.
Execute mock migration runs, capture error logs, perform rootcause analysis, and implement remediation steps.
Perform final production data loads and verification with business signoff.
Produce reconciliation reports to validate accuracy, completeness, and referential integrity postmigration.
Author a comprehensive migration runbook covering workflows, schedules, controls, rollback, and support procedures.
Implement quality checks, lineage notes, and audit artifacts aligned to governance and compliance requirements.
Partner with Salesforce teams to respect object relationships, load order, and platform limits during migrations.
Contribute to continuous improvement by optimizing ETL performance and automating repetitive validation tasks.
Critical Deliverables:
Sourcetotarget data mapping logic with transformation rules.
Configured ETL workflows/scripts for extraction, transformation, and loading.
Validated data extraction files from source Salesforce org (or legacy systems).
Documented data cleansing and validation rules.
Test migration runs with error logs and remediation steps.
MOCK (3) data loads into target Salesforce org with verification.
Final data load into target Salesforce org with verification.
Postmigration data quality and reconciliation report.
Comprehensive migration runbook and documentation.
Required Skills:
Handson experience with IBM DataStage for ETL/ELT development.
Strong SQL with proven performance tuning on large datasets.
Demonstrated ability to perform data mapping, cleansing, deduplication, and validation.
Working understanding of data governance, quality checks, lineage, and compliance.
Experience optimizing ETL jobs and troubleshooting data load issues.
Nice to Have:
Knowledge of the Salesforce data model (objects, relationships, metadata, constraints).
Experience with Salesforce Data Loader or comparable migration tools.
Familiarity with Snowflake or another cloud data warehouse for staging/landing zones.
Exposure to CI/CD for scheduling and deploying DataStage jobs and Python scripts.
Proficiency in Python (or similar scripting language) for data manipulation and automation.
Qualifications:
Bachelor s degree in Computer Science, Information Systems, or equivalent practical experience.
7+ years of data engineering experience, including 3+ years building production ETL pipelines.
Prior delivery of at least one enterprise CRM or Salesforce migration is strongly preferred.