Overview
Skills
Job Details
Job Role : Senior Data Engineer Data Migration
Location : San Diego, CA ( Only 2 days Onsite & 3 Days Remote Every Week )
Job Type : W2 Contract
About the Role
We are looking for an experienced Senior Data Engineer to play a key role in the migration of our legacy RDBMS platforms to PostgreSQL for a mission-critical billing and invoicing system. This position requires strong hands-on skills in data migration, transformation, and validation using Apache Spark as the preferred compute engine.
The Senior Engineer will work closely with the Lead Data Engineer to implement the migration strategy, ensure performance and accuracy, and deliver a seamless transition while safeguarding customer billing continuity.
Key Responsibilities
- Data Migration Execution
- Build and optimize ETL/ELT pipelines for bulk data loads, transformations, and Change Data Capture (CDC).
- Assist in schema conversion, SQL optimization, and data validation processes.
- Implement Spark-based jobs for high-volume and high-performance migration workloads.
Collaboration & Support
- Work under the guidance of the Lead Data Engineer to deliver migration components.
- Partner with cross-functional teams (application engineers, DBAs, QA) to ensure smooth integration with PostgreSQL.
- Provide inputs on tool selection, migration best practices, and automation opportunities.
Data Quality & Reliability
- Develop data validation and reconciliation frameworks to ensure 100% accuracy.
- Monitor pipeline performance and troubleshoot issues proactively.
- Maintain high availability, compliance, and security of sensitive customer and financial data.
Required Qualifications
- 5+ years of experience in Data Engineering, with proven work in database migrations.
Strong experience in RDBMS to PostgreSQL data migration and SQL performance tuning.
Hands-on expertise in Apache Spark (PySpark/Scala) for ETL/ELT workloads.
Familiarity with AWS data services (Glue, EMR, RDS, S3, IAM) or similar cloud platforms.
Knowledge of data validation frameworks and best practices for reconciliation.
Solid programming skills in Python.
Preferred Skills
- Background in financial/billing systems with mission-critical data flows.
Exposure to Terraform or Infrastructure-as-Code (IaC).
Familiarity with DevOps/DataOps practices (CI/CD for data pipelines, monitoring, observability).
Strong problem-solving, debugging, and optimization skills.