Data Engineer - ETL Migration & Pipeline Development
Remote
Exp: 10+yrs
Role Overview
Data Engineer - ETL Migration & Pipeline Development
We are seeking a Data Engineer contractor (100% remote) to support and execute the migration of ETL pipelines from Matillion to Apache Airflow. This role focuses on rebuilding pipelines, improving reliability, and enabling scalable, code-based data workflows.
This is a hands-on role requiring someone who can ramp quickly, work with existing ETL logic, and contribute within the first 1–2 weeks.
Responsibilities:
· Migrate existing ETL pipelines from Matillion to Apache Airflow while preserving logic and dependencies
· Develop and maintain Airflow DAGs with proper scheduling, retries, and failure handling
· Reverse engineer existing Matillion jobs and translate them into Python-based workflows
· Build and optimize data pipelines across systems such as S3, Snowflake, and relational databases
· Perform data validation and reconciliation between source and target systems
· Write and optimize SQL transformations for large-scale datasets
· Implement monitoring, alerting, and error handling for pipelines
· Collaborate with data, platform, and analytics teams to ensure smooth migration and deployment
· Document pipelines, workflows, and operational processes
Required Qualifications:
· 3+ years of experience in Data Engineering or ETL development
· Strong hands-on experience with Apache Airflow or similar orchestration tools
· Proficiency in Python for building data pipelines and workflows
· Strong SQL skills and experience working with large datasets
· Experience with cloud platforms (AWS preferred: S3, RDS/Aurora, IAM)
· Experience with cloud data warehouses (Snowflake preferred or similar)
· Experience building and maintaining ETL/ELT pipelines
· Familiarity with Git and CI/CD workflows
Preferred Qualifications:
· Experience migrating ETL workflows from tools like Matillion, Informatica, Talend, or SSIS to Airflow
· Experience with Airflow in containerized environments (Docker, Kubernetes/EKS)
· Familiarity with data validation, reconciliation, and pipeline testing strategies
· Experience with monitoring tools such as Datadog or CloudWatch
· Understanding of data modeling or medallion architecture