Overview
Skills
Job Details
We're seeking an experienced Python Developer to lead the automation and orchestration of complex data workflows. The ideal candidate will have hands-on experience designing robust, fault-tolerant, and auditable pipelines across on-prem Oracle systems, integrating with job schedulers like RunMyJobs, and modernizing legacy processes using Apache Airflow. You will play a critical role in replacing legacy Perl/PLSQL scheduling logic with modern, Python-based DAG orchestration while ensuring traceability, data quality, and recoverability.
location: Jersey City, New Jersey
job type: Contract
salary: $60 - 65 per hour
work hours: 8am to 5pm
education: Bachelors
responsibilities:
- Develop, deploy, and maintain Python-based automation scripts to orchestrate jobs across Oracle 19c on-prem systems.
- Design and implement Airflow DAGs to manage complex interdependent ETL workflows.
- Migrate existing job logic from Perl, RunMyJob, and PL/SQL-based scheduling into modular, observable Airflow DAGs.
- Build custom Airflow operators/sensors for integration with Oracle, REST APIs, file drops (SFTP/FTP), and external triggers.
- Implement robust error handling, alerting and retry mechanisms across job pipelines.
- Collaborate with DBAs and application teams to understand job dependencies, critical paths, and data lineage.
- Establish job execution logs, audit trails, and SLA monitoring dashboards.
- Participate in code reviews, documentation, and onboarding new jobs into the orchestrator.
qualifications:
- 5+ years of Python development experience, with strong understanding of system/process automation.
- 2+ years of Apache Airflow building production DAGs.
- Solid understanding of Oracle 19c database, SQL tuning, and PL/SQL concepts.
- Experience orchestrating jobs that move large volumes of data across enterprise systems.
- Familiarity with job schedulers (RunMyJob, Autosys, etc.) and how to replace/abstract them using orchestration tools.
- Strong debugging skills across logs, databases, and filesystem for failed jobs or partial runs.
skills: - Prior experience modernizing legacy data workflows from Perl or PL/SQL stored procs.
- Hands-on knowledge of Git/Bitbucket, Jenkins, CI/CD pipelines for code-controlled job rollouts.
- Familiarity with financial data models (e.g., holdings, transactions, NAVs, tax lots).
- Basic understanding of data governance, audit, and operational risk in financial systems
Equal Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status.
At Randstad Digital, we welcome people of all abilities and want to ensure that our hiring and interview process meets the needs of all applicants. If you require a reasonable accommodation to make your application or interview experience a great one, please contact
Pay offered to a successful candidate will be based on several factors including the candidate's education, work experience, work location, specific job duties, certifications, etc. In addition, Randstad Digital offers a comprehensive benefits package, including: medical, prescription, dental, vision, AD&D, and life insurance offerings, short-term disability, and a 401K plan (all benefits are based on eligibility).
This posting is open for thirty (30) days.