Python Developer with Airflow

Overview

Hybrid
Depends on Experience
Accepts corp to corp applications
Contract - Independent
Contract - W2

Skills

Python
Airflow

Job Details

Role: Python Developer with Airflow

Location: Reston, VA & Jersey City, NJ (Hybrid)

Duration: Full Time/Contract


We re seeking an experienced Python Developer to lead the automation and orchestration of complex data workflows. The ideal candidate will have hands-on experience designing robust, fault-tolerant, and auditable pipelines across on-prem Oracle systems, integrating with job schedulers like RunMyJobs, and modernizing legacy processes using Apache Airflow.
You will play a critical role in replacing legacy Perl/PLSQL scheduling logic with modern, Python-based DAG orchestration while ensuring traceability, data quality, and recoverability.

Key Responsibilities:

  • Develop, deploy, and maintain Python-based automation scripts to orchestrate jobs across Oracle 19c on-prem systems.
    Design and implement Airflow DAGs to manage complex interdependent ETL workflows.
    Migrate existing job logic from Perl, RunMyJob, and PL/SQL-based scheduling into modular, observable Airflow DAGs.
    Build custom Airflow operators/sensors for integration with Oracle, REST APIs, file drops (SFTP/FTP), and external triggers.
    Implement robust error handling, alerting and retry mechanisms across job pipelines.
    Collaborate with DBAs and application teams to understand job dependencies, critical paths, and data lineage.
    Establish job execution logs, audit trails, and SLA monitoring dashboards.
    Participate in code reviews, documentation, and onboarding new jobs into the orchestrator.
    Required Skills and Experience:
    10+ years of Python development experience, with strong understanding of system/process automation.
    3+ years of Apache Airflow building production DAGs.
    Solid understanding of Oracle 19c database, SQL tuning, and PL/SQL concepts.
    Experience orchestrating jobs that move large volumes of data across enterprise systems.
    Familiarity with job schedulers (RunMyJob, Autosys, etc.) and how to replace/abstract them using orchestration tools.
    Strong debugging skills across logs, databases, and filesystem for failed jobs or partial runs.
    Experience building REST API integrations, SFTP/file movement logic, and parameter-driven automation.

Bonus / Preferred Experience:

  • Prior experience modernizing legacy data workflows from Perl or PL/SQL stored procs.
    Hands-on knowledge of Git/Bitbucket, Jenkins, CI/CD pipelines for code-controlled job rollouts.
    Familiarity with financial data models (e.g., holdings, transactions, NAVs, tax lots).
    Basic understanding of data governance, audit, and operational risk in financial systems.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Visionary Innovative Technology Solutions