Need ETL DAG - Airflow - Remote

  • Posted 3 hours ago | Updated 3 hours ago

Overview

Remote
Depends on Experience
Accepts corp to corp applications
Contract - W2
Contract - Independent
Contract - 18 Month(s)
Able to Provide Sponsorship

Skills

ETL DAG
Apache Airflow
Astronomer
AWS
EKS/ECS
Snowflake
Databricks
SQL
APIs
SLAs

Job Details

Job Title: ETL DAG Airflow Location: Remote Duration: Long Term
About the Role:
We are looking for an experienced ETL DAG architect with deep expertise in Apache Airflow and Astronomer to design, develop, and maintain complex data orchestration pipelines in our AWS environment. The role involves building robust, scalable DAGs to automate end-to-end workflows across banking systems including data ingestion, transformation, quality checks, and downstream integrations.
You will work closely with Data Engineers, Cloud Architects, and Business Analysts to ensure our Airflow ecosystem is secure, scalable, and optimized for performance.
Key Responsibilities:
Design, develop, and maintain Apache Airflow DAGs for complex, multi-step workflows in the banking and financial domain.
Deploy and manage Airflow environments using Astronomer on AWS (EKS / ECS).
Develop custom Airflow operators, hooks, and sensors to integrate with external systems (Snowflake, Databricks, SQL Server, APIs, etc.).
Implement error handling, retries, SLAs, and alerting mechanisms for all DAGs.
Collaborate with DevOps teams to containerize and deploy workflows using CI/CD pipelines.
Optimize DAG performance and scalability for large-volume banking data operations.
Ensure compliance with data security and governance standards (e.g., masking, auditing, logging).
Participate in code reviews, architecture discussions, and design documentation.
Automate environment setup, configuration, and deployment of Astronomer/Airflow clusters.
Required Skills & Experience:
8+ years of hands-on Python development experience.
3+ years of production experience with Apache Airflow (2.x) preferably Astronomer Airflow.
Strong understanding of Airflow concepts: DAGs, Operators, Sensors, XComs, TaskFlow API, etc.
Experience deploying Airflow in AWS (EKS, ECS, S3, Lambda, Secrets Manager, CloudWatch).
Experience integrating with Snowflake, Databricks, SQL Server, REST APIs, and message queues (Kafka/SQS).
Good understanding of data orchestration patterns, dependency management, and parallel execution.
Proficiency in CI/CD pipelines (GitHub Actions, Jenkins, or GitLab CI).
Familiar with Docker and Kubernetes deployments.
Knowledge of banking data workflows (payments, transactions, reconciliation, reporting) is a plus.
Strong communication and documentation skills:
Experience with Astronomer Cloud or Astronomer Enterprise.
Exposure to data governance frameworks and financial compliance.
Familiarity with IaC tools (Terraform or CloudFormation).
Basic knowledge of ETL frameworks or data transformation tools.

Regards,

Radiantze Inc

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.