ETL DAG Architect(Airflow / Astronomer) / Senior Apache Airflow Engineer-Apache Airflow + Astronomer (must be strong)-onsite- Atlanta GA- locals need

  • Stockton, GA
  • Posted 1 day ago | Updated 1 day ago

Overview

On Site
Depends on Experience
Accepts corp to corp applications
Contract - W2
Contract - Independent
Contract - 12 Month(s)
No Travel Required
Unable to Provide Sponsorship

Skills

Auditing
Apache Airflow
Cloud Computing
Collaboration
Communication
Apache Kafka
Continuous Integration
Continuous Delivery
Amazon Web Services
Amazon SQS
Banking
Data Governance
Amazon S3
Data Security
GitHub
Message Queues
Microsoft SQL Server
Kubernetes
Finance
Jenkins
Extract, Transform, Load
Docker
Design Documentation
Management
Documentation
Payments
GitLab
Orchestration
Sensors
Snow Flake Schema
Reporting
Scalability
Python
Regulatory Compliance
Terraform
Databricks
API
Workflow
DevOps
(Airflow / Astronomer):

Job Details

 ETL DAG architect(Airflow / Astronomer)/Apache Airflow Develope

About the Role We are looking for an experienced ETL DAG architect with deep expertise in Apache Airflow and Astronomer to design, develop, and maintain complex data orchestration pipelines in our AWS environment.

The role involves building robust, scalable DAGs to automate end-to-end workflows across banking systems — including data ingestion, transformation, quality checks, and downstream integrations.

You will work closely with Data Engineers, Cloud Architects, and Business Analysts to ensure our Airflow ecosystem is secure, scalable, and optimized for performance.

-- Key Responsibilities - We need Strong Airflow /Astromaur Design, develop, and maintain Apache Airflow DAGs for complex, multi-step workflows in the banking and financial domain. Deploy and manage Airflow environments using Astronomer on AWS (EKS / ECS).

Develop custom Airflow operators, hooks, and sensors to integrate with external systems (Snowflake, Databricks, SQL Server, APIs, etc.). Implement error handling, retries, SLAs, and alerting mechanisms for all DAGs. Collaborate with DevOps teams to containerize and deploy workflows using CI/CD pipelines. Optimize DAG performance and scalability for large-volume banking data operations.

Ensure compliance with data security and governance standards (e.g., masking, auditing, logging). Participate in code reviews, architecture discussions, and design documentation. Automate environment setup, configuration, and deployment of Astronomer/Airflow clusters. --- Required Skills & Experience 8+ years of hands-on Python development experience. 3+ years of production experience with Apache Airflow (2.x)

— preferably Astronomer Airflow. Strong understanding of Airflow concepts: DAGs, Operators, Sensors, XComs, TaskFlow API, etc. Experience deploying Airflow in AWS (EKS, ECS, S3, Lambda, Secrets Manager, CloudWatch). Experience integrating with Snowflake, Databricks, SQL Server, REST APIs, and message queues (Kafka/SQS). Good understanding of data orchestration patterns, dependency management, and parallel execution.

Proficiency in CI/CD pipelines (GitHub Actions, Jenkins, or GitLab CI). Familiar with Docker and Kubernetes deployments. Knowledge of banking data workflows (payments, transactions, reconciliation, reporting) is a plus.

Strong communication and documentation skills. Experience with Astronomer Cloud or Astronomer Enterprise. Exposure to data governance frameworks and financial compliance.

Familiarity with IaC tools (Terraform or CloudFormation).

Basic knowledge of ETL frameworks or data transformation tools

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Keylent