Senior Data Engineer–Airflow, DBT Core, Kubernetes/OpenShift

Hybrid in Boston, MA, US • Posted 1 day ago • Updated 1 day ago
Contract Corp To Corp
Contract Independent
Contract W2
No Travel Required
Hybrid
Depends on Experience
Company Branding Image
Fitment

Dice Job Match Score™

🔢 Crunching numbers...

Job Details

Skills

  • Data Engineering
  • Python
  • Modeling
  • Migration
  • Apache Airflow

Summary

Role: Senior Data Engineer–Airflow, DBT Core, Kubernetes/OpenShift

Location: Boston, MA-Hybrid

Duration: Contract

Job Summary

We are seeking a highly skilled Senior Data Engineer with 10+ years of hands-on experience in enterprise data engineering, including deep expertise in Apache Airflow DAG development, dbt Core modeling and implementation, and cloud-native container platforms (Kubernetes / OpenShift).

This role is critical to building, operating, and optimizing scalable data pipelines that support financial and accounting platforms, including enterprise system migrations and high-volume data processing workloads.

The ideal candidate will have extensive hands-on experience in workflow orchestration, data modeling, performance tuning, and distributed workload management in containerized environments.

Required Skills & Qualifications

  • 10+ years of professional experience in data engineering, analytics engineering, or platform engineering roles
  • Proven experience designing and supporting enterprise-scale data platforms in production environments
  • Expert-level Apache Airflow (DAG design, scheduling, performance tuning)
  • Expert-level dbt Core (data modeling, testing, macros, implementation)
  • Strong proficiency in Python for data engineering and automation
  • Deep understanding of Kubernetes and/or OpenShift in production environments
  • Extensive experience with distributed workload management and performance optimization
  • Strong SQL skills for complex transformations and analytics
  • Cloud & Platform Experience
  • Experience running data platforms on cloud environments
  • Familiarity with containerized deployments, CI/CD pipelines, and Git-based workflows

Key Responsibilities 

  • Data Pipeline & Orchestration
  • Design, develop, and maintain complex Airflow DAGs for batch and event-driven data pipelines
  • Implement best practices for DAG performance, dependency management, retries, SLA monitoring, and alerting
  • Optimize Airflow scheduler, executor, and worker configurations for high-concurrency workloads
  • dbt Core & Data Modeling
  • Lead dbt Core implementation, including project structure, environments, and CI/CD integration
  • Design and maintain robust dbt models (staging, intermediate, marts) following analytics engineering best practices
  • Implement dbt tests, documentation, macros, and incremental models to ensure data quality and performance
  • Optimize dbt query performance for large-scale datasets and downstream reporting needs
  • Cloud, Kubernetes & OpenShift
  • Deploy and manage data workloads on Kubernetes / OpenShift platforms
  • Design strategies for workload distribution, horizontal scaling, and resource optimization
  • Configure CPU/memory requests and limits, autoscaling, and pod scheduling for data workloads
  • Troubleshoot container-level performance issues and resource contention

Performance & Reliability

  • Monitor and tune end-to-end pipeline performance across Airflow, dbt, and data platforms
  • Identify bottlenecks in query execution, orchestration, and infrastructure
  • Implement observability solutions (logs, metrics, alerts) for proactive issue detection
  • Ensure high availability, fault tolerance, and resiliency of data pipelines
  • Collaboration & Governance
  • Work closely with data architects, platform engineers, and business stakeholders
  • Support financial reporting, accounting, and regulatory data use cases
  • Enforce data engineering standards, security best practices, and governance policies

Preferred Qualifications

  • Experience supporting financial services or accounting platforms.
  • Exposure to enterprise system migrations (e.g., legacy platform to modern data stack)
  • Experience with data warehouses (Oracle)
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 91020323
  • Position Id: 8948346
  • Posted 1 day ago

Company Info

About Visionary Innovative Technology Solutions

VITS provide staffing and recruitment services along with technology consulting to more than 50+ clients globally Our skilled & expertise professionals help clients to manage varying skill needs, skills gaps and changing staffing needs to encounter project deadlines. VITS staff augmentation services provide skilled resources which assist clients to develop, maintain, manage and support their applications. Our vigorous pursuit for excellence in hiring, delivery model, work ethics, and approach has enabled us to become a highly trusted & preferred recruitment solution provider.



Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Herndon, Virginia

Yesterday

Easy Apply

Full-time

Depends on Experience

Hybrid in Dallas, Texas

5d ago

Easy Apply

Full-time

Depends on Experience

Hybrid in Dallas, Texas

5d ago

Easy Apply

Full-time

Depends on Experience

Hybrid in Texas City, Texas

Today

Easy Apply

Full-time

Depends on Experience

Search all similar jobs