DATA ENGINEER with PROJECT MANAGEMENT

Hybrid in Washington, DC, US • Posted 14 hours ago • Updated 14 hours ago
Full Time
Travel Required
Hybrid
$7 - $12/hr
Fitment

Dice Job Match Score™

⏳ Almost there, hang tight...

Job Details

Skills

  • SQL
  • Project Management
  • Jira
  • Documentation

Summary

About the Role

We are seeking a highly skilled Data Engineer with demonstrated project management capabilities to join our growing data and analytics team in the Washington, DC metropolitan area. This role bridges technical data engineering with cross-functional coordination, making it ideal for professionals who thrive at the intersection of technology and delivery leadership. You will architect and build enterprise-scale data infrastructure while concurrently managing project timelines, stakeholder expectations, and team deliverables in a dynamic, mission-focused environment.

 

Key Responsibilities

 

Data Engineering

  • Design, develop, and maintain scalable data pipelines using Apache Spark, Airflow, dbt, and cloud-native services (AWS, Azure, or Google Cloud Platform)
  • Build and optimize data warehouse and Lakehouse architectures (Redshift, Snowflake, BigQuery, Databricks Delta Lake)
  • Implement ELT/ETL processes to ingest structured and unstructured data from diverse federal, commercial, and third-party sources
  • Develop and enforce data quality frameworks, validation rules, and monitoring alerts using tools such as Great Expectations or dbt tests
  • Collaborate with data scientists and analysts to productionize ML models and analytical datasets
  • Design and maintain data models (star/snowflake schemas, OBT) in alignment with business and reporting needs
  • Ensure data platform security, access control, and compliance with federal regulations (FedRAMP, FISMA, NIST) where applicable

 

Project Management

  • Lead end-to-end delivery of data engineering projects — from scoping and requirements gathering through deployment and post-launch support
  • Develop and maintain project plans, roadmaps, risk registers, and status reports for executive and stakeholder audiences
  • Facilitate Agile ceremonies (sprint planning, standups, retrospectives) and manage backlog prioritization in Jira or Azure DevOps
  • Coordinate cross-functional teams including data scientists, analysts, DevOps engineers, and business stakeholders
  • Proactively identify and mitigate technical and schedule risks; escalate blockers with proposed solutions
  • Manage vendor and contractor relationships, SOWs, and deliverable acceptance criteria
  • Track and report on project KPIs including velocity, burn rate, and milestone completion

 

Required Qualifications

  • 3 to 5+ years of experience in data engineering roles with progressively increasing responsibility
  • 2+ years of project or program management experience in a technical environment
  • Proficiency in SQL and at least one scripting/programming language (Python or Scala strongly preferred)
  • Hands-on experience with cloud platforms — AWS (Glue, Redshift, S3, Lambda), Azure (ADF, Synapse, ADLS), or Google Cloud Platform (Dataflow, BigQuery, Cloud Composer)
  • Experience with workflow orchestration tools (Apache Airflow, Prefect, or Dagster)
  • Solid understanding of data modeling principles, data warehousing, and lakehouse paradigms
  • Demonstrated experience managing Agile/Scrum delivery using Jira, Azure DevOps, or similar tools
  • Excellent written and verbal communication skills with the ability to present technical concepts to non-technical audiences
  • Bachelor’s degree in computer science, Information Systems, Engineering, or a related field

 

Preferred Qualifications

  • Experience supporting federal government clients or working in a cleared environment (clearance a plus)
  • Familiarity with data governance frameworks, data cataloging tools (Collibra, Alation, Apache Atlas), and lineage tracking
  • PMP, PMI-ACP, or Scrum Master (CSM/PSM) certification
  • AWS Certified Data Analytics, Azure Data Engineer Associate, or equivalent cloud certification
  • Experience with real-time streaming data (Apache Kafka, Kinesis, or Pub/Sub)
  • Knowledge of CI/CD pipelines and infrastructure-as-code (Terraform, CloudFormation) for data platform deployments
  • Master's degree in a quantitative or technical discipline

 

Technical Stack

Languages

Python, SQL, Scala, Bash

Cloud Platforms

AWS, Azure, Google Cloud Platform

Data Warehouses

Snowflake, Redshift, BigQuery, Synapse

Orchestration

Apache Airflow, dbt, Prefect

Streaming

Apache Kafka, AWS Kinesis

BI & Visualization

Power BI, Tableau, Looker

DevOps / Infra

Terraform, Docker, Kubernetes, GitHub Actions

PM Tools

Jira, Confluence, Azure DevOps, MS Project

 

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: RTX1e861f
  • Position Id: 677-40829-
  • Posted 14 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Springfield, Virginia

Today

Full-time

USD 60,000.00 per year

Reston, Virginia

Today

Full-time

USD 168,750.00 - 281,250.00 per year

Washington, District of Columbia

Today

Full-time

USD 93,400.00 - 176,200.00 per year

Alexandria, Virginia

Today

Full-time

USD 62,000.00 - 141,000.00 per year

Search all similar jobs