Job Title: Senior Data Engineer
Location: District of Columbia (Hybrid onsite 3-4 days a week)
Either Webcam or In Person interview (Locals or Nearby states)
Client: DHCF
Can submit USC/H1B(Own W2)
Note: DHCF is looking for a senior data engineer for its data modernization efforts associated with its Medicaid data ecosystem.
Job Description
1. Position Purpose
The Senior Data Engineer serves as the primary technical engine for the agency's Medicaid data ecosystem. This role is unique: it requires a high-level mastery of our
current legacy environment—characterized by SSIS ETL processes managed via Team Foundation Server (TFS)—while actively spearheading the execution of
our cloud modernization roadmap. Under the direction of the Lead Data Warehouse Solution Architect, you will ensure the stability of current Medicaid reporting while building the future-state Azure Synapse and Databricks Lakehouse.
2. Key Responsibilities
A. Legacy Maintenance & Operational Excellence (Current State)
ETL Management: Maintain, troubleshoot, and modify complex SSIS packages handling high-volume Medicaid claims, provider, and member data.
Version Control: Manage code deployments and branching strategies within TFS, ensuring continuous integration of legacy SQL assets.
Legacy Reporting Support: Support and optimize SSRS report queries and SSAS tabular/multidimensional models to ensure federal and state compliance reporting remains uninterrupted.
B. Modernization & Migration Execution (Future State)
Cloud Development: Implement Medallion Architecture (Bronze/Silver/Gold) using Azure Databricks (PySpark/SQL) as designed by the Lead Architect.
Pipeline Refactoring: Lead the transition of legacy SSIS logic into Azure Data Factory (ADF) and Databricks notebooks. DevOps Transformation: Facilitate the migration of source control and CI/CD pipelines from TFS to Azure DevOps (Git).
Synapse Integration: Build and tune Dedicated and Serverless SQL Pools within Azure Synapse to facilitate advanced analytics and AI-readiness.
C. Data Governance & Security
Medicaid Compliance: Implement Row-Level Security (RLS) and automated data masking for PHI/PII in accordance with HIPAA, CMS MARS-E, and NIST standards.
Data Quality: Develop automated data validation frameworks to ensure data parity between legacy SQL systems and the new Cloud Lakehouse.
3. Competencies for Success
Technical Agility: The ability to pivot between a 10-year-old SSIS package and a modern Databricks Spark job in the same day.
Collaboration: Ability to take high-level architectural blueprints from the Lead Architect and translate them into high-performance, production-ready code.
Attention to Detail: Absolute precision in Medicaid data handling, where an error in logic can impact member benefits or federal funding.
---------------------------------------------
CONTRACT JOB DESCRIPTION
Responsibilities:
1. Leads the adoption or implementation of an advanced technology or platform.
2. Expert on the functionality or usage of a particular system, platform, or technology product.
3. Serves as a consultant to clients, guiding the efficient use or adoption of a particular IT product or platform.
4. Creates implementation, testing, and/or integration plans.
5. Demonstrates expertise in a particular IT platform or service, allowing for maximum IT investment.
Minimum Education/ Certification Requirements:
Bachelor’s degree in Information Technology or related field or equivalent experience
Training or certification in a particular product or IT platform/service, as required
Required/Desired Skills
Skill Required /Desired Amount of Experience
Maintaining SQL Server Required 7 Years
(SSIS/SSAS/SSRS) while
concurrently deploying Azure
cloud solutions.
Expert-level proficiency in SSIS Required 7 Years
and T-SQL. Advanced proficiency
in Azure Databricks (Unity
Catalog, Delta Lake) and Azure
Synapse.
Deep experience with TFS (Team Required 7 Years
Foundation Server) and a strong
desire to migrate workflows to
Git/Azure DevOps.
Mastery of SQL and Python Required 7 Years
(PySpark).
6-10 yrs. leading advanced Required 7 Years
technology projects or service
projects
6-10 yrs. full system engineering Required 7 Years
lifecycle
Bachelor’s degree Required Highly desired 3 Years
Experience with Microsoft Purview
for data cataloging.
Extensive experience with Highly desired 3 Years
Medicaid/Medicare data structures
(e.g., MMIS, EDI 837/835, claims
processing).
Microsoft Certified: Azure Data Highly desired
Engineer Associate or Databricks
Certified Professional Data
Engineer.
Experience in a government or Highly desired 7 Years
highly regulated environment.