Overview
Skills
Job Details
Snowflake Data Engineer - CareFirst 47347
Duration 12+ mos to perm
Location: Hybrid a few times a month in Columbia, MD
Candidates must reside in MD, DC, VA, WV, PA (1 hour from Columbia), DE
LAST INERVIEW WILL BE ON SITE IN COLUMBIA, MD
Client is seeking an experienced Senior Data Engineer (Contractor) to join our Federal Employee Program (FEP) technology platform team. This role is a critical contributor in designing, developing, and optimizing cloud-based data solutions using Snowflake.
 You'll leverage advanced Snowflake capabilities, build modern data pipelines, and enable scalable analytics and reporting for enterprise healthcare operations. The ideal candidate will demonstrate deep Snowflake and SQL expertise, hands-on experience with Snowpark (Python), and a strong foundation in data architecture, governance, and automation.
 ________________________________________
 Key Responsibilities
     Design, develop, and optimize data pipelines and transformations within Snowflake using SQL and Snowpark (Python).
     Build and maintain Streams, Tasks, Materialized Views, and Dashboards to enable real-time and scheduled data operations.
     Develop and automate CI/CD pipelines for Snowflake deployments(Jenkins)
     Collaborate with data architects, analysts, and cloud engineers to design scalable and efficient data models.
     Implement data quality, lineage, and governance frameworks aligned with enterprise standards and compliance (e.g., HIPAA, PHI/PII).
     Monitor data pipelines for performance, reliability, and cost efficiency; proactively optimize workloads and resource utilization.
     Integrate Snowflake with dbt, Kafka for end-to-end orchestration and streaming workflows.
     Conduct root cause analysis and troubleshooting for complex data and performance issues in production.
     Collaborate across technology and business teams to translate complex data needs into elegant, maintainable solutions.
 ________________________________________
 Required Skills & Experience
     5+ years of experience in data engineering or equivalent field.
     3+ years hands-on experience with Snowflake Data Cloud, including:
 o Streams, Tasks, Dashboards, and Materialized Views
 o Performance tuning, resource monitors, and warehouse optimization
     Strong proficiency in SQL (complex queries, stored procedures, optimization).
     Proficiency in Python, with demonstrated experience using Snowpark for data transformations.
 Experience building CI/CD pipelines for Snowflake using modern DevOps tooling.
     Solid understanding of data modeling methodologies (Kimball, Data Vault, or 3NF).
     Experience with data governance, lineage, and metadata tools (Collibra, Alation, or Azure Purview).
     Strong troubleshooting, analytical, and communication skills with the ability to engage both technical and business audiences.
 ________________________________________
 Preferred Qualifications
     Experience with dbt, or Kafka for orchestration and streaming.
     Exposure to data quality frameworks such as Great Expectations or Monte Carlo.
 Understanding of real-time and batch data ingestion architectures.
     Snowflake Certification (SnowPro Core or Advanced).
     Prior experience in healthcare, insurance, or other regulated data environments.
 ________________________________________
 Soft Skills & Professional Attributes
     Excellent problem-solving and root-cause analysis capabilities.
     Strong communication and documentation skills across technical and non-technical audiences.
     Proven ability to work collaboratively in Agile or cross-functional DevOps teams.
     A growth mindset with a commitment to continuous learning and process improvement.
     Ability to thrive in a fast-paced, mission-driven environment supporting critical healthcare data operations.