FYI: Client does not sponsor. This was previously a contract role, but when I pushed, they said perm only.
Senior Data Engineer
Department: Information Technology
Location: Pittsburg/Kansas City, KS (no relocation assistance is offered; must be local)
Salary range is $115-$155K
Position Overview
We are seeking a Senior Data Engineer to design, build, and optimize client’s modern data platform.
This role is responsible for developing robust data pipelines, implementing the Medallion Architecture (Bronze, Silver, Gold), and driving adoption of Snowflake data services.
The candidate must be hands-on, detail-oriented, and thrives on solving complex data challenges in a fast-paced environment.
Key Responsibilities
- Design, develop, and maintain scalable ETL/ELT pipelines using Fivetran and Snowflake.
- Implement and maintain Medallion Architecture layers (Bronze, Silver, Gold) for data curation and analytics readiness.
- Develop data models (conceptual, logical, and physical) supporting analytics, operational systems, and business reporting.
- Ensure data quality, lineage, and governance across multiple data sources and domains.
- Collaborate with analysts, application teams, and business stakeholders to understand data requirements and optimize data flows.
- Optimize Snowflake performance through warehouse sizing, query tuning, and storage management.
- Support the integration of structured and unstructured data from diverse systems (ERP, CRM, Shopify, SQL Server, Twilio etc.).
- Contribute to the data platform roadmap, architecture standards, and automation of data engineering processes.
- Provide technical mentorship to junior data team members and help establish data engineering best practices.
Required Qualifications
- Bachelor’s degree in computer science, Information Systems, or related field (or equivalent experience).
- 7+ years of experience in data engineering or similar roles.
- Expertise with Snowflake, including schema design, Snowpipe, Streams/Tasks, and SnowSQL.
- Experience in Azure cloud services.
- Experience of data modeling, SQL, Python, and ETL/ELT best practices.
- Hands-on experience implementing Medallion Architecture or equivalent layered data lake design.
- Familiarity with data governance, security, and CI/CD practices for data workflows.
- Excellent problem-solving skills and ability to work collaboratively across technical and business teams.
- Experience with Power BI or other BI/reporting platforms.
- Experience with API integrations, REST, and JSON data formats.
- Exposure to machine learning pipelines or advanced analytics frameworks.
- Experience in a manufacturing, logistics, or e-commerce environment is a plus.