Overview
Skills
Job Details
Sr. Snowflake Data Engineer
Sr. Snowflake Data Engineer Agency: Ohio Department of Medicaid (ODM) Location: 50 W. Town Street, Columbus, Ohio 43215
On-site: 5 days a week (No remote option) Overview:
The Ohio Department of Medicaid is seeking an experienced Technical Specialist 4 (TS4) to support its Enterprise Data Warehouse (EDW) operations. This critical role involves migrating data from ODM's current Big Data Environment to the Snowflake Environment, running complex ELT jobs, and validating data quality from various sources.
You will play a key role in supporting ODM's strategic initiatives by ensuring high-quality, efficient, and secure data migration and integration operations across the Snowflake ecosystem.
Responsibilities:- Participate in team stand-ups, design reviews, and sprint planning.
- Provide technical support and implementation expertise in the Snowflake environment.
- Ingest data from Big Data (Hadoop/IOP) to Snowflake using industry best practices.
- Develop Snowpark features and build pipelines using Python.
- Interface and contribute to open-source Snowflake libraries (e.g., Python Connector).
- Manage Snowflake environments including role-based access, virtual warehouses, tasks, streams, and Snowpipe.
- Perform performance tuning, query optimization, and monitoring within Snowflake.
- Maintain technical documentation and ensure compliance with data governance/security policies.
- Analyze, profile, and ensure the quality of ingested data using Hadoop ecosystem tools.
- Develop ETL/ELT workflows using PySpark, Hive, Impala, and UNIX shell scripting.
- Conduct unit testing, mock data creation, and performance tuning.
- Update documentation including Run Books and Deployment Plans.
- Monitor production data loads, troubleshoot and track issues, and ensure successful load operations.
- Conduct code reviews, develop reusable frameworks, and support code deployment activities.
- Collaborate with Admin teams (Snowflake, Hadoop, SAS, ETL) for deployment and maintenance.
- Participate in functional and technical meetings to continuously enhance skill sets.
- 4 6 years of experience with Cloud Databases, Snowflake, and Data Warehousing.
- 2 3 years of hands-on Snowflake platform experience including Snowpipe, Snowpark, and data migration from Big Data environments.
- Expertise in SnowSQL, PL/SQL, SQL/Python/Java-based procedures in Snowflake.
- Experience in performance tuning, monitoring, and data security in Snowflake.
- Knowledge of AWS platform services.
- 8+ years experience in Big Data technologies (Hadoop, Sqoop, PySpark, Hive, Impala, Kafka, etc.).
- Extensive experience with UNIX shell scripting, Oracle SQL, HDFS, and StreamSets.
- Strong ETL/ELT development background, especially in data integration and data transformation logic.
- Experience handling PHI/PII data and adhering to data governance policies.
- Excellent written and verbal communication skills.
- Familiarity with Agile and Waterfall methodologies.
- Bachelor's degree in Computer Science, Information Systems, or related field (or equivalent experience).
- Snowflake certification highly desirable.
- Experience with Snowpipe, Streams, Tasks, and data masking in Snowflake.
- Security experience with SAML, OAuth, Kerberos, etc.
- Experience with System Disaster Recovery Plans for Snowflake.
- Leadership experience and ability to work both independently and in teams.
- Familiarity with tools like Visio, PowerPoint, Excel, Word.
- Strong analytical skills and ability to solve complex technical challenges.
- Ability to identify patterns, drive continuous improvement, and innovate new solutions.
Skill | Required / Desired | Amount | of Experience |
experience in Hadoop, MapReduce, Sqoop, PySpark, Spark, HDFS, Hive, Impala, StreamSets, Kudu, Oozie, Hue, Kafka, Yarn, Python, Flume, Zookeeper, Sentr | Required | 9 | Years |
Strong development experience in creating Sqoop scripts, PySpark programs, HDFS commands, HDFS file formats | Required | 9 | Years |
writing Hadoop/Hive/Impala scripts for gathering stats on table post data loads. | Required | 9 | Years |
hands-on experience with Cloud databases. | Required | 6 | Years |
hands-on data migration experience from the Big data environment to Snowflake environment. | Required | 3 | Years |
hands-on experience with the Snowflake platform along with Snowpipe and Snowpark. | Required | 3 | Years |
BS/BA degree or combination of education & experience. | Required |
|
|
eye