Overview
Skills
Job Details
To prepare for the transition to Florida PALM, our agency is seeking a resource knowledgeable in Snowflake
to assist in the agency s effort to modernize legacy mainframe flat file data into Snowflake-compatible data
formats. Our goal is to transform our data into actionable insights through a modern data platform, enabling
our organization to deliver true management decision support.
Our management has the vision to transform our department into a truly data driven organization and we
just need the right resource to help us execute that vision.
Primary Job Duties/ Tasks
The submitted candidate must be able to perform the following duties and/or tasks. Duties of the selected candidate will include, but not be limited to:
- Analyze the current data environment, including data sources, pipelines, and legacy
structures, to determine required transformations and optimal migration strategies into
Snowflake. - Collaborate with stakeholders and data architects to design and implement scalable, secure,
and cost-effective data architecture using Snowflake. - Re-engineer legacy reporting logic (e.g., WebFOCUS, Mainframe FOCUS, and T-SQL) by
translating them into Snowflake SQL and optimizing performance. - Develop and automate ELT/ETL data pipelines using Snowflake's native features and tools
such as Snowpipe, Streams, Tasks, Informatica, and integration with external orchestration
tools (e.g., dbt, Airflow). - Partner with analysts and business users to build efficient, reusable data models and secure
views within Snowflake that support downstream reporting (e.g., Power BI, Tableau, or
Looker). - Optimize query performance and data governance by implementing best practices in
Snowflake for security, access control, caching, clustering, and cost monitoring. - Support training, documentation, and knowledge transfer to internal teams, ensuring smooth
adoption and use of Snowflake-based solutions.
Candidate must have a minimum of 8 years of experience in data engineering, analytics, or cloud data
warehousing, with at least 6 years of hands-on experience designing and implementing solutions using the Snowflake Data Cloud platform.
- Expert level SQL programming is REQUIRED for this position.
- Proven experience with Snowflake platform architecture and data warehousing concepts.
- Expertise in building efficient, secure, and scalable data models in Snowflake using views, materialized views, and secure shares.
- Strong knowledge of ELT/ETL patterns and tools (e.g., dbt, Airflow, Talend, Informatica, MS SSIS, Fivetran).
- Solid understanding of data governance, security roles, masking policies, and RBAC within
Snowflake. - Experience working with cloud storage integrations (e.g., AWS S3, Azure Blob) and external
tables in Snowflake. - Familiarity with dimensional modeling (Star/Snowflake Schema), OLAP concepts, and reporting layers for BI tools
- Strong communication and analytical skills for working with cross-functional teams and converting data requirements into technical solutions.
- Strong understanding of current data governance concepts and best practices.
- Knowledge of data migration best practices from external data sources and legacy systems
(e.g., mainframe, DB2, MS SQL Server, Oracle) into Snowflake. - Experience with data visualization tools (Power BI, Tableau, Looker) and building BI semantic
models using Snowflake as a backend. - Experience working with financial, ERP, or general ledger data in a reporting or analytics
capacity. - Exposure to mainframe systems, legacy flat files, and their integration with cloud-based platforms.
- Familiarity with Agile/SCRUM frameworks and experience working in iterative development
cycles. - Experience with Oracle Data Warehouse.
- Understanding of DevOps and CI/CD practices in data engineering (e.g., Git, dbt Cloud, or
GitHub Actions).