Application Development Analyst

Overview

On Site
Up to $80
Contract - W2

Skills

snowflake
SQL
ETL
RBAC
Cloud
Storage
SSIS
Azure
OLAP
DB@
Mainframe
Data
BI
Agile
Scrum
Oracle
ODW
DevOps
Access Control
Amazon S3
Amazon Web Services
Analytical Skill
Analytics
Cloud Storage
Collaboration
Communication
Continuous Delivery
Continuous Integration
Application Development
Business Intelligence
Caching
Cloud Computing
Clustering
Data Visualization
Decision Support
Dimensional Modeling
ELT
Data Architecture
Data Engineering
Data Governance
Data Migration
Data Warehouse
General Ledger
Git
GitHub
IBM DB2
Informatica
Instructional Design
Enterprise Resource Planning
Extract
Transform
Load
FOCUS
Finance
Microsoft Power BI
Microsoft SQL Server
Microsoft SSIS
Migration
Orchestration
Flat File
Knowledge Transfer
Snow Flake Schema
Tableau
Talend
Transact-SQL
Legacy Systems
Management
Microsoft Azure
Reporting
Semantics
Streaming
WebFOCUS

Job Details

To prepare for the transition to Florida PALM, our agency is seeking a resource knowledgeable in Snowflake
to assist in the agency s effort to modernize legacy mainframe flat file data into Snowflake-compatible data
formats. Our goal is to transform our data into actionable insights through a modern data platform, enabling
our organization to deliver true management decision support.

Our management has the vision to transform our department into a truly data driven organization and we
just need the right resource to help us execute that vision.

Primary Job Duties/ Tasks

The submitted candidate must be able to perform the following duties and/or tasks. Duties of the selected candidate will include, but not be limited to:

  1. Analyze the current data environment, including data sources, pipelines, and legacy
    structures, to determine required transformations and optimal migration strategies into
    Snowflake.
  2. Collaborate with stakeholders and data architects to design and implement scalable, secure,
    and cost-effective data architecture using Snowflake.
  3. Re-engineer legacy reporting logic (e.g., WebFOCUS, Mainframe FOCUS, and T-SQL) by
    translating them into Snowflake SQL and optimizing performance.
  4. Develop and automate ELT/ETL data pipelines using Snowflake's native features and tools
    such as Snowpipe, Streams, Tasks, Informatica, and integration with external orchestration
    tools (e.g., dbt, Airflow).
  5. Partner with analysts and business users to build efficient, reusable data models and secure
    views within Snowflake that support downstream reporting (e.g., Power BI, Tableau, or
    Looker).
  6. Optimize query performance and data governance by implementing best practices in
    Snowflake for security, access control, caching, clustering, and cost monitoring.
  7. Support training, documentation, and knowledge transfer to internal teams, ensuring smooth
    adoption and use of Snowflake-based solutions.

Candidate must have a minimum of 8 years of experience in data engineering, analytics, or cloud data
warehousing, with at least 6 years of hands-on experience designing and implementing solutions using the Snowflake Data Cloud platform.

  1. Expert level SQL programming is REQUIRED for this position.
  2. Proven experience with Snowflake platform architecture and data warehousing concepts.
  3. Expertise in building efficient, secure, and scalable data models in Snowflake using views, materialized views, and secure shares.
  4. Strong knowledge of ELT/ETL patterns and tools (e.g., dbt, Airflow, Talend, Informatica, MS SSIS, Fivetran).
  5. Solid understanding of data governance, security roles, masking policies, and RBAC within
    Snowflake.
  6. Experience working with cloud storage integrations (e.g., AWS S3, Azure Blob) and external
    tables in Snowflake.
  7. Familiarity with dimensional modeling (Star/Snowflake Schema), OLAP concepts, and reporting layers for BI tools
  8. Strong communication and analytical skills for working with cross-functional teams and converting data requirements into technical solutions.
  9. Strong understanding of current data governance concepts and best practices.
  10. Knowledge of data migration best practices from external data sources and legacy systems
    (e.g., mainframe, DB2, MS SQL Server, Oracle) into Snowflake.
  11. Experience with data visualization tools (Power BI, Tableau, Looker) and building BI semantic
    models using Snowflake as a backend.
  12. Experience working with financial, ERP, or general ledger data in a reporting or analytics
    capacity.
  13. Exposure to mainframe systems, legacy flat files, and their integration with cloud-based platforms.
  14. Familiarity with Agile/SCRUM frameworks and experience working in iterative development
    cycles.
  15. Experience with Oracle Data Warehouse.
  16. Understanding of DevOps and CI/CD practices in data engineering (e.g., Git, dbt Cloud, or
    GitHub Actions).
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About MVP Consulting Plus