Snowflake Data Engineer (Snowflake Specialist)/ Cloud Engineer - Local to CO

Hybrid in Denver, CO, US • Posted 1 day ago • Updated 1 day ago
Contract Corp To Corp
Contract Independent
Contract W2
No Travel Required
Hybrid
Depends on Experience
Fitment

Dice Job Match Score™

🛠️ Calibrating flux capacitors...

Job Details

Skills

  • Operations
  • Administration
  • Data Engineering
  • Data Transformation
  • ETL
  • Snowflake
  • Snowflake Certification

Summary

Title : Cloud Engineer-12380132 Data Ops Snowfla
Location : 80203,Denver,CO
Duration : 12 + Months

Solicitation: Senior Operations & Data Engineer (Snowflake Specialist); Cloud Engineer

Security Clearance: OIT, FTI (IRS Pub 1075), and CJIS (Fingerprint-based)
1. Position Objective
The Office of Information Technology (OIT) is seeking a highly specialized Senior Operations and Data Engineer to serve as the primary administrator and technical lead for our Snowflake ecosystem. This role is a hybrid of platform operations and high-level data engineering, ensuring that sensitive state and federal data (FTI/CJIS) is managed within a secure, high-uptime, and cost-effective environment.
2. Preferred Qualifications
To be considered for this role, candidates should provide proof of the following:
Active Snowflake Certification
Background Clearance Readiness: Absolute eligibility to pass OIT, FTI (Federal Tax Information), and CJIS (Criminal Justice Information Services) background checks.

3. Key Responsibilities
Platform Operations & Administration
Snowflake Mastery: Act as the lead administrator for Snowflake environments; manage platform uptime, vendor escalations, and patch/versioning communications.
Environment Provisioning: Configure Snowflake, including complex RBAC (Role-Based Access Control) and security permissions.
Governance & CI/CD: Implement and manage DataOps and CI/CD pipelines to automate deployments for the broader implementation team.
Financial Stewardship: Configure cost-management features such as Snowflake resource monitors, budgets, and consumption tracking; consult on chargeback models.
Data Engineering & Transformation
Pipeline Architecture: Develop robust ETL/ELT pipelines to ingest data from transactional systems (Line of Business) into the analytical Snowflake environment.
Analytical Modeling: Translate Data Architect visions into technical reality by building complex transformations and target schemas.
Quality Management: Design and deploy automated data cleansing and quality-check pipelines.
Performance Engineering: Optimize data flows for specific latency and frequency requirements while maintaining credit efficiency.

Skills :
Primary Deliverables
Architectural Contributions: Design reviews, Architectural Plans, and Scope Documents.
Deployment Assets: New account/environment deployments, security schemas, and permission assignments.
Engineering Assets: Comprehensive ETL Pipeline Design Documents, Mapping Documents, and production-ready Pipelines.
Product Backlog & Support Ticket Management; performance reports
Weekly Status Reports

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 10499223
  • Position Id: 8927605
  • Posted 1 day ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Denver, Colorado

Today

Easy Apply

Full-time, Part-time, Contract, Third Party

Hybrid in Denver, Colorado

Yesterday

Easy Apply

Contract

$76 - $77

Hybrid in Denver, Colorado

Today

Easy Apply

Third Party, Contract

Depends on Experience

Denver, Colorado

Today

Easy Apply

Full-time

Search all similar jobs