*** WORK MODE- HYBRID – Prefer LOCALS DESIRED***
While OIT supports telecommuting / remote work, it is expected that the contractor performs the work within the State of Colorado
*********************************************************************************************************************************
Title : Cloud Engineer-12380132 Data Ops Snowfla
Location : 80203,Denver,CO
Duration : 12 + Months
Job Type : C
Description :
Solicitation: Senior Operations & Data Engineer (Snowflake Specialist); Cloud Engineer
Security Clearance: OIT, FTI (IRS Pub 1075), and CJIS (Fingerprint-based)
1. Position Objective
The Office of Information Technology (OIT) is seeking a highly specialized Senior Operations and Data Engineer to serve as the primary administrator and technical lead for our Snowflake ecosystem. This role is a hybrid of platform operations and high-level data engineering, ensuring that sensitive state and federal data (FTI/CJIS) is managed within a secure, high-uptime, and cost-effective environment.
2. Preferred Qualifications
To be considered for this role, candidates should provide proof of the following:
- Active Snowflake Certification
- Background Clearance Readiness: Absolute eligibility to pass OIT, FTI (Federal Tax Information), and CJIS (Criminal Justice Information Services) background checks.
3. Key Responsibilities
Platform Operations & Administration
- Snowflake Mastery: Act as the lead administrator for Snowflake environments; manage platform uptime, vendor escalations, and patch/versioning communications.
- Environment Provisioning: Configure Snowflake, including complex RBAC (Role-Based Access Control) and security permissions.
- Governance & CI/CD: Implement and manage DataOps and CI/CD pipelines to automate deployments for the broader implementation team.
- Financial Stewardship: Configure cost-management features such as Snowflake resource monitors, budgets, and consumption tracking; consult on chargeback models.
Data Engineering & Transformation
- Pipeline Architecture: Develop robust ETL/ELT pipelines to ingest data from transactional systems (Line of Business) into the analytical Snowflake environment.
- Analytical Modeling: Translate Data Architect visions into technical reality by building complex transformations and target schemas.
- Quality Management: Design and deploy automated data cleansing and quality-check pipelines.
- Performance Engineering: Optimize data flows for specific latency and frequency requirements while maintaining credit efficiency.
Skills :
Primary Deliverables
- Architectural Contributions: Design reviews, Architectural Plans, and Scope Documents.
- Deployment Assets: New account/environment deployments, security schemas, and permission assignments.
- Engineering Assets: Comprehensive ETL Pipeline Design Documents, Mapping Documents, and production-ready Pipelines.
- Product Backlog & Support Ticket Management; performance reports
- Weekly Status Reports
****************************************************************************************************
If interested, please send a reply with your updated resume and the following details:
Full Name:
Phone Number:
Email ID:
Current Location:
Work Authorization:
Expected Rate/hr :
Availability:
Relocation:
Last 5 digits of SSN:
****************************************************************************************************
Awaiting your reply...