Snowflake Lead Engineer

Overview

Hybrid
$60 - $70
Contract - W2
Contract - 6 Month(s)

Skills

Conflict Resolution
Change Data Capture
Cloud Computing
Code Review
Collaboration
Continuous Delivery
Analytics
Apache Parquet
Auditing
Business Intelligence
Data Security
Data Masking
Data Migration
Data Modeling
Data Profiling
Data Quality
Clarity
DevOps
Extract
Transform
Load
Finance
Flat File
Functional Requirements
Data Visualization
Data Warehouse
Insurance
Internet Explorer
JSON
Database Design
Design Architecture
ELT
GitHub
Information Engineering
JavaScript
Continuous Integration
Data Architecture
Data Integration
Data Loading
Database
Access Control
Kanban
Meta-data Management
Microsoft BI
Microsoft SQL Server
Agile
Amazon EC2
Amazon S3
Amazon Web Services
Problem Solving
Replication
SQL
Specification Gathering
Stored Procedures
Systems Design
Tableau
Talend
Transact-SQL
UDF
Unstructured Data
User Stories
RDMS
Scrum
Snow Flake Schema
Software Development Methodology
Storage
CPU
Jenkins
Microsoft SSIS
Microsoft SSRS
Nexus
Performance Tuning
Version Control
XML

Job Details

Job ID: H#12678-1 - Snowflake Lead Engineer

PLEASE NOTE: This is a 3-6 month contract and needs to meet Client full-time conversion policies. Those dependent on a work permit sponsor now or anytime in the future (ie H1B, OPT, CPT, etc) do not meet Client requirements for this opening.

**MUST BE HYBRID IN Hartford, CT; Charlotte, NC; Chicago, IL, Columbus, OH, Danbury, CT - This role will have a Hybrid work arrangement, with the expectation of working in an office location 3 days a week (Tuesday through Thursday).

**MUST BE W2; No Corp-to-Corp**

**MUST HAVES:

1. 3-5 years of Snowflake experience Required

2. 5 years of Lead experience Senior role Required

3. Must have experience leading a team Required

4. Very strong hands on coding (SQL) Required

5. Snowflake Certifications Required

The Enterprise Data Services department s IT team supporting Global specialty is seeking a hands-on Senior Staff Data Engineer to enhance and support its Data assets on snowflake and SQL server platforms. We are looking for a talented professional with a proven track record of engineering the ELT development and integration using Snowflake. Our ideal candidate will leverage deep technical expertise and problem-solving skills to deliver invest , maintenance and enhancement projects within the Data & Analytics Value stream. This will be a contract to hire role.

The Senior Staff Data Engineer will be proficient with data platform architecture, design, data curation, multi-dimensional models, strong understanding of data architecture, principles of ETL and Data Warehousing. Responsibilities will also include technical delivery review, resolution of architecture issues in AWS Snowflake platform.

Responsibilities:

  • Demonstrate expertise in Snowflake s cloud native architecture and Microsoft SQL server technology.
  • Ability to create, troubleshoot, enhance complex code in Snowflake and SQL server.
  • Experience in building data pipelines (ELT) with Snowflake cloud data platform using AWS compute (EC2) and storage layers (S3).
  • Experience in building the Snowflake SQL Data warehouse using the Virtual warehouses based on best practices.
  • Hands on experience working with Talend or SSIS as an ELT tool with Snowflake and SQL server Data Integration.
  • Implement and leverage Materialized views, Data Sharing, Clone Feature and Performed Dynamic data Masking.
  • Have a solid understanding of delivery methodology (SDLC) and lead teams in the implementation of the solution according to the design/architecture.
  • Hands on experience with Snow SQL, Stored Procedures, UDF?s using JavaScript, SnowPipe and other snowflake utilities.
  • Experience in Data migration from RDMS to snowflake cloud Data warehouse.
  • Experience in data security and data access controls and design.
  • Solution the data loading and unloading activities to/from Snowflake.
  • Experience working with Data Lakes loading disparate data sources- Structured, semi-structured data (Flat files, XML, JSON, Parquet) and unstructured data.
  • Experience in building data pipelines using Talend and automation of data ingestion including change data capture (CDC).
  • Integration of data pipelines with source control repository and build CI/CD pipeline and DevOps.
  • Experience in Performance Tuning of Talend / SQL Agent Jobs to reduce the CPU time/load timing.
  • Deeper Knowledge on Snowflake License model and their continuous data protection life cycle.
  • Architect reusable Talend components such as Audit and reconciliation of jobs.
  • Researches and evaluates alternative solutions and recommends the most efficient and cost-effective solution for the systems design.
  • Support and quickly respond to Production issues and requirements clarifications.
  • Coordinate as needed between multiple disciplines such as, Architects, Business Analysts, Scrum Masters, and Developers to get technical clarity leading to design, develop and implementation of business solution.
  • Oversight of quality and completeness of detailed technical specifications, solution designs, and code reviews as well as adherence to the non-functional requirements.
  • Experience in delivering technical solutions in an iterative, agile environment (Scrum/Kanban)
  • Participate as active agile team member to help drive feature refinement, user story completion, code review, etc.
  • Identify, document, and communicate technical risks, issues and alternative technical solutions discovered during project.
  • Collaboration with a high-performing, forward-focused team, Release train engineer, Product Owner(s) and Business stakeholders.
  • Ability to work on innovative and new projects with a "fail-fast" approach to provide optimal solutions that bring the most value to the business.
  • Passion for learning new skills and the ability to adjust priorities on multiple projects based on changing demands/needs.

Qualifications & Key Skills:

  • Bachelor s degree
  • 5+ years in Snowflake in AWS and Talend Data Integrator or other BI/ETL tools.
  • 7+ years of hands-on experience in Data warehouse and Data Integration (ELT/ETL)
  • 7+ years of Proficiency in ETL with Microsoft Business Intelligence (SSIS, SSRS) and other tools.
  • 2+ years of hands-on experience with Data Visualization (preferably Tableau).
  • Strong background and problem-solving skills with Enterprise Data warehouse, ETL/ELT development, Database Replication, metadata management and data quality.
  • Hands-on experience in all phases of SDLC developing ETL solutions using T-SQL code, Stored Procedures, SSIS.
  • Strong data warehouse applications knowledge in preferably in financial/insurance domain is required.
  • Knowledge of version control tools, CI/CD pipeline and DevOps tools like GitHub, Jenkins nexus and uDeploy.
  • Knowledge of Data Profiling, Data Modeling and Database design is key to this role.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.