Snowflake Data Developer

Chicago, IL, US • Posted 6 hours ago • Updated 6 hours ago
Contract Independent
Contract W2
Contract Corp To Corp
No Travel Required
On-site
Depends on Experience
Company Branding Image
Fitment

Dice Job Match Score™

👤 Reviewing your profile...

Job Details

Skills

  • 5+ years of hands-on experience with Snowflake data platform including advanced SQL
  • stored procedures
  • and performance optimization •           Strong experience with data ingestion patterns including bulk loading
  • micro-batching
  • and streaming data processing •           Proficiency with AWS services such as S3
  • Lambda
  • and CloudWatch Experience with Azure data services including Data Factory
  • Event Hubs
  • Blob Storage
  • and Azure Functions •           Solid Python programming skills for data processing
  • API integrations
  • and automation scripts •           Experience with data modeling concepts and dimensional modeling techniques •           Understanding of data security
  • governance
  • and compliance best practices Preferred Qualifications: •           Snowflake certifications (SnowPro Core or Advanced certifications) •           Experience with Infrastructure as Code tools (Terraform
  • CloudFormation) •           Knowledge of containerization technologies (Docker
  • Kubernetes) •           Familiarity with data visualization tools and business intelligence platform •           Experience with data quality frameworks and monitoring tools

Summary

Only Local candidate 

Must Linkedin ID 

 

Snowflake Data Developer, Chicago, IL

Required Qualifications:

•           Bachelor’s degree in Computer Science, Information Systems, or related technical field

•           Minimum 8+ years of experience in data engineering or related roles

•           Proven track record of delivering production-ready data solutions at scale

•           Experience with version control systems (Git) and collaborative development practices

•           5+ years of hands-on experience with Snowflake data platform including advanced SQL, stored procedures, and performance optimization

•           Strong experience with data ingestion patterns including bulk loading, micro-batching, and streaming data processing

•           Proficiency with AWS services such as S3, Lambda, and CloudWatch Experience with Azure data services including Data Factory, Event Hubs, Blob Storage, and Azure Functions

•           Solid Python programming skills for data processing, API integrations, and automation scripts

•           Experience with data modeling concepts and dimensional modeling techniques

•           Understanding of data security, governance, and compliance best practices

Preferred Qualifications:

•           Snowflake certifications (SnowPro Core or Advanced certifications)

•           Experience with Infrastructure as Code tools (Terraform, CloudFormation)

•           Knowledge of containerization technologies (Docker, Kubernetes)

•           Familiarity with data visualization tools and business intelligence platform

•           Experience with data quality frameworks and monitoring tools

We are seeking an experienced Snowflake Developer to join our data engineering team. The ideal candidate will have strong expertise in Snowflake data platform capabilities, with proven experience in designing and implementing robust data ingestion solutions across both file-based and streaming architectures on AWS and Azure cloud platforms.

Job Duties:

•           Design, develop, and maintain scalable data pipelines using Snowflake as the core data warehouse platform

•           Build and optimize data ingestion processes for both batch file-based loads and real-time streaming data from various sources

•           Implement data transformation logic using Snowflake SQL, stored procedures, and Python integration Collaborate with data architects and analysts to understand business requirements and translate them into technical solutions

•           Monitor and troubleshoot data pipeline performance, ensuring high availability and data quality

•           Develop and maintain documentation for data processes, data models, and system architecture

•           Work closely with DevOps teams to implement CI/CD practices for data pipeline deployments

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 91171196
  • Position Id: Mike245
  • Posted 6 hours ago

Company Info

About Lifelink Healthtech LLC

Lifelink Healthtech is a global firm bringing on-demand talent, strategic workforce management, and IT consulting to Fortune 500 giants and growing businesses alike.

Our expert team works closely with you, helping you embrace change, adapt and scale through a highly tailored, strategy-based consulting approach.

About_Company_One
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

It looks like there aren't any Similar Jobs for this job yet.

Search all similar jobs