Senior Data Engineer

Remote • Posted 30+ days ago • Updated 4 days ago
Full Time
Remote
$100,000 - $120,000/yr
Fitment

Dice Job Match Score™

⭐ Evaluating experience...

Job Details

Skills

  • Snowflake
  • Spark
  • Azure
  • Data Factory
  • ETL
  • Kafka

Summary

Senior Data Engineer Snowflake

 

Join an amazing company where you can work with cutting-edge technologies and platforms. Give your career an Infinite edge, with a stimulating environment and a global work culture. Be a part of an organization where we celebrate integrity, innovation, collaboration, teamwork, and passion. A culture where every employee is a leader delivering ideas that make a difference to this world we live in.

 

In the Database Analysis - Sr Professional II role  responsibilities include, although not limited to:

  • Architect and build enterprise-grade data solutions leveraging Snowflake as the core data platform.
  • Lead the implementation of scalable data models, robust data pipelines, and governed Lakehouse architectures supporting analytics and machine learning.
  • Design and optimize Snowflake schemas using best practices in secure data sharing, micro-partitioning, storage strategies, and compute tuning.
  • Build and manage real-time ingestion and event-driven architectures using Kafka integrated with Snowflake.
  • Implement and enforce enterprise data governance, privacy, data quality, lineage, and strong access controls (RBAC) in Snowflake environments.
  • Establish standards for Snowflake performance tuning, cost optimization, workload management, and query efficiency using platform metrics.
  • Integrate Snowflake with APIs, cloud services, microservices, and ingestion/orchestration tools such as Databricks, ADF, Spark, or Kafka Connect.
  • Solve complex issues related to ingestion failures, pipeline orchestration, schema evolution, and transformation logic across platforms.
  • Mentor developers and data engineers on Snowflake usage patterns, modeling strategies, and best practices for building reusable data products.

 

In addition to the qualifications listed below, the ideal candidate will demonstrate the following traits:

  • Strong analytical mindset with ability to optimize systems at scale.
  • Passion for platform maturity, best practices, and mentoring others.
  • Ability to troubleshoot complex data challenges across distributed systems.
  • Clear communication when collaborating with engineering, architecture, and business teams.
  • Ownership-driven and proactive approach to cost efficiency, data quality, and platform security.

 

You must possess the below minimum qualifications to be initially considered for this position. Preferred qualifications are in addition to the minimum requirements and are considered a plus factor in identifying top candidates.

 

Minimum Qualifications:

  • Bachelor’s degree in Computer Science, Data Engineering, Information Systems, Industrial Engineering, or related field.
  • 7+ years in data engineering, including hands-on experience designing and optimizing Snowflake environments.
  • Expert-level SQL skills, with strong knowledge of micro-partitions, warehouses, clustering, caching, and query tuning in Snowflake.
  • Proven experience building ETL/ELT pipelines specifically optimized for Snowflake workloads.
  • Experience using Kafka for real-time or event-driven ingestion into Snowflake or dependent systems.
  • Proficiency in Python, Scala, C#, or Java for transformations, orchestration logic, and automation.
  • Solid knowledge of data warehousing, dimensional modeling, and/or Lakehouse architectures.
  • Strong Snowflake security expertise (RBAC, encryption, masking, access policies) and governance frameworks.
  • Ability to independently lead design decisions and drive Snowflake platform adoption.
  • Strong English verbal and written communication skills, including executive presentation capabilities.

 

 Preferred Qualifications:

  • Experience with advanced Snowflake capabilities: Snowpipe, Streams, Tasks, Replication, Time Travel, Search Optimization Service.
  • Background with complementary tools such as Spark, Kafka Connect, Databricks, Azure Data Factory, Airflow, or Prefect.
  • Experience with Snowflake cost governance, workload isolation, marketplace integration, or cross-cloud data sharing.
  • Familiarity with Agile/Scrum methodologies and ability to collaborate in cross-functional engineering teams.

 

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 10199915
  • Position Id: 8847747
  • Posted 30+ days ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Remote

14d ago

Easy Apply

Full-time

135,000 - 180,000

Remote or Bethlehem, Pennsylvania

Today

Full-time

USD 99,150.00 - 162,885.00 per year

Remote or Boston, Massachusetts

3d ago

Full-time

USD 90,000.00 - 180,000.00 per year

Remote

Today

Full-time

Search all similar jobs