Senior Lead Data Engineer/Remote

Remote • Posted 2 hours ago • Updated 2 hours ago
Contract Corp To Corp
Contract W2
Contract Independent
No Travel Required
Remote
Depends on Experience
Fitment

Dice Job Match Score™

✨ Finding the perfect fit...

Job Details

Skills

  • Senior Lead Data Engineer

Summary


Job Title: Senior Lead Data Engineer

Location: Remote 
 

Job Summary

We are seeking a highly skilled Senior Lead Data Engineer with strong experience in modern data platforms including Snowflake, Databricks, Apache Iceberg, and Apache Spark. The ideal candidate will lead the design, development, and optimization of scalable data pipelines and analytics platforms while ensuring high performance for large-scale SQL workloads.

This role requires strong expertise in data architecture, performance tuning, and big data technologies to support enterprise-level analytics and data-driven decision-making.

Key Responsibilities

  • Design and implement scalable data pipelines and data lakehouse architectures using Snowflake, Databricks, and Apache Iceberg.

  • Lead the development and optimization of Spark-based ETL/ELT pipelines for large-scale data processing.

  • Optimize complex SQL workloads for performance, cost efficiency, and scalability.

  • Build and maintain high-performance data models supporting analytics, reporting, and machine learning workloads.

  • Implement data governance, security, and data quality frameworks.

  • Collaborate with data scientists, analysts, and business stakeholders to deliver reliable data solutions.

  • Perform performance tuning for distributed processing frameworks such as Spark and Databricks.

  • Guide engineering teams on best practices for data architecture, pipeline orchestration, and cloud data platforms.

  • Monitor and troubleshoot data pipeline performance and reliability issues.

  • Mentor junior data engineers and lead technical design discussions.

Required Skills & Qualifications

  • 10+ years of experience in Data Engineering or Big Data Engineering.

  • Strong expertise with Snowflake and Databricks Lakehouse platform.

  • Hands-on experience with Apache Spark (PySpark / Spark SQL).

  • Experience working with Apache Iceberg or modern table formats.

  • Advanced knowledge of SQL performance tuning and query optimization.

  • Experience designing data lake / lakehouse architectures.

  • Strong programming experience in Python, Scala, or Java.

  • Experience with workflow orchestration tools (Airflow, Prefect, or similar).

  • Knowledge of cloud platforms such as Amazon Web Services, Microsoft Azure, or Google Cloud.

  • Strong understanding of data modeling, partitioning, indexing, and storage optimization.

Preferred Qualifications

  • Experience with data lakehouse architecture and open table formats.

  • Knowledge of streaming data pipelines using Kafka or Spark Streaming.

  • Experience with CI/CD pipelines and infrastructure-as-code tools.

  • Strong leadership and mentoring experience.

  • Experience supporting enterprise-scale analytics platforms.

Nice to Have

  • Experience with data governance tools.

  • Knowledge of machine learning data pipelines.

  • Certifications in cloud platforms or data engineering technologies.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 10513292
  • Position Id: 71860-12895-
  • Posted 2 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Remote

9d ago

Easy Apply

Contract, Third Party

50 - 55

Remote

Today

Easy Apply

Contract

Depends on Experience

Remote

13d ago

Easy Apply

Contract

$70 - $80

Remote

Today

Easy Apply

Contract

Depends on Experience

Search all similar jobs