Overview
Skills
Job Details
Senior Data Engineer (Lakehouse Specialist)
12 + Months Contract
REMOTE CST
GC
Key Skills:
- DataBricks, Snowflake, AWS, Google Cloud Platform, SQL, Python
- Lakehouse platform development
- Lakehouse Architect and Maintain with Delta Lake, Apache Iceberg, or Hudi
- Spark: Develop and optimize batch and streaming ETL pipelines
- Data Lakehouse design patterns and modern data architectures
Job Description:
We are seeking a Senior Data Engineer with strong experience building and maintaining Data Lakehouse architectures. In this role, you'll design scalable data pipelines, optimize data models, and ensure high-performance data availability across structured and unstructured sources. You'll work closely with data analysts, data scientists, and business stakeholders to deliver reliable, high-quality datasets that support analytics, machine learning, and real-time decision-making.
Key Responsibilities:
- Architect and maintain Lakehouse platforms using tools like Delta Lake, Apache Iceberg, or Hudi
- Develop and optimize batch and streaming ETL pipelines
- Implement data quality, governance, and security best practices
- Collaborate with cross-functional teams to deliver business-critical datasets
- Tune performance and ensure scalability for large-scale workloads
Qualifications:
- 5+ years of data engineering experience
- Strong knowledge of cloud data platforms (e.g., Databricks, Snowflake, AWS, Google Cloud Platform)
- Proficiency in SQL, Python, and big data tools (Spark, dbt, etc.)
Experience with data lakehouse design patterns and modern data architectures
Skill | Years of Experience |
Lakehouse platforms (Delta Lake, Apache Iceberg, or Hudi) |
|
Data architecture |
|
Spark (Develop and optimize batch and streaming ETL pipelines) |
|
AWS |
|
Snowflake |
|
DataBricks |
|
Google Cloud Platform |
|
DBT |
|
SQL |
|
Python |
|
Thanks
Ankush