Lead Data Engineer

Overview

Hybrid
$140,000 - $160,000
Full Time
50% Travel
Able to Provide Sponsorship

Skills

ETL
Python
SQL
Snowflake
AWS-Glue
Spark
Big data

Job Details

Job Title: Lead Data Engineer

Hybrid: Jersey City, NJ

Job Type: Fulltime only

Key Skills: Snowflake, SQL, Python, Spark, AWS- Glue, Big Data Concepts

Responsibilities:

Lead the design, development, and implementation of data solutions using AWS and Snowflake.

Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.

Develop and maintain data pipelines, ensuring data quality, integrity, and security.

Optimize data storage and retrieval processes to support data warehousing and analytics.

Provide technical leadership and mentorship to junior data engineers.

Work closely with stakeholders to gather requirements and deliver data-driven insights.

Ensure compliance with industry standards and best practices in data engineering.

Utilize knowledge of insurance, particularly claims and loss, to enhance data solutions.

Must have:

Snowflake expertise (hands-on & architectural understanding) , challenging situation in the project(most common question)this must be data related, or implementation related only

ETL/Data Management proficiency, mastery in: Query optimization, data quality, performance tuning, big data problems, RBAC, row/column-level security, CDC, Incremental vs Batch, Unix commands, job scheduling, CI/CD pipeline

Team leadership (8-10+ members) , End-to-end project implementation with clarity, complex challenges in the project, describe the specific problem and its solution

Excellent communication skills

Stakeholder management & cross-functional collaboration (knowledge on data quality checks, profiling, STTM, data integration from multiple sources, reusable framework)

Release planning, change management, training, KT, L2/L3 support

Apache Iceberg + AWS glue

good to have

10+ years of relevant experience in Data Engineering and delivery.

10+ years of relevant work experience in Big Data Concepts. Worked on cloud implementations.

Strong experience with SQL, python and Pyspark

Good understanding of Data ingestion and data processing frameworks

Good experience in Snowflake, SQL, AWS (glue, EMR, S3, Aurora, RDS, AWS architecture)

Good aptitude, strong problem-solving abilities, analytical skills, and ability to take ownership as appropriate.

Should be able to do coding, debugging, performance tuning, and deploying the apps to the Production environment.

Experience working in Agile Methodology

Requirements:

Bachelor s or Master s degree in Computer Science, Information Technology, or a related field.

Proven experience as a Data Engineer, with a focus on AWS and Snowflake.

Strong understanding of data warehousing concepts and best practices.

Excellent communication skills, with the ability to convey complex technical concepts to non-technical stakeholders.

Experience in the insurance industry, preferably with knowledge of claims and loss processes.

Proficiency in SQL, Python, and other relevant programming languages.

Strong problem-solving skills and attention to detail.

Ability to work independently and as part of a team in a fast-paced environment.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Clairvoyant AI, Inc.