Principal Data Engineer

Overview

Remote
On Site
Depends on Experience
Contract - W2

Skills

ARM
Apache Spark
Continuous Integration
Data Flow
DevOps
Google Cloud Platform
GitHub
Microsoft Azure
PySpark
Python
RBAC
SQL
Snow Flake Schema
Terraform

Job Details

Job Title: Principal Data Engineer

Location: Jersey City, NJ (Hybrid)

Duration: 12+ Months

Job Type: W2 Only

About the Role

We are seeking an accomplished Principal Data Engineer with deep expertise in Snowflake and cloud data platforms (Azure/Google Cloud Platform) to architect and implement enterprise-scale data solutions. You will lead complex data initiatives, optimize our modern data stack, and mentor engineering teams while solving cutting-edge data challenges.

Key Responsibilities

  • Architect and implement high-performance Snowflake solutions (RBAC, CDC, query optimization)
  • Design scalable ETL frameworks using Spark, Python, and cloud-native services
  • Lead end-to-end data projects from ingestion to consumption (8-10 member teams)
  • Solve complex data challenges e.g., "How would you optimize a slowly changing dimension process handling 10TB daily in Snowflake?"
  • Implement data quality frameworks with profiling, STTM, and reusable validation modules
  • Build CI/CD pipelines for data applications (Azure DevOps/GitHub Actions)
  • Drive cloud migrations (Azure Data Factory/Google Cloud Platform Dataflow to Snowflake)
  • Provide L3 support and knowledge transfer to engineering teams

Technical Requirements

  • 14+ years in data engineering with 5+ years focused on Snowflake implementations
  • Expert-level skills in:
    • Snowflake architecture (time travel, zero-copy cloning, resource monitoring)
    • Advanced SQL (query tuning, window functions, stored procedures)
    • PySpark optimizations (partitioning, broadcast joins, Delta Lake)
  • Proven experience with:
    • Azure/Google Cloud Platform data services (Synapse/Dataform, BigQuery, Cloud Composer)
    • Data orchestration (Airflow, Dagster)
    • Infrastructure-as-code (Terraform, ARM templates)
  • Strong Python coding (unit testing, logging, packaging)
  • Experience resolving critical production issues (e.g., "Describe a time you debugged a Snowflake warehouse timeout during month-end close")
  • Ability to translate business needs to technical specs (insurance/finance domain preferred)

Preferred Qualifications

  • Snowflake certifications (SnowPro Advanced)
  • Experience with Apache Iceberg and lakehouse architectures
  • Knowledge of insurance claims/loss data models

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.