Data Engineer - Snowflake, Databricks

Overview

On Site
Accepts corp to corp applications
Contract - W2

Skills

Advanced Analytics
Reporting
Machine Learning (ML)
Unstructured Data
Collaboration
Data Quality
Database
Computer Science
Information Systems
Data Engineering
FOCUS
Python
PySpark
Extract
Transform
Load
ELT
Orchestration
ADF
Data Modeling
Query Optimization
Performance Tuning
Snow Flake Schema
Apache Spark
SQL
Databricks
Cloud Computing
Amazon Web Services
Microsoft Azure
Google Cloud
Google Cloud Platform
Data Governance
Regulatory Compliance
Problem Solving
Conflict Resolution
Communication
Real-time
Streaming
Apache Kafka
Amazon Kinesis
Continuous Integration
Continuous Delivery
DevOps
Git
Workflow
Business Intelligence
Analytics
Microsoft Power BI
Tableau

Job Details

Job Title: Data Engineer Snowflake & Databricks
Location: Berkeley Heights, NJ

Job Summary:

We are seeking a highly skilled Data Engineer with strong expertise in Snowflake and Databricks to join our team in Berkeley Heights, NJ. The ideal candidate will be responsible for designing, developing, and optimizing scalable data pipelines, ensuring data availability, reliability, and performance to support advanced analytics and business intelligence initiatives.

Key Responsibilities:

  • Design, build, and maintain scalable data pipelines and ETL processes using Snowflake and Databricks.
  • Develop and optimize data models to support reporting, analytics, and machine learning workloads.
  • Implement data ingestion frameworks to integrate structured and unstructured data from various sources.
  • Collaborate with data scientists, analysts, and business stakeholders to deliver high-quality data solutions.
  • Ensure data quality, security, and governance across all data platforms.
  • Monitor and optimize performance of Snowflake databases and Databricks clusters.
  • Automate workflows and implement best practices for CI/CD in data engineering.
  • Troubleshoot data-related issues and provide timely resolutions.

Required Skills & Qualifications:

  • Bachelor's or Master's degree in Computer Science, Information Systems, Data Engineering, or related field.
  • 9+ years of experience in Data Engineering, with a focus on Snowflake and Databricks.
  • Strong proficiency in SQL, Python, PySpark, and data transformation techniques.
  • Hands-on experience with ETL/ELT frameworks and pipeline orchestration tools (e.g., Airflow, ADF).
  • Expertise in data modeling, query optimization, and performance tuning in Snowflake.
  • Experience with Delta Lake, Spark SQL, and Databricks notebooks.
  • Familiarity with cloud platforms (AWS, Azure, or Google Cloud Platform) and modern data architectures.
  • Knowledge of data governance, security, and compliance best practices.
  • Strong problem-solving and communication skills.

Preferred Skills:

  • Experience with real-time/streaming data pipelines (Kafka, Kinesis, etc.).
  • Familiarity with CI/CD, DevOps practices, and Git-based workflows.
  • Knowledge of BI/Analytics tools (Power BI, Tableau, etc.).

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.