W2- (New Jersey) Remote :: Azure Data Engineer with Azure Data OR Azure Databricks Certification || 10+Years Exp || (U.S.C & G.C only)

Overview

Remote
Depends on Experience
Full Time
No Travel Required

Skills

Azure
Data Engineer
Databricks
PySpark
Python
Spark
SQL
Synapse
Agile

Job Details

Azure Data Engineer with Azure Data OR Azure Databricks Certification || 10+Years Exp || (U.S.C & G.C only)

6-12+ Months

(New Jersey) Remote

 

10+ Years (Commercial Insurance Experience a Plus)

Certifications: Azure Data Certification and/or Databricks Certification Required

We are seeking a highly experienced Azure Data Engineer with 10+ years of expertise in designing and implementing data engineering solutions on Azure. The role requires strong hands-on experience with PySpark/Python, Spark SQL, Azure Synapse, and Databricks. Commercial Insurance industry knowledge is a strong plus. Candidates must hold relevant Azure or Databricks certifications.

Job Overview:
We are in pursuit of a well-qualified Data Engineer with a deep knowledge of Azure and/or Databricks for the creation, enhancement, and refinement of data architectures, analytical frameworks, and Lakehouse infrastructures. A viable candidate will demonstrate a robust understanding of and experience with data structure design, intricate data issue resolution, and the construction of data processing pipelines, in addition to practical skills in PySpark, Python, Spark SQL, and the utilization of either Azure Synapse or Databricks. This position demands prolific problem-solving capabilities, an aptitude for thriving in a dynamic Agile workspace, and exceptional communicative proficiency for effective interfacing with varied departmental teams and preemptive issue management.

Principal Functions and Aptitudes:
Architecting, refining, and upholding data schematics in both conventional data warehousing and Lakehouse platforms.
Identifying and rectifying data integrity and operational hiccups employing sophisticated SQL techniques.
Establishing and administering data channels for content importation, alteration, and deployment.
Crafting notebooks in PySpark, Python, and Spark SQL to facilitate data manipulation and implement business algorithms.
Engaging with Azure Synapse and Databricks along with data lake technologies, emphasizing efficiency enhancements.
Active participation in a brisk Agile setting, initiating discussions on potential complications and recommending tactical remedies.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.