Databrick Architect with MDM exp - Iselin, NJ (Hybrid)

Overview

Accepts corp to corp applications
Contract - Independent
Contract - W2
Contract - 21 day((s))

Skills

Python
Microservices
SQL
Snowflake
Databricks
MDM
GraphQL

Job Details

Role: Databricks Architect with MDM Experience
Location: Iselin, NJ (Hybrid)

Required Skills

  • MDM
  • Databricks
  • Kafka
  • Snowflake
  • GraphQL
  • SQL
  • Python
  • Microservices

Job Summary

We are seeking a highly skilled Databricks Architect with strong expertise in MDM, SQL, Python, Data Warehousing, and Cloud ETL tools to join our data team. The ideal candidate will design, implement, and optimize large-scale data pipelines ensuring scalability, reliability, and performance. This role involves close collaboration with multiple teams and business stakeholders to deliver cutting-edge data solutions.

Key Responsibilities

Data Pipeline Development

  • Build and maintain scalable ETL/ELT pipelines using Databricks.
  • Utilize PySpark/Spark and SQL to process and transform large datasets.
  • Integrate data from Azure Blob Storage, ADLS, and various relational/non-relational systems.

Collaboration & Analysis

  • Work with multiple teams to prepare data for dashboards and BI tools.
  • Collaborate with cross-functional teams to understand requirements and deliver tailored data solutions.

Performance & Optimization

  • Optimize Databricks workloads for performance and cost efficiency.
  • Monitor and troubleshoot pipelines to ensure accuracy and reliability.

Governance & Security

  • Implement data security, access controls, and governance standards using Unity Catalog.
  • Ensure compliance with organizational and regulatory data policies.

Deployment

  • Use Databricks Asset Bundles for seamless deployment of jobs, notebooks, and configurations across environments.
  • Manage version control for Databricks artifacts and support development best practices.

Technical Skills

  • Strong expertise in Databricks: Delta Lake, Unity Catalog, Lakehouse Architecture, Table Triggers, Delta Live Pipelines, Databricks Runtime, etc.
  • Proficiency in Azure Cloud Services.
  • Solid understanding of Spark and PySpark for big data processing.
  • Experience with relational databases.
  • Knowledge of Databricks Asset Bundles and GitLab.

Preferred Experience

  • Familiarity with Databricks runtimes and advanced configurations.
  • Knowledge of streaming frameworks like Spark Streaming.
  • Experience building real-time data solutions.

Certifications (Optional)

  • Azure Data Engineer Associate
  • Databricks Certified Data Engineer Associate

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About NeoTech Solutions