Azure Data Engineer

Overview

On Site
Full Time
Contract - W2
Contract - 6 month(s)

Skills

databricks
Azure
SQL
Data engineer
Azure Data Factory
ADF

Job Details

Azure Data Engineer

Location: Raleigh, NC (Hybrid Onsite 3x per week)

Job Type: Fulltime/W2 only

Note : Need local consultant only

Position Overview

We are seeking Azure Data Engineers with strong hands-on experience in Azure, Databricks, and Azure Data Factory (ADF). The engineers will play a key role in data cleanup and transformation within OLTP systems to support critical enterprise data initiatives.

The ideal candidates will have a strong foundation in cloud-native data engineering and extensive experience implementing scalable data pipelines, ensuring data quality, and leveraging modern Azure ecosystem tools.

Responsibilities
  • Design, build, and optimize data integration pipelines leveraging Azure Data Factory (ADF).

  • Implement complex Copy & Mapping Data Flows, orchestrate data workflows, and handle error management across pipelines.

  • Develop and maintain Databricks solutions, including Interactive Clusters, Jobs Clusters, Delta Live Table Pipelines, and Notebooks.

  • Leverage ADLS Gen2 Hot Tier for scalable storage and implement data access/security best practices.

  • Work with SQL Warehouse and Unity Catalog to manage, query, and organize enterprise datasets.

  • Establish and monitor data quality frameworks, leveraging Great Expectations in Databricks.

  • Implement metadata, lineage, and governance solutions with Microsoft Purview.

  • Monitor and troubleshoot pipeline and system performance using Azure Monitor and Log Analytics.

  • Collaborate with stakeholders to understand data requirements and drive data cleanup activities in OLTP systems.

  • Optimize performance and scalability of data pipelines to support enterprise reporting and analytics.

Required Skills & Experience
  • 5+ years of professional experience in data engineering, with a focus on Azure cloud data platforms.

  • Strong, hands-on experience with Azure Data Factory (ADF) Copy & Mapping Data Flows, Pipeline Orchestration.

  • Deep technical expertise with Databricks, including:

    • Interactive & Jobs Clusters

    • Advanced Databricks Notebooks

    • Delta Live Table Pipelines

    • SQL Warehouse

    • Great Expectations for quality validation

  • Strong knowledge of Azure Data Lake Storage (ADLS Gen2) and working with Hot Tier storage.

  • Hands-on experience with Microsoft Purview for governance, lineage, and compliance.

  • Experience with Azure Monitor & Log Analytics for pipeline/system troubleshooting.

  • Proficiency in writing SQL and optimizing queries for large datasets.

  • Solid understanding of data modeling, ETL/ELT patterns, and OLTP system data handling.

Preferred Skills
  • Previous experience working in a state government, transportation, or large enterprise data environment.

  • Exposure to Unity Catalog for secure data access management in Databricks.

  • Knowledge of Python or PySpark for advanced data transformations within Databricks.

  • Strong communication skills to collaborate with a diverse technology and business team.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.