AzureDatabricks Engineer

Overview

Hybrid
$60 - $65
Accepts corp to corp applications
Contract - W2
Contract - Independent
Contract - 3 Year(s)
No Travel Required

Skills

Databricks
Agile
Python
data factory
Data Lake
DevOps

Job Details

Role: Azure Databricks Engineer
Location: Jersey City, NJ - Hybrid ( 2 days Onsite)
Note: The initial pre-screening round onsite is mandatory
Final Interview is at Onsite, Prefer locals in NewYork/ NewJersey/ Connecticut.

Updated JD:

Responsibilities:

1. Design and implement a scalable data warehouse on Azure Databricks using data and dimensional modeling techniques to support analytical and reporting requirements.

2. Develop and optimize ETL/ELT pipelines using Python, Azure Databricks and PySpark for large-scale data processing, ensuring data quality, consistency, and integrity.

3. Establish and implement best practices for data ingestion, transformation, and storage using the medallion architecture (Bronze, Silver, Gold).

4. Architect and develop highly scalable data applications using Azure Databricks and distributed computing.

5. Optimize Databricks clusters and ETL/ELT workflows for performance and scalability.

6. Manage data storage solutions using Azure Data Lake Storage (ADLS) and Delta Lake while leveraging Unity Catalog for data governance, security, and access control.

7. Develop and schedule Databricks notebooks and jobs for automated daily execution, implementing monitoring, alerting, and automated recovery processes for job failures.

8. Identify and resolve bottlenecks in existing code and follow best coding practices to improve performance and maintainability.

9. Use GitHub as version control tool to manage code and collaborate effectively with other developers; build and maintain CI/CD pipelines for deployment and testing using Azure DevOps and GitHub.

10. Create comprehensive documentation for data architecture, ETL processes, and business logic.

11. Work closely with business stakeholders to understand project goals and architect scalable and efficient solutions.

12. Knowledge of user authentication on Unity Catalog and authorization between multiple systems, servers and environments.

13. Ensure that programs are written to the highest standards (e.g., Unit Tests) and technical specifications.

14. Ability to collaborate on projects and work independently when required.

Qualifications:

1. 10+ years of prior experience as a developer in the required technologies (Azure Databricks, Python, PySpark, Datawarehouse Designing)

2. Solid organizational skills, ability to multi-task across different projects

3. Experience with Agile methodologies

4. Skilled at independently researching topics using all means available to discover relevant information.

5. Ability to work in a team environment.

6. Excellent verbal and written communication skills

7. Self-starter with ability to multi-task and to maintain momentum.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.