Data Bricks Architect

Overview

Accepts corp to corp applications
Contract - 22 day((s))

Skills

SQL
pyspark
Python
Databricks
ETL

Job Details

Role : Databricks Architect

Location : Iselin NJ / New York, NY : Hybrid (2-3 Days)

Relocation is fine

Hire type : Contract (Preferred CTH)

Exp level : 15+ years

Must have : Expert in Databricks , Pyspark , python, SQL

JD:

Strong knowledge on data bricks architecture and tools
Have experience of task and wf jobs creations in data bricks.
Deep understanding of distributed computing and how to use spark for data processing.
SQL and pyspark : strong command over querying databases and proficiency in pyspark.
Cloud platform: Preferred Azure for data bricks deployment.

Qualifications

Bachelor's degree in Computer Science, Engineering, or related field, or equivalent work experience

5+ years of experience in data engineering with Databricks and Spark

Proficient in SQL and Python and Pyspark

Experience with Azure Databricks Medallion Architecture with DLT, Iceberg

Financial/Corporate Banking context would be a plus

Experience with data integration and ETL tools, such as Azure Data Factory

Experience with Azure cloud platform and services

Experience with data warehouse and data lake concepts and architectures

Good to have experience with big data technologies, such as Kafka, Hadoop, Hive, etc

Strong analytical and problem-solving skills

Excellent communication and teamwork skills

Responsibilities

Design, develop, and maintain data pipelines using Databricks and Spark, and other cloud technologies as needed

Optimize data pipelines for performance, scalability, and reliability

Ensure data quality and integrity throughout the data lifecycle

Collaborate with data scientists, analysts, and other stakeholders to understand and meet their data needs

Troubleshoot and resolve data-related issues, and provide root cause analysis and recommendations

Document data pipeline specifications, requirements, and enhancements, and communicate them effectively to the team and management

Create new data validation methods and data analysis tools, and share best practices and learnings with the data engineering community

Implement ETL processes and data warehouse solutions, and ensure compliance with data governance and security policies

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About NAAS Technologies