Data Architect Databricks

Overview

On Site
Hybrid
$60 - $70
Accepts corp to corp applications
Contract - W2
Contract - Independent
Contract - 12 Month(s)
25% Travel

Skills

Data Architect
ETL
Python
RDBMS
data warehouses
data lakes
XML
cloud infrastructure
JSON
SQL
Databricks
Spark
Azure

Job Details

Position : Data Architect Databricks

Location : Bloomington IL

Long Term

  • 15+ yearsof experience in the IT/Technology and having at least 3+ years of experience in Azure/Databricks
  • Design and architect Databricks-based solutions that align with business objectives, ensuring scalability, performance, and security.
  • Designed, Development and implemented Datamesh using DeltaLake.
  • Provide technical leadership and guidance to the Databricks development team, ensuring best practices are followed throughout the project lifecycle.
  • Collaborate with cloud infrastructure teams to design and optimize the underlying infrastructure for Databricks workloads on platforms such as AWS or Azure
  • Develop efficient data ingestion and ETL pipelines using Databricks, Apache Spark, and other relevant technologies.
  • Integrate Databricks with data lakes and data warehouses to ensure seamless data access and analytics.
  • Continuously monitor and optimize Databricks workloads for performance and cost-effectiveness.
  • Implement and maintain security measures, including access controls, encryption, and compliance standards, to protect sensitive data.
  • Create documentation and provide training to internal teams on Databricks best practices and usage.
  • Stay up-to-date with the latest developments in Databricks and related technologies to recommend and implement improvements.
  • Good knowledge on SQL Queries & Stored Procedures.
  • Strong programming skills in Pyspark, Python, writing complex queries in T-SQL
  • Experience in transferring data from RDBMS to Data bricks using ADF.
  • Expertise in using Spark SQL with various data sources like JSON, Parquet and XML.
  • Experience in creating tables, partitioning, bucketing, loading and aggregating data using Spark SQL/Python.
  • Experience with Insurance Domain

About Advent Global Solutions, Inc.