Senior Azure Databricks Engineer- Only Locals

Overview

On Site
Depends on Experience
Contract - Independent
Contract - W2
Contract - 12 Month(s)

Skills

ADF
"Azure Data Factory"
ETL
"Azure SQL"
Python
DataFactory
Databricks

Job Details

Required skills:

  • 15+ years of experience in IT.
  • Heavy ADF (Azure Data Factory): Azure Engineering, DW, Data pipelines, Data Lakehouse, Databricks
  • Very technical, hands-on- not looking for Architects
  • Love ETL Development, Database exp
  • ADF (Azure Data Factory):Azure Engineering, Data warehouse, Data pipelines, Databricks, Data lakehouse, ETL, Strong Database experience.
  • Experience in the following domains is ideal: Banking, Insurance, Healthcare.
  • Experience on ADLS, Azure Databricks, Azure SQL DB and Datawarehouse
  • Strong working experience in Implementation of Azure cloud components using Azure Data Factory , Azure Data Analytics, Azure Data Lake, Azure Data Catalogue, LogicApps and FunctionApps
  • Have knowledge in Azure Storage services (ADLS, Storage Accounts)
  • Expertise in designing and deploying data applications on cloud solutions on Azure
  • Hands on experience in performance tuning and optimizing code running in Databricks environment
  • Good understanding of SQL, T-SQL and/or PL/SQL
  • Should have experience working in Agile projects with knowledge in Jira
  • Good to have handled Data Ingestion projects in Azure environment
  • Demonstrated analytical and problem-solving skills particularly those that apply to a big data environment
  • Experience on Python scripting, Spark SQL PySpark is a plus.

Responsibilities:

  • Build large-scale batch and real-time data pipelines with data processing frameworks in Azure cloud platform.
  • Designing and implementing highly performant data ingestion pipelines from multiple sources using Azure Databricks.
  • Direct experience of building data pipelines using Azure Data Factory and Databricks.
  • Developing scalable and re-usable frameworks for ingesting of datasets
  • Lead design of ETL, data integration and data migration.
  • Partner with architects, engineers, information analysts, business, and technology stakeholders for developing and deploying enterprise grade platforms that enable data-driven solutions.
  • Integrating the end-to-end data pipeline - to take data from source systems to target data repositories ensuring the quality and consistency of data is maintained at all times
  • Working with event based / streaming technologies to ingest and process data
  • Working with other members of the project team to support delivery of additional project components (API interfaces, Search)
  • Evaluating the performance and applicability of multiple tools against customer requirements

Regards,

Raj Dakshinapu | Recruiter, Dotcom Team LLC

Phone:

Web: | Email:

2023 Best Places to Work Boston Business Journal

Minority Certified Boston Business Journal s Top Ten 2022, 2023

Certified National Minority Supplier NMSDC

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.