AzureDataLead(Databricks,ETL,DatafactoryandSAPHANA)/SanJose,CA(Onsite)

Overview

On Site
$140,000 - $150
Full Time

Skills

Databricks
ETL
DatafactorySAPHANA

Job Details

Hi

We are looking for Azure Data Lead(Databricks, ETL, Data factory and SAP HANA)/San Jose, CA(Onsite). Anyone interested can share your resume at

Title: Technical Lead

Location: San Jose, CA 95110 (Onsite)

Direct-Hire / Contract / CTH

Lead the design, development, and execution of scalable data engineering solutions and migration strategies. Ensure seamless data movement from legacy systems to modern platforms with minimal downtime and data integrity. Deliver optimized data pipelines, enforce data governance standards, and enable analytics readiness. Drive technical excellence, mentor engineering teams, and collaborate with stakeholders to align data solutions with business goals.

Experience/Skills

  • 5+ years of experience in data engineering, data platform development and lead/architecture experience
  • Must have led 3 or more end-to-end enterprise data projects using Databricks, and Azure technologies
  • 5+ years of experience contributing enterprise data projects involving Azure-based ETL solutions, Databricks & SQL
  • 5+ years of in building ETL pipelines using Azure Data Factory (ADF)
  • Experience using Fivetran to automate data pipeline builds; Understanding of Databricks ML and analytics tools
  • Experience resolving networking/VPN issues related to data flow; Familiarity with data governance, security, and compliance frameworks
  • Bachelor s degree (BS/MS) in Computer Science, Information Systems / CIS or a related field
  • Prior experience in working on Agile/Scrum projects with exposure to tools like Jira/Azure DevOps

Key Responsibilities

  • Lead the design and implementation of ETL solutions using SAP Data Services and Azure Data Factory
  • Leverage Fivetran for automated data ingestion from SAP S4 source systems into the Bronze layer
  • Analyze and migrate stored procedures from SAP HANA using SQL / PL/SQL to Databricks-based logic
  • Guide and mentor team members on data engineering best practices
  • Develop and maintain complex ETL pipelines using Python
  • Identify and resolve performance bottlenecks and network-related issues
  • Ensure adherence to data governance and compliance standards across all data flows
  • Participate in performance tuning, issue resolution, and data validation tasks
  • Document data flows, pipeline logic, and lineage as part of project delivery
  • Provides regular updates, proactive and due diligent to carry out responsibilities
  • Communicate effectively with internal and customer stakeholders; approach: verbal, emails and instant messages
  • Strong interpersonal skills to build and maintain productive relationships with team members
  • Provide constructive feedback during code reviews and be open to receiving feedback on your own code
  • Problem-Solving and Analytical Thinking; Capability to troubleshoot and resolve issues efficiently

Regards,

Pinku

Talent Acquisition Radiansys Inc.

Direct: 510 790 2000 Ext 1006

Email:

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.