Databricks Data Engineer

Overview

Hybrid
Depends on Experience
Contract - Independent
Contract - W2
Contract - 12 Month(s)

Skills

B2B
Data Engineering
Databricks
MarTech
Salesforce
Restful
Azure
Project Management

Job Details

Title: Databricks Data Engineer
Location: San Jose, CA
Duration of the Contract: 12+months

Job Description
We are seeking a Databricks Data Engineer to design, build, and optimize scalable data solutions that empower B2B operations. This role involves working closely with business stakeholders, data scientists, and analysts to transform raw enterprise data into actionable insights. The ideal candidate will have strong expertise in Databricks, cloud data platforms, and modern data engineering practices, with a focus on enabling B2B workflows such as customer analytics, partner integrations, and operational intelligence.

Key Responsibilities:

  • Design, develop, and maintain scalable ETL/ELT pipelines using Databricks and cloud-native tools.
  • Ingest and process structured, semi-structured, and unstructured data from multiple B2B sources (APIs, CRM, ERP, partner feeds).
  • Optimize pipelines for performance, reliability, and cost efficiency in large-scale B2B environments.
  • Partner with business stakeholders to translate B2B requirements into technical solutions.
  • Work closely with data scientists and analysts to enable advanced analytics and machine learning use cases.
  • Provide technical guidance to cross-functional teams on Databricks best practices.

Additional Responsibilities:

  • Manage marketing technology and automation, with hands-on experience in Marketo administration, configuration, and production.
  • Integrate marketing automation tools with platforms such as Salesforce or Dynamics.
  • Design, develop, automate, and maintain RESTful and GraphQL APIs for secure, scalable system integrations.
  • Collaborate with engineering and product teams to gather data and integration requirements, delivering robust solutions aligned with business needs.
  • Build and optimize cloud-native data pipelines and ETL processes for ingesting and transforming data from diverse sources.
  • Ensure data consistency, reliability, and performance across integrated systems to support analytics and operational workflows.
  • Implement comprehensive monitoring, logging, and alerting for integration services to ensure high availability and rapid issue resolution.
  • Document integration workflows, API specifications, and engineering processes to promote maintainability, scalability, and cross-team knowledge sharing.

Skills Required:

  • Primary: B2B, Data Engineering, Databricks
  • Secondary: Project Management, Stakeholder Handling
  • Good to Have: MarTech / Marketo, Salesforce or Dynamics, API development (RESTful & GraphQL), Cloud Platforms (Azure/AWS/Google Cloud Platform)
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.