SAP Databricks Engineer

Overview

Remote
Depends on Experience
Accepts corp to corp applications
Contract - Independent
Contract - W2
Contract - 12 Month(s)
No Travel Required
Unable to Provide Sponsorship

Skills

AI/ML workloads
SAP Databricks
Python
Spark
SQL
SAP Analytics Cloud (SAC)
SAP services

Job Details

SAP Databricks Engineer
Remote
12 months contract 
AI/ML (SAP Business Data Cloud) Resource Needed

The SAP Databricks Engineer develops and runs AI and machine learning workloads within SAP Business Data Cloud (BDC) using SAP-managed Databricks. The role focuses on model development, feature processing, training, and inference on SAP-governed data exposed via BDC and Datasphere, enabling predictive and intelligent scenarios consumed through SAP Analytics Cloud (SAC) and other SAP services.

  • Build and execute AI/ML workloads on SAP Databricks using Python, Spark, and SQL.
  • Perform feature engineering and large-scale data preparation on BDC/Datasphere datasets.
  • Train, evaluate, and run ML models using Spark ML and Python ML libraries.
  • Generate prediction and scoring outputs and publish results back to SAP BDC / Datasphere.
  • Support SAC predictive and planning scenarios with model outputs and derived measures.
  • Collaborate with analytics and business teams to operationalize AI use cases in SAP.
  • Strong Python and Spark (PySpark/SQL) skills.
  • Hands-on/Exploratory experience with SAP Databricks in the SAP BDC ecosystem.
  • Experience with ML frameworks (e.g., scikit-learn, Spark ML, MLflow).
  • Hands-on/Deep understanding of SAP Datasphere and SAP Analytics Cloud.
  • Understanding of SAP enterprise data concepts.
 
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.