Job Title: AI/ML Architect with Databricks
Location: Los Angeles CA (Hybrid)
Type: Full time
Role Overview:
We are seeking a skilled AI/ML Architect with hands-on experience in Databricks to join our team. The ideal candidate has strong analytical capabilities, experience building scalable data pipelines and machine learning models, and the ability to collaborate with cross‑functional teams to drive data‑driven decision‑making.
This role involves working with large datasets, advanced analytics, and modern data engineering and ML frameworks—primarily using Databricks on Azure/AWS.
Skills & Qualifications
Required
Bachelor’s degree or higher in Computer Science, Data Science, Mathematics, Statistics, Engineering, or related field.
3+ years of experience in data science or machine learning roles.
Advanced knowledge of Databricks, including:
PySpark / Spark SQL
Databricks notebooks
Delta Lake
MLflow
Databricks Jobs & Workflows
Strong programming skills in Python (pandas, numpy, scikit‑learn).
Experience working with large-scale data processing.
Solid understanding of machine learning algorithms and statistical techniques
Key Responsibilities
Data Science & Machine Learning
· Develop, train, and optimize machine learning and statistical models using Databricks, Python, PySpark, and MLflow.
· Perform exploratory data analysis (EDA) to identify trends, patterns, and insights in large datasets.
· Deploy ML models into production using Databricks MLflow, Delta Live Tables, or other MLOps pipelines.
· Conduct A/B testing, forecasting, segmentation, anomaly detection, or recommendation systems as required by the business.
Data Engineering & Databricks Platform
· Build scalable, high‑performance ETL/ELT pipelines using PySpark, SQL, and Databricks workflows.
· Work with Delta Lake to ensure high-quality, reliable, and performant data.
· Optimize cluster usage and job performance within the Databricks environment.
· Collaborate with data engineers to ensure high-quality data availability for modeling.
Business Collaboration
· Translate business problems into analytical solutions and present findings to non‑technical stakeholders.
· Partner with product, engineering, and business teams to drive data-informed decisions.
· Communicate complex statistical concepts in a clear and concise manner.
Preferred
· Experience deploying models in production using MLOps frameworks.
· Knowledge of Azure Databricks or AWS Databricks environments.
· Understanding of CICD pipelines and DevOps concepts (Azure DevOps, GitHub Actions, etc.)
· Familiarity with deep learning frameworks (TensorFlow, PyTorch) is a plus.
Key Competencies
· Strong analytical and problem‑solving skills
· Ability to work in a fast-paced, collaborative environment
· Excellent communication and presentation skills
· Self-driven with high attention to detail