Databricks Architect

Overview

Accepts corp to corp applications
Contract - W2
Contract - Independent

Skills

JAVA
SQL
Database
Python
Data Warehousing
Migrations
Debug
Operations
Terraform
Ecosystem
Technical Documentation
Big Data
Machine Learning
Git
Data Architecture
databricks
AWS
Data Analytics
Kafka
Real-Time
Hadoop
Translating
GCP
Streaming
Scala
Large-Scale
USE Cases
Mentoring
OLTP
Data Science
Risk Analysis
Google Cloud
Photon Engine
Technical Architecture
EDW
OLAP
WINS
Thought Leadership
Concrete
AWS Certified
Account Planning
SPARK
CLOUD

Job Details

Databricks Architect

100% Remote

Client- Databricks

Evaluation Area

Architecting & Leadership

  • Deep understanding of Lakehouse, Medallion, Data Mesh
  • Leadership / Mentoring capability

Data Engineering Skills

  • Python proficiency
  • Scala proficiencyComplex query tuning
  • SCD Types implementation
  • DBT familiarity
  • Structured Streaming experience
  • Iceberg, Parquet, Delta knowledge
  • Spark tuning and optimization
  • Git and CI/CD integration

Migration Experience

  • Large-scale migrations (>100 pipelines)
  • Experience with migration tools
  • EDW platforms (Snowflake, Redshift)

Databricks Technologies

  • Unity Catalog
  • Delta Live Tables (DLT)
  • Databricks SQL
  • Autoloader
  • Photon engine
  • Cultural fit with Databricks ecosystem

Cloud Technologies

  • Azure
  • AWS
  • Terraform

Consulting / Customer-Facing Skills

  • Consulting experience
  • Communication & Stakeholder Management

Technical Solutions Architect

Description:

As a Technical Solutions Architect at Databricks, you will collaborate with customers to design scalable data architectures utilizing Databricks technology and services. Leveraging your technical expertise and business acumen, you will navigate complex technology discussions, showcasing the value of the Databricks platform throughout the sales process. Working alongside Account Executives, you will engage with customers' technical leaders, including architects, engineers, and operations teams, aiming to become a trusted advisor who delivers concrete outcomes. You will liaise with teams across Databricks and executive leadership to advocate for your customers' needs and foster valuable engagements.

The impact you will have:
  • Develop Account Strategies: Work with Sales and other essential partners to develop strategies for your assigned accounts to grow their usage of the Databricks platform.
  • Development of Utilities & Tools: Build custom tools to assist in scoping and solution design, collaborate with engineering and IT teams to integrate tools with existing internal systems, ensuring scalability and ease of use.
  • Establish Architecture Standards: Establish the Databricks Lakehouse architecture as the standard data architecture for customers through excellent technical account planning.
  • Demonstrate Value: Build and present reference architectures, demo applications and guide technical validation to help prospects understand how Databricks can be used to achieve their goals and land new use cases.
  • Capture Technical Wins: Consult on big data architectures, data engineering pipelines, and data science/machine learning projects to prove out Databricks technology for strategic customer projects. Validate integrations with cloud services and other third-party applications.
  • Promote Open-Source Projects: Become an expert in and promote Databricks-inspired open-source projects (Spark, Delta Lake, MLflow) across developer communities through meetups, conferences, and webinars.
  • Sales Enablement & Thought Leadership: Develop and maintain up-to-date technical documentation, including case studies, whitepapers, and solution briefs to support sales efforts and development of Center of Excellence initiatives.
What we look for:

Minimum qualifications:

  • Educational Background: Degree in a quantitative discipline (Computer Science, Applied Mathematics, Operations Research).
  • Experience: 5+ years in a customer-facing pre-sales, technical architecture, or consulting role with expertise in at least one of the following technologies:
    • Big data engineering (e.g., Spark, Hadoop, Kafka)
    • Data Warehousing & ETL (e.g., SQL, OLTP/OLAP/DSS)
    • Data Science and Machine Learning (e.g., pandas, scikit-learn, HPO)
    • Data Applications (e.g., Logs Analysis, Threat Detection, Real-time Systems Monitoring, Risk Analysis)
  • Technical Expertise: Experience translating a customer's business needs to technology solutions, including establishing buy-in with essential customer stakeholders at all levels of the business. Experienced at designing, architecting, and presenting data systems for customers and managing the delivery of production solutions of those data architectures.
  • SQL Proficiency: Fluent in SQL and database technology.
  • Development Languages: Debug and development experience in at least one of the following languages: Python, Scala, Java, or R.
  • Cloud Experience (Desired): Built solutions with public cloud providers such as AWS, Azure, or Google Cloud Platform

Preferred qualifications:

  • Certifications (Highly Preferred):
    • Databricks Professional Certifications:
      • Data Engineer (Professional), ML Engineer (Professional)
    • Technical certifications such as Azure Solutions Architect Expert, AWS Certified Data Analytics, DASCA Big Data Engineering and Analytics, AWS Certified Cloud Practitioner, Solutions Architect, and Professional Google Cloud Certified.
  • Specific Expertise (Highly Preferred): Technical expertise in at least one of the following areas:
    • Generative AI / AI/ML
    • Unity Catalog
    • Databricks Data Warehousing
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Engineersmind