Databricks Platform Architect

  • Posted 2 days ago | Updated 2 hours ago

Overview

Remote
Accepts corp to corp applications
Contract - long term

Skills

Amazon Web Services
SQL
Visualization
Deployment
Performance Tuning
Networking
Continuous Integration/Delivery
GCP
Apache Spark
Large-Scale
Data Governance
network security
Data Pipelines
Data Lineage
Streaming
Identity and Access Management
Unity 3D
AWS CodePipeline

Job Details

Databricks Platform Architect

Experience: 13+ years of overall experience

Location: Remote, with occasional travel to client sites as required

What's in It for You?

Join a role that allows you to work with state-of-the-art cloud and data technologies, deepen your expertise in Databricks architecture, and drive performance and cost optimization across a scalable, global platform. This opportunity is ideal for enhancing skills in data engineering, security, and multi-region deployments, while collaborating with top industry professionals.

Job Description

  • Strong knowledge of Databricks architecture, including clusters, notebooks, jobs, and the compute and storage layers

  • Proven experience building and scaling Databricks as a global, multi-region platform

  • Expertise in Apache Spark, including Spark SQL, Spark Streaming, and MLlib

  • Familiarity with Delta Lake features such as ACID transactions and time travel

  • Skilled in Databricks SQL for data querying, analysis, and visualization

  • Ability to design and manage complex data pipelines and workflows using Databricks Jobs

  • Proficient in cluster configuration, autoscaling strategies, and performance tuning

  • Experience with Unity Catalog

  • In-depth understanding of AWS or Azure fundamentals, including storage, networking, IAM, and data security

  • Knowledge of VPC configurations, network security groups, and deployment architecture for Databricks

    Ability to monitor, analyze, and optimize Databricks costs

Nice to Have

  • Cloud certifications (Azure, AWS, or Google Cloud Platform)

  • Experience with code repositories and CI/CD pipelines (e.g., AWS CodeBuild, CodePipeline, or equivalent)

  • Knowledge of data governance frameworks and data lineage tools

  • Experience delivering large-scale, multi-region programs

Educational Qualifications:
Bachelor's or Master's degree in Engineering (BE/ME/BTech/MTech/BSc/MSc)
Technical certifications in relevant technologies are a plus+

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.