Databricks Data Architect

Overview

Remote
Depends on Experience
Contract - W2
Contract - 6 Month(s)

Skills

Access Control
Amazon Redshift
Amazon S3
Amazon Web Services
Apache Spark
Cloud Computing
Collaboration
Communication
Data Engineering
Data Flow
Data Governance
Data Lake
Good Clinical Practice
Databricks
ELT
Encryption
Extract
Transform
Load
Google Cloud Platform
Microsoft Azure
Python
Real-time
Regulatory Compliance
SQL
Scala
Stakeholder Management
Storage
Unity
Virtual Private Cloud
Data Warehouse

Job Details

Role: Databricks Data Architect

Location: [Insert Location or "Remote"]
Type: [Full-time / Contract]
Certification Required: Databricks Certified Professional

Job Description:
We are seeking a Databricks Data Architect with proven expertise in designing, building, and maintaining modern data architectures using Databricks, Apache Spark, and cloud platforms (AWS, Azure, or Google Cloud Platform). This role demands a strategic thinker with hands-on experience in creating scalable ETL/ELT pipelines, ensuring robust data governance, and driving integration across a cloud-native ecosystem.

Key Responsibilities:

  • Design and maintain data architectures (data lakes, data warehouses, real-time systems) aligned with business objectives.

  • Develop efficient and scalable ETL/ELT pipelines using Databricks and Apache Spark.

  • Enforce data governance, security, and compliance using tools like Unity Catalog, IAM, encryption, and cloud-native features.

  • Integrate Databricks with cloud services like AWS S3, Azure Data Lake, or Google Cloud Platform Storage to ensure seamless data flow.

  • Collaborate cross-functionally with data scientists, analysts, and business stakeholders.

  • Stay updated with Databricks advancements (e.g., Delta Lake, Databricks SQL) and apply best practices in data engineering.

Required Skills & Qualifications:

  • Databricks Certified Professional (must-have)

  • Proficiency in Apache Spark, Python, Scala, and SQL

  • Strong experience with AWS, Azure, or Google Cloud Platform

  • Familiarity with Redshift, Azure Data Lake, or similar data platforms

  • Excellent communication and stakeholder management skills

Preferred:

  • Experience with Unity Catalog, VPC configuration, and role-based access control

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Tetra Computing