Overview
Skills
Job Details
Role: Databricks Data Architect
Location: [Insert Location or "Remote"]
Type: [Full-time / Contract]
Certification Required: Databricks Certified Professional
Job Description:
We are seeking a Databricks Data Architect with proven expertise in designing, building, and maintaining modern data architectures using Databricks, Apache Spark, and cloud platforms (AWS, Azure, or Google Cloud Platform). This role demands a strategic thinker with hands-on experience in creating scalable ETL/ELT pipelines, ensuring robust data governance, and driving integration across a cloud-native ecosystem.
Key Responsibilities:
Design and maintain data architectures (data lakes, data warehouses, real-time systems) aligned with business objectives.
Develop efficient and scalable ETL/ELT pipelines using Databricks and Apache Spark.
Enforce data governance, security, and compliance using tools like Unity Catalog, IAM, encryption, and cloud-native features.
Integrate Databricks with cloud services like AWS S3, Azure Data Lake, or Google Cloud Platform Storage to ensure seamless data flow.
Collaborate cross-functionally with data scientists, analysts, and business stakeholders.
Stay updated with Databricks advancements (e.g., Delta Lake, Databricks SQL) and apply best practices in data engineering.
Required Skills & Qualifications:
Databricks Certified Professional (must-have)
Proficiency in Apache Spark, Python, Scala, and SQL
Strong experience with AWS, Azure, or Google Cloud Platform
Familiarity with Redshift, Azure Data Lake, or similar data platforms
Excellent communication and stakeholder management skills
Preferred:
Experience with Unity Catalog, VPC configuration, and role-based access control