Job Title: Databricks Architect (Enterprise Data Platform)
Experience: 10+ Years
Location- NJ, Hybrid
C2C or W2
We are seeking a highly skilled Databricks Architect to lead the design and implementation of next-generation data platforms built on the Lakehouse paradigm.
This role goes beyond pipeline development-you will own the Databricks platform architecture end-to-end, driving scalability, governance, performance, and cost optimization across enterprise data ecosystems.
Databricks Platform Architecture
-
- Architect and implement enterprise-scale Databricks environments (dev/test/prod)
- Design secure, scalable Lakehouse architecture using Delta Lake
Data Engineering & Processing
-
- Build and optimize high-performance data pipelines using Apache Spark (PySpark / Scala)
- Implement Medallion Architecture (Bronze, Silver, Gold)
- Develop batch and real-time streaming pipelines using Structured Streaming
Governance, Security & Compliance
-
- Implement fine-grained access control using Unity Catalog
- Define enterprise-wide data governance, lineage, and auditing
Performance & Cost Optimization
-
- Optimize workloads using partitioning, caching, and Photon engine
- Design cost-efficient cluster strategies (autoscaling, spot instances, DBU optimization)
- Monitor and improve query and pipeline performance at scale
Cloud Integration
-
- Architect Databricks solutions on AWS / Azure / Google Cloud Platform
- Integrate with cloud-native services (AWS S3/Glue, Azure ADLS/ADF, etc.)
AI Skills
- Experience with Agentbricks and AI agents at scale to build use cases
- Production and establish guardrails with AI use cases
Must-Have Skills
- Advanced proficiency in SQL and Python
- Hands-on experience with Delta Lake, Medallion Architecture, and Unity Catalog
- Strong experience with at least one cloud platform (AWS / Azure / Google Cloud Platform)
- Experience with CI/CD tools (Azure DevOps, Jenkins, GitHub Actions)
- Knowledge of workflow orchestration tools (Airflow or equivalent)
Good to Have
- Databricks certifications (Associate / Professional)
- Experience with Databricks SQL & dashboards
- Exposure to ML pipelines (MLflow)
- Experience with Kafka or event-driven architectures
- Domain experience in Finance / Retail / Healthcare
What Makes You a Great Fit
- Strong architectural mindset with ability to design scalable data platforms
- Deep understanding of performance tuning and cost optimization
- Ability to balance technical depth with business impact
- Excellent communication and stakeholder management skills