Overview
Skills
Job Details
About the Team
Join a forward-thinking team focused on building and scaling an innovative, ephemeral, and immutable Data & Machine Learning platform on AWS. We use modern technologies like Databricks, Terraform, and Python to create a fully automated environment that powers data analytics and AI innovation organization-wide.
About the Role
We re seeking a skilled, hands-on Databricks Architect to serve as our top technical expert and lead practitioner for the Databricks Platform. This role combines architectural leadership with practical implementation. You ll define the architectural vision and develop core, reusable reference architectures and patterns that accelerate our teams. The ideal candidate is a Databricks master who leads by example through hands-on development and empowers teams to build robust, scalable data solutions.
What You ll Do
- Develop and maintain production-quality reference architectures and reusable patterns that showcase best practices and accelerate engineering teams.
- Architect and build proofs-of-concept for end-to-end solutions on the Databricks Lakehouse Platform, validating complex designs hands-on.
- Serve as the primary Databricks consultant, providing expert guidance beyond diagrams, including code, best practices, and hands-on support.
- Create and lead training sessions on new features like Unity Catalog, Delta Live Tables, and advanced MLOps capabilities.
- Define, document, and evangelize Databricks development standards for data modeling, performance tuning, security, and cost management.
- Mentor engineers through code reviews, paired programming, and design sessions, boosting technical proficiency across the organization.
What You ll Bring
Core Qualifications:
- Bachelor s degree in Computer Science or related field.
- 10+ years experience in data engineering, data warehousing, or software engineering, including significant architect experience.
- Proven hands-on technical architect and advisor on large-scale data projects.
- Excellent communication skills with ability to influence and guide technical teams and stakeholders.
- Strategic thinker passionate about solving complex data challenges and driving business outcomes.
- Strong critical thinking and well-reasoned architectural decision-making skills.
Technical Expertise:
Databricks Mastery (Including):
- Unity Catalog: data governance and security design and implementation.
- Delta Lake & Delta Live Tables: architecting scalable, reliable data pipelines.
- Performance & Cost Optimization: Spark job tuning, cluster optimization, cost management.
- MLOps: practical ML lifecycle management with MLflow.
- Databricks SQL: analytical workload design and optimization.
- Mosaic AI: designing and optimizing AI Agents.
Cloud & Infrastructure: deep AWS knowledge and expertise with Infrastructure as Code (Terraform, YAML).
Data Engineering & Programming: strong data modeling, ETL/ELT development, and advanced Python and SQL skills.
CI/CD & Automation: experience designing and implementing CI/CD pipelines (GitHub Actions preferred) for data and ML workloads.
Observability: familiarity with monitoring, logging, and alerting implementations for data platforms.
Automation: platform is ephemeral; all changes via Terraform and Python. Expertise in both is required.