Hello,
Good Afternoon,
We Floga Technologies are a growing IT services organization delivering scalable technology solutions and are expanding our team with qualified talent.
We have Immediate opening for a Sr. Certified Databricks Architect (Certified in Professional/ Champion) with strong hands-on experience to support a long-term engagement with our client. Must be able to work collaboratively with cross-functional teams to deliver high-quality solutions.
Job Title: Sr. Certified Databricks Architect (Certified in Professional/ Champion)
Location: Chicago, IL (Onsite)
Duration: Long Term
Experience: 13+ Years
Interview Process
3 Rounds: The interview is conducted by Databricks.
Job Summary
We are seeking an experienced Data Architect with deep expertise in Databricks to design, architect, and implement scalable data platforms using the Lakehouse architecture. The ideal candidate will have strong hands-on experience in data engineering, data modeling, performance optimization, and enterprise data governance within Databricks environments.
The candidate must hold a valid Databricks professional certification or Champion status and must be able to successfully clear the Databricks Partner Professional (DPP) process.
Required Qualifications
Candidates must meet one of the following:
Databricks Professional Certification (e.g., Data Engineer Professional / Machine Learning Professional)
OR
Databricks Champion Status
Additionally:
Must be able to clear the Databricks Partner Professional (DPP) process.
Key Responsibilities
Design and implement scalable data architectures using Databricks Lakehouse platform.
Architect batch and real-time data pipelines using Spark, Delta Lake, and structured streaming.
Define enterprise data models (conceptual, logical, and physical) for large-scale analytics platforms.
Implement Medallion Architecture (Bronze, Silver, Gold layers).
Optimize performance and cost efficiency of Databricks workloads.
Establish data governance, security, and compliance frameworks.
Design data ingestion pipelines from multiple sources (APIs, RDBMS, SaaS platforms, streaming systems).
Implement CI/CD pipelines for Databricks notebooks, jobs, and workflows.
Collaborate with business stakeholders, data engineers, and analytics teams to translate requirements into scalable solutions.
Ensure data quality, lineage, and observability across enterprise data platforms.
Technical Requirements
- Strong expertise in:
- Apache Spark (PySpark / Scala)
- Delta Lake
- Databricks Workflows & Unity Catalog
- SQL and advanced data modeling
- Experience with cloud platforms (Azure / AWS / Google Cloud Platform)
- Hands-on experience with:
- Data warehousing concepts
- Lakehouse architecture
- Performance tuning and optimization
- Data governance and security frameworks
- Experience integrating Databricks with BI tools (Power BI, Tableau)
Interested Candidates can reach me out to the below details:
Likitha P | Floga Technologies.
E:
D:
;/b>