Overview
Remote
Depends on Experience
Contract - Independent
Contract - W2
Contract - 12 Month(s)
Skills
Databricks
DevOps
AWS
Platform Architect
Job Details
This role offers candidates the opportunity to work with cutting-edge cloud and data technologies, gain expertise in Databricks architecture, optimize performance and costs, and contribute to building a scalable global platform. It s a great chance to enhance skills in data engineering, security, and multi-region deployments while collaborating with industry professionals.
Job Description
- Deep understanding of Databricks architecture, including clusters, notebooks, jobs, and the underlying compute and storage layers
- Experience building Databricks as a global platform across multiple regions
- Proficiency in Apache Spark, including its core components (Spark SQL, Spark Streaming, and MLlib)
- Knowledge of Delta Lake and its features (ACID transactions, time travel, etc.)
- Experience in using Databricks SQL for data querying, analysis, and visualization
- Ability to create and manage complex data pipelines and workflows using Databricks Jobs
- Understanding of cluster configurations, autoscaling, and performance optimization
- Unity Catalog experience
- Deep understanding of AWS or Azure cloud essentials, including Storage, Networking, Identity and Access Management, and Data Security
- Understanding of network configurations, VPCs, and security groups for Databricks deployments
- Ability to analyze and optimize Databricks costs
Nice-to-haves
- Certifications for any of the cloud services like Azure, AWS, or Google Cloud Platform
- Experience working with code repositories and continuous integration pipelines using AWS code build/code pipelines or similar tools/technologies
- Experience in data governance and lineage implementation
- Multi-geo and distributed delivery experience in large programs
Educational Qualifications: -
- Engineering Degree BE/ME/BTech/MTech/BSc/MSc.
- Technical certification in multiple technologies is desirable.
Must Have
- Databricks Lakehouse Platform Configuration and Implementation on AWS - Which projects, what was role that they played - Network settings, Integration with Identify Access Management, Defining Environment Strategy for Dev, QA, UAT, Prod
- Sizing and Capacity for Databricks Platform - Cluster Sizing
- Integration with Logging and Auditing Services
- Integration of Databricks with DevOps
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.