Overview
Remote
Depends on Experience
Contract - W2
Contract - Independent
Contract - 12 Month(s)
Skills
Databricks
PySpark
CI/CD
Job Details
Job Title: Databricks Architect with Optimization Focus Location: Remote Duration: Long Term Contract Pay Rate: DOE
Required Skills & Qualifications
- Bachelor s or Master s degree in Computer Science, Information Technology, or related field.
- 10+ years of experience in Data Engineering/Big Data, with 4+ years in Databricks architecture and development.
- Expertise in Databricks optimization techniques:
- Cluster sizing and auto-scaling strategies
- Spark performance tuning (caching, partitioning, shuffle optimization)
- Cost governance for compute and storage
- Proficiency in PySpark, Spark SQL, Delta Lake, and Databricks Workflows.
- Strong experience with cloud platforms (AWS/Azure/Google Cloud Platform) and their native data services (S3/ADLS/BigQuery/Redshift, etc.).
- Knowledge of data lakehouse architecture, ELT/ETL frameworks, and data modeling.
- Familiarity with CI/CD tools (Azure DevOps, GitHub Actions, Jenkins) for Databricks deployment.
- Hands-on experience in security, compliance, and governance frameworks in cloud data platforms.
- Strong problem-solving skills with the ability to troubleshoot performance bottlenecks in Spark jobs.
- Excellent communication and leadership skills to collaborate across teams and lead technical discussio
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.