Overview
Skills
Job Details
Job Title - Databricks Platform Administrator / Engineer
Location Hybrid
Duration 12+ Months
Job Summary: We are seeking a skilled Databricks Platform Administrator/Engineer with proven experience supporting modern data platforms in a Property & Casualty (P&C) insurance setting. The ideal candidate will be responsible for managing, configuring, optimizing, and maintaining the Databricks environment to support data engineering, analytics, and machine learning workloads in a secure and compliant manner.
Education: Bachelor s or master s degree in computer science, engineering, information systems, or a related field.
Required Qualifications:
- 5+ years of experience in data platform engineering or system administration.
- 2+ years of hands-on experience with Databricks administration on Azure
- Solid understanding of Apache Spark, Delta Lake, and Lakehouse architecture.
- Proficiency in scripting and automation using Python, Bash, PowerShell, or Terraform.
- Familiarity with data security, audit, and compliance frameworks relevant to P&C insurance (e.g., GDPR, NYDFS, SOC 2).
- Experience with Azure Data Services, ADLS, Key Vault, and Azure DevOps
- Strong understanding of the P&C insurance data ecosystem (policy, claims, underwriting, actuarial, etc.).
Preferred Qualifications:
- Databricks Certified Associate / Professional (Administrator / Data Engineer) certification.
- Experience integrating Databricks with core systems like Guidewire, Duck Creek, or custom P&C platforms.
- Knowledge of industry standards like ACORD data models and ISO data feeds.
- Exposure to tools like Unity Catalog, Purview, Collibra, Informatica, or similar.
Soft Skills:
- Strong problem-solving and analytical skills.
- Ability to work cross-functionally in an agile team environment.
- Excellent communication and stakeholder management skills.
- Experience working in regulated enterprise environments with strong data governance standards.
Key Responsibilities:
- Platform Administration & Management:
- Manage and administer Databricks workspaces, clusters, jobs, libraries, and notebooks.
- Implement access controls using Unity Catalog, SCIM, and workspace-level RBAC.
- Configure and monitor autoscaling, cluster policies, and compute usage to optimize cost and performance.
- Support platform upgrades, patching, and operational automation.
- Security, Compliance & Governance:
- Enforce security policies aligned with P&C insurance industry regulations.
- Integrate with enterprise identity providers (Azure AD, Okta) and manage SSO.
- Support data governance using tools like Unity Catalog, Immuta, or Collibra.
- DevOps & Automation:
- Build CI/CD pipelines for notebooks, jobs, and ML workflows using tools like Azure DevOps, GitHub Actions, or Jenkins.
- Automate platform provisioning and configurations using Infrastructure-as-Code (Terraform, ARM, etc.).
- Monitoring & Troubleshooting:
- Set up logging, monitoring, and alerting using tools like Azure Monitor, Datadog, or Splunk.
- Troubleshoot performance issues, job failures, and cluster errors.
- Collaboration & Enablement:
- Work closely with data engineers, data scientists, and business analysts to ensure optimal platform usage.
- Develop platform usage guidelines, onboarding documentation, and best practices.
- Provide technical support and training to users.
Thanks And Regards,
Amit Lakhotia