Databricks Manager

Remote • Posted 14 hours ago • Updated 9 hours ago
Contract Corp To Corp
Contract Independent
Contract W2
Able to Sponsor
Remote
Depends on Experience
Fitment

Dice Job Match Score™

🫥 Flibbertigibetting...

Job Details

Skills

  • Databricks
  • Cloud
  • SQL
  • Pyspark
  • Big Data
  • delta lake
  • unity catalog

Summary

Operational Lakehouse Strategy, Operations & Platform Management

  • Management level experience for senior big data platform management (10+ years)
  • Experience working across different functional (application, infrastructure, security, compliance / audit, operations and business domains
  • Strong communication and organizational skills
  • Support delivery and management of the enterprise lakehouse architecture and implementation on large-scale cloud data platforms (Databricks)
  • Experience with Databricks usage in hyperscaler environments (Azure, Google Cloud Platform and Azure)
  • Support and lead implementation of best practices standards for SQL/PySpark development and usage
  • Standardize data using industry frameworks to ensure IT-related data alignment (infrastructure-related information, infrastructure capacity, security-related, application runtime data, IT monitoring-related information, and additional meta-data)
  • Support and provide best practices on data mapping
  • Establish multi-zone / Medallion architecture to drive data and cost optimizations:
    • Bronze (raw telemetry)
    • Silver (cleaned/normalized)
    • Gold (aggregated/KPIs)
  • Design for 500TB+/day ingestion scale
  • Define standards for:
    • Delta Lake usage including Delta Tables / DLT
    • Table optimization (Z-ordering, partitioning)
    • Data lifecycle management
    • User workflows and use cases across various areas including line of business and IT
    • Knowledge of various Databricks capabilities including data engineering tools, Mosaic (AI/ML tools), Autoloader, Unity Catalog, Delta Tables / DLT, query builder, workspace - schema - table structures, Autoloader, LakeFlow, Genie, DataBricks Workflows / Jobs and additional Databricks components
  • Support FinOps (usage and capabilities cost controls) related activities including management and optimizations of compute, storage and DBU usage
  • Support Unity Catalog buildout including IAM and RBAC
  • Support and lead expertise
  • Support user-related best practices including use cases across various stakeholder roles, governance, user support, SLO / SLA development, predictive alerting and anomaly detection
  • Support pattern development and optimizations for data ingestion including streaming, batch and incremental
  • Knowledge and expertise in various data pipeline approaches and platforms to ensure data quality, data optimizations and reductions, ETL functions, data protection and high throughput and low latency
  • Support and provide expertise on semantic models
  • Support schema
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 90859492
  • Position Id: 8923806
  • Posted 14 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Remote or Hybrid in Cincinnati, Ohio

15d ago

Easy Apply

Third Party, Contract

$70 - $80

Remote

3d ago

Easy Apply

Contract

Depends on Experience

Remote

22d ago

Easy Apply

Contract

85 - 100

Remote

Today

Easy Apply

Contract

Depends on Experience

Search all similar jobs