Overview
Remote
$60 - $70
Contract - W2
Contract - Independent
Contract - 12 Month(s)
Skills
Databricks
Job Details
Position: Senior Data Engineer
Core Responsibilities:
- Design and develop scalable, metadata-driven ETL frameworks using PySpark in Databricks.
- Implement and manage fine-grained access control using Unity Catalog, including Row-Level Security (RLS).
- Build and optimize data pipelines to support dimensional and operational data models.
- Collaborate with architecture and engineering teams to ensure data solutions align with enterprise standards.
- Apply and advocate for Unity Catalog best practices across data governance, access control, and catalog organization
- Monitor and optimize resource consumption and cost efficiency in Azure-based environments.
Required Skills & Experience:
- Experience working in large-scale, cloud-native data environments (Azure Preferred).
- Expert-level proficiency in Databricks, including SQL and PySpark.
- Hands-on experience migrating from Hive Metastore to Unity Catalog.
- Proven experience with Unity Catalog, especially in implementing Row-Level Security (RLS).
- Strong understanding of metadata-driven ETL design patterns.
- Deep knowledge of SQL and experience loading dimensional and operational data models.
- Familiarity with Azure cost tracking and resource consumption models.
Preferred Qualifications:
- Databricks certifications (e.g., Databricks Certified Data Engineer Associate/Professional).
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.