Overview
Hybrid
Depends on Experience
Contract - W2
Contract - Independent
Contract - 12 Month(s)
Able to Provide Sponsorship
Skills
Cloud Computing
Unity
DevOps
Data Warehouse
Data Architecture
Data Integration
PySpark
Python
Microsoft Azure
Databricks
Data Engineering
Data Lake
Data Governance
Job Details
Sr. Data Engineer with Unity Catalog
Seattle Based client
Location Seattle WA- 4 days office
Duration 12 months
Core skills
Collaborate with cross-functional teams to support data governance using Databricks Unity Catalog
Need to coordinate with Offshore
10-12+ years of experience in data engineering or a related field
Expertise with programming languages such as Python/PySpark, SQL, or Scala
Experience working in a cloud environment (Azure preferred) with strong understanding of cloud data architecture
. Hands-on experience with Databricks Cloud Data Platforms Required.
-Should have experience migrating to Unity Catalog.
Experience with workflow orchestration (e.g., Databricks Jobs, or Azure Data Factory pipelines) Required Responsibilities -
Design, build, and deploy data extraction, transformation, and loading processes and pipelines from various sources including databases, APIs, and data files.
Develop and support data pipelines within a Cloud Data Platform, such as Databricks
Build data models that reflect domain expertise, meet current business needs, and will remain flexible as strategy evolves
Monitor and optimize Databricks cluster performance, ensuring cost-effective scaling and resource utilization
Demonstrates ability to communicate technical concepts to non-technical audiences both in written and verbal form.
Demonstrates strong understanding with coding and programming concepts to build data pipelines (e.g. data transformation, data quality, data integration, etc.). . Demonstrates strong understanding of database storage concepts (data lake, relational databases, NoSQL, Graph, data warehousing).
Implement and maintain Delta Lake for optimized data storage, ensuring data reliability, performance, and versioning
Automate CI/CD pipelines for data workflows using Azure DevOps
Seattle Based client
Location Seattle WA- 4 days office
Duration 12 months
Core skills
Collaborate with cross-functional teams to support data governance using Databricks Unity Catalog
Need to coordinate with Offshore
10-12+ years of experience in data engineering or a related field
Expertise with programming languages such as Python/PySpark, SQL, or Scala
Experience working in a cloud environment (Azure preferred) with strong understanding of cloud data architecture
. Hands-on experience with Databricks Cloud Data Platforms Required.
-Should have experience migrating to Unity Catalog.
Experience with workflow orchestration (e.g., Databricks Jobs, or Azure Data Factory pipelines) Required Responsibilities -
Design, build, and deploy data extraction, transformation, and loading processes and pipelines from various sources including databases, APIs, and data files.
Develop and support data pipelines within a Cloud Data Platform, such as Databricks
Build data models that reflect domain expertise, meet current business needs, and will remain flexible as strategy evolves
Monitor and optimize Databricks cluster performance, ensuring cost-effective scaling and resource utilization
Demonstrates ability to communicate technical concepts to non-technical audiences both in written and verbal form.
Demonstrates strong understanding with coding and programming concepts to build data pipelines (e.g. data transformation, data quality, data integration, etc.). . Demonstrates strong understanding of database storage concepts (data lake, relational databases, NoSQL, Graph, data warehousing).
Implement and maintain Delta Lake for optimized data storage, ensuring data reliability, performance, and versioning
Automate CI/CD pipelines for data workflows using Azure DevOps
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.