Data Engineer With Unity Catalog Experience

Overview

Hybrid
Depends on Experience
Contract - W2
Contract - Independent
Contract - 12 Month(s)
No Travel Required

Skills

Access Control
Analytics
Apache Spark
Cloud Computing
Computer Science
Continuous Delivery
Continuous Integration
Customer Relationship Management (CRM)
Data Architecture
Data Engineering
Data Governance
Data Lake
Data Modeling
Data Processing
Data Quality
Data Warehouse
Databricks
DevOps
DevSecOps
Documentation
Enterprise Resource Planning
FOCUS
Git
GitHub
IaaS
Management
Marketing
Mentorship
Meta-data Management
Network
Operational Excellence
Orchestration
Provisioning
PySpark
Python
Recruiting
Regulatory Compliance
SAP
SQL
Scala
Storage
Supply Chain Management
Terraform
Testing
Unity
Version Control
Workflow
Unity Catalog
Airbyte

Job Details

3Core Systems, Inc is an SAP and Success Factors Partner with employees located across the United States. Our organization is dedicated to customer and employee satisfaction. We provide High Quality, Cost efficient and Competitive Solutions and Resources.

3Core Systems is looking for Data Engineer With Unity Catalog Experience for one of our clients in PLANTATION, FLORIDA, 4 DAYS ONSITE

ROLE: Data Engineer With Unity Catalog Experience

LOCATION: PLANTATION, FLORIDA, 4 DAYS ONSITE

DURATION: LONG TERM

  1. Data Platform Infrastructure & DevOps
  • Administer, optimize, and scale our Databricks Lakehouse environment, ensuring high performance, cost efficiency, and operational excellence.
  • Develop, maintain, and enhance our data platform infrastructure and security configurations using Terraform .
  • This includes provisioning Databricks workspaces, SQL Endpoints, Unity Catalog objects, and network components.
  • Manage and enforce Unity Catalog for data governance, access control, and metadata management.
  • Implement and manage CI/CD pipelines for data pipelines, dbt projects, and infrastructure deployments using GitHub Actions .
  • Automate operational tasks, monitoring, and alerting for the data platform.
  • Implement and enforce DevSecOps principles, working closely with security teams to ensure compliance and manage/rotate credentials securely.
  1. Data Engineering & Pipeline Development
  • Design and implement data ingestion patterns into Databricks using Delta Lake, optimizing for large-scale data processing and storage.
  • Develop, optimize, and troubleshoot complex Spark Jobs (PySpark/Scala) for data processing and transformation within Databricks.
  • Manage and extend data ingestion pipelines using Airbyte (or similar modern tools like Fivetran, Stitch), including configuring connectors, monitoring syncs, and ensuring data quality and reliability from diverse source systems (e.g., ERP, CRM, marketing, supply chain).
  • Orchestrate and automate data pipelines and dbt models using Databricks Workflows and potentially integrating with other orchestration tools.
  1. Data Modeling & Collaboration
  • Collaborate with Analytics Engineers to translate business requirements into efficient and scalable data models using DBT (Data Build Tool).
  • Implement dbt best practices for modularity, testing, documentation, and version control, ensuring seamless integration with Databricks.
  • Partner effectively with Analytics Engineers, Data Scientists, and business stakeholders to deliver high-quality data solutions.
  • Provide technical guidance and mentorship to junior team members, and champion data engineering best practices, code quality, and documentation standards.

Qualifications & Skills

  • Education: Bachelor's degree in Computer Science, Data Engineering, or a related technical field required.
  • Experience: 5+ years of progressive experience as a Data Engineer, with a strong focus on cloud-based data platforms.
  • Deep Databricks Expertise: Extensive experience with Spark (PySpark/Scala), Delta Lake, Unity Catalog, Databricks SQL, and platform administration.
  • Data Modeling: Proven experience with dbt for data modeling, transformation, and testing.
  • Infrastructure as Code (IaC): Strong proficiency with Terraform for defining, provisioning, and managing cloud infrastructure and Databricks resources as code.
  • DevOps & CI/CD: Expertise in Git and GitHub Actions for version control and implementing robust CI/CD pipelines.
  • Programming: Proficiency in SQL and at least one programming language (Python strongly preferred, Scala is a plus).
  • Data Architecture: Solid understanding of data warehousing, data lake, and lakehouse architectures.

Please send your resumes to: Venu at 3coresystems.com

Contact: (630) (971) (5271)

Regards,

Recruiting Team

3Core Systems Inc

9101 Burnet Road, Suite 207, Austin, TX 78758

URL: www. 3coresystems. com

3Core Systems, Inc

Web: http://www.3coresystems. com

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.