Data Architect - Databricks with PySpark

Hybrid in Cincinnati, OH, US • Posted 4 hours ago • Updated 4 hours ago
Contract Corp To Corp
Contract Independent
Contract W2
50% Travel Required
Remote
$70 - $80/hr
Company Branding Image
Fitment

Dice Job Match Score™

🤯 Applying directly to the forehead...

Job Details

Skills

  • Databricks
  • Data Lake
  • ELT
  • Informatica
  • Migration
  • PySpark
  • Azure
  • SQL

Summary

Title: Databricks Architect

Location: Cincinnati, OH

Role Overview:

We are looking for a Databricks Architect to spearhead a strategic migration initiative from Informatica (Or any similar ETL) to Azure Databricks. The ideal candidate will have deep expertise in Databricks integration, PySpark, and Unity Catalog, combined with strong data engineering fundamentals. This role requires working closely with multiple migration pods and bringing best practices for large-scale migration projects, ensuring smooth transformation from legacy ETL to modern cloud-based solutions.

Key Responsibilities:

  • Lead design and development of scalable data pipelines using Databricks and PySpark.
  • Integrate Databricks with enterprise systems and diverse data sources.
  • Implement Unity Catalog for governance, security, and lineage.
  • Drive migration from Informatica to Azure Databricks, ensuring minimal disruption and high data quality.
  • Collaborate with multiple migration pods to align technical solutions and timelines.
  • Introduce and enforce best practices for migration projects, including performance optimization and compliance.
  • Provide technical leadership and mentor team members on modern data engineering practices.

Must-Have Skills:

  • Hands-on experience in Databricks integration (focus on data engineering workflows, not platform setup).
  • Proficiency in PySpark for distributed data processing.
  • Familiarity with Unity Catalog for governance and security.
  • Strong Data Engineering background (data modeling, ETL/ELT, performance tuning).
  • Proven experience in migration from Informatica (or any other ETL) to Azure Databricks.
  • Expertise in Azure Data Services (Data Lake, etc.).
  • Strong SQL skills and understanding of big data concepts.
  • Ability to work with multiple migration pods and manage dependencies effectively.

Good-to-Have Skills:

  • Familiarity with Delta Lake and Lakehouse architecture.
  • Knowledge of CI/CD pipelines for data workflows.
  • Exposure to data security and compliance frameworks.
  • Experience with data cataloging and lineage tools beyond Unity Catalog.
  • Strong communication and stakeholder management skills.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 91166542
  • Position Id: 8910520
  • Posted 4 hours ago

Company Info

About Crea Services LLC

Our comprehensive suite of IT services encompasses managed IT solutions, web development, and hosting & cloud computing, ensuring that your business stays at the forefront of technology. With our managed IT solutions, we take the hassle out of maintaining and optimizing your IT infrastructure, allowing you to focus on your core business objectives.

About_Company_OneAbout_Company_Two
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

It looks like there aren't any Similar Jobs for this job yet.

Search all similar jobs