Senior Python Engineer - Data Ingestion & Databricks - Remote/Hybrid

Overview

Remote
Depends on Experience
Accepts corp to corp applications
Contract - Independent
Contract - W2
Contract - 6 Month(s)
No Travel Required

Skills

Azure
Python
Databricks
Data Lake
CI/CD

Job Details

Job Title: Senior Python Engineer - Data Ingestion & Databricks

Location: Remote / Hybrid

Duration: Contract to Hire

Looking for a candidate currently in MN Hybrid

Would consider someone from South Dakota, North Dakota, Iowa and WI for remote work.

About the Role

o We are seeking a strong Python engineer with a passion for clean, scalable code and a mindset rooted in metadata-driven architecture. This role is ideal for someone who thrives in a collaborative environment, enjoys setting engineering standards, and has hands-on experience with Databricks and Delta Lake, preferably in an Azure ecosystem.

o You'll work closely with technical leads to shape the development culture, contribute to architectural decisions, and help build robust, reusable Python libraries that power our data platform.

Key Responsibilities

o Design, develop, and maintain Python libraries with a focus on packaging, distribution, and reusability.

o Champion metadata-driven development practices to build flexible and scalable systems.

o Collaborate with team leads to define and enforce coding standards, including code reviews and documentation.

o Implement and maintain CI/CD pipelines with tools like linters, type checkers (e.g., mypy), and automated testing frameworks (e.g., pytest).

o Develop and optimize data workflows on Databricks, leveraging Data Lake and best practices for performance and scalability.

o Communicate technical decisions clearly and confidently to both technical and non-technical stakeholders.

o Mentor junior engineers and contribute to a culture of continuous improvement.

Required Qualifications

o Proven experience in packaging and distributing Python libraries (e.g., setuptools, poetry, pip, uv).

o Strong understanding of metadata-driven architecture and its application in software or data systems.

o Familiarity with CI/CD practices in Python projects, including automated testing, linting, and type checking.

o Experience with Github and Github Actions for CI/CD.

o Hands-on experience with Databricks and Delta Lake, ideally within an Azure environment.

o Excellent communication skills with the ability to explain and justify technical decisions.

o A collaborative, opinionated mindset with a drive to lead by example.

o Understanding of Scrum and Agile methodologies.

Nice to Have

o Experience with data ingestion frameworks or data pipeline orchestration tools (e.g. Airflow).

o Familiarity with containers (Docker, Kubernetes, Helm)

o CI/CD build and deploy architecture (Tekton/Argo)

o Familiarity with infrastructure-as-code (e.g. Terraform).

o Experience with managing Databricks infrastructure-as-code (clusters, policies, etc.)

o Contributions to open-source Python projects or internal tooling libraries.

Best Regards,

Chetna

-D

-Fax

Truth Lies in Heart

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.