Databricks Technical Lead

Remote • Posted 12 hours ago • Updated 12 hours ago
Contract W2
No Travel Required
Remote
Depends on Experience
Fitment

Dice Job Match Score™

🫥 Flibbertigibetting...

Job Details

Skills

  • Databricks Technical Lead
  • Databricks space
  • APIs
  • ETL/Teradata
  • Spark
  • data processing
  • batch processing
  • data transformation

Summary

Job: Databricks Technical Lead Location: Saint Louis, MO. REMOTE Duration: 12 Months contract Interview Type (Phone, Video, Face to Face) Video Anticipated Start (in weeks) 2

3-5 Must Haves
technical expert in the Databricks space strong experience working with APIs
very strong batch processing skills
someone who can act as a leader on the team, and bring a strong executive presence.
This individual must be strategic, able to coach others, and capable of guiding our senior engineers in their Databricks work

Key Responsibilities: Technical Leadership and Strategy: Serve as the Databricks technical authority for the team-setting direction, establishing standards, and guiding implementation decisions
Provide strategic leadership and coaching to senior engineers, accelerating team capability and delivery quality through mentorship, technical reviews, and best practices
Translate business objectives into technical execution plans and influence stakeholders through strong communication and executive-level storytelling
Databricks Engineering (Batch-First):
Design, build, and optimize batch processing pipelines in Databricks/Spark, emphasizing reliability, performance, and maintainability
Lead patterns for scalable data transformation and orchestration, including scheduling, monitoring, and recovery/reprocessing approaches
API Integration and Data Ingestion: Architect and implement robust ingestion patterns using APIs, including authentication, pagination, throttling, retries, error handling, and schema evolution
Partner with upstream/downstream teams to define API contracts and ensure dependable data delivery into analytics and operational workloads ETL/Teradata
Enablement: Bring strong ETL/ELT discipline-requirements translation, mapping, transformations, and performance tuning-ensuring clean, governed, and well-documented pipelines *
Leverage experience with Teradata data structures and extraction/loading concepts to support integration, migration, or coexistence strategies


Operational Excellence: Establish and improve engineering practices: code reviews, reusable frameworks, documentation, and production support playbooks
Ensure data solutions meet security, compliance, and reliability expectations through quality controls and strong operational ownership
* Demonstrated expertise in Databricks and distributed data processing (Spark), with a strong focus on batch processing * Strong hands-on experience integrating with APIs for ingestion and systems connectivity
* Solid experience with Teradata and practical knowledge of working with Teradata-based environments (e.g., extraction, migration, or hybrid operations)
* Strong ETL/ELT background with the ability to design end-to-end pipelines and troubleshoot complex data issues * Proven ability to lead through influence with excellent communication skills and strong executive presence (this is a must)

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: RTX1cd78a
  • Position Id: 8920751
  • Posted 12 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Remote

5d ago

Easy Apply

Contract

Depends on Experience

Remote

Today

Easy Apply

Contract

Depends on Experience

Remote

Today

Easy Apply

Contract

Depends on Experience

Remote

Today

Easy Apply

Contract

Depends on Experience

Search all similar jobs