Job: Databricks Technical Lead Location: Saint Louis, MO. REMOTE Duration: 12 Months contract Interview Type (Phone, Video, Face to Face) Video Anticipated Start (in weeks) 2
3-5 Must Haves
technical expert in the Databricks space strong experience working with APIs
very strong batch processing skills
someone who can act as a leader on the team, and bring a strong executive presence.
This individual must be strategic, able to coach others, and capable of guiding our senior engineers in their Databricks work
Key Responsibilities: Technical Leadership and Strategy: Serve as the Databricks technical authority for the team-setting direction, establishing standards, and guiding implementation decisions
Provide strategic leadership and coaching to senior engineers, accelerating team capability and delivery quality through mentorship, technical reviews, and best practices
Translate business objectives into technical execution plans and influence stakeholders through strong communication and executive-level storytelling
Databricks Engineering (Batch-First):
Design, build, and optimize batch processing pipelines in Databricks/Spark, emphasizing reliability, performance, and maintainability
Lead patterns for scalable data transformation and orchestration, including scheduling, monitoring, and recovery/reprocessing approaches
API Integration and Data Ingestion: Architect and implement robust ingestion patterns using APIs, including authentication, pagination, throttling, retries, error handling, and schema evolution
Partner with upstream/downstream teams to define API contracts and ensure dependable data delivery into analytics and operational workloads ETL/Teradata
Enablement: Bring strong ETL/ELT discipline-requirements translation, mapping, transformations, and performance tuning-ensuring clean, governed, and well-documented pipelines *
Leverage experience with Teradata data structures and extraction/loading concepts to support integration, migration, or coexistence strategies
Operational Excellence: Establish and improve engineering practices: code reviews, reusable frameworks, documentation, and production support playbooks
Ensure data solutions meet security, compliance, and reliability expectations through quality controls and strong operational ownership
* Demonstrated expertise in Databricks and distributed data processing (Spark), with a strong focus on batch processing * Strong hands-on experience integrating with APIs for ingestion and systems connectivity
* Solid experience with Teradata and practical knowledge of working with Teradata-based environments (e.g., extraction, migration, or hybrid operations)
* Strong ETL/ELT background with the ability to design end-to-end pipelines and troubleshoot complex data issues * Proven ability to lead through influence with excellent communication skills and strong executive presence (this is a must)