Overview
Skills
Job Details
Job Title: Technical Program/Product Manager - Data Platform & Lakehouse
Location: Plano, TX / Mountain View, CA (Hybrid - 3 days onsite per week)
Duration: [Specify if known, e.g., 12-Month Contract]
Job Description:
We are seeking a highly technical and hands-on Technical Program/Product Manager to drive the end-to-end delivery of our modern data platform. In this role, you will own multi-team programs to design, build, and scale a secure, cost-efficient lakehouse architecture on AWS.
You will be the central hub coordinating roadmaps across data engineering, platform, security, and product teams to land critical capabilities in Databricks (Delta Lake, Unity Catalog, Workflows/Jobs, Delta Live Tables) and event-driven data products using technologies like AWS EventBridge and Kafka/Kinesis. This role spans real-time streaming and batch pipelines and requires a leader with deep technical fluency and strong delivery discipline.
Key Responsibilities:
Drive the end-to-end delivery of the modern data platform, managing complex, multi-team programs from conception to launch.
Own and coordinate roadmaps for data engineering, platform, and security teams to land capabilities in Databricks and event-driven architectures.
Manage program dependencies, risks, and mitigation strategies proactively.
Define and track program-level metrics, SLAs, and success criteria.
Create and maintain program-level dashboards and executive-level reporting decks.
Facilitate Agile ceremonies, lead roadmap prioritization efforts, and manage RFC (Request for Comments) processes.
Lead migrations from legacy ETL systems (e.g., Informatica) to modern lakehouse patterns.
Drive FinOps initiatives for data compute to optimize cloud costs and efficiency.
Apply AI-native product thinking and utilize modern prototyping tools (e.g., , Cursor) with minimal hand-holding.
Required Skills & Experience:
Proven experience as a hands-on Technical Program or Product Manager for enterprise data platforms.
Deep technical fluency in Databricks, including hands-on knowledge of Delta Lake, Unity Catalog, and Workflows/Jobs.
Strong experience with event-driven architectures (e.g., AWS EventBridge, Kafka, Kinesis) for both real-time streaming and batch data processing.
Must have experience with Informatica or an equivalent enterprise data platform (e.g., IBM DataStage, Talend, SSIS).
Expertise in common data platform architectures, specifically Data Lakehouse architectures.
Solid understanding of analytical tools, consumption planes, semantic modeling, and implementation.
Strong background in Agile methodologies, stakeholder communication, and executive reporting.
Experience leading large-scale migrations from legacy ETL to modern data platforms.
Preferred Qualifications (Nice-to-Have):
Prior experience driving FinOps for data compute in a cloud environment (AWS preferred).
AI-native mindset with experience prototyping using GenAI agents and modern development tools.
Exposure to enterprise internal platforms.
If you are a results-driven technical program leader passionate about building modern data infrastructure, we encourage you to apply.