Overview
Skills
Job Details
Job Title: Product Owner Data Projects (Expert)
Alternate Titles: Technical Project Manager / Program Manager (Data)
Location: Oakland, CA (Hybrid minimum 3 days onsite at Oakland General Office)
Duration: 12+ Month Contract
Interview: Phone (may require onsite)
Local Candidates Only
Role Overview
Client is seeking an experienced Product Owner / Technical Project Manager (Expert) to lead and deliver complex data and analytics initiatives. This role bridges business and technology, ensuring data projects are delivered on time, within scope, and with measurable business impact.
The ideal candidate brings a strong foundation in data engineering and data platforms, combined with expert-level project management, Agile delivery, and stakeholder leadership.
Key Responsibilities
- Lead end-to-end delivery of data initiatives including data pipelines, platform migrations, and analytics dashboards
- Act as liaison between business stakeholders and technical teams, translating requirements into clear roadmaps and backlogs
- Partner with data engineers, analysts, product managers, and business teams to define scope, requirements, and success metrics
- Develop and manage project plans, timelines, RACI, dependencies, and risk mitigation strategies
- Track progress, manage risks/issues, and provide regular status reporting to leadership
- Apply strong understanding of data architecture concepts:
- ETL / ELT
- Data lakes
- Medallion architecture
- Data governance
- Cloud data warehouses (Snowflake, BigQuery, Redshift, Databricks)
- Run Agile ceremonies including sprint planning, backlog grooming, daily stand-ups, and retrospectives
- Ensure data quality, documentation, and governance standards are enforced
- Align stakeholders across engineering, analytics, finance, marketing, and product teams
Required Qualifications
- 5 10+ years of experience as a Product Owner, Technical Project Manager, or Program Manager
- 3+ years managing data, analytics, or data platform projects
- Hands-on familiarity with:
- Modern data platforms: Snowflake, BigQuery, Redshift, Databricks
- ETL / ELT tools: Informatica, dbt, Airflow, Fivetran
- Strong understanding of Agile / Scrum and hybrid delivery models
- Experience using Jira, Confluence, Smartsheet, or similar tools
- Excellent communication, stakeholder management, and problem-solving skills
- Bachelor s degree in CS, Information Systems, Engineering, or related field (Master s a plus)
Preferred Qualifications
- PMP, CSM, or Agile certifications
- Experience working in federated data engineering teams
- Familiarity with data product frameworks