Overview
On Site
Up to $120,000
Full Time
Skills
Scrum Lead
ETL
Snowflake
Pyspark
Agile
Job Details
Job Title: Scrum Lead
Position Summary:
We are looking for a skilled and delivery-focused Scrum Lead with a strong background in data engineering to lead Agile teams working on modern data platform projects. The ideal candidate brings hands-on experience with tools such as Snowflake, Rocket ETL, and PySpark, and can effectively guide cross-functional teams through Agile delivery cycles. This role requires a balance of team facilitation, stakeholder coordination, and practical understanding of data engineering practices.
Required skillsets & Experience:
- 7+ years of experience in data engineering or analytics, with recent experience in a Scrum Master or Agile Lead
- Deep understanding of Agile/Scrum principles and frameworks.
- Strong working knowledge of:
- Snowflake(data warehousing, transformations, optimization)
- ETL/ELT tools preferably Rocket ETL, Qlik Replicate, or similar
- PySparkor other big data processing frameworks
- CI/CD practicesfor data pipelines
- Experience with tools like Jira, Confluence, or Azure DevOps.
Nice to Have / Preferred Skills:
- Scrum certifications (CSM, PSM I, SAFe Scrum Master, or equivalent).
- Experience integrating structured and semi-structured data sources, including REST APIs.
- Exposure to YugabyteDBor other distributed SQL databases.
- Familiarity with data governance, data quality frameworks, and normalization practices.
- Basic scripting knowledge in Python or Javato support data platform automation.
Key Responsibilities:
- Facilitate all Scrum ceremoniesincluding sprint planning, daily stand-ups, retrospectives, backlog grooming, and sprint reviews.
- Act as a servant leaderto one or more Agile teams delivering data integration, transformation, and reporting solutions.
- Work closely with Product Owners, Data Engineers, and Analysts to ensure clear backlog priorities and sprint commitments.
- Track team velocity, monitor sprint progress, and remove blockers to enable timely delivery.
- Promote Agile best practices and continuous improvement within the team.
- Ensure effective collaboration between data engineering, BI, and DevOps
- Manage dependencies and risks associated with data pipelines, API integrations, and CI/CD processes.
- Maintain transparency with stakeholders through regular updates, dashboards, and reports.
- Support implementation of data governance, data modeling, and Snowflake optimization
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.