Role Description
We are seeking a Supply Chain Data Engineer to join the our squad. The primary objective of this role is to architect and build the foundational data layer within Google Cloud Platform (Google Cloud Platform). This resource will act as the bridge between raw enterprise data (SAP/FUZE) and the Data Scientists building predictive models (e.g., Project Survival Confidence Intervals, BOM Explosion).
3. Scope of Services
The Contractor shall perform the following duties:
Data Ingestion & Integration: Query, extract, and ingest large datasets from SAP S4 HANA (transactional data), SAP IBP (demand/supply plans), and FUZE (project milestones, BOMs) into BigQuery.
Reconcile data discrepancies between "Planned" vs. "Actual" dates and material requirements across systems. +1
Data Transformation & Wrangling: Clean, normalize, and join disparate datasets to create unified views for:
Current on-hand inventory levels vs. forecasted demand.
Historical project performance (Planned vs. Actual milestones) to support training of ML models.
Bill of Materials (BOM) explosion requirements linked to project schedules.
Schema Development: Develop "shovel-ready" schemas and data tables in Google Cloud Platform that serve as the single source of truth for downstream AI applications. +1
Pipeline Automation: Utilize SQL and Python to script automated data refresh pipelines, ensuring models have access to near real-time data.
4. Key Deliverables (Expected Outputs)
By the end of the contract term, the Contractor is expected to deliver:
1. Unified NSC Dataset: A curated, clean, and documented BigQuery dataset combining SAP IBP, S4, and FUZE data, specifically enabling the "Project Survival" and "Milestone Forecaster" use cases.
2. Automated ETL/ELT Pipelines: Functional code (committed to GitLab) that automates the extraction and transformation of the above data on a set cadence.
3. Data Dictionary: Comprehensive documentation defining field lineage (e.g., mapping a FUZE project ID to an SAP WBS element) to ensure reproducibility.
4. Gap Analysis Report: A technical summary identifying missing data elements or data quality issues within FUZE or S4 that may hinder future MVP deployment.
5. Required Qualifications
Data Engineering: 3+ years of experience with SQL (Advanced), Python, and cloud data warehouses (specifically Google Cloud Platform BigQuery).
Supply Chain Domain: Proven experience working with supply chain data (Inventory Management, Demand Planning, Logistics).
SAP Expertise: Functional knowledge of SAP S4 HANA backend table structures and SAP IBP (Integrated Business Planning) modules.
DevOps: Experience with version control (GitLab) and CI/CD basics.