Detailed Job Description:
Title: | HR Data Engineer (People Analytics) |
Location: | Burlington ON L7T 4K1 |
Duration: | 02 Months |
Pay rate: | $100.00 - $120.00/HR on w2 without PTO (No Paid Holiday or Sick leave). |
Work authorization: | Citizen/ Permanent Resident/EAD |
Shift: | 1st |
Job ID: | IKEAJP00003393 |
On Site in the service office Burlington 2 days per week
Job Description:
HR Data engineer People Analytics: with intermediate SQL/Python, foundational Power Query, and required Power BI Fabric experience, capable of migrating business-critical HR analytics pipelines from legacy Access/VBA environments to modern cloud platforms. Thrives in constrained, transitional settings with sparse documentation. Demonstrates strong communication, documentation habits, and commitment to maintainable solutions. Prior HR/people data experience or sensitive data handling essential. Comfortable with ambiguity, pragmatic trade-offs, and close collaborative work during fast-paced 2-month engagement with extension potential
Role Overview
Purpose: Temporary, project-focused data engineering role within Location Operational Services (LOS), reporting to LOS Manager. Supports strategic transformation of local people analytics infrastructure following an organizational restructure that disrupted legacy analytics systems.
Duration: 2 months initially, extendable based on migration progress and budget availability.
Key Responsibilities & Day-to-Day Work
Primary Deliverable: Migrate Mandatory Training Dashboard tracks compliance training completion across organizational units; critical for audit/regulatory compliance. Current pipeline: manual Excel exports from Cornerstone (training data), organization-controlled Power BI report (employee master data 1), and SuccessFactors (employee master data 2) Access macros/VBA transformations Power BI visualization and Excel VB email process.
Migration Goals:
Rebuild entire pipeline in Power Query (M language) on Power BI Fabric (gen2 dataflows, lakehouses)
Implement resilient source switching (seamless failover if semi-automated exports fail)
Simplify and document logic for non-technical maintainers
Parallel development (current solution remains operational during migration)
Core Tasks:
Reverse-engineer legacy Access macros, VBA, SQL, Python to extract business logic
Design and build Power Query dataflows replicating/improving existing transformations
Implement data quality validation ensuring migrated solution matches legacy outputs
Configure multi-source ingestion with upstream failure handling
Document transformations (data dictionaries, process maps, logic documentation)
Data Sources: SuccessFactors, Cornerstone, Global organization-owned Power BI report (all static Excel exports via SharePoint; no API connectors). Power BI datasets occasionally exported to Excel (fragile).
Must-Have Technical Skills
SQL (Intermediate): Write/optimize joins, aggregations, subqueries, CTEs; reverse-engineer complex queries; translate SQL logic to Power Query.
Python (Intermediate): Read/understand pandas data manipulation scripts; interpret existing Python ETL workflows. Does not need to write production Python.
Power Query (M Language - Foundational): Minimum awareness with willingness to learn quickly under guidance. Strong SQL background enables effective on-the-job learning. Must commit to building exclusively in Power Query (no SQL/Python/Access) for long-term maintainability.
Data Modeling (Strong Dimensional Modeling): Deep star schema understanding; experience with slowly changing dimensions, bridge tables, role-playing dimensions; design models optimized for Power BI/DAX.
Power BI Fabric (Required): Hands-on experience with Fabric workspaces, gen2 dataflows, lakehouses. Understanding Fabric-specific features and differences from gen1 environments. Non-negotiable for 2-month delivery timeline.
Microsoft Access (Reverse-Engineering): Navigate/interpret Access databases (tables, queries, macros); understand Access design patterns; extract business logic for migration (no need to build new Access solutions).
SharePoint (Data Storage/File-Based ETL): Experience using SharePoint as analytics data layer; comfortable with file-based workflows (Excel export ingestion, file path dependencies, refresh schedules).
Data Quality & Documentation: Strong validation/reconciliation instincts; commitment to clear, accessible documentation for non-technical audiences.