ELT / EDI Lead Engineer (Supply Chain Data Platform AI-Assisted) Contract
Remote
Role Overview We are seeking an experienced ELT / EDI Lead Engineer to lead the design and implementation of data pipelines for a supply chain transaction analytics platform. This role will focus on ingesting, normalizing, and modeling EDI-driven transaction data (purchase orders, invoices, shipment notices, acknowledgements) to enable reliable reporting and operational insights. You will play a critical role in defining how data from multiple trading partners and systems is standardized and interpreted, ensuring consistency in transaction lifecycle tracking, error handling, and KPI reporting. This is a hands-on leadership role where you will guide data engineers while actively contributing to pipeline development and data modeling.
Key Responsibilities EDI & Supply Chain Data Ownership Lead ingestion and normalization of EDI/X12 transaction data (850, 855, 856, 810, 997) across multiple source systems Define consistent interpretation of transaction lifecycle states (received, processed, failed, delayed, acknowledged, etc.) Standardize data across different trading partners with varying schemas and formats Work closely with business stakeholders to define supply chain KPIs (transaction success rates, processing delays, error patterns)
ELT Pipeline & Data Modeling Design and implement ELT pipelines using Snowflake, Azure data services, or similar platforms Define and enforce Bronze (raw), Silver (cleaned), and Gold (analytics-ready) data layers Develop transformation logic for structured and semi-structured data (XML, JSON, EDI payloads) Ensure pipelines are scalable, reliable, and optimized for performance Guide data engineers on best practices for pipeline development and data modeling AI-Assisted Development & Optimization Use AI tools such as Cursor and GitHub Copilot to accelerate SQL development, transformation logic, and pipeline design Leverage LLM-based tools to analyze EDI schemas, summarize structure differences, and assist in normalization design Use Snowflake Cortex (where applicable) for query optimization, classification (e.g., error grouping), and performance improvements Apply AI tools to rapidly prototype transformation logic and refine through manual validation Ensure all AI-generated outputs are thoroughly reviewed for correctness, especially in business-critical transaction logic
Data Quality, Semantics & Governance Define data validation rules for transaction completeness, accuracy, and consistency Ensure alignment between source data and reporting outputs through reconciliation logic Establish semantic consistency across KPIs and reporting layers Identify and resolve issues related to schema inconsistencies, missing data, and transformation errors
Leadership & Collaboration Provide hands-on technical leadership to a team of data engineers Review code, transformation logic, and pipeline implementations Collaborate with QA, BI, and DevOps teams to ensure end-to-end data quality and delivery Act as a key technical point of contact for stakeholders and delivery leadership
Required Skills & Experience Core Technical Skills 8+ years of experience in data engineering, with strong focus on ELT/ETL pipelines Deep expertise in EDI/X12 transactions (850, 855, 856, 810, 997) and transaction lifecycles Strong hands-on experience with Snowflake, Azure SQL, or similar data warehouse platforms Advanced SQL skills for complex transformations and performance optimization Experience working with semi-structured data (XML, JSON, EDI formats) Strong understanding of data modeling and medallion architecture (Bronze/Silver/Gold)
AI-Assisted Engineering (Mandatory) Hands-on experience using AI tools such as: o Cursor (SQL and transformation development) o GitHub Copilot (code generation and optimization) o LLM-based tools (schema analysis, documentation, and design support) Ability to use AI tools for: o Accelerating SQL development and transformation logic o Analyzing complex EDI schemas and identifying patterns o Generating and refining pipeline logic Strong ability to validate and refine AI-generated outputs, especially for business- critical transaction data Experience working in environments where AI is used to improve productivity while maintaining strict quality standards