Overview
On Site
130k - 160k
Full Time
Skills
Inventory
Pricing
Real-time
Market Intelligence
Data Architecture
Customer Facing
Data Engineering
Software Engineering
Python
SQL
Cloud Computing
Data Warehouse
IaaS
Amazon S3
Amazon EC2
Communication
Workflow
Orchestration
Terraform
Docker
DevOps
Point Of Sale
Extract
Transform
Load
ELT
Data Modeling
Optimization
Debugging
Retail
Snow Flake Schema
Reporting
Data Flow
Data Quality
Collaboration
Analytics
Amazon Web Services
Documentation
Professional Development
Job Details
An innovative data and insights organization operating within fast-evolving, highly regulated consumer markets is seeking a Data Engineer to join its growing platform team. This company builds large-scale data pipelines, ingesting millions of daily signals across retail, product inventory, pricing, and promotional activity to power real-time market intelligence products.
This role sits at the intersection of data architecture, pipeline design, and analytics engineering-focused on transforming fragmented, multi-source datasets into clean, reliable systems that directly drive customer-facing insights. You'll be joining a team that values curiosity, end-to-end ownership, and thoughtful engineering in an industry where accuracy, timeliness, and transparency are essential.
This is a full-time, hybrid role based in the Chicago Loop, offering significant opportunities to shape the underlying data ecosystem behind a market-leading analytics platform. Required Skills & Experience
60% pipeline development, data modeling, architecture, and system optimization
40% analysis, debugging, cross-functional collaboration, and tooling
Daily Responsibilities:
#LI-OP
This role sits at the intersection of data architecture, pipeline design, and analytics engineering-focused on transforming fragmented, multi-source datasets into clean, reliable systems that directly drive customer-facing insights. You'll be joining a team that values curiosity, end-to-end ownership, and thoughtful engineering in an industry where accuracy, timeliness, and transparency are essential.
This is a full-time, hybrid role based in the Chicago Loop, offering significant opportunities to shape the underlying data ecosystem behind a market-leading analytics platform. Required Skills & Experience
- 4-6 years of experience in data engineering or software engineering with a strong foundation in modern data technologies.
- Proficiency in Python and SQL, including experience building and scaling production-grade pipelines.
- Experience with dbt, Snowflake, or similar cloud-based data warehouses.
- Solid understanding of cloud infrastructure (preferably AWS-S3, EC2, Lambda).
- Experience working with large, complex datasets across multiple data sources.
- Ability to diagnose pipeline issues, analyze anomalies, and enforce data quality and lineage.
- Strong communication skills and a collaborative approach to working with cross-functional teams.
- Familiarity with workflow orchestration tools such as Prefect.
- Experience with Terraform, Docker, or other IaC/DevOps tools.
- Background with retail, point-of-sale, or other high-volume marketplace data.
- Exposure to scraping, ingestion frameworks, or high-throughput ETL/ELT pipelines.
- Interest in building internal tooling that improves experimentation, observability, and governance.
- Ability to thrive in fast-moving, product-driven environments.
60% pipeline development, data modeling, architecture, and system optimization
40% analysis, debugging, cross-functional collaboration, and tooling
Daily Responsibilities:
- Build and enhance scalable pipelines that aggregate data from diverse retail, market, and product sources.
- Design and maintain robust data models using dbt and Snowflake to support analytics, reporting, and product-led insights.
- Investigate data flows to identify inconsistencies, quality issues, or architectural gaps-then implement improvements.
- Develop tooling that increases visibility into pipeline health, data quality, and operational metrics.
- Collaborate with engineering, product, and analytics stakeholders to translate business needs into technical solutions.
- Evaluate new architectural patterns, AWS services, and ingestion strategies to improve efficiency and scale.
- Maintain strong documentation, lineage tracking, and monitoring frameworks.
- Medical, dental, and vision coverage options
- Competitive salary
- Flexible work hours with a hybrid schedule in the Chicago Loop
- Opportunities for professional development and continued technical growth
#LI-OP
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.