Data Analytics Engineer / Snowflake

Overview

On Site
100k - 170k
Full Time

Skills

FOCUS
Workflow
Reporting
Computer Science
Information Systems
Data Engineering
Data Analysis
Cloud Computing
Data Warehouse
SQL
Modeling
ELT
Version Control
Git
Problem Solving
Conflict Resolution
Communication
Logistics
Supply Chain Management
Python
Scripting
Data Governance
Tableau
Microsoft Power BI
Extract
Transform
Load
Data Modeling
Snow Flake Schema
Optimization
Business Intelligence
Analytics
Data Quality
Documentation

Job Details

Data Analytics Engineer - Logistics Industry

A logistics and supply chain technology firm is seeking a Data Analytics Engineer to design and maintain scalable data solutions that drive operational insights across their organization. This role will focus on building reliable data pipelines, integrating complex datasets, and enabling business intelligence through Snowflake and modern ETL workflows.

The ideal candidate will have a background in analytics engineering, strong SQL skills, and hands-on experience with cloud data platforms. You'll work cross-functionally with data analysts, engineers, and business stakeholders to ensure timely, accurate, and actionable reporting across multiple logistics functions.
Required Skills & Experience:
  • Bachelor's degree in Computer Science, Information Systems, Data Engineering, or related field
  • 3-5 years of experience in data analytics engineering or related role
  • Strong experience with Snowflake and cloud-based data warehousing
  • Proficiency in SQL for data transformation, modeling, and analysis
  • Hands-on experience designing and deploying ETL/ELT pipelines using tools like dbt, Fivetran, Airflow, or Matillion
  • Experience working with large, complex datasets in a structured and unstructured format
  • Familiarity with version control and collaborative development practices (Git)
  • Strong problem-solving and communication skills
Desired Skills & Experience:
  • Experience in the logistics, supply chain, or transportation industries
  • Exposure to Python for scripting or lightweight data tasks
  • Familiarity with data governance, quality monitoring, and lineage tools
  • Experience working with BI tools like Looker, Tableau, or Power BI
What You Will Be Doing:
Tech Breakdown:
  • 60% ETL Pipeline Development and Maintenance
  • 25% Data Modeling and Snowflake Optimization
  • 15% Business Intelligence Integration

Daily Responsibilities:
  • 50% Building and Enhancing Data Pipelines
  • 30% Collaborating with Analytics and Operations Teams
  • 20% Maintaining Data Quality and Documentation
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Motion Recruitment Partners, LLC