Overview
Skills
Job Details
Location: 5 days on-site in Irvine, CA
Duration: Direct-Hire Opportunity
About the Role:
We're looking for a Data Engineer who is equal parts builder and strategist, someone who can architect, optimize, and explain data flows with confidence. In this role, youll develop scalable data pipelines using Snowflake and Matillion (or comparable tools), contribute to data modeling, and apply best practices across the data lifecycle.
This is your opportunity to turn raw data into trusted, intelligent insights and to be a key contributor in a growing, high-impact analytics function.
What Youll Be Doing
-
Collaborate closely with Product Owners, Business Analysts, and Architects to translate user stories into effective and efficient data solutions.
-
Design and implement scalable ETL/ELT pipelines using Matillion (or similar tools) and Snowflake, moving data from diverse sources into staging, warehouse, and reporting layers.
-
Apply methodological thinking to pipeline development, explaining not just what youre doing, but why its the right solution for performance, maintainability, and scalability.
-
Write optimized SQL queries and stored procedures to transform raw data into trusted assets.
-
Contribute to data modeling, helping design both normalized and denormalized structures tailored to business needs.
-
Build reusable components for robust and recoverable data pipelines, logging, error handling, and fault tolerance included.
-
Partner with cross-functional stakeholders to communicate complex data flows in simple terms, and provide strategic input on data architecture decisions.
-
Improve continuouslyrefining development patterns, advocating for best practices, and supporting an evolving data engineering roadmap.
What You Bring
-
3+ years of hands-on experience with Snowflake.
-
Strong SQL development skills with demonstrated success in ETL/ELT pipeline design.
-
Experience working in both OLTP and dimensional (data warehouse) environments.
-
Experience using Matillion or similar tools (e.g., dbt, SSIS, Talend, Informatica).
-
Strong understanding of data engineering methodologies and best practicesand the ability to explain technical decisions clearly.
-
Experience working in Agile development teams and using version control (Bitbucket/GitHub).
-
Exceptional communication skills able to interface with technical and non-technical stakeholders alike.
-
Integration experience with APIs and external vendor datasets.
-
Python scripting skills for data transformation or orchestration.
-
Background in performance tuning and query optimization in large datasets.
-
Experience with Microsoft SSIS and familiarity with CI/CD data deployment pipelines.
Who You Are
-
You dont just code, you engineer with intent and purpose.
-
You have a consultative mindset and love helping others understand how data works under the hood.
-
You balance speed with scalability and always have an eye on long-term maintainability.
-
You bring clarity to complexity and arent afraid to challenge assumptions.
-
Youre detail-oriented, analytical, and deeply curious about what makes data systems efficient and impactful.