Data Engineer II

Overview

On Site
USD 63.00 - 68.00 per hour
Full Time

Skills

Streaming
Flat File
Database
Remote Desktop Services
Amazon RDS
Amazon S3
Performance Tuning
Agile
User Stories
Quality Assurance
Data Engineering
Extract
Transform
Load
Training
Computer Science
Information Technology
Communication
Collaboration
Orchestration
Git
Python
SQL
Snow Flake Schema
ELT
Microsoft Azure
DevOps
Continuous Integration
Continuous Delivery
Amazon Web Services
BMC Control-M
Privacy
Marketing

Job Details

Location: Newport Beach, CA
Salary: $63.00 USD Hourly - $68.00 USD Hourly
Description: Our client is currently seeking a Data Engineer II

About the Role

As a Data Engineer II, you will collaborate with technology and business stakeholders to design and implement scalable, reliable data pipelines and solutions. You'll play a key role in building data hubs and marts, transforming data from diverse sources, and centralizing it on Snowflake. This role requires hands-on experience with modern data engineering tools and practices, including DBT, SQL, Git, and Python.
Responsibilities
  • Partner with stakeholders to gather and understand data requirements.
  • Design and build scalable batch and streaming data pipelines.
  • Ingest and transform data from sources like flat files, SQL databases, AWS RDS, and S3.
  • Centralize data into Snowflake using ELT tools such as DBT and Matillion.
  • Write optimized, complex SQL queries and ensure performance tuning.
  • Develop unit and integration tests; implement CI/CD pipelines.
  • Participate in code reviews and enforce best practices.
  • Monitor and maintain production systems.
  • Contribute to Agile ceremonies and maintain user stories in the backlog.
  • Collaborate cross-functionally with product owners, analysts, architects, QA, and other engineers.
Minimum Qualifications
  • 5+ years of experience in data engineering.
  • 2+ years of hands-on ETL development using DBT.
  • 5+ years of experience with SQL and Snowflake.
  • 1+ year of hands-on experience with Git and Python (beyond training or POCs).
  • Bachelor's degree in Computer Science, Information Technology, Engineering, or related field.
  • Strong communication and collaboration skills.
Preferred Qualifications
  • Experience with Matillion for ELT development.
  • Proficiency in coding complex transformations in DBT.
  • Experience with Azure DevOps, including CI/CD pipelines.
  • Familiarity with AWS services and Control-M for job orchestration.
Technical Skills
Must-Have
  • Git & Python (1+ year hands-on)
  • DBT (2+ years hands-on)
  • SQL & Snowflake (2+ years hands-on)
Nice-to-Have
  • Complex DBT transformation logic
  • Matillion ELT development
  • Azure DevOps CI/CD
  • AWS & Control-M experience


By providing your phone number, you consent to: (1) receive automated text messages and calls from the Judge Group, Inc. and its affiliates (collectively "Judge") to such phone number regarding job opportunities, your job application, and for other related purposes. Message & data rates apply and message frequency may vary. Consistent with Judge's Privacy Policy, information obtained from your consent will not be shared with third parties for marketing/promotional purposes. Reply STOP to opt out of receiving telephone calls and text messages from Judge and HELP for help.

Contact:

This job and many more are available through The Judge Group. Please apply with us today!
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Judge Group, Inc.