Data Engineers

Overview

On Site
$110,000 - $120,000
Full Time

Skills

snowflake
dbt
python
sql

Job Details

Job Role: Lead Data Engineers

Job location: Phoenix, Arizona

Job Type: Full Time

Job description:

Client is looking for Lead Data Engineers with expertise in Snowflake, Python & SQL to lead the design, development, and optimization of scalable data pipelines and analytics solutions. The ideal candidate will have a strong background in cloud data platforms, data architecture, and team leadership, with a passion for solving complex data challenges in a collaborative environment. You'll collaborate with Product Owners, Data engineers, Analysts, and other stakeholders to understand requirements and deliver solutions in an entrepreneurial culture where teamwork is encouraged, excellence is rewarded, and diversity is valued.

Candidate must be located within commuting distance of Phoenix, Arizona or be willing to relocate to the area.

Required Qualifications:

  • Bachelor s degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education
  • At least 7 years of Information Technology experience
  • Experience leading/architecting in Snowflake to build scalable and reliable data pipelines and data models using Snowflake and dbt, following industry and enterprise best practices
  • Experience in managing Snowflake features like Snowpipe, STC, Time Travel, Streams, Tasks, Dynamic Tables, Time Travel, Cloning, and Stages.
  • Strong knowledge of Python/PySpark and SQL.
  • Experience advanced transformation in DBT and Jinga.

Preferred Qualifications:

  • At least 6 years of Experience working in Agile environments
  • At least 5 years of experience in implementing robust data governance, security frameworks, and role-based access control (RBAC), data masking policies within the Snowflake environment, ensuring compliance and data protection.
  • Strong foundation in data modeling concepts, including medallion architecture (Bronze/Silver/Gold layers).
  • At least 4 years of experience with PySpark / Scala & Airflow
  • Familiarity with version control systems (GitLab preferred) and CI/CD pipelines for data engineering workflows.
  • Experience with CI/CD, Terraform, Git, and DevOps practices.
  • Solid understanding of data quality principles, testing frameworks, and monitoring strategies.
  • Working knowledge of AWS services relevant to data engineering (S3, Lambda, Glue, etc.).
  • Excellent communication and stakeholder management skills, with the ability to present technical concepts to both technical and non-technical audiences.
  • Comfortable working across distributed global teams in a fast-paced, client-focused environment.
  • SnowPro Advanced Architect Certification is good to have.
  • Experience in project development lifecycle activities and maintenance/support.
  • Ability to translate requirements into technical solutions meeting quality standards.
  • Collaboration skills in diverse environments to identify and resolve data issues.
  • Strong communication, client facing skills and 8+ years of IT experience
  • Strong communication, problem-solving and analytical abilities.
  • Experience in global delivery environments.
  • Commitment to staying current with industry trends in modern data warehousing.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.