Data Engineer

  • Irvine, CA
  • Posted 1 day ago | Updated 7 hours ago

Overview

On Site
$50 - $50 per hour
Contract - W2
Contract - 6+ month(s)

Skills

Data Engineer

Job Details






A recognized property management company in California is actively seeking a new Data Engineer to join their growing team. In this role, the Data Engineer will support the design, automation, and operation of modern data infrastructure.


Responsibilities:



  • Design, build, and maintain scalable and resilient CI/CD pipelines for data applications and infrastructure, with a strong focus on Snowflake, dbt, and modern data tooling

  • Implement and manage Snowflake dbt projects, including developing dbt models, tests, and documentation, and integrating dbt workflows into CI/CD pipelines

  • Develop and manage Infrastructure as Code (IaC) using Terraform to provision and configure data infrastructure on Google Cloud Platform (Google Cloud Platform)

  • Automate the deployment, monitoring, and management of Snowflake data warehouse environments, ensuring optimal performance, security, reliability, and cost efficiency

  • Collaborate with data engineers and data scientists to understand requirements and deliver automated solutions for data ingestion, transformation, and delivery

  • Implement and maintain monitoring, logging, and alerting for data pipelines and infrastructure to ensure high availability and proactive issue resolution

  • Develop and maintain automation scripts and tooling using Python (primary) and Bash for operational tasks

  • Apply and maintain security best practices across data infrastructure, pipelines, and CI/CD processes

  • Troubleshoot and resolve issues related to data pipelines, infrastructure, and deployments

  • Participate in code reviews for Terraform, dbt models, and automation scripts

  • Create and maintain clear technical documentation for architectures, configurations, and operational processes

  • Perform other duties, as needed


Qualifications:



  • 5+ years of experience in Data Engineering, Analytics Engineering, or DevOps for data platforms

  • Strong hands-on experience with Snowflake in production environments

  • Proven expertise with dbt, including model development, testing, and documentation

  • Experience building and maintaining CI/CD pipelines for data and cloud infrastructure

  • Hands-on experience with Terraform and Infrastructure as Code practices

  • Experience working with Google Cloud Platform (Google Cloud Platform) for data storage, processing, and analytics

  • Strong programming skills in Python and working knowledge of Bash/shell scripting

  • Experience implementing monitoring, logging, and alerting for data systems

  • Solid understanding of data pipeline architectures and modern data stack concepts

  • Strong troubleshooting and problem-solving skills

  • Ability to collaborate effectively with cross-functional technical teams


Desired Skills:



  • Experience with Data Governance and Data Quality frameworks

  • Familiarity with Cost Optimization strategies in Snowflake and cloud environments

  • Experience supporting High-Availability or Mission-Critical Data platforms

  • Prior contract or consulting experience in fast-paced environments




Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.