Overview
Skills
Job Details
Title: Data Engineer
Start Date: ASAP
Duration: 12+ months
Location: Cupertino (Hybrid) **Local candidates only**
Work Hours: 8 AM 5 PM PST
Pay Rate: $60/hr. W2
We are seeking a talented and motivated Data Engineer with 2-5 years of experience to join our growing data team. In this role, you will be instrumental in building and maintaining robust, scalable, and efficient data pipelines that power our business intelligence, analytics, and machine learning initiatives. You will be responsible for designing, developing, and deploying data transformations, orchestrating workflows, and ensuring data quality across our cloud-based data platform. You will work closely with data scientists, analysts, and Finance stakeholders to deliver impactful data solutions.
Responsibilities
- Design, develop, and maintain data pipelines using Python, DBT (Data Build Tool), and Apache Airflow.
- Develop and optimize data transformations using SQL within Snowflake.
- Build and manage data ingestion processes from various sources into Snowflake.
- Implement and maintain data quality checks and monitoring solutions.
- Collaborate with data scientists and analysts to understand data requirements and translate them into technical solutions.
- Contribute to the development and maintenance of our data infrastructure on Google Cloud Platform (Google Cloud Platform).
- Troubleshoot and resolve data pipeline issues in a timely manner.
- Document data pipeline architecture and processes.
- Participate in code reviews and contribute to best practices for data engineering.
- Stay up-to-date with the latest technologies and trends in data engineering.
- Contribute to the overall data strategy and architecture of the organization.
Qualifications
- Bachelor's degree in Computer Science, Engineering, or a related field.
- 2-5 years of experience in data engineering or a related role.
- Strong proficiency in Python programming for data engineering tasks.
- Extensive experience with DBT (Data Build Tool) for data transformation and modeling.
- Solid experience with Apache Airflow for workflow orchestration and scheduling.
- Deep understanding of SQL and experience working with cloud-based data warehouses, specifically Snowflake.
- Experience with Google Cloud Platform (Google Cloud Platform) services such as BigQuery, Cloud Storage, Cloud Functions.
- Experience with version control systems (e.g., Git).
- Strong problem-solving and analytical skills.
- Excellent communication and collaboration skills.