Overview
On Site
$68.18 - $68.18 hr
Full Time
Contract - Independent
Contract - W2
Contract - 9+ mo(s)
Skills
SNOWFLAKE
PYTHON
SHELL
BASH
UNIX
SQL
DATABASE
TABLEAU
GITHUB
GIT
SPARK
Job Details
Job details:
Job Title: Data Engineer (SW)
Location: Cupertino, CA
Work schedule: Onsite (Hybrid Schedule) /Hours worked per week: 40
Duration: 9 months, possible for extension
Pay range: $58.00/hr. to $68.00/hr. (DOE)
Key Qualifications:
- 2-5 years of experience in data engineering, software engineering, or data analytics roles.
- Proficient in SQL and Python; comfortable with Bash or shell scripting.
- Hands-on experience with modern data tooling:
- Spark for large-scale data processing
- Airflow for workflow orchestration
- Snowflake and DBT for data transformation and modeling
- AWS S3 for data storage and movement
- Docker and Kubernetes for containerization and deployment workflows
- Jupiter Notebooks for collaborative data exploration and documentation
- Familiarity with Git-based CI/CD pipelines and collaborative code development.
- Solid understanding of data warehousing, data modeling, and working with big data ecosystems.
- Foundational knowledge of statistics, including mean, median, standard deviation, and variance.
- Strong problem-solving skills with the ability to break down complex issues into manageable components.
- Committed to good software engineering practices such as testing, documentation, and code quality checks.
- Able to clearly communicate technical concepts to both technical peers and non-technical stakeholders.
- Familiarity with battery systems or electrical engineering is a plus but not required.
Job Description:
As a Data Engineer, you will:
- Design, build, and maintain scalable ELT pipelines using SQL and Python.
- Work across the full data lifecycle from ingestion and transformation to model deployment and reporting.
- Collaborate with data scientists, engineers, and product managers to deliver clean, reliable, and well-documented data.
- Implement and manage workflows using Airflow, while ensuring traceability and version control via GitHub.
- Support transformation logic and data modeling using DBT, with data housed primarily in Snowflake.
- Use Jupiter Notebooks and ad-hoc analysis to support business questions and drive actionable insights.
- Build tools to monitor, validate, and test data pipelines, ensuring high availability and quality.
- Contribute to automation efforts, improving the team s efficiency and reducing manual work.
- Provide occasional support for urgent data reporting needs.
- Engage constructively with both technical and non-technical colleagues to ensure data solutions align with business goals.
Education
MS or Ph.D. in Computer Science, Software Engineering, Statistics, Electrical Engineering,
Battery Engineering, or related technical field.
Certifications in Six Sigma (CSSBB) or Quality Engineering (CQE) are a plus but not required.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.