"DATA ENGINEER"

  • Cupertino, CA
  • Posted 17 hours ago | Updated 1 hour ago

Overview

On Site
Accepts corp to corp applications
Contract - Independent
Contract - W2

Skills

Git
AWS
Etl
data modeling
cloud
Azure
data lake
GCP
CI/CD
Redshift
Snowflake
Data Engineering
airflow
s3
Athena
orchestration tools
DBT
Data Pipelines
ELT
Prefect
BigQuery
Delta Lake
version control

Job Details

Position: Data Engineer

Location: Remote

Duration: 6 month contract

Note: Client is looking for someone with experience on more of SQL and limited on Tableau.

Required Skills:

Data Engineering Expertise:

  • Experienced in building and maintaining data pipelines (ETL/ELT)
  • Proficient with orchestration tools (e.g., Airflow, dbt, Prefect)
  • Comfortable working with cloud platforms (e.g., AWS, Google Cloud Platform, Azure) and tools like BigQuery, Redshift, or Snowflake
  • Familiar with data lake and warehouse architecture (e.g., S3 + Athena, Delta Lake)
  • Strong Python skills for data manipulation (e.g., pandas, pyarrow, pyspark)

Data Infrastructure & Management:

  • Expertise in data modeling (star/snowflake schemas, normalization, dimensional modeling)
  • Skilled in maintaining data quality and integrity (data validation, deduplication, anomaly detection)
  • Familiar with version control, CI/CD practices for data workflows (e.g., Git, dbt cloud)

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.