Data Engineer JD&nbsp

Cupertino, CA, US • Posted 11 hours ago • Updated 11 hours ago
Contract Independent
Contract W2
No Travel Required
On-site
Depends on Experience
Fitment

Dice Job Match Score™

👾 Reticulating splines...

Job Details

Skills

  • Java
  • Python
  • Amazon Web Services
  • Continuous Integration/Delivery
  • Governance
  • data modeling
  • SQL
  • metrics
  • ETL
  • Scala
  • USE Cases
  • Database Modeling
  • Kafka
  • Real-Time
  • GCP
  • Data Governance
  • Data Integrity
  • Tableau Software
  • Data Pipelines
  • Transactional Data
  • Payments
  • Dimensional Data
  • Online Analytical Processing
  • OLAP

Summary

Role: Data Engineer 

Location: Cupertino, CA/ Austin, CA

 

The Data Foundations Engineer designs and scales modern data architectures powering Wallet, Payments, and Commerceproducts. This role focuses on building high-performance data pipelines and enabling analytics and ML use cases, with strongfundamentals in data modeling and scalable systems.

KEY RESPONSIBILITIES -

Data Engineering & Architecture  Design and implement scalable batch and near-real-time data pipelines. Develop ETL/ELT workflows optimized for performance and cost. Implement dimensional data models and standardize business metrics.                    Instrument APIs and user journeys to capture behavioral and transactional data.

Data Governance & Quality -

Ensure data integrity, governance, privacy, and compliance.                                                

 Maintain reliability and availability of mission-critical systems.

REQUIRED QUALIFICATIONS –

6+ years of experience in data engineering for analytics or ML systems.                               

Strong SQL proficiency.                                                                                                                          

Experience in Python, Scala, or Java.                                                                                                    

Hands-on experience with Spark, Kafka, and Airflow (or similar).                                           

Strong understanding of data modeling and lakehouse architectures (e.g., Iceberg). Experience with AWS, Azure, or Google Cloud Platform.                                                                                          

Comfortable participating in rotating on-call.                                                                     

Experience with Snowflake, Databricks, Trino, OLAP/NRT systems, Superset or Tableau. Familiarity with CI/CD, data observability, infrastructure-as-code.                                  

Exposure to MLOps and GenAI/RAG pipelines.                                                                                 

Hands-on experience with LLMs (prompt engineering, fine-tuning, RAG).                 

Experience in FinTech, Wallet, or Payments domain.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 10448381
  • Position Id: 1353-20107-
  • Posted 11 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Hybrid in Milpitas, California

10d ago

Easy Apply

Contract

Depends on Experience

Hybrid in Milpitas, California

14d ago

Easy Apply

Contract, Third Party

Depends on Experience

San Jose, California

21d ago

Easy Apply

Full-time

$140000-145000

Hybrid in Santa Clara, California

25d ago

Easy Apply

Third Party, Contract

Depends on Experience

Search all similar jobs