Lead Data Engineer - Need Only Local To California - C2C and W2

Fremont, CA, US • Posted 20 hours ago • Updated 20 hours ago
Contract Independent
Contract W2
Travel Required
On-site
$50 - $60/hr
Fitment

Dice Job Match Score™

🎯 Assessing qualifications...

Job Details

Skills

  • ADF
  • Amazon Kinesis
  • Amazon Redshift
  • Amazon S3
  • Amazon Web Services
  • Apache Airflow
  • Apache Hive
  • Apache Kafka
  • Apache Spark
  • Banking
  • SQL Azure
  • PostgreSQL
  • Python
  • ELT
  • Data Architecture
  • Cloud Computing
  • Oracle
  • SQL
  • PySpark
  • Electronic Health Record (EHR)
  • Databricks

Summary

Psoition: Lead Data Engineer
Locations: Fremont, CA (Onsite/ Need Only Local Candidates)
Duration: 12+ Months

Note: Any Visa is Fine with Genuine Experience

Job Description:
Mandatory Skills: Azure Data Factory (ADF), Azure Databricks & PySpark, Azure Synapse, Azure SQL, Python, and Spark SQL
Accomplished Data Architect / Senior Data Engineer with 12+ years of experience designing and modernizing enterprise data platforms across banking, healthcare and retail domains.
Architected scalable ETL/ELT pipelines using Python, PySpark, Databricks, AWS Glue and Azure Data Factory, supporting high-volume transactional and regulatory data processing.
Led enterprise-scale Data Architecture initiatives, defining logical and physical data models, governance standards and cloud-native platform blueprints across AWS and Azure environments.
Designed and implemented Medallion (Bronze/Silver/Gold) Lakehouse architectures using Delta Lake, S3, ADLS Gen2, Snowflake, Redshift and Synapse Analytics.
Engineered large-scale distributed processing workloads using Apache Spark, PySpark, Databricks, EMR, Hive and HDFS, processing billions of records for enterprise analytics.
Orchestrated complex data workflows using Apache Airflow, Databricks Workflows, AWS Step Functions and Azure Data Factory triggers, ensuring SLA-driven pipeline execution.
Strong hands-on experience in Advanced SQL, including complex joins, CTEs, window functions, stored procedures, indexing strategies, partitioning and execution plan optimization across Snowflake, PostgreSQL and Oracle.
Built real-time streaming architectures using Apache Kafka, AWS Kinesis, Azure Event Hub and Service Bus, supporting fraud detection, claims monitoring and operational telemetry.

Phone:
E-mail:

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 90557102
  • Position Id: incorpora_YAS
  • Posted 20 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Fremont, California

Today

Easy Apply

Contract, Third Party

Depends on Experience

Milpitas, California

14d ago

Easy Apply

Third Party, Contract

Depends on Experience

San Jose, California

Yesterday

Easy Apply

Third Party, Contract

Depends on Experience

San Jose, California

Today

Easy Apply

Full-time, Part-time, Contract, Third Party

Search all similar jobs