Senior Data Engineer

Overview

Hybrid
Depends on Experience
Accepts corp to corp applications
Contract - W2

Skills

gcp
google cloud
scala
airflow
dataflow
spark
spark streaming
python
kafka
API development
big query

Job Details

Position: Senior Data Engineer

Location: Sunnyvale CA (only locals to Sunnyvale CA, Relocation will not be considered)

Job Description:

  • Proficiency in managing and manipulating huge datasets in the order of terabytes (TB) is essential.
  • Expertise in big data technologies like Hadoop, Apache Spark (Scala preferred), Apache Hive, or similar frameworks on the cloud (Google Cloud Platform preferred) to build batch data pipelines with strong focus on optimization, SLA adherence and fault tolerance.
  • Expertise in building idempotent workflows using orchestrators like Automic, Airflow, Luigi etc.
  • Expertise in writing SQL to analyze, optimize, profile data preferably in BigQuery or SPARK SQL
  • Strong data modeling skills are necessary for designing a schema that can accommodate the evolution of data sources and facilitate seamless data joins across various datasets
  • Ability to work directly with stakeholders to understand data requirements and translate that to pipeline development / data solution work.
  • Strong analytical and problem-solving skills are crucial for identifying and resolving issues that may arise during the data integration and schema evolution process.
  • Ability to move at rapid pace with quality and start delivering with minimal ramp up time will be crucial to succeed in this initiative.
  • Effective communication and collaboration skills are necessary for working in a team environment and coordinating efforts between different stakeholders involved in the project. Nice to have:
  • Experience building complex near real time (NRT) streaming data pipelines using Apache Kafka, Spark streaming, Kafka Connect with a strong focus on stability, scalability, and SLA adherence.
  • Good understanding of REST APIs – working knowledge on Apache Druid, Redis, Elastic search, GraphQL or similar technologies. Understanding of API contracts, building telemetry, stress testing etc.
  • Exposure in developing reports/dashboards using Looker/Tableau
  • Experience in eCommerce domain.

Tech stack: Google cloud, HDFS, SPARK, Scala, Python (optional), Automic/Airflow, Big Query, Kafka, API, Druid