Skills
- Python
- PySpark
- Scala
- Spark
- GCP
- Airflow DAG
- Hive
- Hadoop
- Map reduce
- BigQuery
Job Description
Sr. Big Data Engineer
Need extensive experience with Python, PySpark (highly Used), Scala, Spark, Google Cloud Platform, Airflow DAG, Hive, Data pipeline technologies Any NoSql
Nice to haves – Hadoop, Map reduce, BigQuery, COSMOS
Designs, develops, and implements Hadoop eco-system based applications to support business requirements.
Follows approved life cycle methodologies, creates design documents, and performs program coding and testing.
Resolves technical issues through debugging, research, and investigation.