Overview
Skills
Job Details
Must Have: Scala, Spark, Google Cloud Platform, Python, PYSPARK, SQL, Data Migration and Coding.
Google Cloud Platform Experience
* 4+ years of recent Google Cloud Platform experience
* Experience building data pipelines in Google Cloud Platform
* Google Cloud Platform Dataproc, GCS & BIG Query experience
* 12+ years of hands-on experience with developing data warehouse solutions and data products.
* 6+ years of hands-on experience developing a distributed data processing platform with Hadoop, Hive or Spark, Airflow or a workflow orchestration solution are required
* 5+ years of hands-on experience in modeling and designing schema for data lakes or for RDBMS platforms.
* Experience with programming languages: Python, Java, Scala, etc.
* Experience with scripting languages: Perl, Shell, etc.
* Practice working with, processing, and managing large data sets (multi TB/PB scale).
* Exposure to test driven development and automated testing frameworks.
* Background in Scrum/Agile development methodologies.
* Capable of delivering on multiple competing priorities with little supervision.
* Excellent verbal and written communication skills.