Google Cloud Platform Data Engineer/Developer

Overview

Remote
Depends on Experience
Contract - W2
Contract - Independent
Contract - 12 Month(s)

Skills

Google Cloud
Google Cloud Platform
Apache Hadoop
Apache Hive
Apache Spark
Big Data
Cloud Computing
Communication
Data Flow
Data Warehouse
Extract
Transform
Load
Fusion
Good Clinical Practice
MapReduce
Meta-data Management
Modeling
ODS
Python
Research
SQL
Scripting
Snow Flake Schema
Stored Procedures
Streaming
Unix
Workflow
Writing
YAML
Data Lake

Job Details

  • Experience building and optimizing big data data pipelines, architectures and data sets.
  • Experience with data pipeline and workflow management
  • Understanding of Big Data technologies and solutions (Spark, Hadoop, Hive, MapReduce) and multiple scripting and languages (YAML, Python).
  • Understanding of Google Cloud Platform (Google Cloud Platform) technologies in the big data and data warehousing space (BigQuery, Cloud Data Fusion, Dataproc, Dataflow, Data Catalog).
  • Snowflake SQL Writing SQL queries against Snowflake Developing scripts Unix, Python, etc. to do Extract, Load, and Transform data
  • Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures.
  • In-depth understanding of Data Warehouse/ODS, ETL concept and modeling structure principles
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • Excellent verbal and written communication skills with the ability to effectively advocate technical solutions to research scientists, engineering teams and business audiences.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.