Google Cloud Platform Data Engineer

Overview

Remote
Depends on Experience
Contract - Independent
Contract - W2
Contract - 12 Month(s)

Skills

Stream/Batch Processing
Python
Java
GCP and GCP managed services
Pub/Sub
Docker
Kubernetes

Job Details

Streaming/Batch Infrastructure: (Google Cloud Platform Data Engineer)

Duration: 6 months plus

Remote

Resources need to be onsite once for a client meet quarterly whenever required on their expense.

8+ years of professional experience in Stream/Batch Processing systems at scale

Strong Programming skills in Java, Python

Experience in Public Cloud is a must. Experience with Google Cloud Platform and Google Cloud Platform managed services is a strong plus.

i. Experience in Messaging/Stream Processing systems on Cloud such as Pub/Sub, Kafka, Kinesis, DataFlow, Flink etc, and/Or

ii. Experience in Batch Processing systems such as Hadoop, Pig, Hive, Spark. Experience with Dataproc is a strong plus.

Knowledge of DevOps principles and tools (e.g. CI/CD, IaC/Terraform).

Strong understanding of Containerization technologies (e.g., Docker, Kubernetes).

Strong problem solving and critical thinking skills.

Strong written/verbal communication skills with the ability to thrive in a remote work environment

(For Senior leads/architects) Ability to explore new areas/problems as well as design and architect scalable solutions in Stream/Batch Processing at scale. Ability to technically lead a team of engineers on a project/component.

About Cyma Systems Inc