Senior Big Data Engineer/Architect

Overview

On Site
Depends on Experience
Accepts corp to corp applications
Contract - W2

Skills

GCP
Big Data Engineer
GCS
Data Proc
Airflow
Data Store and Big Query.
Google cloud Platform
Hadoop stack (HDFS
Hive
Spark
HBase
Kafka
NiFi
Oozie
Splunk etc).
JIRA
GitHub
Jenkins
Nexus
Artifactory

Job Details

 

JOB TITLE: Senior Big Data Engineer/Architect

Duration: 12 month contarct

 

Looking for a Senior Big Data Engineer/Architect on Google Cloud Platform to help strategize, architect and implement various solutions to migrate data hosted on our on-prem platform to Google cloud Platform (Google Cloud Platform). The architect will design and implement enterprise infrastructure and platforms required for setting up data engineering pipelines utilizing the tools available on the Google Cloud Platform Platform. - You will work on Advanced Data Engineering products using Google Big Data technologies such as GCS, Data Proc, Airflow, Data Store and Big Query.

·         Very strong leadership and communication skills exhibiting right negotiating posture with customer and program teams to make the right decisions.

·         Experience leading one or more of the following areas of a Cloud transformation journey: strategy, design, application migration planning and implementation for any private and public cloud.

·         Cloud foundation design and build/implement

·         Cloud Transformation & Migration

·         Cloud Managed service (IaaS and PaaS)

·         Cloud foundation design and build/implement

 

MUST HAVE SKILLS (Most Important):

·         Google Cloud Certified Professional Cloud Data Engineer

·         Bachelor’s degree with 3-5 years’ experience on Google cloud with deep understanding, design and development experience with Google Cloud Platform products on Infrastructure, Data management, Application Development, Smart Analytics, Artificial Intelligence, Security and DevOps

·         Extract, Transform and Load (ETL) & Big Data Tools: BigQuery, Cloud Dataflow, Cloud Proc, Cloud Pub/Sub, Cloud Composer, Google Data Studio, Google Cloud Storage.

·         NoSQL databases: Cloud Bigtable, Cloud Fire store, Firebase Realtime Database, Cloud Memory store. Search Technologies: Lucene and Elasticsearch

·         Relational Databases: Cloud Spanner, Cloud SQL

 

DESIRED SKILLS:

·         Strong knowledge on Google cloud storage Data lifecycle management

·         Strong knowledge on BIGQuery Slots management

·         Cost optimization for Dataproc workload management

·         Experience of designing, building, and deploying production-level data pipelines using tools from Hadoop stack (HDFS, Hive, Spark, HBase, Kafka, NiFi, Oozie, Splunk etc).

·         Development and deployment technologies (e.g. JIRA, GitHub, Jenkins, Nexus, Artifactory)

·         Software development background with solid understanding of and experience in Software development life cycle (SDLC), DevOps, CI/CD. At least 2-year experience in architecting in enterprises using Agile methodologies

·         Experience in data visualization tools like Kibana, Grafana, Tableau and associated architectures.