Sr. Data Engineer With strong Google Cloud Platform Background

Overview

$60
Accepts corp to corp applications
Contract - Long Term

Skills

Python
Database
DEV OPS
Governance
Tableau Software
SQL
ETL
Business Intelligence
Machine Learning
Git
Continuous Integration/Delivery
Apache
GitHub
Problem Solving
Terraform
GCP
Kafka
Hive
Apache Kafka
Welding
Real-Time
Pipeline
Version Control
Hadoop
BigQuery
Google Analytics
Control Systems
digital marketing
Digital Campaign
Batch Processing

Job Details

Title: Google Cloud Platform Data Engineer
Experience: 8+ Years
Location: Remote
Note:Need eCommerce Domain
Primary Skills- PySpark, Spark, Python, Big Data, Google Cloud Platform, Apache Beam, Dataflow, Airflow, Kafka and BigQuery
Good to Have: GFO, Google Analytics
Job Description:
  • 7-10 years experience as a data warehouse engineer/architect designing and deploying data systems in a startup environment
  • Mastery of database and data warehouse methodologies and techniques from transactional databases to dimensional data modeling, to wide deformalized data marts
  • Deep understanding of SQL-based Big Data systems and experience with modern ETL tools
  • Expertise in designing data warehouses using Google Big Query
  • Experience developing data pipelines in Python
  • A firm believer in data-driven decision-making, and extensive experience developing for high/elastic scalable, 24x7x365 high availability digital marketing or e-commerce systems
  • Hands-on experience with data computing, storage, and security components and using Cloud Platforms (preferably Google Cloud Platform) provided Big Data technologies
  • Hands-on experience with real-time streaming processing as well as high volume batch processing, and skilled in Advanced SQL, Google Cloud Platform BigQuery, Apache Kafka, Data-Lakes, etc.
  • Hands-on Experience in Big Data technologies - Hadoop, Hive, and Spark, and an enterprise-scale Customer Data Platform (CDP)
  • Experience in at least one programming language (Python strongly preferred), cloud computing platforms (e.g., Google Cloud Platform), big data tools such as Spark/PySpark, columnar datastores (BigQuery preferred), DevOps processes/tooling (CI/CD, GitHub Actions), infrastructure as code frameworks (Terraform), BI Tools (e.g. DOMO, Tableau, Looker,), pipeline orchestration (eg. Airflow)
  • Fluency in data science/machine learning basics (model types, data prep, training process, etc.)
  • Experience using version control systems (Git is strongly preferred)
  • Experience with data governance and data security
  • Strong analytical, problem solving and interpersonal skills, a hunger to learn, and the ability to operate in a self-guided manner in a fast-paced rapidly changing environment
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Aroha Technologies