Google Cloud Platform ML Data Architect

Overview

On Site
$120,000 - $130,000
Full Time
No Travel Required

Skills

Data Flow
Google Cloud
Google Cloud Platform
Kubernetes
Data Warehouse
IaaS
Machine Learning (ML)
Machine Learning Operations (ML Ops)
Python
Cloud Storage
Artificial Intelligence
Extract
Transform
Load
Data Modeling
BigQuery
(IaC

Job Details

Role : Google Cloud Platform ML Data Architect -

Location : Chaska MN (100% onsite)

Hire type : FTE (No H1 Transfer)

Detailed JD:

Responsible for designing, implementing, and managing data and machine learning solutions on Google Cloud Platform

Key Responsibilities:

Design end-to-end data solutions, including data ingestion, storage, processing, and analysis pipelines, as well as machine learning model development, deployment, and monitoring pipelines.

Design and implement scalable, secure, and cost-optimized cloud infrastructure using Google Cloud Platform services like BigQuery, Dataflow, Dataproc, Cloud Storage, and Kubernetes Engine.

Design and implement data models, ensuring data consistency, accuracy, and accessibility for various applications and users.

Establish MLOps practices, enabling the automation of machine learning model training, deployment, and monitoring.

Ensure that all data solutions adhere to security and compliance standards, implementing access controls, encryption, and other security measures.

Monitor and optimize the performance of data and machine learning systems, ensuring they meet business requirements and SLAs.

Develop and implement strategies for managing and optimizing cloud costs, ensuring efficient resource utilization.

They provide technical guidance and mentorship to other team members, fostering a culture of best practices and continuous improvement.

Key Skills:

10+ years of experience designing and developing production grade data architectures using google cloud data services and solutions

Proficiency in BigQuery, Dataflow, Dataproc, Cloud Storage, pub-sub, Kubernetes Engine, and other relevant Google Cloud Platform services.

Strong Experience with data warehousing, ETL processes, data modeling, and data pipeline development.

Strong hands on experience in Python and SQL

Strong experience of model development, deployment, and monitoring using Vertex AI

Good experience of LLM, agents and agentic AI, Agent Space and hands on RAG experience

Experience with cloud computing concepts, including infrastructure as code (IaC), scalability, security, and cost optimization.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.