Google Cloud Platform Architect Jobs in Salt Lake City, UT

Refine Results
41 - 44 of 44 Jobs

Data Analytics Lead

Bahwan CyberTek Inc.

Remote

Third Party, Contract

Job Title: Data Analytics Lead Engineer Location: Remote (PST Hours) Duration: 6 + months Key Skills: Data Vault, Canonical Modeling and Databricks. Position Overview: As a Data Solution Engineer/Architect Advanced Analytics within our professional services practice, you will lead the design and delivery of cloud-native analytics architectures that solve complex data challenges for enterprise clients. This is a highly technical role that also requires strong consultative and business communicati

Data Analytics Lead Engineer

Bahwan CyberTek Inc.

Remote

Contract

Job Title: Data Analytics Lead Engineer Location: Remote (PST Hours) Duration: 3+ months Pay Rate: $75/- Hr. on C2C Note: Candidate needs to use his own laptop. Key Skills: Data Vault, Canonical Modeling and Databricks. Position Overview: As a Data Solution Engineer/Architect Advanced Analytics within our professional services practice, you will lead the design and delivery of cloud-native analytics architectures that solve complex data challenges for enterprise clients. This is a highly techni

Data Analytics Lead Engineer - C2C - Remote - No OPT & CPT

Shiro Technologies

Remote

Contract

Key Skills: Architecture, Cloud, Analytics, Engineering, Consulting, Communication, Leadership, Ingestion, Modeling, Visualization, Pipelines, Streaming, Warehousing, SQL, Python, Spark, Databricks, Airflow, Snowflake, Kafka, Terraform, Kubernetes, Compliance, Governance, Mentoring, DevOps, Security, MLOps, Strategy, Scoping Note: Candidate needs to use his own laptop. Key Skills: Data Vault, Canonical Modeling and Databricks. Position Overview: As a Data Solution Engineer/Architect Advanced Ana

Google Cloud Platform Data Engineer with Python & PySpark

TekDallas

Remote

Contract

Position: Data Engineer Location: 100% Remote (EST time zone) Contract Duration: 6+ months Only W2 Must Have Tech Stack: Python & PySpark (Spark SQL) 3+ years Airflow (or any orchestration tool) 2+ years Google Cloud Platform (BigQuery, GCS, Pub/Sub, Cloud Run, Functions, Cloud SQL) 3+ years Real-time data ingestion (Kafka, webhooks, file-based) 2+ years API integration (REST/webhooks) 2+ years Kubernetes (GKE preferred) 1 2 years BigQuery SQL & PostgreSQL 2+ years YAML/config-driven pipeline