GCS - Security Architecture Jobs in San Francisco, CA

Refine Results
1 - 5 of 5 Jobs

BigQuery Data Architect

FourSteps Solutions

San Ramon, California, USA

Contract, Third Party

Position Overview We are seeking an experienced BigQuery Data Architect to join our team for a 4-month project to implement a Predictive Cash Flow solution. The role will focus on designing and managing data pipelines in Google BigQuery, integrating data from Oracle Fusion (AP, AR, Payables, Receivables, SCM), Workday Active Payroll, Salesforce, Trovata, and Oracle EPM. The BigQuery Data Architect will ensure efficient data ingestion, transformation, and export to support real-time cash flow for

BigQuery Data Architect

Spiceorb

San Ramon, California, USA

Contract, Third Party

Position Title: BigQuery Data Architect Location: San Ramon, CA - Onsite Qualifications Education:Bachelor s degree in Computer Science, Data Engineering, or a related field (Master s preferred).Experience:5+ years of experience in data architecture or data engineering, with at least 3 years focused on Google BigQuery.Proven experience designing and implementing data pipelines in Google Cloud Platform (Google Cloud Platform).Hands-on experience with ETL processes for financial or ERP data (e.g.,

Sr. Google Cloud Platform Data Engineer

Spiceorb

Remote

Full-time

Hello, Role: Sr Google Cloud Platform Engineer Locations: Remote Qualifications Security in Google Cloud PlatformNetworking in Google Cloud PlatformGKE and overall compute in Google Cloud Platform, including GCEGoogle Cloud Platform Storage - Spanner, Cloud SQL, Memory Store, GCS, etc.Automation and operations in Google Cloud PlatformGoogle Cloud Platform Observability, GKE Observability and Google Cloud Platform Logs/Metrics/Traces.Datadog, New Relic and Synthetic Testing.Bachelor s degree, or

Senior/Principle Google Cloud Platform Data Engineer with strong SQL and Python

ChaTeck Incorporated

Remote or Atlanta, Georgia, USA

Third Party

Role: Senior/Principle Google Cloud Platform Data Engineer with strong SQL and Python Location- Remote Duration- 6 Months C2H Openings - 6 Positions Steer Clear No one < 3 years of experience on Google Cloud Platform No one with only experience in AWS and Azure Data Engineering Requirement Programming SQL Python Java (Optional) Google Cloud Platform BigQuery Dataflow (Apache Beam) Cloud Composer (Airflow) GCS GKE Dataform (Optional to dbt) Tools dbt / Dataform(On Google Cloud Pl

Remote Data Architect with AI/ML, Databricks, Data Governance, Security, Model, ETL-C2C- Ch

Empower Professionals

Remote

Third Party, Contract

Role: Data Architect Location: Remote Duration: 12+ Months Note: Candidate must have Hands on experience in Databricks Requirements: Data Strategy & Architecture Development Define and implement the data architecture and data strategy aligned with business goals.Design scalable, cost-effective, and high-performance data solutions using Databricks on AWS, Azure, or Google Cloud Platform.Establish best practices for Lakehouse Architecture and Delta Lake for optimized data storage, processing, an