Google Cloud Data Architect IAM Data Modernization

Dallas, TX, US • Posted 10 hours ago • Updated 50 minutes ago
Contract Independent
Contract W2
On-site
$65 - $69/hr
Company Branding Image
Fitment

Dice Job Match Score™

🛠️ Calibrating flux capacitors...

Job Details

Skills

  • GCP
  • data engineer
  • IAM
  • Data Architect
  • BigQuery
  • migration

Summary

Job Title: Google Cloud Data Architect IAM Data Modernization

Duration: 6 Months Contract (Possibility for extension)

Location : Dallas, TX 75024 (4 days onsite, 1 day remote)

Pay Range: $65 - $69 /hr W2

Description:

Project/Program

Identity & Access Management (IAM) Data Modernization migration of an on premises SQL data warehouse to a target state Data Lake on Google Cloud (Google Cloud Platform), enabling metrics & reporting, advanced analytics, and GenAI use cases (natural language querying, accelerated summarization, cross domain trend analysis).

About Program/Project

The IAM Data Modernization project involves migrating an on-premises SQL data warehouse to a target state Data Lake in Google Cloud Platform cloud environment. Key highlights include:

  • Integration Scope: 30+ source system data ingestions and multiple downstream integrations
  • Capabilities: Metrics, reporting, and Gen AI use cases with natural language querying, advanced pattern/trend analysis, faster summarizations, and cross-domain metric monitoring
  • Benefits:
    • Scalability and access to advanced cloud functionality
    • Highly available and performant semantic layer with historical data support
    • Unified data strategy for executive reporting, analytics, and Gen AI across cyber domains

This modernization establishes a single source of truth for enterprise-wide data-driven decision-making.

Required Skills:

Data Lake Architecture & Storage

  • Proven experience designing and implementing data lake architectures (e.g., Bronze/Silver/Gold or layered models).
  • Strong knowledge of Cloud Storage (GCS) design, including bucket layout, naming conventions, lifecycle policies, and access controls

Experience with Hadoop/HDFS architecture, distributed file systems, and data locality principles

  • Hands-on experience with columnar data formats (Parquet, Avro, ORC) and compression techniques
  • Expertise in partitioning strategies, backfills, and large-scale data organization
  • Ability to design data models optimized for analytics and BI consumption

Qualifications

  • Experience: [10 14]+ years in data engineering/architecture, 5+ years designing on Google Cloud Platform at scale; prior on prem cloud migration a must.
  • Education: Bachelor's/Master's in Computer Science, Information Systems, or equivalent experience.
  • Certifications: Google Cloud Professional Cloud Architect (required or within 3 months). Plus: Professional Data Engineer, Security Engineer.

Data Ingestion & Orchestration

Experience building batch and streaming ingestion pipelines using Google Cloud Platform-native services

Knowledge of Pub/Sub-based streaming architectures, event schema design, and versioning

Strong understanding of incremental ingestion and CDC patterns, including idempotency and deduplication

Hands-on experience with workflow orchestration tools (Cloud Composer / Airflow)

Ability to design robust error handling, replay, and backfill mechanisms

Data Processing & Transformation

Experience developing scalable batch and streaming pipelines using Dataflow (Apache Beam) and/or Spark (Dataproc)

Strong proficiency in BigQuery SQL, including query optimization, partitioning, clustering, and cost control.

Hands-on experience with Hadoop MapReduce and ecosystem tools (Hive, Pig, Sqoop)

Advanced Python programming skills for data engineering, including testing and maintainable code design

Experience managing schema evolution while minimizing downstream impact

Analytics & Data Serving

Expertise in BigQuery performance optimization and data serving patterns

Experience building semantic layers and governed metrics for consistent analytics

Familiarity with BI integration, access controls, and dashboard standards

Understanding of data exposure patterns via views, APIs, or curated datasets

Data Governance, Quality & Metadata

Experience implementing data catalogs, metadata management, and ownership models

Understanding of data lineage for auditability and troubleshooting

Strong focus on data quality frameworks, including validation, freshness checks, and alerting

Experience defining and enforcing data contracts, schemas, and SLAs

Familiarity with audit logging and compliance readiness

Cloud Platform Management

Strong hands-on experience with Google Cloud Platform (Google Cloud Platform), including project setup, environment separation, billing, quotas, and cost controls

Expertise in IAM and security best practices, including least-privilege access, service accounts, and role-based access

Solid understanding of VPC networking, private access patterns, and secure service connectivity

Experience with encryption and key management (KMS, CMEK) and security auditing

DevOps, Platform & Reliability

Proven ability to build CI/CD pipelines for data and infrastructure workloads

Experience managing secrets securely using Google Cloud Platform Secret Manager

Ownership of observability, SLOs, dashboards, alerts, and runbooks

Proficiency in logging, monitoring, and alerting for data pipelines and platform reliability

Good to have

Security, Privacy & Compliance

Hands-on experience implementing fine-grained access controls for BigQuery and GCS

Experience with VPC Service Controls and data exfiltration prevention

Knowledge of PII handling, data masking, tokenization, and audit requirements

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 91120142
  • Position Id: 2026-17332
  • Posted 10 hours ago

Company Info

About VeridianTech

Veridian Tech Solutions Houston based Cloud Consulting and IT Services firm specializing in Enterprise Mobility, Information Management, and Cloud-based solutions.

Founded in 2013, Veridian Tech Solutions was created to address the growing need for technology that enables businesses to be mobile and flexible in managing their applications and employees. As the market began transitioning into the Digital Era, Veridian realized that companies needed specific skill sets of technology expertise to provide their customers with new tools and services, create better ways for their employees to do their jobs, and do so in a modern, cost-effective way.

With decades of industry experience, Veridian's leadership team brought the expertise and guidance of extensive enterprise systems execution and project management expertise across numerous industries, and most importantly, the knowledge of what is required for a consultant to be effective.

Veridian builds its team by only selecting resources who have deep consulting experience, proven technical skills, and are highly proficient in specific sought-after areas of cutting-edge Mobility, Cloud, and Analytics technologies.

These cutting-edge skills we have often found are the gaps at organizations across North America, from mid-tier to Fortune 500, and at big 4 consulting firms as well. We work with our clients to comprehend their project requirements and utilize our experience to present the correct solutions and right resources.

We provide onshore services to North America, coordinated from our headquarters in Houston, and we provide offshore services from our Delivery center in New Delhi, India.

Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

It looks like there aren't any Similar Jobs for this job yet.

Search all similar jobs