Overview
On Site
Depends on Experience
Accepts corp to corp applications
Contract - W2
Contract - 12 Month(s)
Unable to Provide Sponsorship
Skills
Apache Kafka
Artificial Intelligence
Data Warehouse
Data Governance
Regulatory Compliance
Warehouse
Teradata
Google Cloud
Analytical Skill
Continuous Delivery
Cloud Computing
Java
Python
Scripting
Workflow
Google Cloud Platform
Git
Migration
Data Engineering
Data Quality
Data Flow
Job Details
Position: Data Architect with Google Cloud Platform
Location: Hartford, CT / Irving, TX
Duration: Contract
Must Have Skills
The goal is to migrate data warehouse assets and ETL pipelines from Teradata to Google Cloud Platform (Google Cloud Platform). The role involves Architecting ,hands-on development, testing, and optimization of data pipelines and warehouse structures in Google Cloud Platform, ensuring minimal disruption and maximum performance.
Key Responsibilities:
Analyzing/creating the architecture landscape, and suggesting the best industry standard solutions .
Lead and execute migration of data and ETL workflows from Teradata to Google Cloud Platform-based services such as BigQuery, Cloud Storage, Dataflow, Dataproc, and Composer (Airflow).
Analyze and map existing Teradata workloads to appropriate Google Cloud Platform equivalents.
Rewrite SQL logic, scripts, and procedures in Google Cloud Platform-compliant formats (e.g., standard SQL for BigQuery).
Collaborate with Divissional architects and business stakeholders to define migration strategies, validate data quality, and ensure compliance.
Develop automated workflows for data movement and transformation using Google Cloud Platform-native tools and/or custom scripts (Python).
Optimize data storage, query performance, and costs in the cloud environment.
Implement monitoring, logging, and alerting for all migration pipelines and production workloads.
Required Skills:
4 to 6+ years of experience in Data architecting, with at least 2 years in Google Cloud Platform.
4 to 6+ years of experience in Data Engineering, with at least 2 years in Google Cloud Platform.
Strong hands-on experience in Teradata data warehousing, BTEQ, and complex SQL.
Solid knowledge of Google Cloud Platform services: BigQuery, Dataflow, Cloud Storage, Pub/Sub, Composer, and Dataproc.
Experience with ETL/ELT pipelines using custom scripting tools (Python/Java).
Proven ability to refactor and translate legacy logic from Teradata to Google Cloud Platform.
Familiarity with CI/CD, GIT, Argo CD, and DevOps practices in cloud data environments.
Strong analytical, troubleshooting, and communication skills.
Preferred Qualifications:
Google Cloud Platform certification.
Exposure to Apache Kafka, Cloud Functions, or AI/ML pipelines on Google Cloud Platform.
Experience working in the healthcare domain.
Knowledge of data governance, security, and compliance in cloud ecosystems
Location: Hartford, CT / Irving, TX
Duration: Contract
Must Have Skills
The goal is to migrate data warehouse assets and ETL pipelines from Teradata to Google Cloud Platform (Google Cloud Platform). The role involves Architecting ,hands-on development, testing, and optimization of data pipelines and warehouse structures in Google Cloud Platform, ensuring minimal disruption and maximum performance.
Key Responsibilities:
Analyzing/creating the architecture landscape, and suggesting the best industry standard solutions .
Lead and execute migration of data and ETL workflows from Teradata to Google Cloud Platform-based services such as BigQuery, Cloud Storage, Dataflow, Dataproc, and Composer (Airflow).
Analyze and map existing Teradata workloads to appropriate Google Cloud Platform equivalents.
Rewrite SQL logic, scripts, and procedures in Google Cloud Platform-compliant formats (e.g., standard SQL for BigQuery).
Collaborate with Divissional architects and business stakeholders to define migration strategies, validate data quality, and ensure compliance.
Develop automated workflows for data movement and transformation using Google Cloud Platform-native tools and/or custom scripts (Python).
Optimize data storage, query performance, and costs in the cloud environment.
Implement monitoring, logging, and alerting for all migration pipelines and production workloads.
Required Skills:
4 to 6+ years of experience in Data architecting, with at least 2 years in Google Cloud Platform.
4 to 6+ years of experience in Data Engineering, with at least 2 years in Google Cloud Platform.
Strong hands-on experience in Teradata data warehousing, BTEQ, and complex SQL.
Solid knowledge of Google Cloud Platform services: BigQuery, Dataflow, Cloud Storage, Pub/Sub, Composer, and Dataproc.
Experience with ETL/ELT pipelines using custom scripting tools (Python/Java).
Proven ability to refactor and translate legacy logic from Teradata to Google Cloud Platform.
Familiarity with CI/CD, GIT, Argo CD, and DevOps practices in cloud data environments.
Strong analytical, troubleshooting, and communication skills.
Preferred Qualifications:
Google Cloud Platform certification.
Exposure to Apache Kafka, Cloud Functions, or AI/ML pipelines on Google Cloud Platform.
Experience working in the healthcare domain.
Knowledge of data governance, security, and compliance in cloud ecosystems
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.