Google Cloud Platform Data Engineers and Google Cloud Platform Architects - Atlanta (Hybrid) Only Locals and near by. F2F interview is mandatory. Day-1 onsite. Must have stong experience in BigQuery, Cloud storage, Cloud run, Dataflow, Cloud SQL, AlloyDB, Cloud Balancer, PubSub, IAM, Logging and Monitoring, SQL, Python and Linux scripting, ETL tools such as Datastage, Informatica, SSIS and Google Cloud Platform Professional Data Engineer or Cloud Architect certification is a plus.

  • Atlanta, GA
  • Posted 2 hours ago | Updated 2 hours ago

Overview

On Site
Depends on Experience
Accepts corp to corp applications
Contract - Independent
Contract - W2

Skills

Agile
Analytical Skill
Analytics
Cloud Computing
Cloud Storage
Clustering
Collaboration
Conflict Resolution
Continuous Delivery
Continuous Integration
Data Architecture
Data Engineering
Data Flow
Data Modeling
Data Quality
Data Warehouse
Datastage
DevOps
ELT
Effective Communication
Encryption
Extract
Transform
Load
Git
Good Clinical Practice
Google Cloud
Google Cloud Platform
IBM InfoSphere DataStage
Informatica
Linux
Management
Microsoft SSIS
Network
Optimization
Performance Tuning
Problem Solving
Python
Regulatory Compliance
Reporting
SQL
Scripting
Snow Flake Schema
Terraform
Version Control

Job Details

Need Google Cloud Platform Data Engineers and Google Cloud Platform Architects
Location: Atlanta (Hybrid) Only Locals and near by
F2F interview is mandatory. Day-1 onsite.

Job Summary:
We are seeking a skilled Google Cloud Platform (Google Cloud Platform) Data Engineer to design, build, and optimize data pipelines and analytics solutions in the cloud. The ideal candidate must have hands-on experience with Google Cloud Platform data services, strong ETL/ELT development skills, and a solid understanding of data architecture, data modeling, data warehousing and performance optimization.
Key Responsibilities:
Develop ETL/ELT processes to extract data from various sources, transform it, and load it into BigQuery or other target systems.
Build and maintain data models, data warehouses, and data lakes for analytics and reporting.
Design and implement scalable, secure, and efficient data pipelines on Google Cloud Platform using tools such as Dataflow, Pub/Sub, cloud run, Python and linux scripting.
Optimize BigQuery queries, manage partitioning and clustering, and handle cost optimization.
Integrate data from on-premise and cloud systems using Cloud Storage, and APIs.
Work closely with DevOps teams to automate deployments using Terraform, Cloud Build, or CI/CD pipelines.
Ensure security and compliance by applying IAM roles, encryption, and network controls.
Collaborate with data analysts, data scientists, and application teams to deliver high-quality data solutions.
Implement best practices for data quality, monitoring, and governance.
Required Skills and Experience:
Bachelor s degree in Computer Science, Information Technology, or related field.
Minimum 8 years of experience in data engineering, preferably in a cloud environment.
Minimum 3 years of hands-on and strong expertise in Google Cloud Platform services:
o BigQuery, Cloud storage, Cloud run, Dataflow, Cloud SQL, AlloyDB, Cloud Balancer, PubSub, IAM, Logging and Monitoring.
Proficiency in SQL, Python and Linux scripting.
Prior experience with ETL tools such as Datastage, Informatica, SSIS
Familiarity with data modeling (star/snowflake) and data warehouse concepts.
Understanding of CI/CD, version control (Git), and Infrastructure as Code (Terraform).
Strong problem-solving and analytical mindset.
Effective communication and collaboration skills.
Ability to work in an agile and fast-paced environment.
Google Cloud Platform Professional Data Engineer or Cloud Architect certification is a plus.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Keylent