Overview
Skills
Job Details
Hi All,
Good Morning,
We have an uregent position for the below menioned requiremet.
Title: Google Cloud Platform Architect
Location: Atlanta (Hybrid) Only Locals and near by
F2F interview is mandatory. Day-1 onsite.
Duration: Long Term
Job Summary:
We are seeking a skilled Google Cloud Platform (Google Cloud Platform) Data Engineer to design, build, and optimize data pipelines and analytics solutions in the cloud. The ideal candidate must have hands-on experience with Google Cloud Platform data services, strong ETL/ELT development skills, and a solid understanding of data architecture, data modeling, data warehousing and performance optimization.
Key Responsibilities:
• Develop ETL/ELT processes to extract data from various sources, transform it, and load it into BigQuery or other target systems.
• Build and maintain data models, data warehouses, and data lakes for analytics and reporting.
• Design and implement scalable, secure, and efficient data pipelines on Google Cloud Platform using tools such as Dataflow, Pub/Sub, cloud run, Python and linux scripting.
• Optimize BigQuery queries, manage partitioning and clustering, and handle cost optimization.
• Integrate data from on-premise and cloud systems using Cloud Storage, and APIs.
• Work closely with DevOps teams to automate deployments using Terraform, Cloud Build, or CI/CD pipelines.
• Ensure security and compliance by applying IAM roles, encryption, and network controls.
• Collaborate with data analysts, data scientists, and application teams to deliver high-quality data solutions.
• Implement best practices for data quality, monitoring, and governance.
Required Skills and Experience:
• Bachelor’s degree in Computer Science, Information Technology, or related field.
• Minimum 8 years of experience in data engineering, preferably in a cloud environment.
• Minimum 3 years of hands-on and strong expertise in Google Cloud Platform services:
· BigQuery, Cloud storage, Cloud run, Dataflow, Cloud SQL, AlloyDB, Cloud Balancer, PubSub, IAM, Logging and Monitoring.
• Proficiency in SQL, Python and Linux scripting.
• Prior experience with ETL tools such as Datastage, Informatica, SSIS
• Familiarity with data modeling (star/snowflake) and data warehouse concepts.
• Understanding of CI/CD, version control (Git), and Infrastructure as Code (Terraform).
• Strong problem-solving and analytical mindset.
• Effective communication and collaboration skills.
• Ability to work in an agile and fast-paced environment.
• Google Cloud Platform Professional Data Engineer or Cloud Architect certification is a plus