Overview
Skills
Job Details
We are seeking an experienced Google Cloud Platform Big Data Engineer to design, build, and optimize scalable data pipelines and big data solutions on Google Cloud Platform. The ideal candidate should have strong hands-on expertise with Google Cloud Platform services, big data technologies, and data processing frameworks.
Key Responsibilities
Design, develop, and maintain large-scale data pipelines using Google Cloud Platform services.
Work with BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage.
Develop batch and real-time data processing solutions.
Build ETL/ELT pipelines for structured and semi-structured data.
Optimize query performance and cost in BigQuery.
Collaborate with data scientists, analysts, and business teams.
Ensure data quality, governance, and security best practices.
Monitor and troubleshoot production data pipelines.
Required Skills
Strong experience in Google Cloud Platform (Google Cloud Platform).
Hands-on experience with Big Data technologies such as:
Apache Spark
Hadoop
Kafka
Expertise in BigQuery, Dataflow, Dataproc, Cloud Composer.
Strong programming skills in Python, Scala, or Java.
Experience with SQL and data warehousing concepts.
Knowledge of CI/CD pipelines and Git.
Experience with containerization (Docker/Kubernetes) is a plus.
Preferred Qualifications
Google Cloud Platform Professional Data Engineer Certification (Good to have).
Experience with data streaming and real-time analytics.
Knowledge of Airflow, Terraform, and Infrastructure as Code.
Experience working in Agile/Scrum environments.
Nice to Have
Experience with ML pipelines on Google Cloud Platform.
Knowledge of cloud security and IAM.
Exposure to multi-cloud platforms.