Overview
Skills
Job Details
Job Summary:
We are seeking a skilled Google Cloud Platform Big Data Engineer with strong Java development expertise to design, develop, and maintain large-scale data processing solutions on Google Cloud Platform. In this role, you will build and optimize data pipelines, integrate data from multiple sources, and enable scalable, real-time analytics to support business needs. You ll collaborate with cross-functional teams to design cloud-native solutions that leverage Google Cloud Platform services, Big Data frameworks, and Java-based processing.
Key Responsibilities:
Design and implement scalable data processing pipelines using Apache Beam, Dataflow, BigQuery, and other Google Cloud Platform services.
Develop backend components and microservices using Java for data ingestion, transformation, and delivery.
Migrate and modernize legacy data solutions to cloud-native architectures on Google Cloud Platform.
Integrate structured and unstructured data sources into cohesive, analytics-ready datasets.
Optimize performance and cost-efficiency of data workflows and queries.
Work closely with data analysts, data scientists, and DevOps teams to ensure reliable and secure data access.
Ensure compliance with security, privacy, and governance standards for cloud-based data solutions.
Leverage Big Data technologies such as Hadoop, Spark, Kafka, and Google Cloud Platform-native tools.
Qualifications:
Strong programming skills in Java (8+), with experience in multithreaded and distributed systems.
3+ years of hands-on experience with Google Cloud Platform (Google Cloud Platform), especially Big Data and analytics services.
Experience with Big Data frameworks such as Apache Beam, Spark, Kafka, or Hadoop.
Proficiency in SQL and Google Cloud Platform tools like BigQuery, Pub/Sub, Dataflow, and Cloud Storage.
Familiarity with CI/CD pipelines, Terraform, or Cloud Deployment Manager is a plus.
Excellent problem-solving, communication, and collaboration skills.