Job Role: Google Cloud Platform Data Engineer
Job Location: Danbury, CT
Experience: 10years
"NOTE" : Google Cloud Platform certification mandatory
Job Description:
1. Designing or building REST/GraphQL APIs
2. Working with API frameworks (FastAPI, Flask, Express, Spring Boot)
3. API Gateway configuration, rate limiting, or traffic management
4. Authentication/authorization implementation (OAuth 2.0, JWT)
5. API documentation (Swagger/OpenAPI)
6. Container-based API deployment (Cloud Run, GKE, Docker)
7. API testing (Postman, Newman, contract testing)
8. Microservices design patterns
• Design, build, and maintain scalable data pipelines using Cloud Dataflow, Apache Beam, Apache Spark, or BigQuery.
• Develop ETL/ELT workflows for data ingestion, transformation, and processing using Cloud Composer (Airflow), TIDAL, Dataform, or custom scripts.
• Optimize BigQuery performance through partitioning, clustering, and query tuning.
• Work with Cloud Storage, Pub/Sub,Ni-Fi, Cloud SQL and Bigtable for real-time and batch data processing.
• Monitor and troubleshoot data pipeline performance, failures, and cost efficiency.
• Strong expertise in Google Cloud Platform services (BigQuery, Dataflow, Cloud Storage, Pub/Sub, Bigtable, Firestore, etc.).
• Proficiency in SQL, Python, and Java for data processing and automation.
• Experience with ETL/ELT workflows using Cloud Composer, Dataflow, or Dataform.
• Strong understanding of data modeling, warehousing, and distributed computing.
• Experience with real-time and batch processing architectures.
• Understanding of security and compliance standards (IAM, encryption, GDPR, HIPAA, etc.).
Strong API Skills:
• Strong Core Java & Spring Boot
• APIGEE & Security patterns
• Swagger Designing
• Microservice Architecture and Patterns
• Google Cloud Platform certifications (e.g., Professional Data Engineer, Associate Cloud Engineer).