Overview
Skills
Job Details
Key Responsibilities
Architect and deploy batch and real-time data pipelines using Google Cloud Platform (Pub/Sub, Dataflow, BigQuery, Cloud Functions, DataStream).
Design secure, scalable serverless data platforms for structured & unstructured data.
Build AI/ML workflows using Vertex AI, BigQuery ML, and modern MLOps practices.
Develop Infrastructure-as-Code using Terraform or Deployment Manager.
Manage Google Cloud Platform environments including IAM, VPC networking, cost control, and hybrid connectivity.
Integrate semantic search frameworks with vector databases (FAISS, Pinecone, Vertex Vector Search).
Implement RAG pipelines to improve LLM performance.
Collaborate across engineering, security, legal, and business stakeholders as senior technical advisor.
Mentor junior engineers, ensuring best practices in data governance, observability, and AI infra.
Required Qualifications
15+ years in data engineering, architecture, or AI/ML development.
Advanced Python programming with ML libraries (TensorFlow, PyTorch, scikit-learn).
Hands-on expertise with Google Cloud Platform services (Vertex AI, BigQuery, Cloud Storage).
Strong experience with APIs, JSON messaging, IaC, Docker, Kubernetes, networking.
Experience with MLOps pipelines, DevOps workflows, model lifecycle management.
SAP integration & Apigee API Gateway knowledge is a plus.
Bachelor s in CS/Engineering (Master s preferred).