Overview
Skills
Job Details
Must Have:
AWS services- Bedrock, SageMaker, ECS and Lambda
Demonstrated experience with AWS organizations and policy guardrails (SCP, AWS Config)
Experience implementing RAG architectures and using frameworks and ML tooling like: Transformers, PyTorch, TensorFlow, and LangChain
Experience in Infrastructure as Code best practices and experience with building Terraform modules for AWS cloud
Fine-tuning large language models, building datasets and deploying ML models to production
Git-based version control, code reviews, and DevOps workflows
Nice To Have:
AWS or relevant cloud certifications
Data privacy and compliance best practices (e.g., PII handling, secure model deployment)
Data science background or experience working with structured/unstructured data
Exposure to FinOps and cloud cost optimization
Hugging Face, Node.js
Policy as Code development (I.e. Terraform Sentinel)
DUTIES AND RESPONSIBILITIES:
Hands-on role using AWS (Lambda, Bedrock, SageMaker, Step Functions, DynamoDB, S3).
Responsible for the implementation of AWS cloud services including infrastructure, machine learning, and artificial intelligence platform services.
Experience with LLM-based applications, including Retrieval-Augmented Generation (RAG) using LangChain and other frameworks.
Develop cloud-native microservices, APIs, and serverless functions to support intelligent automation and real-time data processing.
Collaborate with internal stakeholders to understand business goals and translate them into secure, scalable AI systems.
Own the software release lifecycle, including CI/CD pipelines, GitHub-based SDLC, and infrastructure as code (Terraform).
Support the development and evolution of reusable platform components for AI/ML operations.
Create and maintain technical documentation for the team to reference and share with our internal customers.
Excellent verbal and written communication skills in English.
MINIMUM KNOWLEDGE, SKILLS, AND ABILITIES REQUIRED:
7 years of hands-on software engineering experience with a strong focus on Python.
Experienced with AWS services, especially Bedrock or SageMaker
Familiar with fine-tuning large language models or building datasets and/or deploying ML models to production.
Demonstrated experience with AWS organizations and policy guardrails (SCP, AWS Config).
Solid experience implementing RAG architectures and LangChain.
Demonstrated experience in Infrastructure as Code best practices and experience with building Terraform modules for AWS cloud.
Strong background in Git-based version control, code reviews, and DevOps workflows.
Demonstrated success delivering production-ready software with release pipeline integration.
Nice-to-Haves:
AWS or relevant cloud certifications.
Policy as Code development (e.g., Terraform Sentinel).
Experience with Hugging Face, Golang, or Node.js.
Exposure to FinOps and cloud cost optimization.
Data science background or experience working with structured/unstructured data.
Awareness of data privacy and compliance best practices (e.g., PII handling, secure model deployment).