Overview
Skills
Job Details
KPI Partners, A global consulting firm focused on strategy, technology, and digital transformation. We help companies tackle their most ambitious projects and build new capabilities. We provide solutions in Cloud, Data, Application Development & BI spaces.
Title: Senior Software Engineer (RAG/LLM)
Location: 100% Remote PST Hours (8 AM 5 PM PST)
Job Type: Contract 12+ Months
Travel: Should come Fremont Office on the first Tuesday and Wednesday of every month
About KPI Partners
KPI Partners is a 5 times Gartner recognized data, analytics, and AI consulting company. We are leaders in data engineering on Azure, AWS, Google, Snowflake and Databricks. Founded in 2006, KPI has over 400 consultants and has successfully delivered over 1,000 projects to our clients. We are looking for skilled data engineers who want to work with the best team in data engineering.
About the Role:
Seeking a highly skilled and experienced Software Engineer to join our team. The ideal candidate will focus on enhancing our information retrieval and generation capabilities, with specific experience in Azure AI Search, data processing for RAG, multimodal data integration, and familiarity with Azure Services. In this role, you will build a robust indexing framework that transforms unstructured, semi-structured, and structured data for consumption by GenAI applications. This framework will ensure seamless integration and accessibility of data, which will be consumed by an LLM-based chatbot (but not limited to) to optimize and enhance semiconductor manufacturing processes.
Key Responsibilities:
- Framework Development: Play a key role in developing a foundational indexing framework to accelerate the onboarding of Strategic Data Assets.
- Multimodal Data Integration: Integrate and manage various data types (e.g., text, images, videos) to enhance retrieval and generation capabilities.
- Cross-functional Collaboration: Work closely with cross-functional teams to support data integration into our data retrieval ecosystem, ensuring seamless functionality and performance.
- Scalability and Reliability: Ensure the scalability, reliability, and performance of data retrieval in production environments.
- Data Security: Ensure robust data security measures are in place to protect access to sensitive information.
- Automation: Develop and implement automation strategies to streamline data onboarding and processing workflows.
- Performance Monitoring: Monitor and analyze the performance of data pipelines and retrieval systems, making necessary adjustments to optimize efficiency.
- Innovation: Stay updated with the latest advancements in AI to drive innovation and maintain a competitive edge.
Must-Have Skills & Qualifications:
- Master's or Bachelor's degree in Computer Science, Data Science, or a related field.
- Approximately 8 years of experience, primarily in software engineering with some experience in developing ETL pipelines.
- Proficiency in Python and FastAPI
- Proven experience in software development, with an emphasis on building and deploying RAG pipelines or similar information retrieval systems.
- Familiarity with processing multimodal data (e.g., text, images) for retrieval and generation tasks.
- Strong understanding of database systems (both SQL and NoSQL) and data warehousing solutions.
- Proficiency in Azure AI, Databricks, and other relevant tools and technologies.
- Experience with Azure Services, including Azure Durable Functions and Azure Kubernetes.
- Excellent problem-solving skills and the ability to work both independently and collaboratively in a team environment.
- Strong communication skills to effectively convey technical concepts to non-technical stakeholders.
Good-to-Have Skills:
- Experience working in a fast-paced environment, demonstrating adaptability, innovation, and the ability to thrive in dynamic settings.
- Experience with Generative AI (GenAI), including large language model (LLM) orchestration and evaluation.
- Experience with Helm charts for managing Kubernetes applications, including the ability to define, install, and upgrade complex Kubernetes applications using Helm