What Working at Hexaware offers:
Hexaware is a dynamic and innovative IT organization committed to delivering cutting-edge solutions to our clients worldwide. We pride ourselves on fostering a collaborative and inclusive work environment where every team member is valued and empowered to succeed.
Hexaware provides access to a vast array of tools that enhance, revolutionize, and advance professional profile. We complete the circle with excellent growth opportunities, chances to collaborate with highly visible customers, chances to work alongside bright brains, and the perfect work-life balance.
With an ever-expanding portfolio of capabilities, we delve deep into and identify the source of our motivation. Although technology is at the core of our solutions, it is still the people and their passion that fuel Hexaware s commitment towards creating smiles.
At Hexaware we encourage to challenge oneself to achieve full potential and propel growth. We trust and empower to disrupt the status quo and innovate for a better future. We encourage an open and inspiring culture that fosters learning and brings talented, passionate, and caring people together.
We are always interested in, and want to support, the professional and personal you. We offer a wide array of programs to help expand skills and supercharge careers. We help discover passion the driving force that makes one smile and innovate, create, and make a difference every day.
The Hexaware Advantage: Your Workplace Benefits
- Excellent Health benefits with low-cost employee premium.
- Wide range of voluntary benefits such as Legal, Identity theft and Critical Care Coverage
- Unlimited training and upskilling opportunities through Udemy and Hexavarsity
Sr Data & AI engineer
Remote
FULLTIME with Hexaware
Your key responsibilities
- Design, build, and maintain efficient, reusable, and scalable ETL/ELT pipelines for structured and unstructured data.
- Develop and optimize data workflows to support advanced analytics, machine learning models, and reporting tools.
- Collaborate with data scientists, analysts, and business stakeholders to gather requirements and ensure data quality and availability
- Work with cloud and on-prem data platforms (e.g., AWS, Azure, Google Cloud Platform, Hadoop, or on-prem SQL/NoSQL systems).
- Develop and maintain LLM- and RAG-based AI agents using Azure OpenAI, LangChain/Semantic Kernel, and Azure ML.
- Design, build, and maintain data and AI pipelines in production.
- Ensure data integrity, governance, and security best practices across pipelines and data lakes/warehouses.
- Troubleshoot and resolve data issues and performance bottlenecks in real-time and batch pipelines.
- Monitor job performance and implement automation and alerting for data operations.
- Contribute to documentation, code reviews, and development best practices.
Skills and attributes for success
- 8+ years of experience in data engineering or related roles.
- Advanced programming skills in Python and experience with frameworks such as TensorFlow/PyTorch/Scikit-learn.
- Solid understanding of data modelling, data warehousing, and cloud platforms (AWS/Azure/Google Cloud Platform).
- Hands-on experience with data warehouses such as Databricks
- Experience with LLMs and generative AI especially Open AI.
- Good understanding of the different data architecture patterns (e.g., Data Lake, Data Mesh, Data Warehouse, Data Lakehouse)
- Experience with data visualization and BI tools such as PowerBI and/or Tableau.
- Familiarity with structured output tooling (e.g., Pydantic, LangChain) and integrating with LLM pipelines.
- Exposure to CI/CD, version control (Git), and agile development practices.
- Experience with machine learning workflows and AI-driven analytics.
To qualify for the role, you must have
- Bachelor s or Master s degree in Computer Science, Engineering, or related field.
- Familiarity with at least one of the following: Data Warehouse, Data Lake, Lakehouse, Delta etc.
- Knowledge of streaming data frameworks (Kafka, Spark Streaming, Flink) is a plus.
- Certifications in AWS/Azure/Google Cloud Platform or specific data tools.
Privacy Statement:
The information you provide will be used in accordance with the terms of our and will be used specifically for the business/processing purpose of the event. You should be aware that we may share your details with our approved vendors for this event to be handled successfully.