Overview
On Site
$50 - $60
Contract - W2
Contract - Independent
Contract - 12 Month(s)
No Travel Required
Skills
Generative Artificial Intelligence (AI)
Database
Demand Planning
Evaluation
Forecasting
Incident Management
Agile
Application Lifecycle Management
Artificial Intelligence
Automated Testing
Selenium
Snow Flake Schema
Software Implementation
Test Estimation
Reporting
Resource Management
SQL
Scripting
System Integration Testing
Product Development
Prompt Engineering
PyTorch
Python
Quality Management
EAS
JIRA
LangChain
Leadership
Portfolio Management
Process Improvement
Testing
Vector Databases
Writing
Job Details
Job Title: Automation & AI Test Engineer
Job Location: Dallas TX / Tampa FL/ Jersey City NJ (Onsite)
Job Type: Contract
Job Duration: 12+ Months
Experience: 8+ Years
Job Description:
- Responsible for system integration testing of newly developed or enhanced applications
- Requires in-depth knowledge of the software implementation lifecycle
- Possesses good Testing Process knowledge and a detailed understanding on AGILE methodology
- Works with project teams during early stages to establish plans, standards, and procedures that will add value to the testing effort and satisfy the constraints of the project
- Ensures on-time delivery of work including monitoring of external and internal dependencies, tracking of progress, and monitoring of project milestones accomplishments by: o Recording work status information and generating status reports
- Tracking and reporting actual versus planned completion
- Ensuring all activity is accurately recorded and reported
- Drives the education of Business and Development teams on the testing model, methodology, and processes, particularly end-to-end testing, and its role in quality assurance
- Demonstrates Leadership characteristics while handling Product, Development, EAS, and other stakeholders
- Demonstrates proficiencies in Project and Quality management, Issue management, and Communication
- Assists Testing Manager in reviewing Portfolio Management and Resource Management forecast reports, to optimize team s supply/demand planning, maximizing resource utilization
- Monitors adherence to quality standards during the testing of production applications, identifies areas of strengths and weakness
- Identifies process improvement opportunities and communicates to the project coordinator
- Experience with Jira and ALM
Domain Knowledge:
- Ability to works closely with business and development subject matter experts, to continually improve depth and breadth of knowledge for the assigned applications/systems
Automation:
- Experience in writing, and executing application tests using industry standard automated testing tools like Selenium
AI Specific Testing:
- Hands on experience testing LLMs and RAG architectures using vector databases or embeddings to validate retrieval accuracy in RAG based models.
- Implement automated testing frameworks for gen AI applications such as Deep eval, RAGAs etc for evaluating and testing LLMs
- Strong knowledge of Python and experience with AI testing tools and libraries (e.g, LangChain, Pytest, Pytorch)
- Develop custom validation scripts for Gen AI model output evaluation including accuracy, relevancy, faithfulness and performance benchmarks
- Test prompt engineering techniques to optimize responses
- Familiarity with Snowflake and executing SQL queries to validate database integrity and application responses
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.