Hybrid - 2 days Onsite at their Jersey city Office.
Job Description: Key Experiences:
9+ years of proven experience in the software development industry, participating in a team environment.
5+ years of experience in UI automation, such as Selenium WebDriver.
5+ years of proven experience in testing web services such as RESTful API.
3+ years of proven experience in performance testing using tools such as Apache JMeter.
2+ years of working in a cloud-based development and production environment such as AWS.
3+ years of proven experience with programming languages such Java/C#. Familiarity with Python is a plus. Seeking candidates with solid coding proficiency in Java or C#.
Proficient with both SQL and NoSQL databases.
Experienced working in Continuous Integration/Continuous Deployment (CI/CD) environments.
Having AWS/Cloud technologies and AI/ML powered applications knowledge are big plus.
Key Responsibilities:
Design and execute functional and automation testing for applications.
Develop and execute automated test scripts using testing tools; document and summarize results.
Build, update, and maintain UI automation test cases using Selenium with C# (flexibility to learn C# if not already proficient).
Perform manual API testing using tools such as Postman or Insomnia.
Perform API automation testing using industry-standard frameworks and tools.
Participate actively in Agile ceremonies (daily stand-ups, sprint planning, retrospectives).
Collaborate closely with Development and Business teams to ensure project success.
Quickly adapt to new technologies and tools (e.g., Amazon Web Services).
Work with SQL and NoSQL databases for backend validation and robust data testing.
Create comprehensive end-to-end test plans, test specifications, and test case templates.
Develop and document detailed test plans and test cases.
Facilitate defect tracking, reporting, and resolution management.
Review requirements (Functional Specifications, User Stories, Change Requests) to ensure full coverage.
Identify and manage test data; prepare automation-ready test cases.
Conduct system testing, regression testing, and support User Acceptance Testing (UAT).
Apply test case design techniques (e.g., boundary value analysis, equivalence partitioning).
Implement Shift Left Testing practices by participating in early requirement and design reviews, writing acceptance criteria up front, and integrating automated tests in CI/CD pipelines.
Be actively involved in the deployment process, collaborating with DevOps and engineering teams to ensure smooth releases.
Nice to Have:
Experience with application monitoring tools (e.g., New Relic, Datadog, CloudWatch).
Working knowledge of Python for scripting and automation.
Familiarity with AWS services for cloud-based testing and deployment.
Hands-on experience with performance and load testing using JMeter
Experience with testing AI/ML-powered applications, including validating model outputs, creating test datasets, evaluating model performance.