Performance Test Engineer

Overview

On Site
Depends on Experience
Contract - W2
Contract - 30 week(s)

Skills

Selenium

Job Details

One of CEI's largest Healthcare clients is seeking a Performance Test Engineer to join their growing organization!

Client/Industry: Healthcare Information Technology / Health Insurance Systems
Job Title: Digital Performance Analyst (Performance Load Testing)
Location: Onsite (Highly preferred) | Columbia, SC 29203
Work Schedule/Shift: Mon-Fri 8:30am to 5:00pm | Minimum 40 work hours per week ; Possible overtime required
Duration/Length of Assignment: 7 Month Contract with potential for long-term engagement (extension or hire)
Pay Rate: $49.00 per hour
Additional Information: U.S. Citizenship required; must be eligible to obtain CMS security clearance

*Must be able to convert to a full-time employee without sponsorship, restrictions, or an additional employer*
  1. W2 Employment Only – No Corp to Corp / C2C arrangements.
  2. Optional benefits available during contract (Medical, Dental, Vision, and 401k)


Position Overview:
This role supports a large-scale cloud migration initiative focused on modernizing enterprise healthcare applications hosted within a CMS-regulated cloud environment. The position sits within a cloud migration program responsible for ensuring performance, scalability, and reliability of mission-critical applications that support healthcare operations and member-facing portals. The analyst works as part of a collaborative performance testing and quality engineering team aligned to an Agile delivery model, partnering closely with development, quality assurance, business analysis, and infrastructure teams. The team operates within a structured reporting environment and contributes directly to ensuring cloud-hosted systems meet regulatory, security, and operational standards. From a high-level perspective, the role is responsible for validating how applications behave under real-world usage conditions, identifying performance risks before production deployment, and supporting continuous improvement across cloud-based platforms. The analyst is expected to plan, execute, and evaluate performance testing efforts, analyze results, communicate findings to stakeholders, and contribute to system tuning and optimization efforts. The position also includes mentoring junior team members and supporting broader testing governance, documentation, and compliance-related activities across the cloud migration initiative.

Required Skills/Experience/Qualifications:
  • Bachelor’s degree in Computer Science, Information Technology, or a related field, or equivalent experience (4 years of related experience, or 2 years of related experience plus an associate degree)
  • 6 years of application development or testing experience, including at least four years focused on performance testing and tuning
  • Experience testing cloud-hosted applications in enterprise environments
  • Ability to design, develop, and execute performance test strategies, test plans, and automated test scripts
  • Hands-on experience with performance and test automation tools including Apache JMeter, BlazeMeter, LoadRunner Cloud, Selenium WebDriver, Selenium Grid, JUnit, TestNG, SoapUI, Postman, and Insomnia
  • Strong understanding of performance testing concepts, key performance indicators, metrics, and testing types
  • Experience identifying, tracking, and managing defects using TestRail, JIRA, Confluence, or ALM tools
  • Knowledge of application and performance monitoring tools, including AWS CloudWatch Synthetics
  • Experience supporting lift-and-shift, re-platforming, or cloud migration testing initiatives
  • Strong knowledge of REST and SOAP API testing, SQL, and test automation frameworks
  • Experience integrating automated test suites into CI/CD pipelines using tools such as Jenkins, GitHub, or AWS CodePipeline
  • Hands-on experience with AWS services including EC2, IAM, S3, RDS/Aurora, Lambda, API Gateway, CloudWatch, CloudTrail, and CodeBuild

Preferred Skills (Not Required):
  • Understanding of DevOps practices related to cloud-based analytics pipelines and deployments
  • Experience with GitHub Actions, version control workflows, and automated deployment processes
  • Familiarity with additional AWS services such as DynamoDB, SNS, SQS, Step Functions, and Artifact

Day to Day/Responsibilities:
  • Begin the day by reviewing current sprint priorities, active test cycles, and open defects through JIRA and TestRail to align testing activities with development timelines
  • Participate in daily stand-ups with developers, quality engineers, and business analysts to discuss testing progress, risks, and upcoming release needs
  • Analyze business and technical requirements to design appropriate load, stress, and performance test scenarios that reflect real-world user behavior
  • Design, develop, and maintain automated performance test scripts using tools such as Apache JMeter, BlazeMeter, or LoadRunner Cloud, ensuring test coverage across critical application workflows
  • Execute performance, load, and stress tests against cloud-hosted applications running in AWS, validating speed, stability, scalability, and system resilience
  • Monitor application behavior during test execution using AWS CloudWatch, CloudWatch Synthetics, and other monitoring tools to identify latency issues, resource constraints, and architectural bottlenecks
  • Evaluate system performance across multiple layers, including APIs, databases, and infrastructure components, to understand interdependencies and performance impact
  • Document test execution results, performance trends, and pass/fail outcomes in TestRail, linking findings to historical performance data
  • Log, track, and manage defects and performance issues in JIRA, collaborating with development teams to support root cause analysis and resolution
  • Provide recommendations for system tuning, capacity planning, and architectural improvements based on test results and monitoring data
  • Support CI/CD workflows by integrating automated performance test suites into pipelines using Jenkins, GitHub, or AWS CodePipeline
  • Maintain detailed test documentation, execution notes, and release-related artifacts in Confluence to support auditability and compliance requirements
  • Validate application updates against CMS security and performance standards prior to deployment into higher environments
  • Perform data analysis and statistical review of performance metrics to support continuous improvement and software process evaluation
  • Mentor junior performance testing team members by reviewing test designs, providing technical guidance, and supporting skill development
  • Contribute to administrative and lead activities related to cloud testing governance, standards, and team coordination
#INDGEN #ZR
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.