Overview
Remote
$60 - $65
Contract - W2
Contract - 12 Month(s)
Skills
AWS
Azure
Lead
ALB/NLB
Balancer
Application Gateway
CI/CD
QA
Dev
JMeter
Locust
HTTP/HTTPS
SSL
TCP/IP
Terraform
CloudFormation
CLI
AWS CLI
Jenkins
GitHub
GitLab
Job Details
W2 W2 W2
Job Title: Data Architect AWS & Azure Load Balancer Testing
Location: Remote
Location: Remote
W2 Contract
Summary:
Design and implement scalable load balancing architectures on AWS and Azure. Lead performance and load testing to ensure high availability, optimal traffic distribution, and system resilience for enterprise data platforms.
Key Responsibilities:
Architect load balancing solutions using AWS ALB/NLB and Azure Load Balancer/Application Gateway.
Plan and execute distributed load testing to validate load balancer performance under real-world traffic.
Monitor and analyze load balancer health, traffic patterns, and optimize configurations.
Collaborate with Dev, QA, and Infra teams to enforce best practices in routing, security, and scaling.
Automate load balancer deployment and testing within CI/CD pipelines.
Documentation of architecture, test plans, and performance reports.
Required Skills:
Strong experience with AWS and Azure load balancer services.
Expertise in load testing tools like JMeter, Locust, or AWS/Azure native load testing.
Deep networking knowledge: HTTP/HTTPS, SSL, TCP/IP, routing algorithms.
Scripting and automation skills (Terraform, CloudFormation, Azure CLI, AWS CLI).
---
:
Load Testing Strategy & Experience
Built a performance team from scratch for major restaurant clients (Buffalo Wild Wings, Jimmy John s, Subway) at Capgemini.
Additional experience with TJ Maxx and Walmart (real estate).
Approach: Analyze customer base and peak hours to simulate realistic load scenarios.
Industry standards: Target response times (e.g., 500ms for microservices, 1 1.5s with third-party dependencies).
Baseline & Load Testing
Start with single-user baseline tests to assess application behavior.
Investigate response times >2 seconds, even for single users, using tools like Dynatrace, AppDynamics, and New Relic.
Troubleshoot network, database, JVM, or code issues as needed.
Multi-Year / Multi-Team Projects
Identify critical business flows (e.g., payment gateways).
Test backend services (APIs) before UI-level testing.
Centralized performance team manages intake from multiple dev/business teams via request forms.
Testing is based on non-functional requirements and workload modeling.
Workload Modeling & Acceptance
Maintain non-functional templates for UI and APIs.
After load tests, business-accepted risks (e.g., slow login page) are tracked for future sprints.
Cross-Team Coordination
Performance testers collaborate with developers, DBAs, test data managers, and architects.
Lead collects and analyzes reports to identify bottlenecks.
Major releases involve stakeholder meetings (dev, infra, DB, product, architects) to resolve issues.
Database tuning may involve adjusting read/write limits or connection pools.
Load Balancing & Data Centers
Load balancer type and multi-region deployment impact latency.
Data center location affects user experience; rerouting to the nearest center can reduce latency.
Multi-Cloud Experience
Experience with AWS and Azure for hosting applications.
Performance strategy is similar across clouds, but differs from on-prem environments.
AWS: Amazon Load Balancer & CloudWatch; Azure: Application Gateway & App Insights.
Consistency & CI/CD Integration
Centralized repository for sharing performance reports.
Compare results across releases to monitor performance trends.
JMeter scripts integrated with Jenkins/GitHub/GitLab for automated performance regression in CI/CD.
Jenkins configures pass/fail criteria (e.g., error rate, response time) for build promotion
Summary:
Design and implement scalable load balancing architectures on AWS and Azure. Lead performance and load testing to ensure high availability, optimal traffic distribution, and system resilience for enterprise data platforms.
Key Responsibilities:
Architect load balancing solutions using AWS ALB/NLB and Azure Load Balancer/Application Gateway.
Plan and execute distributed load testing to validate load balancer performance under real-world traffic.
Monitor and analyze load balancer health, traffic patterns, and optimize configurations.
Collaborate with Dev, QA, and Infra teams to enforce best practices in routing, security, and scaling.
Automate load balancer deployment and testing within CI/CD pipelines.
Documentation of architecture, test plans, and performance reports.
Required Skills:
Strong experience with AWS and Azure load balancer services.
Expertise in load testing tools like JMeter, Locust, or AWS/Azure native load testing.
Deep networking knowledge: HTTP/HTTPS, SSL, TCP/IP, routing algorithms.
Scripting and automation skills (Terraform, CloudFormation, Azure CLI, AWS CLI).
---
:
Load Testing Strategy & Experience
Built a performance team from scratch for major restaurant clients (Buffalo Wild Wings, Jimmy John s, Subway) at Capgemini.
Additional experience with TJ Maxx and Walmart (real estate).
Approach: Analyze customer base and peak hours to simulate realistic load scenarios.
Industry standards: Target response times (e.g., 500ms for microservices, 1 1.5s with third-party dependencies).
Baseline & Load Testing
Start with single-user baseline tests to assess application behavior.
Investigate response times >2 seconds, even for single users, using tools like Dynatrace, AppDynamics, and New Relic.
Troubleshoot network, database, JVM, or code issues as needed.
Multi-Year / Multi-Team Projects
Identify critical business flows (e.g., payment gateways).
Test backend services (APIs) before UI-level testing.
Centralized performance team manages intake from multiple dev/business teams via request forms.
Testing is based on non-functional requirements and workload modeling.
Workload Modeling & Acceptance
Maintain non-functional templates for UI and APIs.
After load tests, business-accepted risks (e.g., slow login page) are tracked for future sprints.
Cross-Team Coordination
Performance testers collaborate with developers, DBAs, test data managers, and architects.
Lead collects and analyzes reports to identify bottlenecks.
Major releases involve stakeholder meetings (dev, infra, DB, product, architects) to resolve issues.
Database tuning may involve adjusting read/write limits or connection pools.
Load Balancing & Data Centers
Load balancer type and multi-region deployment impact latency.
Data center location affects user experience; rerouting to the nearest center can reduce latency.
Multi-Cloud Experience
Experience with AWS and Azure for hosting applications.
Performance strategy is similar across clouds, but differs from on-prem environments.
AWS: Amazon Load Balancer & CloudWatch; Azure: Application Gateway & App Insights.
Consistency & CI/CD Integration
Centralized repository for sharing performance reports.
Compare results across releases to monitor performance trends.
JMeter scripts integrated with Jenkins/GitHub/GitLab for automated performance regression in CI/CD.
Jenkins configures pass/fail criteria (e.g., error rate, response time) for build promotion
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.