Role Overview:
We are seeking a highly skilled Senior Solution Architect with deep expertise in designing and delivering modern, customerfacing web applications on AWS, leveraging React.js, Node.js, PostgreSQL, and scalable architectural patterns. The ideal candidate will excel at building highvolume OLTP systems, implementing loadbalanced, resilient architectures, and ensuring secure, performant, and cloudnative delivery using AWS best practices.
This role demands strong architectural leadership, handson technical depth, and a proven ability to collaborate across product, engineering, security, and DevOps teams to deliver enterprisegrade digital platforms.
Key Responsibilities:
Architecture & Solution Design
- Lead endtoend architecture for hightraffic, customer-facing web applications.
- Design scalable microservices-based and event-driven architectures using Node.js, REST APIs, and GraphQL.
- Architect single-page applications (SPA) using React.js, modular component design, and optimized client-side performance.
- Define relational and non-relational data models optimized for high-volume OLTP transactions on PostgreSQL.
- Create architecture blueprints covering application layers, data flows, integration, observability, and deployment topology.
Cloud Architecture on AWS
- Design distributed, fault-tolerant, and secure solutions using key AWS services:
- VPC, Subnets, NACLs, Security Groups for network isolation
- ALB/NLB, Auto Scaling, ECS/EKS, Lambda for compute
- RDS PostgreSQL, Aurora, ElastiCache for data
- API Gateway, CloudFront, S3 for global-scale delivery
- Ensure reliability, scalability, and high availability for systems processing large transaction volumes.
- Review and optimize cloud costs, architecture fitness, and performance.
Application Performance & Scalability
- Design solutions with builtin load balancing, auto-scaling, queueing, caching, and failover patterns.
- Architect for low latency, high throughput, and predictable system behavior under peak load.
- Ensure appropriate use of CDN, caching layers (Redis/Memcached), and async processing.
Security & Compliance
- Own the security architecture across application, data, and network layers:
- IAM design & RBAC
- JWT/OAuth2 authentication
- Data encryption in transit & rest
- Secure VPC patterns
- Work with InfoSec teams on threat modeling, vulnerability resolution, and cloud security posture.
Technical Leadership
- Guide engineering teams on architecture patterns, design reviews, best practices, and code quality.
- Collaborate with product owners, UX, DevOps, and QA to ensure cohesive delivery.
- Mentor developers and junior architects; evangelize modern engineering and cloud-native principles.
DevOps, Observability & Delivery
- Partner with DevOps to design CI/CD pipelines, IaC (Terraform/CloudFormation), and automated release processes.
- Ensure full-stack observability using CloudWatch, X-Ray, ELK/Datadog/New Relic.
- Champion SRE practices error budgets, performance baselines, incident readiness.
Required Skills & Experience
- 10 15+ years of experience in architecture and engineering for enterprise-grade web applications.
- Strong hands-on expertise with:
- React.js, JavaScript/TypeScript (frontend)
- Node.js, Express/Nest.js (backend)
- PostgreSQL, schema design, performance tuning
- AWS cloud architecture (compute, network, security, data, serverless)
- Proven experience designing high-volume OLTP systems and distributed back-end architectures.
- Strong understanding of load balancing, caching, async patterns, CI/CD, containers, and cloud-native technologies.
- Expertise with VPC, network segmentation, API security, and zero-trust principles.
- Deep knowledge of microservices, domain-driven design (DDD), and event-driven integration patterns.
Preferred Qualifications
- AWS Certifications (Solutions Architect Professional highly preferred)
- Experience with:
- Docker, Kubernetes (EKS)
- Redis/RabbitMQ/Kafka
- GraphQL
- Infrastructure-as-Code (Terraform / CloudFormation)
- Experience working with global delivery teams in Agile environments.
Job Duties:
- Define and implement a cloud-native data architecture that supports analytics, real-time processing, and advanced AI/ML workloads.
- Develop a comprehensive modernization roadmap, balancing speed, risk, cost, and business continuity.
- Partner with data engineering, analytics, infrastructure, and data science teams to design scalable pipelines and governance frameworks.
- Define data ingestion, transformation, and integration strategies across diverse data sources.
- Optimize performance, scalability, and cost within Snowflake and Databricks environments.
- Establish best practices for DevOps, CI/CD, and Infrastructure as Code (e.g., Terraform).
- Provide technical leadership and mentorship to data engineers and architects.
- Stay ahead of evolving AI, ML, and LLM capabilities and ensure the platform is future-ready for those workloads.
Qualifications:
- 10+ years of experience in data engineering, data architecture, or platform engineering roles.
- Proven track record migrating large-scale on-prem data warehouses (Oracle, Teradata, etc.) to Snowflake.
- Demonstrated experience migrating Hadoop environments to Databricks, including Spark, Hive, and HDFS.
- Deep expertise in SQL, Spark, Python, and distributed data processing frameworks.
- Strong understanding of cloud architectures (AWS, Azure, or Google Cloud Platform) including storage, compute, networking, and security.
- Experience with data governance, lineage, and cataloging tools (Purview, Collibra, Alation, etc.).
- Familiarity with real-time data streaming (Kafka, Kinesis, or Pub/Sub) and orchestration frameworks (Airflow, dbt).
- Proven leadership in cross-functional collaboration and migration program management.
- Excellent communication skills able to articulate technical concepts to both technical and non-technical stakeholders.
Nice to Have:
- Certifications in Snowflake, Databricks, or major cloud providers (AWS, Azure, Google Cloud Platform).
- Experience with machine learning and AI pipelines on Databricks or similar platforms.
- Prior experience in data platform modernization or transformation programs at scale (multi-petabyte preferred).
- Knowledge of modern data mesh, lakehouse, and data product principles.