Redshift Data Architect

Overview

Remote
Depends on Experience
Accepts corp to corp applications
Contract - Independent
Contract - W2
Contract - 12 Month(s)
No Travel Required

Skills

Amazon Redshift
Amazon S3
Amazon Web Services
Analytical Skill
Benchmarking
Business Intelligence
Cloud Computing
Collaboration
Continuous Delivery
Continuous Integration
Dashboard
Data Analysis
Data Compression
Data Engineering
Data Loading
Data Modeling
Data Warehouse
Database
Database Administration
DevOps
Dimensional Modeling
Distribution
Electronic Health Record (EHR)
Extract
Transform
Load
Management
Performance Tuning
Python
Quality Assurance
Reporting
SQL
Scalability
Scripting
Scripting Language
Snow Flake Schema
Stored Procedures
Tableau
Terraform
WLM
Writing
Claims data
insurance data

Job Details

Redshift Data Architect

Location: Remote

Key Responsibilities
  • Act as the subject matter expert for AWS Redshift performance.
  • Conduct deep-dive analysis on slow-running queries, data loading times, and concurrent workload bottlenecks.
  • Proactively review, rewrite, and optimize complex SQL queries, stored procedures, and ETL logic to minimize execution time and resource consumption.
  • Manage and fine-tune Redshift cluster configurations, including:
    • Implementing and optimizing Workload Management (WLM) and Short Query Acceleration (SQA) policies.
    • Advising on optimal Distribution Keys, Sort Keys, and Column Compression encodings.
    • Managing and ensuring timely execution of VACUUM and ANALYZE operations.
  • Collaborate with Data Engineers and Architects on data modeling (Star/Snowflake schema design) and physical table design to ensure future scalability and analytical performance.
  • Configure and manage advanced monitoring tools (e.g., CloudWatch, Redshift system views like SVL_QUERY_SUMMARY, custom dashboards) to identify performance trends, anomalies, and resource hogs.
  • Analyze Redshift cluster usage and recommend cost-saving measures through elastic resize, reserved nodes, and query efficiency improvements without compromising performance.
  • Develop scripts and automation (primarily Python/Lambda) for performance diagnostics, reporting, and automated tuning tasks.
  • Design and execute stress tests, load tests, and benchmarking exercises on the data platform to validate performance under peak load conditions.

Required Qualifications & Skills

  • 5+ years of experience in Data Engineering, Database Administration, or a dedicated Performance/Tuning role within a large-scale data environment.
  • Expert-level knowledge of AWS Redshift internals, architecture, and best practices for performance tuning.
  • Mastery of SQL for writing highly optimized, complex queries, and interpreting query execution plans.
  • Proven experience with Redshift WLM, distribution styles (Key, All, Even), and sort key (Compound, Interleaved) selection.
  • Strong proficiency in at least one scripting language for automation and data analysis (Python preferred).
  • Solid understanding of AWS services relevant to data pipelines: S3, Glue, Athena, and EMR.
  • Experience with dimensional data modeling and its direct impact on query performance.
  • AWS Certified Data Analytics Specialty or AWS Certified Database Specialty.
  • Experience with performance tuning on other cloud data warehouses (e.g., Snowflake, Google BigQuery).
  • Familiarity with BI tools (e.g., Tableau, Looker, QuickSight) and how they interact with Redshift at the query level.
  • Experience in a DevOps or DataOps environment, utilizing CI/CD and Infrastructure as Code (e.g., Terraform).
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.