AWS Data Architect

Overview

Remote
$60 - $70
Contract - Independent
Contract - W2

Skills

AWS
GLU
PYSPARK
PYTHON

Job Details

Position - AWS Data Architect
Duration 12month
Location - Remote

Must skills - AWS Redshift, AWS Glu, AWS Lake formation, PySpark

Key Responsibilities:

  • Design and implement end-to-end cloud-native data architectures on AWS.
  • Build and optimize data lakes, data warehouses, and analytics platforms using AWS services.
  • Architect and manage data pipelines with AWS Glue, Apache Airflow, and PySpark.
  • Implement advanced data formats and technologies including Apache Iceberg and Delta Lake.
  • Strong programming skills in Python, PySpark, and SQL.
  • In-depth knowledge of Iceberg and Delta Lake table formats.
  • Utilize Lake Formation, Redshift, Athena, DynamoDB, and other AWS services to deliver high-performance solutions.
  • Apply medallion architecture principles to ensure robust, scalable, and maintainable data flows.
  • Develop reusable components, frameworks, and automation tools to support data engineering efforts.
  • Ensure solutions meet security, compliance, and cost optimization best practices in cloud environments.
  • Implement CI/CD pipelines and leverage Infrastructure-as-Code (IaC) tools such as Terraform.
  • Monitor and troubleshoot performance and reliability issues in data systems and infrastructure.
  • Collaborate with cross-functional teams and lead data architecture discussions and decisions.
  • Mentor and support junior developers and architects, fostering a culture of continuous learning and improvement.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Talent Glide