AWS Data Architect ( Only W2)

Overview

Hybrid
Depends on Experience
Contract - W2
Contract - 12 Month(s)
No Travel Required

Skills

AWS
SQL
Python
AWS Glue

Job Details

Data Architect Contractor (15+ Years Experience)

Long Term Contract

Richmond, VA

Role Overview

We re looking for an experienced Senior Data Architect Contractor to help drive enterprise data modernization. In this high-impact role, you'll contribute to the design and implementation.

The ideal candidate has a strong background in enterprise data architecture, deep hands-on AWS experience, and a proven track record of influencing engineering teams and business stakeholders.

Key Responsibilities

Design and implement robust, scalable data architectures using AWS-native services: Glue, Redshift, Lake Formation, EMR, Athena, Kinesis, S3, Lambda, and Step Functions.

Collaborate with Capital One engineering and product teams to define technical patterns and standards across ETL/ELT pipelines, data lakes, and streaming platforms.

Contribute to domain-oriented data product strategies, including semantic modeling and metadata management.

Help enforce data governance and compliance requirements, ensuring architecture aligns with enterprise and regulatory standards.

Support AI/ML initiatives by enabling data pipeline design that aligns with real-time and batch model needs.

Participate in technical design reviews, provide architectural guidance, and create documentation and artifacts.

Conduct proof-of-concepts and technical evaluations of emerging AWS tools, data frameworks, and open-source solutions.

Required Qualifications

Technical Expertise

15+ years of experience in data architecture, engineering, or analytics roles, with recent focus on cloud data platforms.

Strong AWS background, including hands-on implementation with services like Glue, Redshift, S3, Lake Formation, Athena, Kinesis, DynamoDB, and EMR.

Deep knowledge of data integration, data modeling, metadata management, and distributed data systems.

Experience with stream processing (Kafka or Kinesis), big data tools (Spark, Hadoop), and data formats like Parquet or Avro.

Proficient in SQL and Python, with experience in automated testing and CI/CD for data pipelines.

Familiarity with data governance tools such as Collibra, Alation, or AWS Glue Data Catalog.

Leadership & Soft Skills

Strong ability to collaborate with cross-functional teams, including data engineers, analysts, and product owners.

Excellent communication skills for technical and non-technical audiences.

Experience working in agile environments with matrixed teams.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.