Enterprise Data Architect (2 Openings)

Overview

Remote
Depends on Experience
Contract - Independent
Contract - W2
Contract - 12 Month(s)

Skills

API
Amazon Kinesis
ADO
Amazon S3
Continuous Integration
Amazon Redshift
Data Structure
Finance
Kubernetes
DevOps
Data Engineering
Continuous Delivery
Optimization
GitHub
Python
Servers
SQL
Step-Functions
Terraform
Amazon Web Services

Job Details

Job Title: Enterprise Data Architect (2 Openings)
Location: Remote (Must support CST hours, Based in St. Louis)
Duration: 3 -6 Months with a high likelihood of extension, and possibility of conversion to FTE

Overview:
Seeking a highly skilled Enterprise Data Architect to join our ongoing Enterprise Data Optimization (EDO) initiative. This project, which has been in development for two years, aims to enhance our data structuring capabilities and significantly improve data retrieval speeds across the organization. We are looking for a versatile, senior-level professional with a strong background in Enterprise data engineering and architecture to join our team. The breakdown of the work will be 50/50 between Engineering and Architecture.
Key Responsibilities:
Collaborate with existing architects to split responsibilities equally between engineering and architecture.
Engage in hands-on work, including coding in Python and SQL, to optimize data structures and API services.
Develop and implement solutions to enable data retrieval speeds that are 10 times faster than current capabilities.
Work closely with various teams to ensure seamless data access and integration across the organization.
Provide thought leadership within the EDO space and act as a true consultant for best practices
Qualifications:
Minimum of 10+ years of experience in Data Engineering/Architecture.
Proven expertise as a jack of all trades in Enterprise Data Engineering.
Proficiency in AWS, Terraform, Kubernetes, Python, and SQL.
Experience in implementing and automating Kubernetes, with the ability to build Kubernetes environments independently without using Glue.
Extensive experience with AWS services, including S3, Redshift, Aurora Postgres, Glue, Lambda, Step Functions, Lake Formation, and Kinesis.
Strong background in AWS with Terraform, particularly in automating infrastructure delivery.
Experience with CI/CD processes using Azure DevOps (ADO) and GitHub.
Strong knowledge of Python and Postgres SQL.
Some experience with MCP servers is a plus.
Preferred Experience:
A background in finance is advantageous but not required.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.