City : Austin
State : Texas
Neos is Seeking a
Data Architect for a contract role for with our client in Austin, TX.
***REMOTE - ONLY CANDIDATES CURRENTLY RESIDING IN THE AUSTIN, TEXAS WILL BE CONSIDERED***No calls, no emails, please respond directly to the "apply" link with your resume and contact details.
Data Architect - Enterprise Data Modernization (D2I Initiative)Environment: Higher Education | Enterprise Data | Hybrid Mainframe + Cloud
Position OverviewThe Data Architect will support UT Austin's enterprise Data to Insights (D2I) modernization initiative. This role will design and guide the transition from legacy mainframe-based data environments to a modern cloud-based data platform leveraging Databricks, while supporting downstream analytics tools including Tableau and Cognos.This is a high-visibility, enterprise-level architecture role focused on data integration, governance, scalability, and performance across academic and administrative domains.
Key ResponsibilitiesThe Data Architect will play a key role in designing, implementing, and scaling cloud-based data architectures within our AWS environment. As we implement Databricks, this role will focus on modernizing our unified data platform to enable advanced analytics, machine learning, and real-time data processing.This position will collaborate closely with Data Engineering, DevOps, Data Modeling, Analytics, and Metadata teams to ensure scalable, efficient, and well-governed data solutions. Partnering with the Chief Enterprise Data Architect, you will drive data strategy, architecture, and implementation that align with our business.
Responsibilities - Cloud Data ArchitectureDesign and implement robust, scalable, and secure AWS-based data architectures with a focus on Databricks adoption.
Partner with the Analytics and Data Modeling Group to ensure alignment on data modeling standards, schema design, and integration with data pipelines.
Architect efficient ETL/ELT pipelines for data ingestion, transformation, and delivery, supporting operational and analytical workloads.
Develop and maintain comprehensive data strategies that align with enterprise goals, enabling real-time and batch data processing.
Create technical artifacts, standards, and architectural frameworks to address current and future business requirements.
Ensure data quality, governance, and compliance throughout the data lifecycle.
Databricks Implementation and LeadershipLead the implementation and optimization of Databricks for advanced data engineering, analytics, and machine learning workloads.
Drive the adoption of Delta Lake architectures for high-performance data pipelines.
Develop and operationalize scalable architectures for collaborative notebooks, machine learning workflows, and real-time processing.
Collaboration and InnovationCollaborate with Data Engineering, DevOps, Data Modeling, Analytics, and Metadata teams to align on architectural decisions, standards, and processes.
Serve as a technical advisor to stakeholders, effectively communicating complex data concepts to leadership and cross-functional teams.
Foster a culture of collaboration, innovation, and continuous improvement within the team.
Required QualificationsBachelor's degree in Computer Science, Information Systems, Engineering, or equivalent experience.
Proven experience designing and implementing enterprise-scale data architectures in AWS environments.
Strong expertise in data modeling, schema design, and database structures, with experience working closely with modeling teams.
Hands-on experience with Databricks for big data processing, analytics, and machine learning.
Proficiency in building ETL/ELT pipelines and working with data integration tools (e.g., AWS Glue, Informatica).
Deep understanding of SQL, NoSQL, and data storage technologies (e.g., Redshift, RDS, S3).
Experience ensuring data governance, quality, and compliance.
Strong troubleshooting skills and ability to optimize performance of cloud data solutions.
Collaborative team player with excellent communication and leadership skills.
Relevant education and experience may be substituted as appropriate.
Preferred QualificationsMaster's degree in a relevant field.
Certifications in AWS (e.g., AWS Solutions Architect) and Databricks (e.g., Databricks Certified Professional).
Expertise in Delta Lake architecture design and implementation.
Familiarity with Agile development methodologies and tools (e.g., JIRA, Confluence).
- Proven experience leading data modernization initiatives across cross-functional teams.
#DICE
#LI-IC