Overview
Skills
Job Details
Job Title: Senior Data Solutions Architect (AWS & Snowflake)
Work Location: (Once a month travel is required to Lincoln, RI)
Duration : Long Term
Role Overview
We are seeking a highly skilled Senior Data Architect to support enterprise data initiatives and serve as a key technical partner to the architecture team. This role will work closely with Information Systems Officer / Enterprise Architect, and drive data architecture decisions, perform solution reviews, and guide the design and implementation of scalable, cloud-based data platforms.
The ideal candidate will have deep expertise in Data warehousing, SQL, cloud data architecture (AWS and Snowflake), and a strong ability to balance hands-on engineering with high-level architectural thinking.
Key Responsibilities
- Act as a technical right hand to the Enterprise Architect, supporting architecture reviews, design decisions, and strategic planning.
- Design and implement scalable data warehouse and analytics solutions on AWS and Snowflake.
- Develop and optimize SQL, ETL/ELT pipelines, and data models to support reporting and analytics.
- Collaborate with cross-functional teams (data engineering, application development, infrastructure) to align on architecture best practices and ensure consistency across solutions.
- Evaluate and recommend technologies, tools, and frameworks to improve data processing efficiency and reliability.
- Provide guidance and mentorship to data engineering teams, enforcing data governance, quality, and security standards.
- Troubleshoot complex data and performance issues and propose long-term architectural solutions.
- Support capacity planning, cost optimization, and environment management within AWS/Snowflake ecosystems.
Required Skills and Experience
- 15+ years of experience in data engineering and architecture, with a strong focus on data warehouse design and optimization.
- Expert-level proficiency in SQL and relational database design.
- Hands-on experience with AWS cloud services (Redshift, S3, Glue, Lambda, etc.) and Snowflake.
- Strong understanding of Data Modelling (star/snowflake schemas), ETL/ELT frameworks, and best practices for data integration.
- Experience in designing data pipelines using Python, Spark, or similar technologies is a plus.
- Excellent communication and documentation skills able to translate technical concepts for business and technical audiences.
- Ability to operate independently, make architectural decisions, and act as a trusted advisor on technology direction.