Snowflake Data Architect
Location: Berkeley Heights, NJ
Experience Level: Senior / Principal
Industry Focus: Financial Services, Healthcare, Manufacturing, Fortune 500 Enterprise
Position Overview
We are seeking a highly experienced Data Architect to design, build, and optimize enterprise-grade data solutions in complex, large-scale environments. This role requires deep expertise in Snowflake, cloud data platforms, and modern ELT architectures, combined with strong business acumen and leadership capabilities.
You will play a key role in architecting scalable data warehouse solutions, leading data migration initiatives, and driving measurable business outcomes for enterprise clients.
This is not a hands-on developer-only role — we are looking for a strategic technical leader who can translate complex business problems into high-impact, scalable data solutions.
Key Responsibilities
Architecture & Engineering
- Design and implement enterprise data warehouse solutions using Snowflake.
- Architect and optimize large-scale ETL/ELT pipelines.
- Lead data migration initiatives from legacy systems to modern cloud platforms.
- Develop and maintain scalable data models using SQL, Python, and dbt.
- Implement performance optimization strategies achieving measurable improvements.
- Troubleshoot and resolve production data issues in mission-critical environments.
Cloud & Platform Strategy
- Design and implement data solutions across Azure, AWS, or Google Cloud Platform.
- Define best practices for cloud data architecture and governance.
- Ensure scalability, reliability, and security of enterprise data platforms.
Leadership & Stakeholder Engagement
- Translate complex business requirements into technical data solutions.
- Partner with executive stakeholders and cross-functional teams.
- Lead and mentor junior data engineers.
- Drive delivery across multi-stakeholder enterprise environments.
- Demonstrate measurable ROI through cost savings, performance gains, or SLA improvements.
Required Qualifications
- 7–10 years of relevant data engineering experience (minimum 5 years for exceptional candidates).
- Expert-level SQL proficiency.
- Expert-level Snowflake experience.
- Strong Python skills in a data engineering context.
- Experience with Azure, AWS, or Google Cloud Platform cloud platforms.
- Hands-on experience with dbt.
- Enterprise data warehouse design experience.
- Large-scale ETL/ELT pipeline implementation.
- Data migration project leadership experience.
- Experience mentoring or leading engineering teams.
- Bachelor’s degree in Computer Science, Engineering, or related technical field.
Preferred Qualifications
- Snowflake SnowPro Core certification (strongly preferred).
- Snowflake SnowPro Advanced Architect certification.
- Microsoft Azure Data or Solutions Architect certification.
- dbt certification.
- Master’s degree (MBA, M.S. in Data Science, Information Science, or related field).
- Experience in Financial Services, Healthcare, Manufacturing, or Fortune 500 enterprises.
- Demonstrated 30–90% performance optimization improvements.
- Proven record of measurable business impact (cost savings, system performance, SLA improvements).