Client: IBM
Location: Boston, MA
Engagement Type: C2C, W2, 1099
Best Rate
Role Overview
seeking an experienced Enterprise Data Warehouse Architect to lead the architecture and delivery of enterprise-scale data warehouse solutions. This role requires hands-on Snowflake production experience, strong expertise in big data platforms, and the ability to provide technical leadership, mentorship, and architectural guidance to delivery teams.
The client wants architects who can mentor juniors, conduct code reviews, and guide teams on standards, s emphasis on technical guidance. Strong stakeholder collaboration to translate requirements into solutions is key, with experience in Agile/DevOps and cloud platforms (AWS/Azure/Google Cloud Platform).
Responsibilities
Data Architecture & Delivery
Architect, design, and implement enterprise data warehouse solutions using Snowflake (production environments)
Define data architecture standards, best practices, and reusable patterns
Lead data migration strategies from legacy/on-prem platforms to Snowflake
Ensure high availability, scalability, and cost-efficient Snowflake implementations
Data Engineering & Performance
Collaborate with data engineers and analytics teams to design and optimize ETL/ELT pipelines
Drive performance tuning, query optimization, and workload management in Snowflake
Integrate Snowflake with big data technologies such as Spark and/or Kafka
Support batch and near-real-time data processing use cases
Governance, Security & Compliance
Implement and enforce data governance, security, and compliance controls within Snowflake
Design role-based access control (RBAC), data masking, auditing, and monitoring solutions
Ensure compliance with enterprise and regulatory data standards
Technical Leadership (Consulting Focus)
Conduct code reviews and ensure adherence to IBM and client engineering standards
Provide technical mentorship and hands-on guidance to junior and mid-level engineers
Act as a trusted technical advisor to IBM and client stakeholders
Stakeholder Collaboration
Engage with business and technical stakeholders to gather requirements and translate them into technical designs
Work within Agile/Scrum and DevOps delivery models
Clearly communicate architecture decisions, risks, and trade-offs
Required Skills & Experience
15+ years of experience in data warehousing, data architecture, or analytics platforms
Strong Snowflake production experience (mandatory)
Hands-on expertise in Snowflake performance tuning and optimization
Strong experience with big data platforms: Apache Spark and/or Kafka
Experience designing enterprise-grade ETL/ELT pipelines
Solid understanding of data governance, security, and compliance
Experience working in Agile/DevOps environments
Strong consulting, communication, and stakeholder management skills
Cloud & Technology Stack (Preferred)
Cloud platforms: AWS / Azure / Google Cloud Platform
Data platforms: Snowflake
Big data tools: Spark, Kafka
Programming: SQL, Python
CI/CD & DevOps tools