Job Description
This is a senior level technical architect role responsible for defining the target architecture and leading the design and build of an end-to-end enterprise data platform, including a lakehouse on Databricks or AWS native services and a governed data warehouse on Snowflake.
Operating with broad autonomy and enterprise level influence, this role sets platform standards and makes high impact technical decisions across ingestion, lakehouse storage/compute, Snowflake warehousing, transformation, governance, and consumption. Architectural standards and patterns defined by this role serve as the technical foundation that engineering delivery teams build against, ensuring consistency and alignment across the platform. The architect partners closely with functional executives, cross-functional leaders, and select external vendors, while remaining hands-on through architecture reviews, POCs, and critical path build activities.
This position is accountable for solving ambiguous, complex, and high risk platform problems in a greenfield environment where choices around lakehouse patterns, Snowflake modeling, data movement, and governance establish long-term direction and impact multiple processes, functions, and enterprise outcomes.
Responsibilities:
Architect and lead delivery of an end-to-end enterprise data platform with a lakehouse (Databricks or AWS native) and a Snowflake data warehouse, including ingestion, transformation, serving, and consumption through BI tools and a governed semantic layer.
Design lakehouse patterns (e.g., bronze/silver/gold, Delta/Apache Iceberg where applicable), including data quality controls and master data management (MDM) integration, and define the approach into Snowflake (ELT, CDC, data sharing, and performance-optimized loading).
Ensure data products are scalable, trusted, secure, and performant by defining SLAs/SLOs, data quality controls, lineage, and operational monitoring across the lakehouse and warehouse.
Influence functional executives and cross?functional leaders by being the strongest technical voice in the room.
Build and validate solutions hands-on through POCs, reference implementations, architecture reviews, and direct contribution to Databricks/AWS pipeline patterns and Snowflake models.
Define platform guardrails for security and governance (e.g., encryption, IAM integration, RBAC, PII handling, auditability) across the lakehouse and warehouse.
Identify and introduce architectural standards, patterns, and tools that measurably improve platform speed, reliability, cost efficiency, or security - translating architectural decisions into tangible operational and financial outcomes.
Drive outcomes that affect enterprise operations, financials, and executive reporting trust.
Define platform processes, patterns, and standards for data ingestion/CDC, transformation (ELT), orchestration, CI/CD, testing, and release management.
Establish architecture standards and guardrails for Databricks/AWS lakehouse and Snowflake warehouse patterns, including workload separation, cost management, and performance tuning.
Operate effectively in ambiguous, fast-moving conditions - challenge established patterns, evaluate emerging tools and approaches (including AI-native capabilities), and adapt platform direction as the technology landscape shifts.
Design for AI-native platform capabilities where they improve operational efficiency, including intelligent monitoring, automated anomaly detection, data quality remediation, and stewardship automation - while keeping the core focus on building a reliable, well-governed data platform.
Ensure the semantic layer, catalog, and metadata architecture are designed to support AI-enabled analytics capabilities
Lead major platform initiatives and influence work across teams based on expertise.
Provide input into technical hiring and capability development.
Qualifications - Required
10+ years designing and building enterprise scale data platforms, including lakehouse and data warehouse architectures
Deep hands-on experience with Databricks and/or AWS native lakehouse services, and strong working knowledge of Snowflake (architecture, modeling, performance, and security).
Strong command of BI ready data modeling (dimensional and/or data vault), semantic layer design/management, and enabling BI tools on top of Snowflake and/or curated lakehouse gold layers.
Experience implementing data quality frameworks (rules, monitoring, SLAs) and partnering on master data management (MDM) to deliver trusted, consistent dimensions across domains.
Proven track record influencing senior leaders without formal authority
Comfortable operating in ambiguity, making decisions that last, and balancing scalability, governance, performance, and cost across the platform.
Active, habitual use of AI development tools (e.g., GitHub Copilot, Claude Code, Cursor, or similar) to accelerate personal delivery of POCs, reference implementations, and architectural artifacts.
Qualifications - Preferred
Experience with enterprise data catalog and lineage platforms (e.g., DataHub, Atlan, OpenLineage or similar) spanning multi-engine environments.
Familiarity with formal data contract frameworks (e.g., ODCS) for codifying producer-consumer agreements.
Hands-on experience with MDM platforms (e.g., Profisee or comparable tools).
Experience with data security and tokenization tooling (e.g., Protegrity, DataBolt, or similar) for sensitive data handling in regulated environments.
Experience designing AI-enabled platform capabilities such as intelligent data quality monitoring, automated anomaly detection, or AI-assisted data stewardship and operations.
Background in financial services or other complex regulated industries.
#LI-Onsite
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
- Dice Id: RTX1a6d2c
- Position Id: 6257
- Posted 22 hours ago