![]()
Director, Senior Information Architect
Position Purpose
The Senior Information Architect will design, build, and maintain scalable, well structured data models that power analytics, reporting, and operational workflows across CIM's $35B+ portfolio spanning real estate equity, infrastructure, and private credit.
This role establishes data modeling standards from the ground up, creating enterprise grade models within the Databricks Lakehouse platform and integrating with Snowflake and MongoDB where appropriate.
As a key contributor to CIM's developing data architecture function, the Senior Information Architect will work closely with business teams-including Fund Accounting, FP&A, Investor Relations, Sales, and Investments-to translate complex requirements into high performing, future ready data structures that support analytics, governance, and operational needs across Azure and modern data ecosystems.
Responsibilities
Business Partnership & Requirements Discovery
- Partner with business stakeholders across Fund Accounting, FP&A, Global Client Group, and Investments to gather data needs, understand pain points, and define use cases.
- Translate business requirements into clear, scalable data models, validating assumptions and ensuring alignment before solutioning.
- Collaborate with data analysts, data scientists, MLOps engineers, and application developers to confirm downstream requirements.
- Build trust by clearly explaining technical concepts and demonstrating measurable business value.
Data Model Design & Development
- Develop conceptual, logical, and physical data models for data warehouses, data lakes, operational data stores, and transactional systems.
- Create optimized dimensional models (star and snowflake schemas) for analytics and BI.
- Apply a variety of modeling techniques (Kimball, Inmon/3NF, Data Vault, NoSQL patterns) depending on the initiative.
Databricks Lakehouse Architecture
- Architect medallion layer structures (bronze/silver/gold) and define modeling standards within the Lakehouse environment.
- Optimize Delta Lake tables leveraging ACID transactions, time travel, schema evolution, Z ordering, and liquid clustering.
- Design balanced partitioning strategies to improve performance without over segmenting data.
- Implement Unity Catalog organization (catalogs, schemas, tables) for governance and multi domain data management.
- Support MLOps teams by designing models for ML feature stores and GenAI/RAG workloads.
Multi Platform Data Architecture
- Build and maintain relational schemas for Snowflake and other RDBMS systems, integrating them with Lakehouse patterns.
- Design NoSQL structures for MongoDB, including document design, indexing, and query optimization.
- Establish a framework for deciding whether Databricks, Snowflake, or MongoDB is the correct platform based on workload characteristics.
Performance Optimization
- Recommend and refine query optimization, indexing, and partitioning strategies.
- Identify and resolve issues such as data skew, excessive small files, and inefficient Spark joins.
- Partner with engineering teams on Delta Lake maintenance processes (OPTIMIZE, VACUUM, ANALYZE).
ETL/ELT Collaboration & Pipeline Alignment
- Work closely with Data Engineers to ensure models integrate smoothly with ETL/ELT pipelines built with Auto Loader, Delta Live Tables, and Spark workflows.
- Provide guidance on data mapping, transformation logic, schema evolution, and load patterns.
- Design SCD implementations using Delta Lake MERGE operations and Change Data Feed.
Data Governance & Standards
- Establish modeling standards, naming conventions, metadata practices, and governance frameworks.
- Implement row and column level security with Unity Catalog for sensitive investor and fund data.
- Define and maintain end to end data lineage from ingestion through reporting.
- Help build a comprehensive metadata repository and data dictionary.
- Ensure compliance with regulatory and ethical data handling standards.
Documentation & Knowledge Sharing
- Create and maintain ERDs, data dictionaries, lineage diagrams, and model documentation.
- Document Lakehouse design patterns and platform best practices for internal knowledge sharing.
- Identify data quality issues early in modeling processes and collaborate on remediation strategies.
Education & Experience Requirements
Required
- Bachelor's or Master's in Computer Science, Engineering, Information Systems, or similar.
- 10+ years of experience in Data Modeling or Data Architecture.
- Expertise in dimensional modeling, 3NF, Data Vault, and NoSQL methodologies.
- Strong SQL capabilities with advanced query development experience.
- Proficiency with data modeling tools (ER/Studio, Erwin, DataGrip, SQL Developer, etc.).
Databricks Platform Requirements
- Hands-on experience designing medallion architectures in Databricks.
- Deep understanding of Delta Lake internals (ACID, time travel, schema evolution, Z order, liquid clustering).
- Experience with Unity Catalog governance and lineage.
- Knowledge of Spark optimization and partitioning strategies.
- Familiarity with Databricks SQL Warehouses and Delta Live Tables.
Multi Platform
- Experience modeling data in Databricks Lakehouse (required).
- Strong understanding of Azure (ADLS, Azure Databricks).
- Experience with ingestion patterns such as Kafka, Event Hubs, CDC, and APIs.
Preferred
- Experience in financial services, private equity, or alternative investments.
- Background building data architecture functions from scratch.
- Familiarity with investment data structures (fund hierarchies, NAV, capital calls, etc.).
- Experience supporting ML/AI use cases including feature engineering and RAG/GenAI data models.
Desirable Certifications
- Databricks Data Engineer (Associate/Professional)
- Microsoft Azure Data Engineer or Enterprise Data Analyst
- TOGAF (Enterprise Architecture)
All qualified applicants will receive consideration for employment without regard to race, color, national origin, age, ancestry, religion, sex, sexual orientation, gender identity, gender expression, marital status, disability, medical condition, genetic information, pregnancy, or military or veteran status. We consider all qualified applicants, including those with criminal histories, in a manner consistent with state and local laws, including the California Fair Chance Act, City of Los Angeles' Fair Chance Initiative for Hiring Ordinance, and Los Angeles County Fair Chance Ordinance.