Technical Architect Data and Analytics

Newport Beach, CA, US • Posted 3 hours ago • Updated 3 hours ago
Full Time
No Travel Required
On-site
Depends on Experience
Fitment

Dice Job Match Score™

🤯 Applying directly to the forehead...

Job Details

Skills

  • Amazon S3
  • Artificial Intelligence
  • Data Engineering
  • Budget
  • DevOps
  • Git
  • Health Care
  • Informatica

Summary

Technical Architect — Data and Analytics

Location :: Newport Beach, CA

Duration :: Full Time

 

Job Description:

 

Tagline: Platform Architecture | Snowflake Data Mesh | AI-Augmented Delivery | Enterprise-Scale Modernization

Role Summary

We are looking for a seasoned Technical Architect to own end-to-end solution architecture for a Fortune 500 insurance and financial services enterprise''''s large-scale, multi-wave data modernization program. You will design and govern a Snowflake + AWS S3 + Matillion + dbt platform built on medallion architecture principles, define the ACORD-based enterprise data model, and set the technical standards that all delivery workstreams will follow. This is a hands-on architecture role — you will be deeply embedded in the delivery team, not a distant reviewer.

Why This Role Matters

This engagement is a generational transformation — migrating 372 production database instances and 80+ source systems into a unified, governed, cloud-native data platform over 18–24 months. The architecture you design will directly enable three of the client''''s most critical strategic programs: real-time operations transformation, new business underwriting modernization, and Finance Oracle Cloud implementation. Getting the foundation right in Wave 1 determines whether Waves 2 and 3 can scale without replatforming.

What You''''ll Do

Architecture & Solution Design

·         Architect and deliver the enterprise data platform on Snowflake + AWS S3 using a medallion (Bronze–Silver–Gold) architecture, supporting 80+ source systems and 7-year historical migration

·         Design the ACORD Life & Annuity-based enterprise data model customized for insurance domains — Policy, Claims, Finance, Actuarial, Agent/Distribution, Customer/Party

·         Define the data mesh architecture with federated governance, domain ownership boundaries, and self-serve platform patterns for multi-wave delivery

·         Establish reusable ingestion templates (Matillion), dbt transformation frameworks, and Snowflake-native quality patterns (Data Metric Functions) as cross-program standards

·         Govern architecture decisions across all 5 parallel workstreams: Ingestion, Transformation, Data Quality/DRE, Consumption, and Governance/MDM

Data Platform & Engineering Standards

·         Design the data ingestion strategy for structured (SQL Server CDC, Oracle, SFTP, APIs) and unstructured (EHR, APS notes, underwriting documents) source systems

·         Define dbt project structure, modular macro patterns, and Git-integrated version control standards for all transformation logic

·         Architect the Collibra integration strategy — automated catalog, end-to-end lineage (Matillion → dbt → Snowflake → Tableau/Power BI), business glossary, and certification workflows

·         Specify the Profisee MDM integration architecture — bi-directional Snowflake Silver layer synchronization, golden record publishing, and insurance-specific match/merge patterns

AI-Driven Delivery & Acceleration

·         Embed WinWire''''s WinAIDM AI accelerator framework into the delivery model — AI-powered ingestion, quality validation, dbt transformation, and test case generation

·         Guide adoption of Snowflake Cortex AI capabilities (AI_EXTRACT, AI_CLASSIFY, AI_COMPLETE) for unstructured data processing within the platform''''s security perimeter

·         Define the CI/CD quality gate architecture — automated dbt tests, reconciliation validation (99.9%+ match), performance benchmarks, and lineage completeness checks

·         Champion AI-augmented engineering practices (GitHub Copilot, LLM-based accelerators) to drive 40–50% reduction in development effort across the team

Technical Leadership & Governance

·         Lead technical design reviews, architecture decision records (ADRs), and code standards across all engineering workstreams

·         Mentor senior engineers, technical leads, and data modelers — building a high-performing delivery team capable of sustaining the platform post-engagement

·         Drive the Data Reliability Engineering (DRE) framework: SLO/SLI definition, error budgets, automated monitoring, and incident response patterns for all certified data products

·         Participate in the three-tier governance model — representing architecture at Program Management and Steering Committee levels

Client Engagement

·         Translate complex technical architecture into clear, decision-ready recommendations for client technology and business stakeholders

·         Proactively surface trade-offs (performance vs. cost, speed vs. governance) and recommend options — acting as a trusted guide, not just an executor

·         Collaborate with the customer''''s data engineering, infrastructure, and domain SME teams to align platform decisions to business outcomes

Tech Stack Snapshot

Snowflake, AWS S3, Matillion, dbt (data build tool), Collibra, Profisee MDM, Python, SQL, Snowflake Cortex AI, WinAIDM, SnowConvert AI, Tableau, Power BI, CI/CD (Azure DevOps), Git, ACORD Data Model

Must-Have Skills

·         12+ years in data engineering and analytics with 3+ years in a solution/technical architect role on enterprise-scale programs

·         Deep, hands-on expertise in Snowflake — query optimization, clustering, Data Metric Functions, Snowpipe, Streams, and native AI capabilities

·         Proven experience designing medallion/lakehouse architectures with AWS S3 as the raw data lake layer

·         Strong command of dbt — project structure, macros, testing frameworks, and CI/CD integration

·         Experience architecting data governance solutions using Collibra — catalog, lineage, business glossary, and certification workflows

·         Demonstrated ability to lead multi-wave, multi-workstream data modernization programs in a regulated (insurance, healthcare, or financial services) environment

·         Hands-on experience migrating legacy ETL (Informatica or SSIS) to modern dbt/Matillion pipelines

Good to Have

·         Familiarity with ACORD Life & Annuity data standards and insurance domain concepts (Policy, Claims, Actuarial, Reinsurance)

·         Experience with Profisee MDM or equivalent enterprise MDM platforms

·         Exposure to Snowflake Cortex AI, AI-assisted development tools (GitHub Copilot, Azure OpenAI), or LLM-based data engineering accelerators

·         SnowPro Advanced certification (Data Engineer or Architect)

Best Regards

Abdul Samad

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 10122703
  • Position Id: 8931693
  • Posted 3 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Costa Mesa, California

Today

Full-time

USD 191,000.00 - 253,000.00 per year

Newport Beach, California

Today

Easy Apply

Full-time

Depends on Experience

Tustin, California

Today

Full-time

USD 140,000.00 - 150,000.00 per year

Irvine, California

Today

Full-time

USD 166,000.00 - 220,000.00 per year

Search all similar jobs