Job Title: Senior Data Modeler
Location: Erie PA – Onsite
Long Term
Must have deep expertise in Property & Casualty (P&C) Insurance
Job Summary: We are seeking a highly skilled Senior Data Modeler with deep expertise in Property & Casualty (P&C) Insurance and strong capabilities in data analysis, modeling, governance, and schema management. This role requires hands-on experience in analyzing complex data structures, ensuring data quality, and designing scalable data models aligned with industry standards such as ACORD and enterprise canonical data models, along with strong familiarity with policy administration platforms such as OneShield and Policy Management Systems (PMS).
Key Responsibilities:
- Work closely with business stakeholders to translate P&C insurance domain requirements (policy/quote/rating lifecycle, endorsements, claims, billing) into scalable data solutions.
- Perform in-depth data analysis and profiling to assess data quality, completeness, and consistency across multiple systems.
- Design and implement robust data models, including:
- Relational and Denormalized Data Modeling
- Dimensional Modeling (Star/Snowflake schemas)
- Data Vault Modeling (Hubs, Links, Satellites)
- Apply Slowly Changing Dimensions (SCD Types 1, 2, 3, etc.), with clear understanding of when to use each based on business requirements.
- Support Change Data Capture (CDC) processes to enable efficient incremental data ingestion and synchronization.
- Conduct data balancing and auditing to ensure reconciliation between source and target systems.
- Analyze and process XML and JSON data structures, including troubleshooting and transformation.
- Maintain and enhance XSDs and Java-based schemas, addressing:
- Schema evolution and Schema drift
- Backward/forward compatibility
- Work with and map data to industry-standard data models (e.g., ACORD) and design/maintain enterprise canonical data models for system interoperability.
- Integrate and analyze data from policy administration systems such as OneShield and other PMS platforms, ensuring alignment with downstream data models and reporting layers.
- Collaborate with engineering teams to ensure proper schema versioning, canonical modeling, and governance practices.
- Define and enforce data governance standards, including metadata management, lineage, and data quality rules.
- Support regulatory and reporting requirements specific to the insurance industry
Required Skills & Qualifications:
· Strong experience in Data Analysis and Data Profiling tools/techniques
· Expertise in Dimensional Modeling, Data Vault 2.0, Canonical Data Modeling and hands-on experience using the tools like ERWin and Hackoldate
· Solid understanding of Change Data Capture (CDC) frameworks and tools
· Deep knowledge of SCD Types and practical implementation strategies
· Hands-on experience with XML, XSD, and JSON data structures using XML IDEs like Altova XMLSpy
· Deep understanding of the Mainframe Data Structures (like Segments, Copybooks)
· Experience in schema management, including schema evolution and drift handling
· Proficiency in data auditing, reconciliation, and balancing techniques
· Strong SQL and XQuery skills and experience working with large datasets
Data Standards, Systems & Integration:
- Experience working with industry-standard data models such as ACORD
- Ability to map, transform, and align source system data to canonical data models
- Strong understanding of policy administration systems, specifically OSPAS and PMS
- Knowledge of policy/quote/rating lifecycle data flows (quote → bind → issue → endorsement → renewal → cancellation)
- Understanding of data integration patterns across upstream PMS and downstream analytics platforms
Data Governance & Quality:
- Experience implementing data governance frameworks
- Knowledge of data lineage, cataloging, and metadata management
- Familiarity with data quality tools and validation strategies
Domain Expertise:
- Strong experience in Property & Casualty (P&C) Insurance, including:
- Policy, Claims, Billing, Underwriting data domains
- Regulatory and reporting requirements
Preferred Qualifications:
- Experience with modern data platforms (Snowflake, Redshift, etc.)
- Familiarity with ETL/ELT tools (AWS Glue, Informatica, dbt, etc.)
- Programming experience (Python) for schema and data processing tasks
- Knowledge of event-driven architectures and streaming data (Kafka, etc.)
- Prior experience implementing ACORD-based integrations or canonical data hubs
- Hands-on experience working with OneShield data structures, APIs, or data extracts