Data Modelling Architect

Overview

On Site
$65
Accepts corp to corp applications

Skills

Underwriting
Python
IronPython
R
JavaScript
PL/SQL
Stored Procedures
Performance Tuning
Agile
Scrum
Automated Testing
Windows PowerShell
Microsoft Visual Studio
Git
Microsoft Azure
Microsoft TFS
Compensation Management
Insurance
Financial Reporting
Data Modeling
Data Marts
Extraction
Extract
Transform
Load
Business Intelligence
Dashboard
TIBCO Spotfire
Analytics
Data Extraction
Data Warehouse
Data Science
Software Development Methodology
Project Management
Enterprise Architecture
IT Governance
Documentation

Job Details

Job Title: Data Modelling Architect - Job ID: HBITS-07-14522
Location: Albany, NY 12205 - Hybrid
Duration: 30 Months
Experience: 12+ Years (Passport Copy Mandatory)
Mandatory Qualifications:
  • 72 months of experience in data modelling for underwriting and claims data marts and enterprise data warehouses within the insurance industry
  • 72 months of experience developing and maintaining predictive systems using R or Python with TIBCO Spotfire
  • 72 months of experience using TIBCO Spotfire Analyst, Web Player, and Business Author with IronPython, R, and JavaScript
  • 72 months of experience designing, documenting, and maintaining ETL pipelines using IBI Data Migrator
  • 72 months of experience in Oracle PL/SQL development, including stored procedures, triggers, and performance tuning
  • 72 months of experience working in Agile/Scrum environments
  • 72 months of experience with automated testing, build, and deployment tools such as TFS, PowerShell, and Visual Build Pro
  • 72 months of experience with development tools including Visual Studio, Visual Studio Code, Git, Azure, and TFS
  • 72 months of experience in workers' compensation, disability benefits, insurance systems, and financial reporting
  • Bachelor's Degree
Day-to-Day Responsibilities:
  • Provide recommendations, guidance, and hands-on implementation for new and existing data marts
  • Participate in all phases of data warehouse and business intelligence project lifecycles
  • Establish and document best practices and standards for data modelling and architecture
  • Create and maintain documentation for data warehouses, data marts, and ETL processes
  • Design, develop, test, and maintain source system extraction processes
  • Develop and maintain ETL pipelines and BI dashboards
  • Build and support Spotfire analytics, prediction models, and automation services
  • Automate data extraction and loading processes into the enterprise data warehouse
  • Operationalize predictive models developed by data science teams
  • Adhere to NYSIF standards for SDLC, project management, enterprise architecture, and IT governance
Required Documentation:
  • Resume
  • Copy of Candidate Identification (Driver's License, , Visa, or Passport as applicable)
  • Supporting documents for qualifications such as certifications or degrees
Thanks and Regards
Pavan || Lead Technical Recruiter
Email:
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Maagsoft Inc.