Data Engineer with Banking Exp

Overview

Hybrid
$50 - $60
Contract - W2
Able to Provide Sponsorship

Skills

Reinsurance

Job Details

Role: Data Engineer
Location: Princeton, NJ (3 days a week)
Duration: Long Term
Job Description:
They are looking for Reinsurance and Actuarial exp. Candidates.
Technical Skills:
Business Intelligence:Azure Data Factory (ADF), Azure Databricks, Azure Analysis Services (SSAS), Azure Data Lake Analytics, Azure Data Lake Store (ADLS), Azure Integration Runtime, Azure Event Hubs, Azure Stream Analytics, DBT
Database Technologies:Azure SQL, MongoDB, PySpark
Data Engineering
Experience in implementing Microsoft BI/Azure BI solutions like Azure Data Factory, Azure Databricks, Azure Analysis Services, SQL Server Integration Services, SQL Server Reporting Services. Strong Understanding in Azure Big data technologies like Azure Data Lake Analytics, Azure Data Lake Store, Azure Data Factory and in moving the data from flat files and SQL Server using U-SQL jobs.
  • Expert in data warehouse development starting from inception to implementation and ongoing support, strong understanding of BI application design and development principles using Normalization and De-Normalization techniques. Experience in developing staging zone, bronze, silver and gold layers of data
  • Good knowledge in implementing various business rules for Data Extraction, Transforming and Loading (ETL) between Homogenous and Heterogeneous Systems using Azure Data Factory (ADF).
  • Developed notebooks for moving data from raw to stage and then to curated zones using Databricks.
  • Involved in developing complex Azure Analysis Services tabular databases and deploying the same in Microsoft azure and scheduling the cube through Azure Automation Runbook.
  • Extensive experience in developing tabular and multidimensional SSAS Cubes, Aggregation, KPIs, Measures, Partitioning Cube, Data Mining Models, deploying and Processing SSAS objects.
Domain Knowledge (Preferred)
Experience with actuarial tools or insurance is preferred. The intent is familiarity with the data terminologies and the hierarchy of data in the Insurance domain, specifically in the below areas
  • Familiarity with reinsurance broking data, including placements, treaty structures, client hierarchies, and renewal workflows.
  • Understanding of actuarial rating inputs and outputs, including exposure and experience data, layers, tags, and program structures.
  • Experience building data pipelines that support actuarial analytics, pricing tools, and downstream reporting for brokers and clients.
Team skills
  • Team builder with strong, analytical & interpersonal skills with good knowledge in Software Development Life Cycle (SDLC) and Proficient in technical writing.
  • Experience in Agile software development and SCRUM methodology.
  • Ability to work independently and as part of a team to accomplish critical business objectives as well as good decision-making skills under high pressure complex scenarios
Must Have:
  • Experience in implementing Microsoft BI/Azure BI solutions like Azure Data Factory, Azure Databricks, Azure Analysis Services, SQL Server Integration Services, SQL Server Reporting Services. Strong Understanding in Azure Big data technologies like Azure Data Lake Analytics, Azure Data Lake Store, Azure Data Factory and in moving the data from flat files and SQL Server using U-SQL jobs.
  • Expert in data warehouse development starting from inception to implementation and ongoing support, strong understanding of BI application design and development principles using Normalization and De-Normalization techniques. Experience in developing staging zone, bronze, silver and gold layers of data
  • Good knowledge in implementing various business rules for Data Extraction, Transforming and Loading (ETL) between Homogenous and Heterogeneous Systems using Azure Data Factory (ADF).
  • Developed notebooks for moving data from raw to stage and then to curated zones using Databricks.
  • Involved in developing complex Azure Analysis Services tabular databases and deploying the same in Microsoft azure and scheduling the cube through Azure Automation Runbook.
  • Extensive experience in developing tabular and multidimensional SSAS Cubes, Aggregation, KPIs, Measures, Partitioning Cube, Data Mining Models, deploying and Processing SSAS objects.
Domain Knowledge (Preferred)
  • Experience with actuarial tools or insurance is preferred. The intent is familiarity with the data terminologies and the hierarchy of data in the Insurance domain, specifically in the below areas
  • Familiarity with reinsurance broking data, including placements, treaty structures, client hierarchies, and renewal workflows.
  • Understanding of actuarial rating inputs and outputs, including exposure and experience data, layers, tags, and program structures.
  • Experience building data pipelines that support actuarial analytics, pricing tools, and downstream reporting for brokers and clients.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Globex IT Solutions INC