Overview
Skills
Job Details
Role : Data Engineer
Location: Princeton 3 days a week
Technical Skills:
Azure Data Factory (ADF), Azure Databricks, Azure Analysis Services (SSAS), Azure Data Lake Analytics, Azure Data Lake Store (ADLS), Azure Integration Runtime, Azure Event Hubs, Azure Stream Analytics, DBT , Azure SQL, MongoDB, PySpark
Experience in implementing Microsoft BI/Azure BI solutions like Azure Data Factory, Azure Databricks, Azure Analysis Services, SQL Server Integration Services, SQL Server Reporting Services. Strong Understanding in Azure Big data technologies like Azure Data Lake Analytics, Azure Data Lake Store, Azure Data Factory and in moving the data from flat files and SQL Server using U-SQL jobs.
Good knowledge in Data Extraction, Transforming and Loading (ETL) , Azure Data Factory (ADF).
Extensive experience in developing tabular and multidimensional SSAS Cubes, Aggregation, KPIs, Measures, Partitioning Cube, Data Mining Models, deploying and Processing SSAS objects. Strong understanding of BI application design and development principles using Normalization and De-Normalization techniques.
Understanding of actuarial rating inputs and outputs, including exposure and experience data, layers, tags, and program structures.Experience building data pipelines that support actuarial analytics, pricing tools, and downstream reporting for brokers and clients.
Experience in Agile software development and SCRUM methodology.Ability to work independently and as part of a team to accomplish critical business objectives as well as good decision-making skills under high pressure complex scenarios