Overview
Skills
Job Details
Job Title: ETL/Data Architect
Location: Jersey City, New Jersey(Day 1 onsite)
Duration : Long Term
Primary Skills : Informatica Power Center, Azure Data Factory (ADF)
Good to have : Guidewire
About the Role:
We are seeking an experienced Data Architect to lead and design modern data solutions for a Property & Casualty (P&C) customer undergoing a major data modernization initiative involving Guidewire Claim Data Access (CDA). The ideal candidate will possess strong technical expertise, hands-on experience, and excellent communication skills to successfully deliver enterprise-grade data solutions in Azure/Informatica. This role requires a proactive problem solver who can troubleshoot and optimize complex data pipelines and workflows for maximum efficiency and reliability.
Key Responsibilities:
Architect and implement enterprise metadata-driven data pipelines using ETL tools like Azure Data Factory (ADF) and Informatica.
Design and develop an Operational Data Store (ODS) sourced from Azure Data Lake, ensuring robust, scalable, and high-performing architecture.
Collaborate with stakeholders to integrate and optimize Guidewire Data (CDA) into the data lake architecture, enabling advanced analytics and reporting.
Troubleshoot and resolve issues in data pipelines, workflows, and related processes to ensure reliability and data accuracy.
Continuously monitor and optimize current workflows for performance, scalability, and cost-efficiency, adhering to best practices.
Develop and maintain custom processes using Python, T-SQL, and Spark, tailored to business requirements.
Leverage Azure Functions to design serverless compute solutions for event-driven and scheduled data workflows.
Optimize data workflows and resource usage to ensure cost-efficiency in Azure Cloud environments.
Provide leadership and guidance for implementing Hadoop-based big data solutions where applicable.
Develop a comprehensive understanding of P&C domain data, ensuring alignment with business objectives and compliance requirements.
Communicate technical solutions effectively with cross-functional teams, stakeholders, and non-technical audiences.
Required Qualifications:
13+ years of experience in data architecture, data engineering, and/or ETL development roles.
Proven experience with Azure Cloud Services, including Azure Data Lake, Azure Data Factory, and SQL Server.
Leverage Informatica for robust ETL workflows, data integration, and metadata-driven pipeline automation to streamline data processing
Build end-to-end metadata-driven frameworks and continuously optimize existing workflows for improved performance, scalability, and efficiency.
Expertise in troubleshooting and optimizing data pipelines and workflows for enhanced reliability and performance.
Proficiency in scripting and programming with Python, T-SQL, and Spark for custom data workflows.
Hands-on expertise in building and managing ODS systems from data lakes.
Demonstrated ability to design solutions for Azure Cloud Cost Optimization.
Excellent communication skills to engage with technical and business stakeholders effectively.
Torque Technologies LLC is an Equal Opportunity Employer (EOE). Qualified applicants are considered for employment without regard to age, race, color, religion, sex, national origin, sexual orientation, disability, or veteran status.