Data Architect

New York, NY, US • Posted 1 hour ago • Updated 1 hour ago
Contract Independent
Contract W2
On-site
Depends on Experience
Fitment

Dice Job Match Score™

🔢 Crunching numbers...

Job Details

Skills

  • Data Architec
  • Azure
  • Azure Data Factory
  • Azure Synapse
  • Data Governance
  • AI Governance

Summary

Primary Skills: SQL, Python, and cloud platforms; property and casualty insurance industry or with industry-standard data quality tools is a plus.
Secondary Skills:
Description:

The Data Architect will work closely with the Data Governance's Data Quality team to design the optimal architecture for data governance and to support data engineering needs, including managing GitHub repositories, building frameworks that automate and enable data quality assessments, and performing data quality evaluations across multiple systems. The candidate will also design robust data architectures to support scalable and integrated solutions, ensuring high-quality data for business analytics and operational processes. The ideal candidate will demonstrate a high level of problem-solving ability, adaptability to evolving requirements, and strong hands-on skills with SQL, Python, and cloud platforms. Experience in the property and casualty insurance industry or with industry-standard data quality tools is a plus.
Requirements
1. Data Architecture for Data Governance and AI Governance
o Proven experience designing and implementing scalable, robust, and secure enterprise data architectures that support data governance and AI governance initiatives.
o Hands-on expertise in data governance tools such as Microsoft Purview (or similar) for metadata management, data cataloging, and lineage tracing.
o Deep understanding of data modeling, schema design, and database normalization best practices.
o Ability to translate business, compliance, and AI governance requirements into technical data architectures.
o Experience developing and enforcing data management and AI governance policies and standards.
o Skilled in designing systems for data lineage visualization and impact analysis to support compliance and responsible AI use.
2. Python Experience:
o Solid experience developing data pipelines and ETL processes using Python.
o Familiarity with Python data processing libraries such as pandas and NumPy.
o Ability to write modular, well-documented, and maintainable code.
3. Cloud Platform Experience (Azure/AWS/Databricks):
o Hands-on experience with one or more cloud platforms: Microsoft Azure and/or Amazon Web Services.
o Familiarity with cloud-based data services (e.g., Azure Data Factory, Azure Synapse).
o Understanding of deploying, monitoring, and managing data infrastructure in the cloud.
4. SQL (Advanced Level):
o Proficiency in writing complex SQL queries for data extraction, transformation, and analysis.
o Experience with query performance tuning and optimizing large-scale data operations.
o Knowledge of relational database management systems (e.g., PostgreSQL, SQL Server).
5. GitHub and Version Control:
o Proficiency with version control systems, especially Git, and practical experience using GitHub for code collaboration and project management.
o Familiarity with branching strategies, pull requests, code reviews, and resolving merge conflicts.
o Experience managing code repositories, tracking issues, and contributing to or maintaining collaborative projects.
6. Data Quality Assessment:
o Experience implementing data validation, data cleaning, and quality checks within data pipelines.
o Familiarity with tools or frameworks for data profiling and data quality monitoring
o Strong understanding of data integrity, accuracy, completeness, and consistency concepts.
o Capable of designing and executing data quality assessments in various systems.
7. Good Communication Skills:
o Ability to effectively communicate technical concepts to both technical and non-technical stakeholders.
o Experience translating complex data findings and issues into clear, actionable insights.
o Collaborative team player able to gather requirements and present solutions clearly.
o Strong problem-solving skills and adaptability to changing business or technical needs.
o Ability to foster a culture of data quality and continuous improvement.

Additional Preferred Skills (Optional but Recommended):
o Knowledge of workflow orchestration tools (e.g., Azure Data Factory pipelines, AWS Step Functions).
o Basic experience creating reports and dashboards in Power BI.
o Strong problem-solving skills and keen attention to detail.
o Experience working in the property and casualty insurance industry.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 91163673
  • Position Id: 8966874
  • Posted 1 hour ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

New York, New York

Today

Contract

$60 - $65 hourly

New York, New York

Today

Easy Apply

Contract

Depends on Experience

New York, New York

30+d ago

Contract, Third Party

$75 - $80

New York, New York

Today

Easy Apply

Third Party, Contract

60+

Search all similar jobs