Requirements
1. Data Architecture for Data Governance and AI Governance
o Proven experience designing and implementing scalable, robust, and secure enterprise data architectures that support data governance and AI governance initiatives.
o Hands-on expertise in data governance tools such as Microsoft Purview (or similar) for metadata management, data cataloging, and lineage tracing.
o Deep understanding of data modeling, schema design, and database normalization best practices.
o Ability to translate business, compliance, and AI governance requirements into technical data architectures.
o Experience developing and enforcing data management and AI governance policies and standards.
o Skilled in designing systems for data lineage visualization and impact analysis to support compliance and responsible AI use.
2. Python Experience:
o Solid experience developing data pipelines and ETL processes using Python.
o Familiarity with Python data processing libraries such as pandas and NumPy.
o Ability to write modular, well-documented, and maintainable code.
3. Cloud Platform Experience (Azure/AWS/Databricks):
o Hands-on experience with one or more cloud platforms: Microsoft Azure and/or Amazon Web Services.
o Familiarity with cloud-based data services (e.g., Azure Data Factory, Azure Synapse).
o Understanding of deploying, monitoring, and managing data infrastructure in the cloud.
4. SQL (Advanced Level):
o Proficiency in writing complex SQL queries for data extraction, transformation, and analysis.
o Experience with query performance tuning and optimizing large-scale data operations.
o Knowledge of relational database management systems (e.g., PostgreSQL, SQL Server).
5. GitHub and Version Control:
o Proficiency with version control systems, especially Git, and practical experience using GitHub for code collaboration and project management.
o Familiarity with branching strategies, pull requests, code reviews, and resolving merge conflicts.
o Experience managing code repositories, tracking issues, and contributing to or maintaining collaborative projects.
6. Data Quality Assessment:
o Experience implementing data validation, data cleaning, and quality checks within data pipelines.
o Familiarity with tools or frameworks for data profiling and data quality monitoring
o Strong understanding of data integrity, accuracy, completeness, and consistency concepts.
o Capable of designing and executing data quality assessments in various systems.
7. Good Communication Skills:
o Ability to effectively communicate technical concepts to both technical and non-technical stakeholders.
o Experience translating complex data findings and issues into clear, actionable insights.
o Collaborative team player able to gather requirements and present solutions clearly.
o Strong problem-solving skills and adaptability to changing business or technical needs.
o Ability to foster a culture of data quality and continuous improvement.
Additional Preferred Skills (Optional but Recommended):
o Knowledge of workflow orchestration tools (e.g., Azure Data Factory pipelines, AWS Step Functions).
o Basic experience creating reports and dashboards in Power BI.
o Strong problem-solving skills and keen attention to detail.
o Experience working in the property and casualty insurance industry.