Job Summary: We are seeking a Technical Data Engineer with strong expertise in Python to design, develop, and maintain enterprise-grade data solutions supporting regulatory and risk reporting. This role involves building scalable data pipelines, developing microservices and APIs, and designing data models to support analytics and reporting platforms. The ideal candidate will work with Snowflake, PostgreSQL, and Power BI while ensuring data quality, performance, and compliance. Key Responsibilities: Analyze and profile enterprise data stored in Snowflake, ensuring data quality, lineage, and accuracy. Design and develop ETL/ELT pipelines for data movement and transformation across systems. Build and maintain Python-based microservices and Flask APIs for data ingestion, processing, and integration. Implement business rules, transformation logic, and regulatory reporting computations. Design and optimize PostgreSQL data models, schemas, stored procedures, and queries. Develop and support reporting solutions, including Power BI dashboards and semantic models. Ensure data quality, validation, and integrity across data pipelines and reporting layers. Optimize pipelines and reporting systems for performance, scalability, and reliability. Integrate CI/CD pipelines for automated build, testing, and deployment. Collaborate with cross-functional teams including architects, analysts, QA, and domain experts. Support data governance, metadata management, and compliance with regulatory requirements. Leverage AI tools to enhance development, testing, and data analysis processes. Participate in technical discussions, code reviews, and adoption of best practices. Required Qualifications: 5+ years of experience in technical leadership roles. 5+ years of experience working with Snowflake for data analysis, profiling, and quality assurance. 5+ years of experience designing and implementing ETL/ELT pipelines across Snowflake, PostgreSQL, or similar platforms. 7+ years of experience developing Python-based microservices and REST APIs (Flask preferred). Strong experience with PostgreSQL including schema design, stored procedures, and performance tuning. Experience implementing authentication, authorization, logging, and error handling in APIs. 5+ years of experience with Power BI, including data modeling, DAX, and dashboard development. Strong SQL and data analysis skills. Experience with CI/CD pipelines and DevOps practices. Strong analytical, problem-solving, and communication skills. Preferred Qualifications: Experience ensuring data accuracy, validation, and reconciliation in complex data environments. Experience using AI tools such as Copilot, Claude, or similar for development and analysis. Experience supporting regulatory or risk reporting environments. Education: Bachelors Degree
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
- Dice Id: compun
- Position Id: BHADC5779334
- Posted 3 days ago