Overview
Skills
Job Details
Architect, implement, and optimize end-to-end solutions designed for large-scale datasets, ensuring scalability and efficiency.
Big data ecosystem (Good understanding of Hadoop, Cloudera), exposure to Hive, Impala, Yarn, Kafka.
Drive cross-functional collaboration, bridging technical teams and business stakeholders.
Oversee data pipeline design, model development, and deployment with a focus on scalability and accuracy.
Drive compliance with policies/standards (e.g., data quality) and adoption of data governance capabilities.
Provide mentorship to teams, fostering innovation and excellence in delivery.
Influence and drive the adoption of front-end solutions to support data visualization and user experiences.
Qualifications:
12+ years of experience
Advanced degree in Computer Science, Data Science, or a related field.
Extensive hands-on experience with Java, Python and Hadoop big-data frameworks.
Proven expertise in designing and managing large-scale data ecosystems, data integration and mapping.
Proficiency in front-end development for data visualization or user-centric tools.
Familiarity with cloud platforms (AWS, Azure, or Google Cloud Platform) and modern data infrastructure.
Exceptional leadership, communication, and strategic thinking skills to manage cross-functional teams and deliver impactful solutions.
Strong ability to thrive in fast-paced and high-pressure financial environments.