The Company’s is a leading global financial services firm providing a wide range of investment banking, securities, investment management and wealth management services. The Firm's employees serve clients worldwide including corporations, governments and individuals from more than 1,200 offices in 43 countries. As a market leader, the talent and passion of our people is critical to our success. Together, we share a common set of values rooted in integrity, excellence and strong team ethic. The Company’s can provide a superior foundation for building a professional career - a place for people to learn, to achieve and grow.
A philosophy that balances personal lifestyles, perspectives and needs is an important part of our culture. The mission of the Global Technology division is to provide a highly reliable and commercial technology platform, which supports the Firm's strategy, delivered by an innovative, world-class team of professionals.
The Enterprise Systems Management (ESM) team is responsible for engineering enterprise systems and tools across Technology and Data for systems management. These systems include Data Management: asset and configuration management. Service management: automation, orchestration, service catalog, problem, incident, change management, capacity, performance & change management. Event Management: monitoring, log-collection, correlation, analysis. Visualization: analytics & user interfaces.
ESM is a global organization with team members spanning three continents. Our clients are internal The Company’s users.
The candidate will be part of the global Enterprise Monitoring team, which focuses on building the next generation of the Plant Monitoring tools. The candidate will be involved in the development of new system features and components of the real time data pipeline where Logs, events, metrics and reference data are streamed and processed in real-time.
The features vary from integration with external data sources (like AWS) to real-time aggregation, alerting, transport to third party consumers and archiving.
The role requires someone who is self-motivated, quick-learner and comfortable working across numerous technologies, and who can take ownership of critical problems and work throughout the full project lifecycle from problem analysis to successful timely delivery of the solution.
The candidate should expect to work in a global virtual team, sometimes across multiple time zones. The ideal candidate is a self-motivated team player, committed to continuous delivery in an agile dev environment; ready to learn new technologies to apply them to real business needs.
*Java or Python programming experience; streaming and messaging
*Experience high-throughput, low-latency platforms for handling real-time data feeds like Apache Kafka
*Excellent problem-solving, design, development, and debugging skills
*Excellent written and verbal communication skills
*Ability to rapidly learn new things
*Bachelor’s, master’s, or Ph.D. in computer science, human-computer interaction, design, statistics, or a related field
*Event processing frameworks experience, any of: spark/ kafka streams / storm/
*Experience with AWS logging/monitoring tooling
*Experience with data processing systems like Splunk/Hadoop/ElasticSearch