remote role – 12 + months
Java, python, spring, rest microservices, Scala/spark/airflow
Sr. pipeline and big data engineer
You will work with a team of Software Engineers to create the next generation security products to enhance the auto detection and investigation of security breaches within large organizations. You will work with Architects and other Cloud Infrastructure Engineers to understand the designs, validate the features, and to deliver and integrate solutions in an AWS environment. If you are passionate about building security solutions that keep millions of enterprise customers safe with an outstanding user experience, then this position might be a perfect opportunity for you. You will report to the Sr. Manager, Software Engineering and will be based in San Jose, CA.
Organizations and governments around the world may have different priorities and transformation initiatives, but one thing in common is that all are faced with growing risk, advanced threats, and complex environments. Today, cybersecurity strategies are critical to long-term success, and Enterprise is here to provide the industry’s only comprehensive, proactive, cloud security platform. Our technology is designed to protect the people, hybrid infrastructure, IP, and reputation of your business through our actionable threat intelligence and world-class solutions. With a cloud-native portfolio that spans from device to cloud edge and multi-cloud, you can stay ahead of threats by predicting, preventing, detecting, and correcting them. Backed by our 30+ year history, you can trust that we have the focus, experience and expertise needed to continually innovate, and are committed to help you protect what matters most.
About the Role:
• Help develop our next-generation micro-services to enhance auto detection of different breaches and other security concerns.
• Use your knowledge of Java, Go or Python to create new features in the cloud.
• Work with other software developers and be responsible for the whole solution including uptime and design of the product.
• Work with other developers to fix bugs, debug issues and achieve resolution.
• Understand and influence logging to support our Data Flow.
• Pioneer a new way of thinking about Data Pipelines, Orchestration and Configuration
• Your experience includes 5+ years of hands-on experience with Big Data technology(Kafka, Spark or Airflow).
• Successful track record in developing and automating large-scale, high-performance data processing systems (batch and streaming).
• Design data models for optimal storage and retrieval to meet critical product requirements.
• Experience with both scripting and system programming languages (Python, Go and Scala).
• Experience with microservices including defining and testing APIs.
Intelliswift Software, Inc. is a premier software solutions and services company headquartered in Silicon Valley, with offices across the United States and India. The company has a proven track record of delivering results through its global delivery centers and flexible engagement models for over 450 brands ranging from Fortune 100 to growing companies. Intelliswift provides a variety of services including Enterprise Applications, Software Product Development, Mobility & Collaboration, Big Data/BI, Cloud Solutions, and Team Augmentation.