Skills
- Angular
- ClickHouse
- Collaborate
- Druid
- Java
- Kafka
- Nifi
- React . js
- SDET
- Scala
Job Description
NO H-1B
ONLY W2
1. looking for hands-on developers at the moment, not Architect. Must be passionate about Scala programing.
2. OOD programming background is important, but should be big data relatedÂ
3. Data analyst, data scientist and researcher are NOT the right candidates
Responsibilities:
Build components of large-scale data platform for real-time and batch processing, and own features of big data applications to fit evolving business needs
Build next-gen cloud based big data infrastructure for batch and streaming data applications, and continuously improve performance, scalability and availability
Contribute to the best engineering practices, including the use of design patterns, CI/CD, code review and automated test
Chip in ground-breaking innovation and apply the state-of-the-art technologies
As a key member of the team, contribute to all aspects of the software lifecycle: design, experimentation, implementation and testing.
Collaborate with program managers, product managers, SDET, and researchers in an open and innovative environment
Skills:
Bachelor or above in computer science or EE
4+ years of professional programming in Scala
3+ years of big data development experience with technical stacks like Spark, Flink, Singlestore, Kafka, Nifi and AWS big data technologies
Experience with Java, Python
Knowledge of system, application design and architecture
Experience of build industry level high available and scalable service
Passion about technologies, and openness to interdisciplinary work
Preferred skills:
Experience with processing large amount of data at petabyte level
Demonstrated ability with cloud infrastructure technologies, including Terraform, K8S, Spinnaker, IAM, ALB, and etc.
Experience with ClickHouse, Druid, Snowflake, Impala, Presto, Kinesis, etc.
Experience in widely used Web framework (React.js, Vue.js, Angular, etc.) and good knowledge of Web stack HTML, CSS, Webpack