Overview
Hybrid2 days onsite per week
Depends on Experience
Contract - W2
Contract - 6 Month(s)
Skills
Java
Python
Scala
API
GraphQL
Hadoop
Hive
Spark
Vertex AI
Kubernetes
Microsoft Azure
Data Lake
Apache Hadoop
Airflow
Job Details
Required:
- 5+ years of relevant experience in building highly resilient, highly scalable systems.
- Experience in multiple stack technologies Java, Python, Scala
- Hands on experience in API development, GQL, Node.js
- Hands on experience with Hadoop, Hive, Spark using Scala, Vertex AI, Presto/Trino, Kubernetes, Cloud, Automic, Airflow and Data Lake concepts.
- Solid knowledge of complex software design, distributed system design, design patterns, data structures, and algorithms.
- Skilled in data modelling & data migration protocols.
- Familiarity with public cloud technologies such as Azure or Google Cloud Platform.
- Knowledge on Kafka connect, Druid, Big Query and Looker is added advantage.
- Excellent technical debugging and production support skills.
- Extensive experience in the design, development, and delivery of software products with a large user base.
- Track record in an architect role with large-scale software development data-backed services and applications.
- Ability to balance conflicting interests in a complex and fast-paced environment.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.