This position will develop new analytics, enhance existing analytics, and maintain and improve the applications that support these analytics using BigData tools, Azure, Pyspark, SCALA technology stack.
You’ll enjoy the flexibility to telecommute* from anywhere within the U.S. as you take on some tough challenges.
- Design, code, test, document, and maintain high-quality and scalable Azure, Scala,Spark, Big Data, Cloud solutions
- Design, develop and implement analytics rules engines
- Research, evaluate, and deploy new tools, frameworks, and patterns to build sustainable Big Data platform
- Identify gaps and opportunities for improvement of existing solutions
- Define and develop APIs for integration with various data sources in the enterprise
- Analyze and define customer requirements
- Assist in defining product technical architecture
- Make accurate development effort estimates to assist management in project and resource planning
- Create prototypes, proof-of-concepts & design and code reviews
- Collaborate with management, quality assurance, architecture, and other development teams
- Write technical documentation and participate in production support
You’ll be rewarded and recognized for your performance in an environment that will challenge you and give you clear direction on what it takes to succeed in your role as well as provide development for other roles you may be interested in.
- Undergraduate degree and at least 12 years of IT work experience
- 5+ years of development experience with SCALA, SPARK, Python or Pyspark
- 4+ years of experience in Azure cloud platform
- 2+ years of experience in BigData tools such as Hadoop, MapReduce, HDFS, Spark, Kafka Streaming, Docker, Kubernetes
- 2+ years of experience in MySQL, NoSQL databases (Cassandra / HBase preferred)
- Previous experience with Agile / Scrum methodology / best practices
- Thorough understanding of service-oriented architecture (SOA) concepts
- Healthcare Industry experience (preferred )