Java Developer

API, Agile, Apache Flume, Apache HBase, Apache Hadoop, Apache Hive, Apache Kafka, Apache Spark, Apache Sqoop, Architecture, Best practices, Big data, Data structure, Data integration, Communication skills, Chef, Cloud Foundry, Data cleansing, Docker, Microservices, Jenkins, Oracle, Performance tuning, Pivotal, Planning, SDLC, SQL, Scala, Scrum, TDD, Test-driven development, Python, Software, Software development, Spring, Production, Negotiations, RESTful, QA, Problem solving, Revision control
Full Time
Depends on Experience
Work from home available Travel not required

Job Description

Preferred Qualifications:

2+ years of software development utilizing industry standard design patterns in common languages such as Java. Demonstrated experience with test-driven development techniques (TDD, Junit, mocks).
• Familiarity with 12-factor microservice development patterns
• Familiarity with multiple interface patterns including RESTful APIs or event-based messaging
• Applied scripting experience in advanced scripting languages (eg. Python, Groovy, Powershell, JavaScript)
• Applied experience with Unix/Linux shell scripting
• Applied experience with iterative and incremental development on product-focused teams practicing code reviews
• Knowledge of version control systems (Git, Bitbucket) and modern version control for use in continuous deployments
• Competency writing basic SQL queries. Oracle and/or Postgres experience a plus. NoSQL experience with Cassandra a plus.
• Excellent verbal and written communication skills and ability to effectively communicate and translate feedback, needs and solutions
Strong teamwork focus and the ability to foster collaboration within and across teams

• Experience developing 12-factor microservices while utilizing Continuous Integration, Build, and Delivery w/Spring, Spring Boot
• Experience building and maintaining highly automated CI and CD pipelines leveraging technologies such as Azure DevOps Server (formerly TFS), Jenkins, Maven, Artifactory, Black Duck, Chef, SonarQube
• Experience in designing and implementing container technologies like Docker, Kubernetes and Helm
• Experience utilizing platform and infrastructure-as-a-service technologies and capabilities and their corresponding services (object store, configuration management, service registries, etc.). Pivotal Cloud Foundry experience strongly preferred.
• Experience with Big Data technologies and developing in Hadoop ecosystem, i.e. Hadoop, Hbase, Hive, Scala, SPARK, Sqoop, Flume, Kafka, Python
• Experience with the ELK stack and dashboarding within Kibana
Experience supporting applications in Production

Responsibilities: 

• Experience in large scale data integration implementations
• Strong problem solver with ability to evaluate a challenge and understand solution patterns
• Experience in data aggregation and manipulation of large data sets.
• Knowledge of data quality, data cleansing, data wrangling, and data standards.
• Experience with industry leading large-scale data platforms (i.e. Hadoop, Greenplum, Oracle)
• Experience with open source languages and architectures including Python, Spark, Scala, Kubernetes and Docker
• Ability to rapidly learn innovative technologies and apply to solving business problems
• Experience with writing performant query language
• Experience with DataStage and ETL patterns
• Experience with optimization and performance tuning
• Experience building re-usable data integration frameworks and patterns
• Knowledge of database modeling and data structure principles, techniques and best practices.
• Demonstrated knowledge in multiple software development disciplines (i.e. Agile, Scrum, SDLC)
• In-depth understanding of good architecture and design patterns, programming practices, development standards and QA/testing processes
• Experience negotiating scope, timing, and resource needs in planning initiatives

Dice Id : 10116994
Position Id : 7107222
Originally Posted : 2 months ago
Have a Job? Post it

Similar Positions

Production Support Engineer
  • Impetus
  • Phoenix, AZ, USA
Big Data Hadoop Consultant
  • Deloitte
  • Gilbert, AZ, USA
Python Developer
  • Deloitte
  • Gilbert, AZ, USA
Remote Python Developer
  • Deloitte
  • Gilbert, AZ, USA
Data Engineer
  • Experis
  • Phoenix, AZ, USA
Developer - Specialty 4
  • Apex Systems
  • Tempe, AZ, USA
Senior Big Data Engineer
  • PETADATA
  • Phoenix, AZ, USA
Senior Data Engineer / Python / SQL / AWS
  • Motion Recruitment
  • Irvine, CA, USA