We're searching for Senior and Lead level Big Data Developers. We need technologist with strong experience in Kafka and Scala. These positions can be located in Irving, TX or Chicago, IL. Salaries are up to 160K. Our client offers relocation assistance and amazing benefits. If you are above the suggested salary range please let us know.
Primary Duties and Responsibilities:
Actively participates and lead requirement analysis and reviews to identify missing or incomplete requirements. Must always look for assumptions made in the models and must always validate those assumptions of the models.
Actively participates in design of highly performing, scalable, secure, reliable and cost optimized solutions.
Primarily responsibility is application design and development of big data application for business requirements in agreed architecture framework and Agile environment
Thoroughly analyzes requirements, develops, tests, and documents software quality to ensure proper implementation.
Follows agreed upon SDLC procedures to ensure that all information system products and services meet: both explicit and implicit quality standards, end-user functional requirements, architectural standards, performance requirements, audit requirements, security rules are upheld, and external facing reporting is properly represented.
Work with Scrum Master, Product Owner and team to groom backlog, Estimate level of effort, Identify and add dependencies
Performs thorough code-reviews based on high engineering standards
Writes unit and integration tests based on chosen DevOps frameworks.
Performs application and project risk analysis and recommends quality improvements.
Assists Production Support by providing advice on system functionality and fixes as required
Communicates in a clear and concise manner all time delays or defects in the software immediately to appropriate team members and management
Assist with departmental and new employee training
Experience integrating modelling libraries required. Risk analytics technology implementations experience a plus
The requirements listed are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the primary functions.
8+ year of experience in building large scale, data-centric Java-based solutions.
Java 8+ experience required.
3+ years of experience with high speed distributed computing frameworks AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, Apache Flink etc.
3+ years of experience with distributed message brokers Kafka, RabbitMQ, ActiveMQ, Amazon Kinesis, etc.
Experience with cloud technologies and migrations. Experience preferred with AWS foundational services like VPCs, Security groups, EC2, RDS, S3 ACLs, KMS, AWS CLI and IAM etc.
Experience developing and delivering technical solutions using public cloud service providers like Amazon, Google, etc.
Experience writing unit and integration tests with testing frameworks like Junit, Citrus
Experience following Git workflows
1770 N. Park St, Suite 100 Naperville, IL, 60563Contact