Scala Developers

Overview

Remote
Depends on Experience
Contract - W2
Contract - Independent
Contract - 8 Month(s)
No Travel Required

Skills

Bigdata
Spark
Scala
EMR
AWS

Job Details

Role: Scala Developers

Exp Required: 12+Years

Location: Remote

Duration: 7+ Months

Interview Process: Test followed by two round of technical discussion.

Mandatory Skills:

Spark, Scala EMR and AWS

Please submit candidate who have Hand-on experience only

Job Description

Bachelor s degree preferably Computer Science, Engineering, or other quantitative fields

6+ years of related experience in designing and implementing enterprise applications using big data

5+ years of experience in a senior level engineering role mentoring other engineers, which includes engineering best practices, unblocking, code reviews, unit testing, managing deployments, technical guidance, system design, etc.

5+ years of experience working with large-scale data and developing SQL queries

Advanced experience with scripting languages (e.g., Python, Bash, node.js) and programming languages (e.g., SQL, Java, Scala) to design, build, and maintain complex data processing, ETL (Extract, Transform, Load) tasks, and AWS automation.

5+ years of hands-on experience with AWS cloud services, such as Apache Spark, with Scala, AWS EMR, Airflow, RedShift

4+ years of experience with RESTFul APIs and web services

Excellent communication skills including discussions of technical concepts, soft skills, conducting peer-programming sessions, and explaining development concepts

In-depth understanding of Spark framework, scripting languages (e.g., Python, Bash, node.js) and programming languages (e.g., SQL, Java, Scala) to design, build, and maintain complex data processing, ETL (Extract, Transform, Load) tasks, and AWS automation.

A firm understanding of unit testing. Possess in-depth knowledge of AWS services and data engineering tools to diagnose and solve complex issues efficiently, specifically AWS EMR for big data processing.

In-depth understanding of GIT or other distributed version control systems.

Excellent communication. Essential to performing at maximum efficiency within the team.

Collaborative attitude. This role is part of a larger, more dynamic team that nurtures collaboration.

Strong technical, process, and problem-solving proficiency.

Thorough understanding of complex data structures and transformations, such as nested JSON, XML, Avro, or Parquet, into structured formats suitable for analysis and large datasets (100 gigs or more).

Advance skills in data cleansing, deduplication, and quality validation to maintain high-quality data in the transformed datasets.

Experience in the healthcare industry or another highly regulated field is a plus

 

About People Force Consulting Inc