Sr Hadoop Engineer with migration to AWS
Location - Remote
Please make sure to submit profiles in proper format
Need profiles between 2 hrs only
Must have :
Experience with migration to AWS.
Scala Spark development, Hadoop ecosystem, AWS (EMR, S3, Glue, Lambda), data pipeline architecture, CI/CD best practices, healthcare/payment systems experience.
Job Description
**Preferably in Dallas, TX, or Atlanta, GA, but open to any US location. * Experience with migration to AWS.
The Software Engineer is responsible for the design and development of software applications and reusable software components for the Payment BU, in compliance with predefined coding standards and technical design. Closely working with other architects and senior engineers, this individual will help assess existing applications, produce detailed designs for hadoop components, and drive the implementation of scalable, secure, and reusable solutions on Hadoop and AWS.
Responsibilities: Design and develop high quality, maintainable software modules for Payment BU product. Conduct unit and integration testing using appropriate methodology and techniques Evaluate and integrate complementary technologies, where appropriate (Eg Aws cloud technologies such as step function, EMR and Glue) Review software engineering approach to proposed solutions to ensure adherence to best practice Provide technical mentorship, conduct code/design reviews, and promote a culture of engineering excellence. Complete all responsibilities as outlined on annual Performance Plan Complete all special projects and other duties as assigned. Must be able to perform duties with or without reasonable accommodation.
Qualifications: Bachelor’s Degree in Computer Science, Information Technology, Information Systems, or related field. 5 - 10 years of experience with enterprise and application design patterns such as GoF 5 - 10 years of experience with data pipeline design patterns for batch processing, stream, lambda, data mesh and lakehouse 5 years of proficiency with Scala spark and a scripting language such as Python, PySpark and Linux shell scripting Working knowledge of RDBMS such as SQL Server, MySQL, Oracle with SQL programming skills 4 - 5 years' experience with software engineering best practices such as continuous integration, unit testing, refactoring, and code reviews Strong understanding of agile development practices, preferably SAFe · Ability to deliver in a dynamic, fast paced environment within estimated timelines Strong analytical, organizational, and interpersonal skills · Good written and verbal communication skills
Additional skills Demonstrated proficiency in developing programs using Scala. Hands-on experience with Hadoop ecosystem components such as Hive, YARN, Sqoop, Oozie and Groovy 4-5 years of experience with platform engineering tools such as Bitbucket, Gradle, Octopus, Artifactory. 4-5 years of experience with building complex components using Spark SQL, MySql, HiveQL. 2-3 year of experience with AWS cloud technologies complementing Hadoop stack including hands-on development experience with S3, Lambda functions, Api gateway, EMR Working knowledge of AWS components such as IAM, KMS, Glue, Athena
Preferred Qualifications AWS associate architect certification or higher. Background in Healthcare IT, especially payer systems or payment integrity solutions. Experience developing software following cloud native principles.
Working Conditions and Physical Requirements: Remaining in a stationary position, often standing or sitting for prolonged periods. Communicating with others to exchange information. Repeating motions that may include the wrists, hands and/or fingers. Assessing the accuracy, neatness and thoroughness of the work assigned. No adverse environmental conditions expected. Must be able to provide a dedicated, secure work area. Must be able to provide reliable high-speed internet connection (minimum download speed of 100 Mbps and upload speed of 50 Mbps), office setup and maintenance.
Job Responsibilities
Responsibilities: Design and develop high quality, maintainable software modules for Payment BU product. Conduct unit and integration testing using appropriate methodology and techniques Evaluate and integrate complementary technologies, where appropriate (Eg Aws cloud technologies such as step function, EMR and Glue) Review software engineering approach to proposed solutions to ensure adherence to best practice Provide technical mentorship, conduct code/design reviews, and promote a culture of engineering excellence. Complete all responsibilities as outlined on annual Performance Plan Complete all special projects and other duties as assigned. Must be able to perform duties with or without reasonable accommodation.