Overview
Skills
Job Details
Job Description
Required Qualifications:
All applicants authorized to work in the United States are encouraged to apply
At least 4 years of experience with Information Technology.
At least 2 years of experience in PySpark
Strong understanding of distributed computing principles and big data technologies
At least 2 years of experience working with Apache Spark and Spark SQL
Knowledge of data serialization formats such as Parquet, Avro, or ORC
Familiarity with data processing and transformation techniques
Preferred Qualifications:
Experience with data lakes, data warehouses, and ETL processes
Good understanding of Agile software development frameworks
Experience in Banking domain
Strong communication and Analytical skills
Ability to work in teams in a diverse, multi-stakeholder environment comprising of Business and Technology teams
Experience and desire to work in a global delivery environment