Overview
On Site
$60 - $65
Contract - W2
Contract - Independent
Contract - 12 Month(s)
No Travel Required
Skills
Scala
Spark
Kafka
Bank
Job Details
Job Title: Scala Developer
Location: NYC, NY (Onsite Job)
Duration: 12+ Month Contract
Key Responsibilities
Interview type: In-person
- Design, develop, and maintain Scala-based applications and microservices to support banking and financial workflows.
- Collaborate with business analysts and domain experts to translate banking requirements (e.g., payments, risk, compliance, credit, or trading data) into technical specifications.
- Develop and optimize ETL pipelines for large-scale data processing, warehousing, and analytics across multiple systems.
- Integrate structured and unstructured data sources including Teradata, PostgreSQL, MongoDB, Snowflake, and Redshift.
- Implement data solutions in cloud environments (Azure) ensuring scalability, security, and regulatory compliance.
- Work with Apache Spark, Kafka, Databricks, and Airflow to build and manage high-performance, distributed data processing workflows.
- Ensure data quality, governance, and compliance in line with banking regulations (e.g., Basel, CCAR, AML, KYC).
- Collaborate with cross-functional teams to support real-time transaction processing, risk analytics, and reporting systems.
- Perform code reviews, performance tuning, and troubleshooting to ensure robust, secure, and efficient applications.
- Stay updated with industry trends, particularly in fintech and banking innovations, to continuously improve systems and processes.
Key Qualifications
- Proficiency in Scala with experience in functional and reactive programming paradigms.
- Hands-on experience with Java for backend integration and legacy system support.
- Strong knowledge of SQL and NoSQL databases (Teradata, PostgreSQL, MongoDB).
- Proven expertise in ETL pipeline development and data warehousing solutions (Snowflake, Redshift).
- Familiarity with cloud infrastructure (Azure) including deployment, monitoring, and scaling.
- Solid experience with Apache Spark, Kafka, Databricks, and Airflow for large-scale data engineering.
- Prior experience working in the banking/financial services domain is a must.
- Strong problem-solving, analytical, and communication skills.
Educational Qualification
- Bachelor s degree in computer science, Engineering, or a related field (Master s preferred).
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.