What Working at Hexaware Offers:
Hexaware is a dynamic and innovative IT organization committed to delivering cutting-edge solutions to our clients worldwide. We pride ourselves on fostering a collaborative and inclusive work environment where every team member is valued and empowered to succeed.
Hexaware provides access to a vast array of tools that enhance, revolutionize, and advance professional profiles. We complete the circle with excellent growth opportunities, chances to collaborate with high-profile customers, opportunities to work alongside brilliant minds, and the perfect work-life balance.
Role: Azure Databricks / PySpark Developer with Business Analysis (Banking Domain)
Role Level: Mid to Senior
Department: Banking Technology / Data Engineering
Location: Mount Laurel, NJ
Employment Type: Fulltime/Contract
Mode of work: 4 days onsite/per
Role Summary
We are seeking a midtosenior level Data Engineer with strong handson expertise in Azure Databricks, PySpark, Azure Data Factory, and Synapse Analytics, combined with solid Business Analysis skills
This is a hybrid technofunctional role where you will develop scalable data pipelines and also translate banking business requirements into technical designs.
You will collaborate with business users, data architects, and crossfunctional technology teams to deliver highquality data solutions supporting regulatory reporting, risk analytics, fraud detection, customer insights, and core banking operations.
Key Responsibilities
Technical Responsibilities-Data Engineering & Development
- Design, build, and optimize ETL/ELT data pipelines using Azure Databricks (PySpark), ADF, and Synapse.
- Develop scalable data ingestion frameworks for batch and nearrealtime data.
- Implement Delta Lake for ACIDcompliant, performant data workflows.
- Build data transformation logic using PySpark for cleansing, validation, and enrichment.
- Optimize Spark jobs for performance, reliability, cost, and parallel execution.
- Integrate data across multiple banking systems (CBS, Loans, Cards, Payments, AML, Risk, Regulatory platforms).
- Develop reusable modules, notebooks, parameterized pipelines, and CI/CDready components.
- Ensure data quality, profiling, governance, lineage, and audit requirements are met.
- Work with ADO/Git for version control, branching, and deployment.
Business Analysis Responsibilities
- Interact with business stakeholders: Risk, Finance, Payments, Operations, Compliance, Treasury, etc.
- Gather, analyze, and document business/stakeholder requirements.
- Translate banking use cases into technical specifications and data transformation rules.
- Perform data mapping, sourcetotarget documentation, and business glossary creation.
- Conduct gap analysis, feasibility studies, and propose data solutions.
- Support UAT, perform data validation, and work closely with endusers for signoff.
- Identify data issues, troubleshoot root causes, and propose process improvements.
- Communicate insights and updates to both technical and nontechnical teams.
Required Skills & Qualifications
- 8 10+ years of experience in Data Engineering, Analytics, or Development.
- Strong handson experience with:
- Azure Databricks (PySpark)
- Azure Data Factory (ADF)
- Azure Synapse Analytics
- Delta Lake & ADLS Gen2
- Solid proficiency in SQL, performance tuning, and data modeling.
- Experience gathering and translating requirements in banking environments.
- Ability to work with large, complex, multisource financial datasets.
- Understanding of banking processes such as:
- Retail/Corporate Banking
- Loans & Credit
- Risk & Compliance
- Payments
- Regulatory Reporting (e.g., Basel, AML, KYC)
- Strong communication and stakeholder management skills.
- Experience working in Agile/Scrum frameworks