Big Data Architect - INDIA

Overview

Remote
Depends on Experience
Contract - Independent
Contract - W2
Contract - 6 Month(s)

Skills

Apache Hadoop
Apache Hive
Apache Kafka
Apache Spark
Backbone.js
Banking
Big Data
MapReduce
HaaS
HDFS
LDAP
Performance Tuning
Proposal Writing
Encryption
Data Integrity
Continuous Integration
Hadoop
Spark
Hive
Data ENgineer
Big Data Architect

Job Details

Remoe work requirement for 6m-1 year = India based = Big Data Architect Hadoop.

  • We need your help in getting a Big Data Architect (Hadoop) for our banking client - that is a one-man army
  • Work to be split between India (base) and Singapore (visiting) preferably.
  • 6 month to 1 year
  • Some travel between india and Singapore needed
  • Ability to hold initial consultative type conversations with client initially (on solution and proposal) and then implement the architecture you will recommend in Hadoop.

We're Hiring: Senior Hadoop Architect Pre-Sales & Delivery (Global Banking Client | Singapore & India)

Are you a Hadoop expert with deep experience in enterprise-grade, regulated environments? Our global banking client is launching a mission-critical on-prem Hadoop implementation, and we re looking for a seasoned architect to lead the charge from pre-sales solutioning to hands-on delivery.

This is a high-impact, individual contributor role for someone who thrives in complex, multi-region environments and can operate independently across Singapore and India.

Your Mission

  • Lead pre-sales solutioning: scope work, estimate effort, define deliverables for banking-grade deployments
  • Architect and implement secure, scalable, high-performance Hadoop clusters in on-prem environments
  • Migrate workloads from Hadoop-as-a-Service (HaaS) to on-premise infrastructure with zero data loss
  • Integrate Hadoop with Kafka, MapReduce, and LDAP for enterprise application interoperability
  • Ensure compliance with banking regulations, data governance, and auditability standards
  • Own the full lifecycle: architecture, design, configuration, implementation, performance tuning, and support

What You Bring

  • 10+ years of hands-on Hadoop architecture and implementation experience in regulated industries
  • Proven ability to deliver independently no hand-holding, no dependencies
  • Deep expertise in HDFS, YARN, Hive, Spark, Kafka, Ranger, Atlas, and security frameworks
  • Experience with LDAP integration, Kerberos, encryption, and role-based access control
  • Strong background in infrastructure planning, CI/CD, and monitoring tools (Ambari, Grafana)
  • Familiarity with banking-grade SLAs, data residency, and risk mitigation
  • Willingness to travel between Singapore and India

Why This Role Matters

  • You ll be the technical backbone of a strategic data platform for a global bank
  • Your work will directly impact regulatory compliance, data integrity, and operational resilience
  • You ll collaborate with cross-functional teams across APAC and global delivery centers
  • Competitive compensation, high visibility, and long-term growth opportunities

If you re ready to architect the future of big data in banking we want to hear from you. Apply now or tag someone who fits this role.

#Hadoop #BigData #BankingTech #Kafka #Spark #ArchitectJobs #SingaporeJobs #IndiaJobs #OnPremHadoop #DataGovernance #Compliance #HiringNow

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.