W2 Only
The following requirement is open with our client.
Client : TCS
Title : Data Engineer with Java, Spark
Location : New York Need to be local as F2F interview will be required
Duration : 12 Months
Rate : $56/hr on W2
Relevant Experience : 10+
Detailed Job Description:
We are seeking a skilled engineer with strong Data Engineer with Java microservices, Apache Spark, and Databricks/Delta Lake experience to build high‐performance platforms supporting trade lifecycle processing, risk analytics, market data pipelines, and post‐trade reporting.
The role involves building scalable, low‐latency, high‐throughput systems used across trading desks, risk teams, and operations.
Job Responsibilities:
Capital Markets Data Engineering
· Build and optimize Spark/Databricks pipelines for:
· Trade capture & enrichment
· Market data ingestion (real‐time & batch – equities, FI, FX, derivatives)
· Intraday risk & PnL calculations
· Settlements and clearing data flows
· Implement Structured Streaming for near real-time consumption of:
· Quotes, ticks, orders
· Trades and confirmations
· Risk engine outputs
· Backend Engineering (Java + Spring)
Develop microservices to support:
· Trade booking & validation
· Reference data lookups (instruments, prices, curves)
· Workflow/exception management for trade breaks
· Risk exposure and limit checks
· Integrate systems with market data sources (Bloomberg, Reuters, ICE) and OMS/EMS platforms.
· Databricks Lakehouse Architecture
Build Capital Markets‐aligned Bronze → Silver → Gold layers for:
· Trade lifecycle
· Positions & holdings
· Market data
· Pricing & curves
· Regulatory/reporting datasets
· Use Databricks performance patterns:
· Delta Lake optimization, Z‐ORDER, OPTIMIZE, VACUUM
· Cluster tuning, caching, partitioning, AQE
· Risk, Regulatory & Reporting
Support data pipelines for:
· Daily risk and PnL
· Market risk metrics (VaR, Greeks)
· Regulatory data feeds (MiFID II, EMIR, CAT, TRACE)
· End‐of‐day and intraday reporting
· Implement data controls: lineage, traceability, audit logs, reconciliation rules.
· Cloud & DevOps
· Deploy pipelines and services on Azure/AWS (ADLS/S3, Event Hubs/Kafka, Key Vault/KMS).
· Use CI/CD pipelines (Azure DevOps/GitHub/Jenkins) for automated deployment and testing.
· Work with monitoring/alerting tools to meet low‐latency SLAs required by trading environments.
Must Have Skills:
CORE JAVA, DATA ENGINEERING, FI, OMS, ARCHITECTURE, RISK METRICS, RECONCILIATION, AZURE, KMS, CI/CD, JENKINS, AUTOMATED DEPLOYMENT, CAPITAL MARKETS, OPTIMIZE, PIPELINES, SLAS, MARKET DATA, FX, DATA FLOWS, AUDIT, OPTIMIZATION.
Thanks and Regards,
Goutham Eluri
Technical Recruiter
ASCII Group LLC.
38345 W. 10 Mile Rd, Ste.#365; Farmington, MI 48335
Office:
Email: Website: