KDB+/Q Engineer(Onsite) Location:NYC, NY (onsite)

Overview

On Site
Accepts corp to corp applications
Contract - W2
Contract - Independent
Contract - 4+ Month(s)

Skills

Employment Authorization
Real-time
Use Cases
FTP
Capacity Management
Software Development Methodology
Continuous Integration
Continuous Delivery
Release Management
Collaboration
Application Support
DevOps
Documentation
Knowledge Transfer
Data Engineering
Time Series
LiveJournal
RDB
EOD
Migration
Optimization
Layout
IPC
Data Compression
Query Optimization
Linux
Shell Scripting
Computer Networking
Market Analysis
Modeling
Analytics
Incident Management
Change Control
Communication
Stakeholder Management
Trading
Quantitative Analysis
Apache Kafka
Python
Java
C++
.NET
WebSocket
Cloud Computing
Storage
Amazon S3
Business Intelligence
Visualization
Dashboard
Grafana
Tableau
Regulatory Compliance
Microsoft Exchange
Bloomberg
Management
Research
Data Quality
Oracle UCM
SANS
OM
WebKit
IMG

Job Details


Job Title: KDB+/Q Engineer (Contract)


Location: NYC, NY (onsite)

Work Authorization:

Duration: 4+ Months

Overview:

We are seeking an experienced KDB+/Q engineer to design, build, and optimize low-latency time-series data pipelines and analytics used by trading, quant, and risk teams. This is an onsite role in NYC, for one of our Big Four clients, working closely with quants, traders, and platform engineering to deliver high-performance market data and analytics solutions.

Key Responsibilities:

  • Own the end-to-end KDB+ architecture: tickerplant/rdb/hdb design, schema evolution, and storage strategy (splayed/partitioned tables, attributes, compression).
  • Build and optimize real-time and historical data pipelines for market data and trade/order events; implement robust feed handlers and entitlements.
  • Develop high-performance q code and APIs for analytics, research, and production use cases (asof joins, windowed aggregations, order book analytics).
  • Integrate KDB+ with surrounding systems (e.g., Python/PyKX, Java/C++ gateways, Kafka, REST/WebSocket services, files/FTP, cloud object storage).
  • Implement monitoring, alerting, and capacity planning; tune performance across memory, disk, and IPC; troubleshoot latency and data-quality issues.
  • Establish SDLC best practices for q: code reviews, unit/integration tests, CI/CD, versioning, and production release management.
  • Collaborate with Market Data, Application Support, and DevOps/SRE on production readiness, incident response, and runbooks.
  • Deliver clear documentation, handover materials, and knowledge transfer by contract end.

Required Skills & Qualifications:

  • 10+ years of professional data engineering experience, with deep expertise in time-series and columnar data.
  • 7+ years hands-on KDB+/q experience in production environments supporting trading, market data, or risk platforms.
  • Expert-level q: idiomatic vector programming; joins (aj/aj0/uj/lj), windowed ops, keyed/splayed/partitioned tables, enumerations, attributes (p/s/u), upsert patterns.
  • Strong knowledge of kdb+tick components (tickerplant, rdb, hdb), sym management, EOD processes, and schema/version migration.
  • Proven low-latency optimization skills: memory/disk layout, IPC, batching, compression, partitioning strategy, and query tuning.
  • Solid Linux engineering background, including shell scripting, networking fundamentals, and performance profiling.
  • Practical experience with market data (e.g., ITCH/OUCH, FIX/FAST, proprietary exchange feeds), order book modeling, and tick-level analytics.
  • Production discipline: observability (logging/metrics/tracing), incident management, and change control in regulated environments.
  • Excellent communication and stakeholder management across trading, quant, and infrastructure teams; ability to operate autonomously on tight timelines

Preferred Skills & Qualifications:

  • Kafka or other pub/sub experience; schema registry and exactly-once or idempotent patterns.
  • Python (PyKX), Java/C++/.NET gateways; building REST/WebSocket services for kdb+.
  • Experience with KX Insights, cloud object storage (S3S), or hybrid/on-prem deployments.
  • BI/visualization integration (KX Dashboards, Grafana, Tableau) and entitlements/compliance for vendor/exchange data (Bloomberg, Refinitiv, direct feeds).
  • Experience with backtesting/research platforms and data quality frameworks.


Himanshu Goswami

Sr. IT Technical Recruiter

Stellent IT Phone:

Email: Himanshu.goswami
Gtalk: Himanshu.goswamiom

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.