Overview
Skills
Job Details
Only NY/NJ candidates will be considered for Data Engineer role.
Interview Process: Video
Key Must-Haves:
10+ years hands-on Data Engineering experience (not analytics-heavy).
Strong Azure + Databricks (PySpark) background.
Extensive experience integrating 3rd party/vendor data feeds into Capital Markets trading platforms.
Proven expertise building robust, testable Python/PySpark pipelines for batch/stream data.
Deep experience with schema evolution, PII handling, resiliency/retry patterns.
Strong SQL, CI/CD, observability & monitoring.
Role Overview:
Own end-to-end lifecycle of market, alternative, and vendor data ingestion.
Build & scale Azure/Databricks infrastructure.
Design & develop data pipelines, ingestion frameworks, Delta Lake/parquet patterns.
Provide production support during market hours, drive RCA & permanent fixes.
Collaborate with PMs/Analysts to translate investment needs into data models & marts.
Nice-to-Haves:
JavaScript (internal tools/UI)
Delta Live Tables, dbt, Airflow
Terraform/Bicep