Senior Azure Data Engineer Capital Markets

Overview

Hybrid
Depends on Experience
Full Time

Skills

Capital Market
Data Engineering
Azure
Databricks
Data Integration
Trading Platforms
Python
PySpark
Data Pipelines

Job Details

Only NY/NJ candidates will be considered for Data Engineer role.

Interview Process: Video
Key Must-Haves:
10+ years hands-on Data Engineering experience (not analytics-heavy).
Strong Azure + Databricks (PySpark) background.
Extensive experience integrating 3rd party/vendor data feeds into Capital Markets trading platforms.
Proven expertise building robust, testable Python/PySpark pipelines for batch/stream data.
Deep experience with schema evolution, PII handling, resiliency/retry patterns.
Strong SQL, CI/CD, observability & monitoring.
Role Overview:
Own end-to-end lifecycle of market, alternative, and vendor data ingestion.
Build & scale Azure/Databricks infrastructure.
Design & develop data pipelines, ingestion frameworks, Delta Lake/parquet patterns.
Provide production support during market hours, drive RCA & permanent fixes.
Collaborate with PMs/Analysts to translate investment needs into data models & marts.
Nice-to-Haves:
JavaScript (internal tools/UI)
Delta Live Tables, dbt, Airflow
Terraform/Bicep

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.