Trading Analytics Engineer - Prop Trading

Overview

On Site
Full Time

Skills

Analytics
Use Cases
Collaboration
Front Office
Workflow
Specification Gathering
Software Engineering
Analytical Skill
SQL
Finance
Trading
Object-Oriented Programming
Python
Data Analysis
Pandas
NumPy
scikit-learn
Performance Tuning
Distributed Computing
Apache Spark
Data Lake
Databricks
Snow Flake Schema
Apache Parquet
Large Language Models (LLMs)
Productivity

Job Details

We are seeking a Trading Analytics Engineer to join a well-established Data Analytics team that supports analysis workflows across multiple trading desks. This role involves building essential tools and systems for data analysis and collaborating closely with traders to tailor existing tools to a variety of use cases. The ideal candidate is adept at translating high-level ideas into detailed technical solutions and thrives in a fast-paced, dynamic environment.

Key Responsibilities:

  • Partner with traders to prototype and develop tools and libraries that support innovative analysis techniques.
  • Collaborate directly with front-office teams to enhance analytical capabilities and streamline workflows.
  • Translate domain-specific requirements from financial markets into actionable technical specifications.
  • Apply software engineering skills to solve complex analytical problems in the trading domain.
  • Optimize data access and processing using tools such as SQL, Polars/Python, and Spark to accelerate iteration cycles.
  • Develop a broad understanding of internal analysis tools and advise researchers and traders on their effective application.


Qualifications & Experience:

  • 3+ years of engineering experience in financial markets; experience in a proprietary trading environment is a plus.
  • Proven experience working closely with traders, analysts, or quantitative researchers.
  • Strong command of Python, with the ability to write both object-oriented and functional code.
  • Advanced user of JupyterHub/JupyterLab; experience supporting multi-user environments is preferred.
  • Proficiency with Python data analysis libraries such as pandas, polars, numpy, scipy, and scikit-learn, especially for performance optimization on large datasets.
  • Familiarity with distributed computing frameworks like Spark, Trino, Dremio, or Dask.
  • Experience with modern data lake technologies such as Databricks, Snowflake, Iceberg, and Parquet.
  • Comfortable using large language model (LLM) tools to enhance productivity in analysis and code generation.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.