Overview
On Site
Full Time
Skills
ARM
Data Engineering
Data Warehouse
Quantitative Analysis
Extract
Transform
Load
Market Analysis
Finance
Time Series
Pricing
Python
SQL
Stored Procedures
NoSQL
Big Data
Cloud Computing
Workflow Management
Database
Storage
File Systems
Analytics
Computer Science
Statistics
Real-time
Data Quality
Apache Kafka
Java
C++
Rust
Commodities
Job Details
Data Engineer wanted for systematic arm of globally recognised hedge fund to join the growing Data Engineering team where you will support the Commodities business. This opportunity will give you the chance to design, implement and evolve the data platform, including pipeline, repository, data warehouse and data access APIs. You will be working closely with the quant PMs and researchers in a fast-paced environment as you automate and support the Extract, Transform, and Load (ETL) processes from various market data vendors.
They are looking for a passionate engineer with strong financial data knowledge, including security master, financial time series (pricing data, etc.) across multiple asset classes, You'll need to have solid coding skills (Python, Java/C++, SQL), and experience working with large datasets.
If you eat, sleep and breathe data, and have fantastic interpersonal skills, this role is perfect for you!
Requirements
Desirable
Benefits
Whilst we carefully review all applications, to all jobs, due to the high volume of applications we receive it is not possible to respond to those who have not been successful.
Contact
If you feel you are a strong match for this role, please don't hesitate to get in touch:
Greg Leigh-Jones
+1
linkedin.com/in/greg-jones-8a457011a
They are looking for a passionate engineer with strong financial data knowledge, including security master, financial time series (pricing data, etc.) across multiple asset classes, You'll need to have solid coding skills (Python, Java/C++, SQL), and experience working with large datasets.
If you eat, sleep and breathe data, and have fantastic interpersonal skills, this role is perfect for you!
Requirements
- 3+ years' Python experience
- Broad knowledge & experience of database concepts with proficiency in SQL and stored procedures
- Knowledge of NoSQL databases and big data technologies
- Solid experience working with cloud or on-prem technologies and data infrastructure, e.g. workflow management engines, databases, storage & file systems, analytics platforms
- Experience processing large and complex datasets
- BS in Computer Science, Engineering, Statistics, or related subject
Desirable
- Experience with any of the following: designing and implementing real-time pipelines; data quality and validation; Kafka; Java/C++/Rust/Go
- Experience with traditional and/or alternative datasets in the Commodity space
Benefits
- Work directly with all parts of the fund
- Market-leading salaries + bonus
- Entrepreneurial feel of a small firm with the infrastructure, technology and support that comes with a large company
- Inclusive, collaborative work culture
Whilst we carefully review all applications, to all jobs, due to the high volume of applications we receive it is not possible to respond to those who have not been successful.
Contact
If you feel you are a strong match for this role, please don't hesitate to get in touch:
Greg Leigh-Jones
+1
linkedin.com/in/greg-jones-8a457011a
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.