Data Scientist - Semi Systematic Trading

Overview

On Site
Full Time

Skills

Trading
Data Science
Innovation
Advanced Analytics
Strategic Communication
Optimization
Machine Learning (ML)
Algorithms
Forecasting
Operational Efficiency
Design Of Experiments
Management
Communication
Analytical Skill
Apache Velocity
Finance
Python
R
Pandas
scikit-learn
TensorFlow
Visualization
Dashboard
Tableau
Fluency
Database
Writing
SQL
Data Extraction
Cloud Computing
Google Cloud Platform
Google Cloud
Big Data
Distributed Computing
Apache Spark
Apache Kafka
Extract
Transform
Load

Job Details

Role Overview

Join our data science team to drive innovation and insights through advanced analytics, experimentation, and strategic communication. You'll help shape data-driven decisions in a dynamic environment.

Key Responsibilities
  • Data Insight Development: Investigate complex datasets to extract trends and key relationships, utilizing visual tools to communicate findings effectively.
  • Model Construction & Optimization: Design and deploy machine learning algorithms to forecast business metrics and enhance operational efficiency.
  • Experimental Design: Construct and manage experiments to validate predictive models and inform business strategies.
  • Insight Communication: Translate analytical results into actionable recommendations for cross-functional stakeholders.

Minimum Qualifications
  • Professional Experience: At least 3 years working in high-velocity sectors like finance or similar industries.
  • Coding Expertise: Proficient in Python and R; experienced with core libraries such as Pandas, scikit-learn, and TensorFlow.
  • Visualization Skills: Adept at building data dashboards and visual narratives using platforms like Tableau.
  • Statistical Fluency: Deep knowledge of statistical techniques and their real-world applications.
  • Database Mastery: Skilled in writing and optimizing complex SQL queries for data extraction and transformation.

Preferred Qualifications
  • Cloud Familiarity: Exposure to environments like Google Cloud Platform (Google Cloud Platform).
  • Big Data Acumen: Experience with distributed computing tools such as Apache Spark and Kafka.
  • Pipeline Design: Understanding of data ingestion frameworks and ETL best practices.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.