Data Engineer

Overview

On Site
Depends on Experience
Full Time

Skills

AWS
Cloud
Data Engineer
Python
Snowflake
GIT
Cloudbees Jenkins
uDeploy
Pivotal Concourse

Job Details

Data Engineer - Onsite at Dallas, TX

The Expertise and Skills You Bring.

  • Bachelor's degree (or higher) in either Computer Science, Management Information Systems, Business Information Systems, Mathematics or Finance related field.
  • 6+ years of data engineering experience, at peer firms utilizing agile methods and modern SDLC processes to deliver quality technological solutions in a transparent, reliable way.
  • Demonstrable mastery of industry best practices in the data lifecycle, including data quality automation and tooling.
  • Demonstrated domain experience with financial datasets including FactSet, Bloomberg, IBES, CompStat, S&P Global, Worldscope, Morningstar, MSCI, Reuters, IDC, Markit, BARRA, Axioma, Northfield, and PORT
  • A proven track record of working with complex data environments and associated technology and analytics infrastructure needed to support these environments. Ability to recognize business risk and surface it to key decision-makers.
  • Substantial Investment Management business domain expertise across some combination of research, portfolio management, trading and investment operations.
  • The Value You Deliver
  • Deliver datasets from onboarding through mapping and automated DQ so they can be handed off to the Data Operations team for DQ maintenance and consumed by research teams and production.
  • Design and implement processes and tools for data onboarding and quality, helping to deliver an industry best-practice solution for managing the data lifecycle.
  • Provide data stewardship to other team members and to Data Operations as they review datasets for completeness and quality.
  • Produce stand-alone tools that can be used by other teams to automate data quality and discover faults.

Key Skills:

  • Demonstrated data engineering experience, utilizing agile methods and modern SDLC processes to deliver quality technological solutions in a transparent, reliable way.
  • Experience in building and designing solutions for data warehouse and experience in working with large data sets in Oracle and Snowflake
  • Demonstrable mastery of industry best practices in the data lifecycle, including data quality automation and tooling.
  • A proven track record of working with complex data environments and associated technology and analytics infrastructure needed to support these environments. Ability to recognize business risk and surface it to key decision-makers.
  • Domain experience with financial datasets like FactSet, Bloomberg, IBES, CompStat, S&P Global, Worldscope, Morningstar, MSCI, Reuters, IDC, Markit, BARRA, Axioma, Northfield, and PORT is a plus
  • Demonstrated Experience in Python programming and familiarity with core data science libraries. Understanding different package managers and test frameworks.
  • Experience in some of the following technologies continuous integration/delivery tools such as GIT, Cloudbees Jenkins, uDeploy and/or Pivotal Concourse.
  • Knowledge in python web frameworks like Flask, Django is a plus
  • Fluency in formal language design concepts like type systems a plus.
  • Experience with quant research processes, methodologies, and tools a plus.