Big Data Engineer

Overview

On Site
BASED ON EXPERIENCE
Contract - W2
Contract - Independent

Skills

Analytics
TAS
SQL
Apache Hive
Trading
Big Data
Apache Spark
Distributed Computing
PySpark
Data Manipulation
Pandas
NumPy
Data Engineering
API
Testing
Python
Electronic Health Record (EHR)
Extract
Transform
Load
Step-Functions
Amazon S3
Amazon Kinesis
Amazon RDS
Remote Desktop Services
PostgreSQL
Amazon DynamoDB
Time Series
Amazon SQS
Virtual Private Cloud
Amazon Web Services
NoSQL
Database
Data Modeling
Normalization
Data Warehouse
Amazon Redshift
Analytical Skill
Version Control
Unit Testing
Test-driven Development
Continuous Integration
Continuous Delivery
Communication
Data Visualization
Tableau

Job Details

Big Data Engineer
Location: Malvern, PA (3 days onsite) - LOCAL CANDIDATES ONLY
Duration: 1 year

Responsibilities:

Client's Trading Analytics and Strategy (TAS) team collaborates with global trading desks to optimize trading strategies, saving millions of dollars for clients annually. The team partners with traders and portfolio managers across various asset classes and mandates, both passive and active, to conduct data-driven analyses and develop tools that shape trading strategies. As a Full Stack Engineer, you will design, implement, and maintain a modern, robust, and scalable platform to meet the increasing demands of the trading desks.

Qualifications:
  • Programming Skills: Proficiency in Python programming.
  • Database Expertise: Strong expertise in SQL, Presto, HIVE, and Spark.
  • Domain Knowledge: Knowledge of trading and investment data.
  • Big Data Technologies: Experience with Spark and developing distributed computing applications using PySpark.
  • Data Manipulation: Proficiency with libraries such as Pandas, Polars, and NumPy.
  • Data Engineering: Understanding of data pipelines, ETL processes, and data warehousing concepts.
  • API Development: Experience in building APIs.
  • Testing: Write, maintain, and execute automated unit tests using Python. Follow Test-Driven Development (TDD) practices.
  • AWS Services: Extensive experience with key AWS services/components including EMR, Lambda, Glue ETL, Step Functions, S3, ECS, Kinesis, IAM, RDS PostgreSQL, DynamoDB, Timeseries database, CloudWatch Events/Event Bridge, Athena, SNS, SQS, and VPC.
  • Serverless Architectures: Proficiency in developing serverless architectures using AWS services.
  • Database Skills: Experience with both relational and NoSQL databases.
  • Data Modeling: Skills in designing and implementing data models, including normalization, denormalization, and schema design.
  • Data Warehousing: Knowledge of data warehousing solutions like Amazon Redshift.
  • Analytical Skills: Strong analytical skills with the ability to troubleshoot data issues.
  • Development Practices: Good understanding of source control, unit testing, test-driven development, and CI/CD.
  • Code Quality: Ability to write clean, maintainable code and comprehend code written by others.
  • Communication: Strong communication skills.
  • Data Visualization: Proficiency in data visualization tools and ability to create visual representations of data, particularly using Tableau.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.