Overview
Hybrid3 days a week onsite, 2 days remote
$60 - $65
Contract - W2
Contract - 6 Month(s)
No Travel Required
Able to Provide Sponsorship
Skills
kafka
snowflake
python
data pipelines
Streaming
Performance Tuning
Data Loading
Apache Kafka
trading
Job Details
Lead the development and optimization of batch and real-time data pipelines, ensuring scalability, reliability, and performance. Architect, design, and deploy data integration, streaming, and analytics solutions leveraging Spark, Kafka, and Snowflake. Ability to help voluntarily and proactively, and support Team Members, Peers to deliver their tasks to ensure End-to-end delivery. Evaluates technical performance challenges and recommend tuning solutions. Hands-on knowledge of Data Service Engineer to design, develop, and maintain our Reference Data System utilizing modern data technologies including Kafka, Snowflake, and Python.
Responsibilities:
- Lead the development and optimization of batch and real-time data pipelines, ensuring scalability, reliability, and performance.
- Architect, design, and deploy data integration, streaming, and analytics solutions leveraging Spark, Kafka, and Snowflake.
- Ability to help voluntarily and proactively, and support Team Members, Peers to deliver their tasks to ensure End-to-end delivery.
- Evaluates technical performance challenges and recommend tuning solutions.
- Hands-on knowledge of Data Service Engineer to design, develop, and maintain our Reference Data System utilizing modern data technologies including Kafka, Snowflake, and Python.
Requirements:
- Proven experience in building and maintaining data pipelines, especially using Kafka, Snowflake, and Python.
- Strong expertise in distributed data processing and streaming architectures.
- Experience with Snowflake data warehouse platform: data loading, performance tuning, and management.
- Proficiency in Python scripting and programming for data manipulation and automation.
- Familiarity with Kafka ecosystem (Confluent, Kafka Connect, Kafka Streams).
- Knowledge of SQL, data modeling, and ETL/ELT processes.
- Understanding of cloud platforms (AWS, Azure, Google Cloud Platform) is a plus.
Preferred, but not required:
- Trade Processing, Settlement, Reconciliation, and related back/middle-office functions within financial markets (Equities, Fixed Income, Derivatives, FX, etc.).
- Strong understanding of trade lifecycle events, order types, allocation rules, and settlement processes.
- Corporate Data System- Funding Support, Planning & Analysis, Regulatory reporting & Compliance. Knowledge of regulatory standards (such as Dodd-Frank, EMIR, MiFID II) related to trade reporting and lifecycle management.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.