Overview
Remote
$50 - $60
Contract - W2
Contract - 6 Month(s)
Skills
Amazon Web Services
Data Engineering
Data Modeling
Extract
Transform
Load
Performance Tuning
Python
Snow Flake Schema
SQL
Data Quality
Job Details
Responsibilities:
Architect, design, implement, and operate end-to-end data engineering solutions using Agile methodology.
- Develop and manage robust data integrations with external vendors and organizations (including complex API integrations).
- Collaborate closely with Data Analysts, Data Scientists, DBAs, and cross-functional teams to understand requirements and deliver high-impact data solutions.
- Lead and take ownership of assigned technical projects in a fast-paced environment.
- Drive continuous improvement in the quality, security, efficiency, and scalability of our data pipelines and infrastructure.
Skills/Experience you have:
- Bachelor of Science degree in Computer Science or equivalent practical experience.
- 4+ years of dedicated experience building and maintaining complex ETL/ELT pipelines.
- 3+ years of Python development experience, specifically building production-grade data APIs using FastAPI or similar frameworks.
- Strong expertise in SQL, including advanced query optimization and performance tuning.
- Expert-level proficiency in dbt implementation, including managing models, macros, and incremental and dynamic tables.
- Extensive hands-on experience with Snowflake, including performance optimization, data recovery using Time Travel, and advanced data modeling techniques.
- Practical experience with key AWS data services such as Kinesis, Firehose, Redshift, Spectrum, Elastic MapReduce, and Lambda; and container orchestration services like ECS and EKS for deploying and managing data applications.
Skills/Experience preferred:
- Proven experience designing and building near real-time data processing systems.
- Experience with data observability tools for proactive monitoring of data quality, lineage, and pipeline health.
- Familiarity with modern data orchestration platforms such as Argo or Airflow.
- Hands-on experience with the full software development lifecycle (SDLC), including strong CI/CD practices for data pipelines and proficiency with data testing frameworks.
- Demonstrated ability to identify and adopt new technologies that improve data quality and reliability.
- Deep understanding of the data lifecycle, emphasizing the importance of high-quality data in applications, machine learning, business analytics, and reporting.
- Proven track record of mentoring junior team members.
- Experience with data ingestion tools like Singer is a plus.
- Knowledge of analytics tools such as Tableau, Plotly, and Pandas.
- Experience in the financial services industry.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.