Overview
HybridFort Mill, SC or New York City, NY (3-4 days Hybrid)
Depends on Experience
Contract - W2
Contract - Independent
Contract - 6 Month(s)
No Travel Required
Unable to Provide Sponsorship
Skills
Conflict Resolution
Amazon Web Services
Business Intelligence
Agile
Amazon Redshift
Amazon S3
Computer Science
Continuous Delivery
BFSI
Cloud Computing
Microsoft Power BI
Information Technology
Orchestration
Extract, Transform, Load
Jenkins
Performance Tuning
Data Storage
Data Warehouse Architecture
Data Modeling
Data Quality
Data Analysis
Data Engineering
Data Security
Regulatory Compliance
SQL
Scala
GitHub
Problem Solving
PySpark
Python
Collaboration
Continuous Integration
Data Lake
Wealth Management
Step-Functions
Tableau
Scripting
Scrum
Snow Flake Schema
Visualization
Job Details
Key Responsibilities
- Design, develop, and maintain ETL pipelines using AWS Glue, Glue Studio, and Glue Catalog.
- Ingest, transform, and load large datasets from structured and unstructured sources into AWS data lakes/warehouses.
- Work with S3, Redshift, Athena, Lambda, and Step Functions for data storage, query, and orchestration.
- Build and optimize PySpark/Scala scripts within AWS Glue for complex transformations.
- Implement data quality checks, lineage, and monitoring across pipelines.
- Collaborate with business analysts, data scientists, and product teams to deliver reliable data solutions.
- Ensure compliance with data security, governance, and regulatory requirements (BFSI preferred).
- Troubleshoot production issues and optimize pipeline performance.
Required Qualifications
- 12+ years of experience in Data Engineering, with at least 5+ years on AWS cloud data services.
- Strong expertise in AWS Glue, S3, Redshift, Athena, Lambda, Step Functions, CloudWatch.
- Proficiency in PySpark, Python, SQL for ETL and data transformations.
- Experience in data modeling (star, snowflake, dimensional models) and performance tuning.
- Hands-on experience with data lake/data warehouse architecture and implementation.
- Strong problem-solving skills and ability to work in Agile/Scrum environments.
Preferred Qualifications
- Experience in BFSI / Wealth Management domain.
- AWS Certified Data Analytics – Specialty or AWS Solutions Architect certification.
- Familiarity with CI/CD pipelines for data engineering (CodePipeline, Jenkins, GitHub Actions).
- Knowledge of BI/Visualization tools like Tableau, Power BI, QuickSight.
Education
- Bachelor’s degree in Computer Science, Information Technology, Engineering, or related field.
- Master’s degree preferred.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.