AWS Data Engineering

Overview

On Site
Full Time
Part Time
Accepts corp to corp applications
Contract - W2
Contract - Independent

Skills

Data Warehouse
Managed Services
Management
Reporting
Collaboration
Data Storage
Data Quality
Workflow
Scalability
Data Engineering
Data Architecture
Amazon S3
Step-Functions
Amazon DynamoDB
Data Lake
Data Modeling
NoSQL
Database
SQL
Production Support
Cloud Computing
Analytical Skill
Conflict Resolution
Problem Solving
Communication
Stakeholder Management
DevOps
Continuous Integration
Continuous Delivery
Computer Science
Information Systems
Analytics
Amazon Redshift
Spectrum
Python
Apache Spark
Scala
Extract
Transform
Load
Scripting
Agile
Scrum
Project Delivery
Data Analysis
Amazon Web Services

Job Details

Job Summary:

We are seeking an experienced AWS Data Engineer to design, develop, and maintain end-to-end ETL pipelines, data lake, and data warehouse solutions on AWS. The ideal candidate will have hands-on experience with AWS services and a strong ability to solve complex data problems, provide scalable solutions, and support production environments effectively.



Key Responsibilities:

  • Design, develop, and deploy end-to-end ETL pipelines and data lake architectures using AWS services such as AWS Glue, Lambda, Redshift, S3, and Step Functions.

  • Build, maintain, and optimize data warehouse applications for analytical and reporting purposes.

  • Implement data ingestion, transformation, and integration solutions using AWS managed services.

  • Create, manage, and deploy infrastructure as code using CloudFormation templates.

  • Develop and maintain DynamoDB and Athena queries for analytics and operational reporting.

  • Monitor, troubleshoot, and provide production support for ETL jobs and data pipelines, including alert handling and incident resolution.

  • Collaborate with cross-functional teams to gather requirements, design data models, and implement best practices for ETL and data storage.

  • Ensure data quality, integrity, and security in all data workflows.

  • Communicate effectively with business and technical stakeholders during production issues, providing timely resolutions and updates.

  • Continuously explore and leverage latest AWS technologies to improve ETL processes, performance, and scalability.




Required Skills and Qualifications:

  • 3 8 years of experience in data engineering, ETL development, or cloud data architecture.

  • Hands-on experience with AWS services: Glue, Lambda, Redshift, S3, Step Functions, DynamoDB, Athena, CloudFormation.

  • Experience in end-to-end ETL and data lake design on AWS.

  • Strong knowledge of data modeling, relational and NoSQL databases, and SQL.

  • Experience in production support, job monitoring, and incident resolution in cloud environments.

  • Strong analytical and problem-solving skills, with ability to design creative and scalable solutions.

  • Excellent communication and stakeholder management skills.

  • Familiarity with DevOps concepts, CI/CD, and automation tools is a plus.

  • Bachelor's degree in Computer Science, Information Systems, or related field.




Preferred Skills:

  • Experience with AWS analytics stack such as QuickSight or Redshift Spectrum.

  • Exposure to Python, Spark, or Scala for ETL scripting and automation.

  • Familiarity with Agile/Scrum methodology and collaborative project delivery.

  • AWS certifications such as AWS Certified Data Analytics Specialty or AWS Certified Solutions Architect are a plus.


Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Purple Drive Technologies LLC