Overview
Hybrid3 days to local office
$110,000 - $135,000
Full Time
Skills
AWS
Data Engineer
Lambda
Python
PySpark
SQL
Amazon Web Services
ELT
Reporting
Snow Flake Schema
Snowflake
Data Modeling
Extract
Transform
Load
Job Details
Role: AWS Data Engineer
Location: Hybrid (3 days to local office - New Orleans, LA; Little Rock, AR; or The Woodlands, TX)
We are seeking a skilled AWS Data Engineer who has experience working with Python, PySpark, lambda, Airflow, and Snowflake.
Responsibilities:
- Design, build, and optimize ETLs using Python, PySpark, lambda, Airflow and other AWS services.
- Create SQL queries to segment, manipulate, and formatdata.
- Build automations to ingest, transfer, move, upload, and manipulatedata.
- Build or maintain data ingestion pipelines that move data from source systems into Snowflake.
- Create and manage data models to ensure data integrity and facilitate efficient data analysis.
- Implement and maintain data security and compliance measures, including access controls, encryption, and data masking.
- Ensure data quality, accuracy, and consistency through data validation, cleansing, and monitoring.
- Design and maintain ETL/ELT pipelines to ingest data into Amazon Redshift for analytics and reporting.
Requirements:
- Minimum 5 years of experience as Data Engineer.
- 3+ years of Python, PySpark, and Lambda.
- Must have experience with Airflow and Snowflake.
- Advanced SQL query development proficiency
- Understanding of data modelling principles and techniques.
- Knowledge of data security best practices and compliance requirements.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.