Data Engineer

Overview

On Site
$40 - $50
Contract - W2
Contract - 6 Month(s)

Skills

Data Engineering
Python
Snowflake
ELT frameworks
Apache Airflow
AWS cloud environment
S3
Lambda
CloudWatch

Job Details

Role: Data Engineer
Location: Maitland, FL
Duration: 6+ Months Contract

We are also looking for data engineers ,attached JD-- We re specifically looking for someone with strong experience in building (not just using) ELT frameworks using DBT, MWAA, and Snowflake, along with solid Python skills.

We re looking for a data engineer to build and maintain ELT pipelines using Apache Airflow, dbt, and Snowflake in an AWS cloud environment. Should have experience in modular python coding with experience in deploying any container based services in aws with monitoring setup as well

Key Skills & Experience:

Strong SQL and Snowflake expertise, including performance tuning and data modeling.
Proficient in Python for scripting, automation, and working with REST APIs.
Experience with Apache Airflow for orchestration and workflow monitoring.
Hands-on with dbt for modular, version-controlled data transformations.
Solid experience with AWS services (e.g., S3, Lambda, IAM, CloudWatch) in data engineering workflows.
Experience integrating and processing data from REST APIs.
Understanding of data quality, governance, and cloud-native troubleshooting.

"Must have below:
10+ Years Experience
Great Communicator/Client Facing
Individual Contributor
100% Hands on in the mentioned skills
DBT Proficiency: model development:
Experience in creating complex DBT models including incremental models, snapshots and documentation. Ability to write and maintain DBT macros for reusable code
Testing and documentation:
Proficiency in implementing DBT tests for data validation and quality checks
Familiarity with generating and maintaining documentation using DBT's built in features
Version control:
Experience in managing DBT projects using git ,including implementing CI/CD process from the scratch

AWS Expertise:
Data STORAGE solutions:
In depth understanding of AWS S3 for data storage, including best practices for organization and security
Experience with AWS redshift for data warehousing and performance optimization
Data Integration:
Familiarity with Aws glue for ETL processes and orchestration -Nice to have
Experience with AWS lambda for serverless data processing tasks
Workflow Orchestration:
Proficiency in using Apache Airflow on AWS to design ,schedule and monitor complex data flows
Ability to integrate Airflow with AWS services and DBT models such as triggering a DBT model or EMR or reading from s3 writing to redshift
Data Lakes and Data warehousing:
Understanding the architecture of data lakes vs data warehouses and when to use each
Experience with amazon Athena for querying data directly in s3 using SQL
Monitoring and Logging:
Familiarity with AWS cloud watch for monitoring the pipelines and setting up alerts for workflow failures
Cloud Security:
Knowledge of AWS security best practices ,including IAM roles, encryption, DBT profiles access configurations

Programming Skills:

Python:
Proficiency in Pandas and NumPy for data analysis and manipulation
Ability to write scripts for automating ETL processes and scheduling jobs using airflow
Experience in creating custom DBT macros using jinja and Python allowing for reusable components within dbt models
Knowledge on how to implement conditional logic in DBT through python
SQL:
Advanced SQL skills, including complex joins ,window functions, CTE's and subqueries
Experience in optimizing SQL queries for performance and optimization"

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.