Senior Data Engineer - DBT

Overview

Remote
On Site
Full Time
Part Time
Accepts corp to corp applications
Contract - W2
Contract - Independent

Skills

DBT

Job Details


Position Details

Requirement

Creq

TBD

No of Positions

1

Role

Senior Data Engineer - DBT

Location

NYC, NY (Remote)

Type of Hire - Contract/ C2H

C2H - w2, 1099 ( usc)

Duration

Long-Term

Number of Interviews

L1+HR+Client

In person interviews Yes/No

No

Client Interview Yes/No

Yes

No of submission (example not more than 3 resumes)

3 submissions

Timeline for submission

2 hours

Position requirements (Remote/ Hybrid/Fully onsite)

Remote

Job Description

Must have below:

10+ Years Experience

Great Communicator/Client Facing

Indidual Contributor

100% Hands on in the mentioned skills

DBT Proficiency: model development:

Experience in creating complex DBT models including incremental models, snapshots and documentation. Ability to write and maintain DBT macros for reusable code

Testing and documentation:

Proficiency in implementing DBT tests for data validation and quality checks

Familiarity with generating and maintaining documentation using DBT's built in features

Version control:

Experience in managing DBT projects using git ,including implementing CI/CD process from the scratch

  • AWS Expertise:
    • Data STORAGE solutions:
      • In depth understanding of AWS S3 for data storage, including best practices for organization and security
      • Experience with AWS redshift for data warehousing and performance optimization
    • Data Integration:
      • Familiarity with Aws glue for ETL processes and orchestration -Nice to have
      • Experience with AWS lambda for serverless data processing tasks
    • Workflow Orchestration:
      • Proficiency in using Apache Airflow on AWS to design ,schedule and monitor complex data flows
      • Ability to integrate Airflow with AWS services and DBT models such as triggering a DBT model or EMR or reading from s3 writing to redshift
    • Data Lakes and Data warehousing:
      • Understanding the architecture of data lakes vs data warehouses and when to use each
      • Experience with amazon Athena for querying data directly in s3 using SQL
    • Monitoring and Logging:
      • Familiarity with AWS cloud watch for monitoring the pipelines and setting up alerts for workflow failures
    • Cloud Security:
      • Knowledge of AWS security best practices ,including IAM roles, encryption, DBT profiles access configurations

Programming Skills:

  • Python:
    • Proficiency in Pandas and NumPy for data analysis and manipulation
    • Ability to write scripts for automating ETL processes and scheduling jobs using airflow
    • Experience in creating custom DBT macros using jinja and Python allowing for reusable components within dbt models
    • Knowledge on how to implement conditional logic in DBT through python
  • SQL:
    • Advanced SQL skills, including complex joins ,window functions, CTE's and subqueries

Experience in optimizing SQL queries for performance and optimization

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.