Overview
Remote
Full Time
Skills
Employment Authorization
Data Architecture
Data Integration
Data Modeling
Business Intelligence
Scripting
ERD
Documentation
Data Quality
Software Engineering
Version Control
SQL
Scalability
SQL Tuning
Clustering
Management
Workflow
DevOps
Collaboration
Statistics
Mathematics
Computer Science
Information Technology
Computer Engineering
Data Engineering
Analytics
Macros
Dimensional Modeling
Data Warehouse
Change Data Capture
Step-Functions
Amazon Redshift
Amazon Web Services
Continuous Integration
Continuous Delivery
Python
PySpark
Agile
Offshoring
Job Details
Role: Data Engineer -DBT (Full Time Role)
Location: Remote-USA
Type: Full Time Role (No C2C/W2/C2H/1099)
Work Authorization: This role is not eligible for company sponsorship now or in future
Responsibilities:
- Design and build robust, scalable data transformation pipelines using SQL, DBT, and Jinja templating
- Develop and maintain data architecture and standards for Data Integration and Data Warehousing projects using DBT and Amazon Redshift
- Collaborate with cross-functional teams to gather requirements and deliver dimensional data models that serve as a single source of truth
- Own the full stack of data modeling in DBT to empower analysts, data scientists, and BI engineers
- Enhance and maintain the analytics codebase, including DBT models, SQL scripts, and ERD documentation
- Ensure data quality, governance alignment, and operational readiness of data pipelines
- Apply software engineering best practices such as version control, CI/CD, and code reviews
- Optimize SQL queries for performance, scalability, and maintainability across large datasets
- Implement best practices for SQL performance tuning, including partitioning, clustering, and materialized views
- Build and manage infrastructure as code using AWS CDK for scalable and repeatable deployments. Integrate and automate deployment workflows using AWS CodeCommit, CodePipeline, and related DevOps tools
- Support Agile development processes and collaborate with offshore teams
Required Qualifications:
- Bachelor's or Master's (preferred) degree in a quantitative or technical field such as Statistics, Mathematics, Computer Science, Information Technology, Computer Engineering or equivalent
- 5+ years of experience in data engineering and analytics on modern data platforms
- 3+ years' extensive experience with DBT or similar data transformation tools, including building complex & maintainable DBT models and developing DBT packages/macros
- Deep familiarity with dimensional modeling/data warehousing concepts and expertise in designing, implementing, operating, and extending enterprise dimensional models
- Understand change data capture concepts
- Experience working with AWS Services (Lambda, Step Functions, MWAA, Glue, Redshift)
- Hands-on experience with AWS CDK, CodeCommit, and CodePipeline for infrastructure automation and CI/CD
- Python proficiency or general knowledge of Jinja templating in Python and/or PySpark
- Agile experience and willingness to work with extended offshore teams and assist with design and code reviews with customer
- A great teammate and self-starter, strong detail orientation is critical in this role.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.