Overview
Remote
Depends on Experience
Contract - W2
Contract - Independent
Contract - 12 Month(s)
Skills
Apache Airflow
FiveTran
Amazon Web Services
Apache HTTP Server
Analytics
Business Intelligence
Cloud Computing
Continuous Delivery
Continuous Integration
Data Engineering
Data Governance
Data Processing
Data Quality
Data Warehouse
Decision-making
Documentation
Extract
Transform
Load
ELT
FOCUS
Good Clinical Practice
Google Cloud Platform
Management
Microsoft Azure
Microsoft Power BI
Orchestration
Python
Query Optimization
SQL
Scheduling
Snow Flake Schema
Stacks Blockchain
Tableau
Testing
Use Cases
Version Control
Airflow
and Python
Job Details
skilled Analytics Engineer / Modern Data Engineer to design, build, and maintain scalable, reliable data pipelines and analytics-ready data models. This role will focus on leveraging Snowflake, Airflow, dbt, Fivetran, SQL, and Python to enable trusted analytics and data-driven decision-making across the organization.
The ideal candidate has strong experience with modern data stacks, understands analytics use cases, and enjoys collaborating closely with analytics, product, and business teams.
- Design, build, and maintain ELT/ETL pipelines using Fivetran, Airflow, and Python
- Develop and optimize analytics-ready data models in Snowflake using dbt
- Implement data transformations, testing, and documentation following analytics engineering best practices
- Ensure data quality, accuracy, and reliability through validation checks and monitoring
- Collaborate with analytics, BI, and business stakeholders to understand data requirements
- Optimize Snowflake performance, cost, and query efficiency
- Maintain scheduling, orchestration, and dependency management using Apache Airflow
- Support downstream analytics tools (e.g., Looker, Tableau, Power BI) with well-modeled data
- Contribute to data governance, version control, and CI/CD processes for data pipelines
- 4+ years of experience in data engineering, analytics engineering, or similar roles
- Strong hands-on experience with Snowflake
- Advanced SQL skills (query optimization, window functions, CTEs)
- Experience with dbt for transformations, testing, and documentation
- Experience using Apache Airflow for orchestration
- Proficiency in Python for data processing and automation
- Experience with Fivetran or similar ingestion tools
- Solid understanding of data warehousing concepts and dimensional modeling
- Experience working in cloud environments (AWS, Azure, or Google Cloud Platform)
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.