Data Engineer

Overview

Remote
$140,000 - $180,000
Full Time
No Travel Required

Skills

Snowflake
Python
SQL

Job Details

Senior Enterprise Data Warehouse (EDW) Engineer 10+ Years Experience
Location: Remote (USA)
Employment Type: W2
Experience Level: 10+ Years
Duration: Long-term
Work Authorization: GC
About the Role
We re looking for a Senior Enterprise Data Warehouse (EDW) Engineer with deep experience in designing and managing scalable, cloud-native data ecosystems. This role is ideal for someone who not only writes strong SQL or builds data pipelines but also architects and optimizes data warehouses end-to-end in enterprise environments.
Key Responsibilities
  • Architect and develop data pipelines and ETL/ELT processes for enterprise-scale data warehouses.
  • Design, implement, and optimize data models on Snowflake / Redshift / BigQuery.
  • Create and maintain dbt models, Airflow / Prefect workflows, and CI/CD pipelines for data.
  • Partner with data analysts, BI developers, and cloud engineers to ensure data reliability and quality.
  • Implement and enforce data governance, lineage, and security controls.
  • Support migration from on-prem databases to modern cloud platforms (AWS, Google Cloud Platform, Azure).
  • Conduct code reviews, performance tuning, and cost optimization across data layers.

Required Skills
  • 10+ years of hands-on experience with Enterprise Data Warehousing and ETL/ELT.
  • Strong proficiency in SQL, Python, and data modeling (Kimball/Inmon).
  • Expertise with at least one cloud data warehouse Snowflake, Redshift, or BigQuery.
  • Experience with ETL orchestration tools such as Airflow, dbt Cloud, or Data Factory.
  • Solid understanding of data lake / lakehouse architectures and integration patterns.
  • Hands-on experience with version control (Git), CI/CD, and Agile development.
  • Strong analytical mindset with ability to troubleshoot complex data issues.

Preferred Skills
  • Experience integrating with AWS Glue, Lambda, Step Functions, S3, and API-based ingestion.
  • Knowledge of Looker, Power BI, or Tableau for end-to-end data delivery.
  • Familiarity with Terraform / CloudFormation for infrastructure automation.
  • Exposure to governance and metadata tools (e.g., Alation, Collibra, Informatica).
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.