Junior Snowflake Data Engineer

Overview

Remote
$25 - $30
Contract - W2

Skills

Snowflake
DBT
Airflow
SQL
Python
AWS
Kafka
Spark
ETL
Jenkins
GitLab
CI/CD
Docker
Terraform

Job Details

We don't need a senior person here as the rate is offered for this role is not on higher side.

Job Title: Junior Snowflake Data Engineer

Location: Remote

Required Experience: 6+ years

Duration: 6 Month+

Need someone focused on Snowflake and DBT along with Python, and Airflow.

About the Role:

We are seeking a Data Engineer with strong expertise in Snowflake, DBT, and Airflow to design, build, and optimize modern cloud data platforms. The ideal candidate has hands-on experience with ETL/ELT pipelines, data modeling, and orchestration frameworks, and can translate business requirements into scalable, secure, and performant data solutions.

Key Responsibilities:

  • Design and implement end-to-end data pipelines using Snowflake, DBT, Airflow, and Python.
  • Ingest data from diverse sources (transaction systems, APIs, streaming platforms like Kafka) into cloud data platforms (AWS, Azure, Google Cloud Platform).
  • Manage data across bronze, silver, and gold layers ensuring scalability, lineage, and compliance.
  • Develop and optimize data models (star schema, Data Vault, SCD handling) for analytics and reporting use cases.
  • Build reusable frameworks for data ingestion, transformation, and quality checks.
  • Define and manage Airflow DAGs with dependencies, retries, SLAs, and monitoring.
  • Implement CI/CD pipelines for DBT and Airflow jobs using GitLab/Jenkins.
  • Ensure data governance, masking, encryption, and regulatory compliance (HIPAA, GDPR, SOX, PCI DSS).
  • Collaborate with cross-functional teams (Data Architects, BI Developers, Data Scientists) to deliver business-ready datasets.
  • Troubleshoot pipeline failures, optimize query performance, and ensure 99.9% pipeline reliability.

Required Skills & Experience:

  • 6+ years of professional experience in Data Engineering.
  • Hands-on expertise with Snowflake (warehousing, task automation, RBAC, query optimization).
  • Proficiency in DBT (models, macros, testing, documentation, dbt Cloud/CLI).
  • Strong experience with Apache Airflow (DAG design, custom operators, retries, SLAs, alerting).
  • Advanced SQL and Python programming skills.
  • Experience with AWS services (S3, Glue, Lambda, EMR, RDS, Kinesis, MSK).
  • Familiarity with streaming technologies (Kafka, Spark Streaming).
  • Strong understanding of data modeling, ETL/ELT frameworks, and pipeline orchestration.
  • Knowledge of DevOps practices (CI/CD, Terraform, Docker, GitLab/Jenkins).
  • Excellent problem-solving and communication skills; ability to explain technical solutions in business terms.

Preferred Qualifications:

  • Prior experience in Healthcare, Finance, or Retail domains.
  • Familiarity with Redshift, Databricks, or BigQuery.
  • Exposure to data governance tools (Collibra, Alation).
  • Experience with real-time analytics use cases (fraud detection, patient monitoring, stockouts reduction).
  • Certification in Snowflake, AWS, or DBT.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Brilliant Infotech Inc.