Overview
Remote
Depends on Experience
Full Time
Skills
Extract
Transform
Load
Snowflake
ETL
big data
Python
Apache Spark
data ingestion
transformation
Oracle
Troubleshoot
clustering
Data Warehouse
Data Modeling
stakeholders
SQL
warehouse
CI/CD
DevOps
AWS
Azure
Google Cloud
Apache Airflow
Azure Data Factory
ADF
Job Details
Job Title: Backend Data & Snowflake Engineer
Location: Remote
Job Description:
We are seeking a talented and self-starter Backend Data & Snowflake Engineer to join our team. This role blends backend data engineering with deep expertise in Snowflake to support growing business needs. The ideal candidate has strong experience in ETL processes, big data workloads, and Python/Apache Spark, with the ability to design, maintain, and optimize scalable data solutions.
Key Responsibilities
Data Engineering & ETL Development
- Design, build, and optimize ETL pipelines for data ingestion, transformation, and processing.
- Integrate and process data from multiple sources, including Snowflake, Oracle, and big data platforms.
- Troubleshoot and resolve data-related issues while ensuring consistency, accuracy, and availability.
- Leverage Python and Apache Spark (or equivalent) for data processing at scale.
Snowflake Data Warehouse & Data Modeling
- Oversee the operation and performance of the Snowflake data warehouse.
- Optimize queries, storage, and compute resources to improve efficiency.
- Design and maintain new tables and data models to meet evolving business needs.
- Apply best practices in warehouse design, including partitioning, clustering, and indexing.
- Ensure data integrity, security, and compliance across the environment.
Architecture & Design
- Lead the design and implementation of scalable data architectures.
- Collaborate with stakeholders to translate business requirements into technical solutions.
- Evaluate and adopt best practices for Snowflake and modern data engineering.
Collaboration & Team Support
- Partner with engineers, analysts, and stakeholders to deliver reliable data solutions.
- Provide guidance and mentorship on Snowflake, Python, and data engineering practices.
- Contribute to strategic decisions on infrastructure, tooling, and standards.
Required Skills & Qualifications
- Demonstrable experience with big data workloads.
- Hands-on expertise with Python and Apache Spark (or equivalent).
- Strong proficiency in SQL, data modeling, and warehouse optimization.
- In-depth experience with Snowflake development and administration.
- Familiarity with data security principles and compliance best practices.
- Strong analytical, problem-solving, and communication skills.
- Ability to design and optimize large-scale data environments.
Preferred Qualifications
- Snowflake certification (SnowPro Core or Advanced).
- Experience with CI/CD and DevOps practices in data engineering.
- Knowledge of data governance frameworks and compliance standards.
- Familiarity with orchestration tools such as Apache Airflow or Azure Data Factory.
- Experience with cloud platforms (AWS, Azure, or Google Cloud).
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.