Job Description: Data Engineer with 10+yrs must (Snowflake & Python) ( W2 profiles only)

  • New York, NY
  • Posted 3 days ago | Updated 3 days ago

Overview

Hybrid
$80,000 - $100,000
Contract - W2
Contract - 12 Month(s)
Able to Provide Sponsorship

Skills

Snowflake and Python

Job Details

Job Description: Data Engineer with 10+yrs must (Snowflake & Python)

( W2 profiles only) Should be willing to relocate

Overview:


We are seeking a highly skilled and motivated Data Engineer with expertise in Snowflake and Python to join our dynamic team. The ideal candidate will be responsible for building and optimizing data pipelines, managing cloud-based data environments, and supporting data integration efforts. You will play a key role in ensuring that data solutions are scalable, reliable, and aligned with business needs.

Key Responsibilities:

  • Design, develop, and maintain scalable data pipelines and ETL processes using Snowflake and Python.
  • Build and optimize data architectures to support business intelligence, analytics, and machine learning initiatives.
  • Collaborate with data analysts, data scientists, and stakeholders to understand data requirements and ensure smooth dataflows.
  • Manage and administer Snowflake data warehouses, including schema design, performance tuning, and data security.
  • Write efficient, reusable, and maintainable code for data processing and transformation tasks using Python.
  • Implement data quality checks and validation processes to maintain data integrity.
  • Automate data workflows and improve data reliability and performance.
  • Troubleshoot and resolve data-related issues in a timely manner.
  • Maintain and document data engineering solutions, best practices, and coding standards.

Required Skills & Qualifications:

  • Bachelor s or master s degree in computer science, Information Technology, or a related field.
  • Proven experience as a Data Engineer with hands-on expertise in Snowflake and Python.
  • Strong proficiency in SQL for querying and manipulating data within Snowflake.
  • Knowledge of Snowflake architecture, data sharing, cloning, and security features.
  • Experience in developing and managing ETL pipelines and workflows.
  • Familiarity with cloud platforms (AWS, Azure, or Google Cloud Platform) and data storage solutions.
  • Proficient in data modeling, data warehousing concepts, and database optimization techniques.
  • Experience with version control systems (e.g., Git) and CI/CD pipelines.
  • Strong problem-solving and debugging skills with attention to detail.

Preferred Skills:

  • Experience with data orchestration tools like Airflow or Prefect.
  • Knowledge of other big data technologies such as Databricks, Spark, or Kafka.
  • Familiarity with REST APIs and data integration from external sources.
  • Exposure to machine learning pipelines and AI workflows is a plus.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About yondertech