Overview
On Site
Depends on Experience
Contract - W2
Contract - 12 Month(s)
Skills
Data Engineer
data pipelines
Snowflake
data modeling
SQL
Python
Java
ETL/ELT
Job Details
Title - Senior Data Engineer
Location-Onsite @NYC-Need Locals
Duration Contract
Design, build, and deployment of data pipelines and backend services, Snowflake data modeling, and strong data engineering experience.
Job Summary
We are seeking a Senior Data Engineer to design, build, and deploy scalable data pipelines and backend data services. The ideal candidate will have strong hands-on experience in Snowflake data modeling, modern data engineering practices, and building reliable, high-performance data platforms that support analytics and business intelligence.
Key Responsibilities
- Design, build, and deploy end-to-end data pipelines for large-scale data processing.
- Develop and maintain backend data services and APIs to support data consumption.
- Perform Snowflake data modeling, including schema design, optimization, and performance tuning.
- Implement ELT/ETL processes for structured and semi-structured data sources.
- Optimize data storage, query performance, and cost efficiency in Snowflake.
- Ensure data quality, reliability, security, and governance across pipelines.
- Collaborate with Data Scientists, Analysts, and Business stakeholders to understand data needs.
- Implement monitoring, logging, and alerting for data pipelines.
- Support CI/CD pipelines for data engineering deployments.
- Document data flows, architectures, and best practices.
Required Skills & Qualifications
- 10+ years of experience in Data Engineering or related roles.
- Strong hands-on experience with Snowflake (data modeling, performance tuning, security).
- Expertise in SQL and data transformation techniques.
- Proficiency in Python and/or Java for backend data processing.
- Experience building scalable ETL/ELT pipelines.
- Knowledge of data orchestration tools (Airflow, Azure Data Factory, or similar).
- Experience with cloud platforms (AWS, Azure, or Google Cloud Platform).
- Familiarity with CI/CD, Git, and DevOps practices.
- Strong problem-solving and communication skills.
Nice to Have
- Experience with streaming technologies (Kafka, Kinesis).
- Exposure to dbt or modern transformation frameworks.
- Experience in large-scale enterprise or regulated environments.
- Knowledge of data governance and metadata management tools.
Regards,
Sai Srikar
Email:
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.