Data Engineer must need AWS,Python, SQL, Snowflake, Orchestration Tools, Spark, Scala-Providence NJ (F2F interview and onsite)-Need local candidate 15+years need

  • New Providence, NJ
  • Posted 8 hours ago | Updated 8 hours ago

Overview

On Site
Depends on Experience
Accepts corp to corp applications
Contract - Independent
Contract - W2

Skills

Amazon Web Services
Analytical Skill
Apache Spark
Automated Testing
Automic
Communication
Conflict Resolution
Confluence
Continuous Delivery
Continuous Integration
Data Management
Data Warehouse
Development Testing
Informatica
Git
JIRA
JavaScript
Management
Meta-data Management
Orchestration
Organizational Skills
Performance Engineering
Problem Solving
ROOT
Python
Quality Assurance
Snow Flake Schema
SQL
Scala
Shell Scripting
Software Development
Software Engineering
Tandem

Job Details

Data Engineer must need AWS,Python, SQL, Snowflake, Orchestration Tools, Spark, Scala-Providence NJ (F2F interview and onsite)-Need local candidate 15+years need

. Data Engineer Job Description Experienced Data management specialist responsible for developing, overseeing, organizing, storing, and analyzing data and data systems Participate in all aspects of the software development lifecycle for Snowflake solutions, including planning, requirements, development, testing, and quality assurance Work in tandem with our engineering team to identify and implement the most optimal solutions Ensure platform performance, uptime, and scale, maintaining high standards for code quality and thoughtful design Troubleshoot incidents, identify root causes, fix and document problems, and implement preventive measures Able to manage deliverables in fast paced environments Areas of Expertise At least 8-10 years of experience designing and development of data solutions in enterprise environment At least 5+ years experience on Snowflake Platform Strong hands on SQL and Python development Experience with designing and development data warehouses in Snowflake A minimum of three years experience in developing production-ready data ingestion and processing pipelines using Spark, Scala Strong hands-on experience with Orchestration Tools e.g. Airflow, Informatica, Automic Good understanding on Metadata and data lineage Hands on knowledge on SQL Analytical functions Strong knowledge and hands-on experience in Shell scripting, Java Scripting Able to demonstrate experience with software engineering practices including CI/CD, Automated testing and Performance Engineering. Good understanding and exposure to Git, Confluence and Jira Good problem solving and troubleshooting skills. Team player, collaborative approach and excellent communication skills

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Keylent