Sr Data Engineer (Only local to NY and Onsite interview is Mandatory)

Overview

On Site
Accepts corp to corp applications
Contract - W2
Contract - 12+ month

Skills

ORACLE
JAVA
python
Agile
jenkins
JSON
DOCUMENTATION
Cassandra
mongodb
Hadoop
data structures
SCHEDULING
Scala
NOSQL
Git
Amazon Web Services
PostGres
Avro
Dynamo
CONTINUOUS INTEGRATION/DELIVERY
PROBLEM-SOLVING
Asteradata

Job Details

Job Title : Sr. Data Engineer

Location : New York (5 Day's Onsite/week)

Duration: Long term Contract

Experience: 10+ years (Mandatory)

Required Skills: Python, Spark, AWS, Airflow

Job Description :-

Proficiency in data engineering programming languages (preferably Python, alternatively Scala or Java)

Proficiency in atleast one cluster computing frameworks (preferably Spark, alternatively Flink or Storm)

Proficiency in atleast one cloud data Lakehouse platforms (preferably AWS data lake services or Databricks, alternatively Hadoop), atleast one relational data stores (Postgres, Oracle or similar) and atleast one NOSQL data stores (Cassandra, Dynamo, MongoDB or similar)

Proficiency in atleast one scheduling/orchestration tools (preferably Airflow, alternatively AWS Step Functions or similar)

Proficiency with data structures, data serialization formats (JSON, AVRO, Protobuf, or similar), big-data storage formats (Parquet, Iceberg, or similar), data processing methodologies (batch, micro-batching, and stream), one or more data modelling techniques (Dimensional, Data Vault, Kimball, Inmon, etc.), Agile methodology (develop PI plans and roadmaps), TDD (or BDD) and CI/CD tools (Jenkins, Git,)

Strong organizational, problem-solving and critical thinking skills; Strong documentation skills

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.