Overview
Hybrid3 times a week
$50 - $60
Contract - W2
Contract - 12 Month(s)
Skills
Hadoop
AWS
Spark
PySpark
Java
Python Scala
Job Details
Needs: Hadoop AWS Spark PySpark Java Python Scala
Job Description:
Big Data Technologies: Hadoop, Spark, HDFS, Hive, Cloudera, Hortonworks
Cloud Platforms: AWS (Glue, Lambda, Redshift, S3, CloudWatch)
ETL/ELT Tools: AWS Glue, Python, PySpark, Databricks
Programming Languages: Python, Java, Scala, SQL, HiveQL
Data Integration & Migration: Experience with Hadoop, Kafka, data lakes, and real-time streaming
Data Modeling & Transformation: Dimensional data models, structured/unstructured data processing
CI/CD & Automation: Jenkins, Git, Autosys, Airflow
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.