Hadoop Developer

Overview

On Site
Full Time
Part Time
Accepts corp to corp applications
Contract - W2
Contract - Independent

Skills

Recruiting
Employment Authorization
HDFS
MapReduce
HiveQL
Data Analysis
Reporting
RDBMS
Streaming
Data Security
Regulatory Compliance
Collaboration
Workflow
Java
Scala
Python
Apache Hive
Apache Pig
Apache HBase
Apache Sqoop
Linux
Unix
SQL
NoSQL
Database
Data Processing
Data Warehouse
Extract
Transform
Load
Apache Spark
Real-time
Apache Kafka
Apache Flume
Cloud Computing
Amazon Web Services
Electronic Health Record (EHR)
Microsoft Azure
DevOps
Git
Jenkins
Continuous Integration
Continuous Delivery
Cloudera
Hortonworks
Computer Science
Big Data
Apache Hadoop

Job Details

Hiring: W2 Candidates Only

Visa: Open to any visa type with valid work authorization in the USA

Key Responsibilities

  • Design, develop, and implement scalable big data solutions using Hadoop.

  • Work with Hadoop ecosystem tools such as HDFS, MapReduce, YARN, Hive, Pig, HBase, and Spark.

  • Develop and optimize ETL pipelines for large datasets.

  • Write and optimize HiveQL queries for data analysis and reporting.

  • Integrate Hadoop systems with data sources like RDBMS, NoSQL databases, and streaming systems.

  • Monitor Hadoop cluster performance and troubleshoot issues.

  • Ensure data security, governance, and compliance.

  • Collaborate with data engineers, data scientists, and business teams.

  • Document system designs, workflows, and best practices.


Required Skills

  • Strong experience with Hadoop framework and ecosystem.

  • Proficiency in Java, Scala, or Python.

  • Hands-on experience with Hive, Pig, HBase, Spark, and Sqoop.

  • Knowledge of Linux/Unix environments.

  • Experience with SQL and NoSQL databases.

  • Understanding of distributed systems and data processing concepts.

  • Familiarity with data warehousing and ETL tools.


Preferred Qualifications

  • Experience with Apache Spark and real-time processing tools like Kafka or Flume.

  • Knowledge of cloud platforms (AWS EMR, Azure HDInsight, or Google Dataproc).

  • Understanding of DevOps tools (Git, Jenkins, CI/CD pipelines).

  • Hadoop certification (Cloudera, Hortonworks) is a plus.


Education & Experience

  • Bachelor s degree in Computer Science, Engineering, or related field.

  • 8 years of experience in Big Data or Hadoop development (varies by role level).


Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.