Hadoop Developer | Chicago, IL & Charlotte, NC | only W2 | Hybrid

Overview

Hybrid
Depends on Experience
Contract - W2

Skills

Hadoop
big data
Spark
Kafka
Impala

Job Details

JobTitle: Hadoop Developer
Location: Chicago, Illinois & Charlotte, NC ( Hyrbid)


Must Have skills:
Hadoop

Job Description:

Hadoop Engineer (SME) role supporting NextGen Platforms built around Big Data Technologies (Hadoop, Spark, Kafka, Impala, Hbase, Docker-Container, Ansible and many more). Requires experience in cluster management of vendor based Hadoop and Data Science (AI/ML) products like Cloudera, Databricks, Snowflake, Talend, Greenfield, ELK, KPMG Ignite etc. Hadoop Engineer is involved in the full life cycle of an application and part of an agile development process. They require the ability to interact, develop, engineer, and communicate collaboratively at the highest technical levels with clients, development teams, vendors and other partners. The following section is intended to serve as a general guideline for each relative dimension of project complexity, responsibility and education/experience within this role. - Works on complex, major or highly visible tasks in support of multiple projects that require multiple areas of expertise
- Team member will be expected to provide subject matter expertise in managing Hadoop and Data Science Platform operations with focus around Cloudera Hadoop, Jupyter Notebook, OpenShift, Docker-Container Cluster Management and Administration
- Integrates solutions with other applications and platforms outside the framework
- He / She will be responsible for managing platform operations across all environments which includes upgrades, bug fixes, deployments, metrics / monitoring for resolution and forecasting, disaster recovery, incident / problem / capacity management
- Serves as a liaison between client partners and vendors in coordination with project managers to provide technical solutions that address user needs
- Hadoop, Kafka, Spark, Impala, Hive, Hbase etc.
- Strong knowledge of Hadoop Architecture, HDFS, Hadoop Cluster and Hadoop Administrator's role
- Intimate knowledge of fully integrated AD/Kerberos authentication
- Experience setting up optimum cluster configurations
- Debugging knowledge of YARN.

Expert level knowledge of Cloudera Hadoop components such as HDFS, Sentry, HBase, Kafka, Impala, SOLR, Hue, Spark, Hive, YARN, Zookeeper and Postgres
- Strong technical knowledge: Unix/Linux; Database (Sybase/SQL/Oracle), Java, Python, Perl, Shell scripting, Infrastructure.
- Experience in Monitoring Alerting, and Job Scheduling Systems
- Being comfortable with frequent, incremental code testing and deployment
- Strong grasp of automation / DevOps tools - Ansible, Jenkins, SVN, Bitbucket
- Experience working on Big Data Technologies
- Cloudera Admin / Dev Certification
- Certification in Cloud, Docker-Container, OpenShift Technologies

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.