Hadoop Infrastructure Engineer

ATLAS, Amazon Web Services, Apache HBase, Apache Hadoop, Apache Hive, Apache Kafka, Apache NiFi, Apache Pig, Apache Solr, Apache Spark, Apache Storm, Big data, Cloud, Communication skills, DevOps, Git, Infrastructure, Microsoft Windows Azure, Python, Ruby, LDAP, Scripting language, Software, Software deployment, Splunk, Troubleshooting, Service management
Full Time
$125,000 - $135,000
Work from home available Travel not required

Job Description

This is a permanent/direct hire opportunity.  No third party candidates or candidates who require sponsorship will be considered at this time.

Job Purpose: 
This position is responsible for collaborating with Solutions Engineering, Infrastructure Operations, and Infrastructure Service Management teams in the design and build of infrastructure solutions/blueprints for the area of responsibility; participating in the design and build of repeatable patterns (build-kits) to improve deployment times for non-prod and prod environments; transitioning knowledge to Infrastructure Operations.

Required Job Qualifications:

  • Bachelor's Degree and 5 years in Information Technology or relevant experience OR Technical Certification and/or College Courses and 7-year Information Technology experience OR 9 years Information Technology experience.
  • Expert in implementing and troubleshooting hive, spark, pig, storm, Kafka, Nifi, Atlas,  Elastic Search, Solr, Splunk, HBase applications.
  • Candidate should be ready for Hadoop on-call support if and when needed.
  • Working knowledge of Ruby or Python and known DevOps tools like Git and GitHub.
  • Experience in a scripting language to automate Infrastructure deployments tasks
  • Knowledge/intermediate level experience in Cloude technology(Azure/Aws)
  • Ability to simplify & standardize complex concepts / processes
  • Understanding of business priorities (e.g., vision), trends (e.g., industry knowledge) and markets (e.g., existing/ planned)
  • Oral & written communications.
  • Ability to prioritize and make trade-off decisions.
  •  Drive cross-functional execution.
  • Adaptability and ability to introduce/manage change.
  • Teamwork and collaboration.
  • Organized and detail-oriented.

Preferred Job Qualifications:

  • Exp in Hadoop application infrastructure engineering and development methodology background.
  • Experience with Ambari, Hortonworks, HDInsight, Cloudera distribution (CDH)
  • Experience with Kerberos, TLS encryption, SAML, LDAP.
  • Knowledge of the cloud (Azure/AWS) big data solutions using EMR, HDInsight, Kinesis, Azure Event Hubs, etc.



Dice Id : solpart
Position Id : 6685990
Originally Posted : 2 days ago
Have a Job? Post it

Similar Positions

Hadoop Data Specialist - Dallas, TX
  • Purple Drive Technologies LLC
  • Dallas, TX
Hadoop Admin
  • Cognizant Technology Solutions
  • Irving, TX
Sr. Hadoop Developer
  • Mindlance
  • Plano, TX
Big Data / Hadoop Developer
  • Addison, TX
Database Consultant - Hadoop Service Operation
  • Solution Partners, Inc.
  • Richardson, TX
Hadoop Admin
  • Signature Consultants
  • Addison, TX
Hadoop Admin/Developer
  • Keanesoft
  • Richardson, TX
Data Engineer (Hadoop and Spark)
  • Photon Infotech
  • Irving, TX
  • eDataForce consulting LLC
  • Dallas, TX
Kafka Admin
  • CyberBahn Federal Solutions LLC
  • Plano, TX
Kafka Administrator - Fast Moving
  • ClifyX
  • Plano, Texas
Kafka Developer
  • iQuest Solutions Corp
  • Irving, TX
Hadoop Engineer with DevOps
  • Object Technology Solutions, Inc.
  • Irving, TX