Hybrid 3 Days a Week Onsite Description and Requirements : 1. Expert in setting up Hadoop cluster from scratch and server maintenance. 2. Commissioning and decommissioning of the nodes to/from Hadoop cluster. 3. Implementing, managing, and administering the overall Hadoop infrastructure. 4. Capacity planning and estimating the requirements for lowering or increasing the capacity of the Hadoop cluster. 5. Monitoring the Hadoop cluster connectivity and performance of the cluster for application te
Hadoop Administrator / Engg, Hadoop cluster server maint, capacity, 12+ Months Cont Alpharetta, GA Loc: Alpharetta, GA (Hybrid 3 Days a Week Onsite) Dur: 12+ Months Cont Interview Type: Inperson meeting will be needed ( locals who can commute will be given preference ) Description: Hybrid 3 Days a Week Onsite Description and Requirements : 1. Expert in setting up Hadoop cluster from scratch and server maintenance. 2. Commissioning and decommissioning of the nodes to/from Hadoop cluster. 3. I
Description and Requirements :1. Expert in setting up Hadoop cluster from scratch and server maintenance.2. Commissioning and decommissioning of the nodes to/from Hadoop cluster.3. Implementing, managing, and administering the overall Hadoop infrastructure.4. Capacity planning and estimating the requirements for lowering or increasing the capacity of the Hadoop cluster.5. Monitoring the Hadoop cluster connectivity and performance of the cluster for application teams.6. Replicating huge amount da
Level 4: (8 to 10 yrs of exp) Loc: Alpharetta, GA or Menlo Park, CA locations. ( ONSITE 3 to 4 days a week hybrid ) Interview type: Inperson - locals only Dur: 12+ Months Cont Data Engineer - ETL, Kafka, ElasticSearch ELK, Snowflake, Hadoop, SQL, Python 12 mths Cont, Alpharetta, GA . Hybrid 3 Days a Week Onsite Open to Alpharetta, GA or Menlo Park, CA locations. Please put at the top of the resume which location the candidate is being considered for. - Must be local to either location 2 roles
Our client, a leading financial services company is hiring a Hadoop Administrator on a long-term contract basis. Job ID 82768 Work Location: Alpharetta, GA - Hybrid Responsibilities: Commissioning and decommissioning of the nodes to/from Hadoop cluster. Implementing, managing, and administering the overall Hadoop infrastructure. Capacity planning and estimating the requirements for lowering or increasing the capacity of the Hadoop cluster. Monitoring the Hadoop cluster connectivity and perform
Our client, a leading financial services company is hiring a Hadoop Administrator on a long-term contract basis. Job ID 82768 Work Location: Alpharetta, GA - Hybrid Responsibilities: Commissioning and decommissioning of the nodes to/from Hadoop cluster. Implementing, managing, and administering the overall Hadoop infrastructure. Capacity planning and estimating the requirements for lowering or increasing the capacity of the Hadoop cluster. Monitoring the Hadoop cluster connectivity and perform
Software Guidance & Assistance, Inc., (SGA), is searching for a Hadoop Developer for a CONTRACT assignment with one of our premier Financial Services clients in Alpharetta, GA. Responsibilities: Commissioning and decommissioning of the nodes to/from Hadoop cluster. Implementing, managing, and administering the overall Hadoop infrastructure. Capacity planning and estimating the requirements for lowering or increasing the capacity of the Hadoop cluster. Monitoring the Hadoop cluster connectivi
We are seeking a Big Data Hadoop Engineer for one of our direct client & its a 100% remote opportunity, needing to work in PST time zone. Looking for candidates who can work only on W2 Must Haves Strong experience in Big Data, Cloudera Distribution 7.x, RDBMS development4-5 years of programming experience in Python, Java, Scala, and SQL is must.Strong experience building data pipelines using Hadoop components Sqoop, Hive, SOLR, MR, Impala, Spark, Spark SQL, HBase.Strong experience with REST API
Position: Big Data Hadoop Engineer with Cloudera Distribution 7.x and Java Location: Remote (PST Timing) Duration: 12+ Months (May extend till 48 months) Must Haves Strong experience in Big Data, Cloudera Distribution 7.x, RDBMS development 4-5 years of programming experience in Python, Java, Scala, and SQL is must. Strong experience building data pipelines using Hadoop components Sqoop, Hive, SOLR, MR, Impala, Spark, Spark SQL, HBase. Strong experience with REST API development using Python fra
Currently, we are looking for talented resources for one of our listed clients. If interested please reply to me with your updated resume or feel free to reach out to me for more details at On-Site role Job Description: Candidates must have: 10 to 15 years of experience in data science, including an extensive track record of implementing data solutions and driving data-driven decision-making Master's or Ph.D. in a quantitative field (e.g., Computer Science, Statistics, Mathematics, Engineering)