Apache Sqoop Jobs in Seattle, WA

Refine Results
1 - 5 of 5 Jobs

Big Data Hadoop 100% Remote

Apex 2000

Remote

Contract

Job Title - Big Data Hadoop 100% Remote Skill: Over 10+ years of experience.Big Data Hadoop, Hands on experience in installing, configuring, and using Hadoop ecosystem components likeHadoop Map Reduce, HDFS, HBase, Hive, Spark, Sqoop, Pig, Zookeeper and Flume . Should be good in Hadoop Architecture and various components such as HDFS, Job tracker Task tracker, Name Node, Data Node and Map Reduce. Good knowledge on Amazon Web Services (AWS) Cloud services like EC2, S3, EBS, RDS and VPC.

Hadoop Administrator with Cloudera

Kforce Technology Staffing

Remote or Birmingham, Alabama, USA

Contract

RESPONSIBILITIES: Kforce has a client that is seeking a Hadoop Administrator with Cloudera in Birmingham, AL. Summary: The Big Data Design Engineer is responsible for architecture design, implementation of Big Data platform, Extract/Transform/Load (ETL), and analytic applications. The team is currently migrating to Snowflake. This person will handle existing data lake during migration. Primary Responsibilities: * Hadoop Administrator oversees implementation and ongoing administration of Hadoop

Sr. Application Developer (Data Mastering) - Remote

Lincoln Financial Group

US

Full-time

Alternate Locations: Work from Home Work Arrangement: Remote : Work at home employee residing outside of a commutable distance to an office location. Relocation assistance: is not available for this opportunity. Requisition #: 73942 The Role at a Glance The person in this role will be responsible for designing, developing, and implementing Master Data Management solutions. It involves working closely with business stakeholders, data analysts, and IT teams to ensure the accuracy, consistency

SS - HADOOP ROLE

Apex 2000

Remote

Contract, Third Party

Hi Pls sendin resume and ph# HADOOP ROLE Location remote Over 10+ years of experience. Big Data Hadoop, Hands on experience in installing, configuring, and using Hadoop ecosystem components like Hadoop Map Reduce, HDFS, HBase, Hive, Spark, Sqoop, Pig, Zookeeper and Flume . Should be good in Hadoop Architecture and various components such as HDFS, Job tracker Task tracker, Name Node, Data Node and Map Reduce. Good knowledge on Amazon Web Services (AWS) Cloud services like EC2, S3, EBS, RDS an

Machine Learning Engineer

Buxton Consulting

Remote

Contract

Position: Machine Learning AI Engineer Location: Remote Duration: Long Term Must Have Skills: Strong project experience in Machine Learning, Big Data, NLP, Deep Learning, RDBMS is must.Strong project experience with Amazon Web Services and Cloudera Data Platform is must.4-5 experience building data pipelines using Python, MLLib, PyTorch, TensorFlow, Numpy/Scipy/Pandas, Spark, Hive,4-5 years of programming experience in AWS, Linux and Data Science notebooks is must.Strong experience with REST AP