Apache Sqoop Jobs in San Jose, CA

Refine Results
1 - 4 of 4 Jobs

Big Data Azure Engineer



Third Party

Job Title: Big Data Engineer Location: East Coast (Remote) Major Responsibilities/Activities: Build a highly functional and efficient Big Data platform to consolidate data from diverse sources, enabling the design and execution of complex algorithms for insights into Healthcare business operations. Develop ETL Data Pipelines in Azure Cloud using Azure ADF and Databricks, utilizing PySpark and Scala. Migrate ETL Data pipelines from On-Prem Hadoop Cluster to Azure Cloud. Construct Data Ingestion

Senior Databricks Engineer - REMOTE

Perficient, Inc.

Remote or St. Louis, MO, USA


A Senior Databricks Technical Developer - is expected to be knowledgeable in two or more technologies within (Databricks and Spark Data Engineering). The Senior Technical Consultant is expected to have strong development and programming skills in Databricks and Spark with a focus on Scala/Java and other ETL development experience in the Big Data space. You are expected to be experienced and fluent in agile development and agile tools as well as code repositories and agile SDLC/DevOps frameworks.

BigData Azure Engineer

Info Dinamica Inc


Full-time, Third Party

Job Title: Big Data Azure Engineer Location: East Coast(Remote) Duration: 12+ Months Experience: 7 - 10 years Major Responsibilities/Activities Build a highly functional and efficient Big Data platform that brings together data from disparate sources and allow FinThrive to design and run complex algorithms providing insights to Healthcare business operations.Build ETL Data Pipelines in Azure Cloud using Azure ADF and Databricks using PySpark and Scala.Migrate ETL Data pipelines from On Prem Hado

Hadoop Engineer with Cloudera Distribution 7.x & Java, Scala, Python, Hbase - Local only

Buxton Consulting

Pleasanton, CA, USA


Must HavesStrong experience in Big Data, Cloudera Distribution 7.x, Cloud migration, RDBMS -(out of 10)Strong Experience with Amazon EMR/Databricks/Cloudera CDP -(out of 10)4-5 experience building data pipelines usingHadoopcomponents Sqoop, Hive, Solr, MR, Impala, Spark, Spark SQL., HBase.(out of 10)4-5 years of programming experience in Python, Java and Scala is must.(out of 10)Strong experience with REST API development using Python frameworks (Django, Flask etc.)(out of 10)Micro Services/Web