Apache Sqoop Jobs in San Jose, CA

Refine Results
1 - 9 of 9 Jobs

Sr. Data Engineer

ACL Digital

San Jose, California, USA

Full-time

Job Title: Sr. Data Engineer Location: San Jose, CA (Onsite) Job Description: Analytical skills : problem solving, handling multiple complex projects at one time etc SQL is a must( high proficiency). Preferably in Teradata but any SQL platform like Oracle, MySQL also ok Extensive experience in migrating data from Teradata --> Hadoop --> Cloud(Google Cloud Platform) and solid experience in Pre-prod and from Pre-prod to Production. Experience in migrating workload from on-premise to cloud and clou

Hadoop Data Discovery and Protection Execution Engineer - 5770355

Accenture LLP

Remote or Charlotte, North Carolina, USA

Full-time

Accenture Flex offers you the flexibility of local fixed-duration project-based work powered by Accenture, a leading global professional services company. Accenture is consistently recognized on FORTUNE's 100 Best Companies to Work For and Diversity Inc's Top 50 Companies For Diversity lists. As an Accenture Flex employee, you will apply your skills and experience to help drive business transformation for leading organizations and communities. In addition to delivering innovative solutions for

Applications Development Senior Programmer Analyst

Citi

Remote or Tampa, Florida, USA

Full-time

Citibank, N.A. seeks an Applications Development Senior Programmer Analyst for its Tampa, FL location. Duties: Onboard applications in the Olympus Platform enabling controls and monitoring of platform stability. Involved in the design, implementation, and delivery of End-of-Day (EOD) Marker controls by building new connectors in Step Framework, implementing platform guidelines for platform migration, and automating outbound reports for onboarding. Support workflow for application users to make

Applications Development Group Manager

Citi

Remote or Rutherford, New Jersey, USA

Full-time

Citibank, N.A. seeks an Applications Development Group Manager for its Rutherford, NJ location. Duties: Ensure completeness and accuracy of data profiling needs on a data platform for risk and finance business use cases. Responsible for Data Integration of multiple data flows. Own and maintain end-to-end Data Flow and Architecture of Global Finance and Risk Data specifically on Big Data platform and its toolsets. Develop and implement customized data federation layer transformation (Extract Tra

Data Engineer with AWS, Kafka, Lambda,python, Hadoop, Google Cloud Platform, AWS Glue, pyspark, Snowflake, databricks, S3, Redshift, HDFS, HBASE, Azure, NoSql- Remote- must need 10+ Years

Keylent

Remote

Third Party, Contract

Data Engineer with AWS, Kafka, Lambda,python, Hadoop, Google Cloud Platform, AWS Glue, pyspark, Snowflake, databricks, S3, Redshift, HDFS, HBASE, Azure, NoSql- Remote- must need 10+ Years 2.Job summary : Around 10 years of experience working with almost all Hadoop ecosystem components AWS cloud services Microsoft Azure Google Cloud Platform Apache Spark strong working background in designing developing and deploying complex data integration solutions . Experience with star schema modeling and kn

Master Data Management Sr. Developer

Lincoln Financial Group

US

Full-time

Alternate Locations: Charlotte, NC (North Carolina); Fort Wayne, IN (Indiana); Greensboro, NC (North Carolina); Radnor, PA (Pennsylvania); Work from Home Work Arrangement: Hybrid Preferred : Preferred employee will work 3 days a week in a Lincoln office Relocation assistance: is not available for this opportunity. Requisition #: 73942 The Role at a Glance The person in this role will be responsible for designing, developing, and implementing Master Data Management solutions. It involves wor

Sr. Application Developer (Master Data Management)

Lincoln Financial Group

US

Full-time

Alternate Locations: Charlotte, NC (North Carolina); Fort Wayne, IN (Indiana); Greensboro, NC (North Carolina); Radnor, PA (Pennsylvania); Work from Home Work Arrangement: Hybrid Preferred : Preferred employee will work 3 days a week in a Lincoln office Relocation assistance: is not available for this opportunity. Requisition #: 73942 The Role at a Glance The person in this role will be responsible for designing, developing, and implementing Master Data Management solutions. It involves wor

Big Data Hadoop Engineer

Harvey Nash Inc.

Remote

Contract

We are seeking a Big Data Hadoop Engineer for one of our direct client & its a 100% remote opportunity, needing to work in PST time zone. Looking for candidates who can work only on W2 Must Haves Strong experience in Big Data, Cloudera Distribution 7.x, RDBMS development4-5 years of programming experience in Python, Java, Scala, and SQL is must.Strong experience building data pipelines using Hadoop components Sqoop, Hive, SOLR, MR, Impala, Spark, Spark SQL, HBase.Strong experience with REST API

Big Data Hadoop Engineer with Cloudera Distribution 7.x and Java

Buxton Consulting

Remote

Contract

Position: Big Data Hadoop Engineer with Cloudera Distribution 7.x and Java Location: Remote (PST Timing) Duration: 12+ Months (May extend till 48 months) Must Haves Strong experience in Big Data, Cloudera Distribution 7.x, RDBMS development 4-5 years of programming experience in Python, Java, Scala, and SQL is must. Strong experience building data pipelines using Hadoop components Sqoop, Hive, SOLR, MR, Impala, Spark, Spark SQL, HBase. Strong experience with REST API development using Python fra