Software Engineer - Google Cloud Platform, Hadoop - Preferred local resources

Overview

On Site
$40+
Contract - W2
Contract - 12 Month(s)

Skills

GCP
Haddop
Python
Elastic Search

Job Details

Job Description:

  • Design, build, and optimize scalable big data solutions on Google Cloud Platform (Big Query, Composer, Dataproc, GCS)
  • Execute end-to-end migration of data pipelines and ETL workflows from Hadoop/Hive to Google Cloud Platform
  • Develop robust, automated data pipelines using Python, PySpark, and SQL
  • Collaborate with product and engineering teams to deliver reliable, high-performance data products
  • Implement and manage CI/CD workflows
  • Ensure data compliance, security, and privacy policies
  • Mentor junior engineers

Top 3 Skills:

  • Hadoop 3 years min
  • Google Cloud Platform 3 years min
  • Elastic Search 2 years min
  • Airflow, Python, Kafka 3 years min

Preferred Skills:

  • Software Engineering All Aspects of SDLC
  • Agile
  • Marketing Audience Segment Builder experience

Minimum Qualification:

  • Bachelor's Degree in Computer Science, CIS, or related field (or equivalent work experience in a related field)
  • 2 years of experience in software development or a related field
  • 2 years of experience in database technologies
  • 1 year of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC)

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Unisoft Technology Inc