Overview
On Site
Depends on Experience
Contract - W2
Contract - Independent
Contract - 12 Month(s)
Able to Provide Sponsorship
Skills
Big Data
Data Engineering
Google Cloud
Google Cloud Platform
Apache Hadoop
Apache Hive
Python
Software Development
Software Engineering
Database
Job Details
Job Description:
- Design, build, and optimize scalable big data solutions on Google Cloud Platform (Big Query, Composer, Dataproc, GCS)
- Execute end-to-end migration of data pipelines and ETL workflows from Hadoop/Hive to Google Cloud Platform
- Develop robust, automated data pipelines using Python, PySpark, and SQL
- Collaborate with product and engineering teams to deliver reliable, high-performance data products
- Implement and manage CI/CD workflows
- Ensure data compliance, security, and privacy policies
- Mentor junior engineers
Preferred Skills:
- Software Engineering All Aspects of SDLC
- Agile
- Marketing Audience Segment Builder experience
Minimum Qualification:
- Bachelor's Degree in Computer Science, CIS, or related field (or equivalent work experience in a related field)
- 2 years of experience in software development or a related field
- 2 years of experience in database technologies
- 1 year of experience working on project(s) involving the implementation of solutions applying development life cycles (SDLC)
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.