W2 - (4) - Sr Data Engineer (Google Cloud Platform tech stack, Python, SQL, Data pipelines, Google Cloud Platform certification) - Remote

Overview

Remote
Hybrid
Contract - W2
Contract - To 12/31/2025

Skills

Terraform
data pipelines
GCP Dataflow
GCP Big Query
GCP DataProc
GCP Data Fusion
GCP Change Data Stream
Python/SQL
Cloud Functions
Cloud Events
Cloud Composer. -GCP certification

Job Details

Duties:

Scope: Develops and deploys data pipelines, integrations and transformations to support analytics and machine learning applications and solutions as part of an assigned product team using various open-source programming languages and vended software to meet the desired design functionality for products and programs.

The position requires maintaining an understanding of the organization's current solutions, coding languages, tools, and regularly requires the application of independent judgment. May provide consultative services to departments/divisions and leadership committees. Demonstrated experience in designing, building, and installing data systems and how they are applied to the Department of Data & Analytics technology framework is required. Candidate will partner with product owners and Analytics and Machine Learning delivery teams to identify and retrieve data, conduct exploratory analysis, pipeline and transform data to help identify and visualize trends, build and validate analytical models, and translate qualitative and quantitative assessments into actionable insights.

Requirements:

-Required hands-on Google Cloud Platform tech stack experience, including advanced skills in several of the following: Terraform, Google Cloud Platform Dataflow, Google Cloud Platform Big Query, Google Cloud Platform DataProc, Google Cloud Platform Data Fusion, Google Cloud Platform Change Data Stream, Python/SQL, Cloud Functions, Cloud Events, Cloud Composer.

-Google Cloud Platform certification

100% Remote - equipment will be provided.

Implement data pipelines using best practices for ETL / ELT, data management, and data governance. Analyze and process complex data sources in a fast-paced environment. Perform data modeling against large data sets for peak efficiency. Identify, design, and implement process improvement solutions that automate manual processes and leverage standard frameworks and methodologies. Understand and incorporate data quality principles that ensure optimal reliability, impact, and user experience. Partner across teams to support cross-platform operations. Create and document functional and technical specifications. Drive exploration of new features, versions, and related technologies, and provide recommendations to enhance our offerings. Mentor junior engineers within the team

Education: Bachelor's degree in Computer Science, Information Technology or related field; OR equivalent 5+ years of experience. 5+ years of hands-on experience programming in SQL. 3+ years of experience building and maintaining automated data pipelines and data assets using batch and/or streaming processes

Schedule Notes: Scope: Develops and deploys data pipelines, integrations and transformations to support analytics and machine learning applications and solutions as part of an assigned product team using various open-source programming languages and vended software to meet the desired design functionality for products and programs. The position requires maintaining an understanding of the organization's current solutions, coding languages, tools, and regularly requires the application of independent judgment. May provide consultative services to departments/divisions and leadership committees. Demonstrated experience in designing, building, and installing data systems and how they are applied to the Department of Data & Analytics technology framework is required. Candidate will partner with product owners and Analytics and Machine Learning delivery teams to identify and retrieve data, conduct exploratory analysis, pipeline and transform data to help identify and visualize trends, build and validate analytical models, and translate qualitative and quantitative assessments into actionable insights. Requirements: -Required hands-on Google Cloud Platform tech stack experience, including advanced skills in several of the following: Terraform, Google Cloud Platform Dataflow, Google Cloud Platform Big Query, Google Cloud Platform DataProc, Google Cloud Platform Data Fusion, Google Cloud Platform Change Data Stream, Python/SQL, Cloud Functions, Cloud Events, Cloud Composer. -Google Cloud Platform certification 100% Remote - equipment will be provided.

Hours Per Day: 8.00

Hours Per Week 40.00

Pay rate: $/hr on W2.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.