Principal Google Cloud Platform Data Engineer -- Remote -- greenfield environments

Overview

Remote
Full Time
Contract - W2
Contract - 14 day((s))

Skills

etl
GCP

Job Details

Job Title: Principal Google Cloud Platform Data Engineer
Location: 100% Remote (U.S. Based Only)
Contract Type: Contract-to-Hire
Conversion Timeline: 4 6 months
Work Schedule Perks: Flexible hours (Option for 4x10s or 4x9s + half-day Fridays during summer)
Position Summary:
We're seeking a Principal Google Cloud Platform Data Engineer to lead efforts in building modern data solutions from scratch - including the migration to Google Cloud Platform (Google Cloud Platform), creation of data pipelines using SnapLogic, and orchestration via Airflow/Cloud Composer. You'll join a dynamic, forward-thinking team focused on sunsetting legacy platforms (like Redshift and Datastage) and transforming how enterprise healthcare data is ingested, processed, and used.
This role requires a hands-on builder, not a maintainer - someone who can design and develop scalable pipelines and actively contribute to architectural discussions.
Key Responsibilities:
  • Design and build data ingestion & ETL pipelines from scratch using tools like SnapLogic, Python, SQL, and Dataflow.
  • Lead data warehouse migration efforts to Google BigQuery from Redshift and other legacy systems.
  • Build and manage orchestration workflows using Airflow or Cloud Composer, including the development of custom DAGs.
  • Collaborate cross-functionally with data modelers, BI/reporting teams, and stakeholders to support end-to-end data delivery.
  • Support data modeling activities and ensure strong understanding of data warehousing fundamentals.
  • Participate in architectural planning, tool evaluation, and POC development for emerging technologies.
  • Be a proactive communicator, providing input, solving problems, and contributing ideas - this is not a "heads-down" or "order-taker" role.
Required Qualifications:
  • 5 6+ years of experience in modern data engineering roles with increasing responsibility.
  • Proven experience in building ETL/ELT pipelines from scratch (not just maintaining existing pipelines).
  • Strong hands-on experience with SnapLogic (preferred), Python, SQL, and Apache Beam/Dataflow.
  • Hands-on experience with Airflow or Cloud Composer, including DAG development.
  • Strong working knowledge of Google Cloud Platform (Google Cloud Platform), specifically BigQuery.
  • Solid grasp of data warehouse architecture, data modeling concepts, and end-to-end data lifecycle.
  • Strong communication skills and a proactive, collaborative working style.
Nice to Have (Bonus Skills):
  • Experience with Kafka, Java, Alteryx, or other modern data tools.
  • Background in healthcare data and familiarity with healthcare-related data processes or standards.
  • Experience supporting enterprise data migration or modernization initiatives.
Ideal Candidate Traits:
  • Builder mindset: thrives in greenfield environments, not legacy maintenance.
  • Curious, communicative, and team-oriented: asks questions, challenges assumptions, and improves systems.
  • Clear understanding of how data flows from source systems to reporting layers.

Follow us over Linkedin -

#LI-BS1
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.