Overview
Skills
Job Details
Hope you are doing well. This is Jyoti Chauhan from Vlink. We are currently seeking a talented and motivational consultant for the position of Google Cloud Platform Data Migration Engineer Based on your background and experience, I believe you could an excellent fit for this role. Please find the Job Description for the position below, and if comfortable please share your updated resume.
Job Title: Google Cloud Platform Data Migration Engineer
Location: REMOTE (EST Only)
Employment Type: Contract (Only W2) Duration: 6+ Months About VLink: Started in 2006 and headquartered in Connecticut, VLink is one of the fastest growing digital technology services and consulting companies. Since its inception, our innovative team members have been solving the most complex business, and IT challenges of our global clients.
Job Description: Technical focus: Google Cloud Platform expertise, Python, Pyspark, SQL Must Have:
8+ years of experience in data processing, data engineering or large-scale data systems.
Python & Pyspark (Spark SQL) 3+ years
Airflow (or any orchestration tool) 2+ years
Google Cloud Platform (Big Query, GCS, Pub/Sub, Cloud Run, Functions, Cloud SQL) 3+ years
Real-time data ingestion (Kafka, webhooks, file-based) 2+ years
API integration (REST/webhooks) 2+ years
Kubernetes (GKE preferred) 1 2 years
Big Query SQL & PostgreSQL 2+ years
YAML/config-driven pipeline design 2+ years
Schema transformation, hashing, DQF 2+ years
CI/CD, observability, lightweight dashboards (Grafana/Streamlit/Flask UI) 1+ year
Logistics or healthcare domain exposure nice to have
Experience Needed:
Strong with Pyspark (esp. Spark SQL) for complex transformation pipelines
Hands-on with Airflow for orchestration and BigQuery SQL for querying and data modeling
Good experience in Google Cloud Platform (BigQuery, GCS, Pub/Sub, Cloud Run, Functions,Cloud SQL)
Comfortable with real-time ingestion: Kafka, webhooks, file-based triggers
Solid API integration skills (REST/webhooks), with ability to handle payload-driven workflows
Experience working in Kubernetes (GKE) for deploying and scaling pipelines
Comfortable handling UI-driven configuration, YAML-based setups, and modular frameworks
Exposure to schema transformation, data validation, hashing, and DQF logic
Domain familiarity with logistics/healthcare (CVS context is a big plus)
Strong ownership mindset able to work under pressure and balance speed with reliability
Basic understanding of BASH, SFTP transfers, Networking, Access Management
Bonus: PostgreSQL, CI/CD, monitoring, dashboarding, or lightweight UI development
Employment Practices:
EEO, ADA, FMLA Compliant
VLink is an equal opportunity employer. At VLink, we are committed to embracing diversity, multiculturalism, and inclusion. VLink does not discriminate on the basis of race, color, religion, sex, national origin, disability status, protected veteran status, or any other characteristic protected by law. All aspects of employment including the decision to hire, promote, or discharge, will be decided on the basis of qualifications, merit, performance, and business needs.