Role: Google Cloud Platform Data Engineer + Python
Duration: 12+ month contract to hires so candidates need to be able to convert to fulltime.
Location: Remote
W2 only
5 positions
We are looking for DATA ENGINEERS that have python
Skills: Python, spark, pyspark, Google Cloud Platform.
Overview: At a high level, they have migrated from Hadoop to Google Cloud Platform for data processing. Have a Google Cloud Platform data environment, predominantly for big data applications on the cloud. Seeking 3-5 Senior Level Data Engineers with strong Python skills to support ongoing data migration and ingestion efforts.
Source systems: get data from multiple external channels - provider data, healthcare groups, hospitals, etc. send data and their platform processes it and provides to operation systems.
Their end state is not a data warehouse for analytics - but the data directly feeds applications.
Currently, a lot of the data ingestion is being done manually and they are looking to automate.
Their data pipelines are PySpark, Scala/Spark run on Dataproc for larger volumes.
Python, Google Cloud Functions to execute the scripts with Google Kubernetes Engine (GKE)
Should have experience in working with denormalized data types, both structured and unstructured data.
Using Cloud SQL as relational cloud database, but okay with others i.e. Oracle, Postgres.
Building AI use cases as well - 5-6 use cases on their plate right now, including AI for data pipeline builds. Folks who can have some background in developing AI applications would be the ideal profile. If we found a strong ML or AI candidate with Python programming skills, they could potentially find a space for them.
Must Have:
Strong hands-on Python programming
Spark/PySpark
Google Cloud Platform (BigQuery, Dataproc, Google Cloud Functions, GKE, Cloud SQL - would not consider all must haves, just general awareness of the Google Cloud Platform ecosystem and data services)
Experience working with various data types and structures
Nice to Have:
AI experience - building AI systems, models, or building inference pipelines and processing data "for AI"