Devops/MLOPS

Overview

Hybrid
Depends on Experience
Contract - W2
Contract - 12 Month(s)

Skills

Devops
MLOPS
Kubernetes
Python
Healthcare
GPU
Kubeflow
Airflow
Tensorflow

Job Details

Position: Devops/MLOPS/Data Architect -Open Position-CA
Type: contract
Duration: 12+ months
Location: Pleasanton, CA (Remote)

<>Responsibilities</>

Big Data

Healthcare (Desirable)

Containers, Kubernetes

Python

Apache airflow

mapR

Rancher

Openshift

Pytorch

Traefik

How many years of related experience are you looking for in your ideal candidate?
Specific Systems Knowledge Required: Yarn, Yamel files, Kubernetes, Docker, Traefik, Prometheus, GetLab/GetHub, Hadoop, Hive, thorough understanding of building from the ground up AI/ML platforms, data ingestion proficiencies, Grafana, Python specialist.
Specific Systems Knowledge Preferred: MapR storage, Advanced Linux Administration, scripting expertise, LDAP integration, Vault development, ansible development, VMware administration, experience with various operating systems CentOS, RHEL, Ubuntu.

? This is not an analyst position, but an architect position.
? Architected Kubernetes, Linux , Traefik, MapR . Postgress , python , understands how to build these systems from the ground up on-premises, knows storage systems real well. Understands how to build out a platform with Spark, Jupiter Notebooks, R Server / Studio with interactions with SAS.
? It's a bit of a crossover with a Data Scientist, but I don't need the person to know how to run models. It would be nice, but they need to know how to build a platform to use Data Scientist tools to run models and what type of systems fit to make it work. They should be very experienced too, with GPUs and how they are used and the scenarios to use them.

Notes: Implementation of Traefik. Implementation of Kubernetes (on prem) OpenShift or Rancher Implementation of R Server / RStudio / Jupyter Notebooks Expert Linux administration MapR Storage implementation or Administration Versed with Yaml file configuration Understanding of how to Architect and build out a complete solution for Big Data for use by Data Scientist. Must understand how to setup an environment for a Data Scientist workbench. Must be expert with Python and understand HDFS systems and how to implement them

Client is looking for someone that has been in the research area who knows how to setup the data science infrastructure and has some data science experience.