DevOps Articles

Curated articles, resources, tips and trends from the DevOps World.

Tensorflow Model Deployment and Inferencing with Kubeflow

3 years ago thenewstack.io
Tensorflow Model Deployment and Inferencing with Kubeflow

Summary: This is a summary of an article originally published by The New Stack. Read the full original article here →

In the last part of this series, we trained a Tensorflow model to classify the images of cats and dogs. The model is stored in a shared Kubernetes persistent volume claim (PVC) which can be accessed by another Kubeflow Notebook Server to test the model.

This notebook validates the model by passing a few images.

We are essentially mounting the same PVC used by the Jupyter Notebook Servers to serve the model.

Since Kubeflow relies on Istio for authorizing requests, we need to apply an authorization policy to allow requests to TF Serving.

Made with pure grit © 2024 Jetpack Labs Inc. All rights reserved. www.jetpacklabs.com