Category: Kubernetes, Docker, yaml

In the last installment of this series, we have created custom Docker images for provisioning Jupyter Notebook Servers targeting the data preparation, training, and inference stages of a machine learning project. This tutorial focuses on provisioning the storage backend for Jupyter Notebook Servers running in the Kubeflow platform.

As discussed in the earlier parts, Kubeflow has a unique requirement of shared volumes and dedicated volumes to run MLOps pipelines.

For detailed instructions on deploying and configuring Kubeflow storage, refer to the DeepOps guide for NFS and Portworx.

With the custom Docker container images and storage volumes in place, we are all set to launch the Notebook Servers for data preparation, training, and inference.

Related Articles