NOTE: This notebook requires Kaptain SDK 1.3.x or later.

This is a continuation of the Multi-Cluster Tutorial use case example. Ensure you have successfully completed the steps in Build your Notebook in your source cluster.

What You Need

To run this notebook, ensure your target deployment cluster notebook server is configured similar to the source cluster as specified in the Prerequisites, which includes these steps:

  1. Create a Docker secret and an AWS credentials secret.

  2. Create a PodDefault configuration referencing the created secrets.

  3. Launch a Jupyter notebook server with said PodDefault configuration.

You will be able to open this notebook after launching the notebook server.

Ensure You Are Ready to Start

Before proceeding, verify that the notebook server was configured and launched correctly:

  1. Ensure that the docker secret is mounted. You should not see an error:

    %%sh
    ls -la ~/.docker/config.json
    CODE

    Output:

    lrwxrwxrwx 1 root istio 18 Oct  6 07:45 /home/kubeflow/.docker/config.json -> ..data/config.json
    CODE

  2. Verify that the AWS environment variables are set. You should see AWS_ACCESS_KEY_ID, AWS_REGION, and AWS_SECRET_ACCESS_KEY:

    %%sh
    set | egrep ^AWS_ | cut -f 1 -d '='
    CODE

    Output:

    AWS_ACCESS_KEY_ID
    AWS_REGION
    AWS_SECRET_ACCESS_KEY
    CODE

Initialize Model Configuration

  1. Before loading the model, provide some configuration to the Kaptain SDK, so it is able to recognize Docker and S3:

    from kaptain.config import Config
    from kaptain.platform.config.s3 import S3ConfigurationProvider
    from kaptain.platform.config.docker import DockerConfigurationProvider
    from kaptain.platform.model_util import ModelUtil
    
    config = Config(
      docker_config_provider=DockerConfigurationProvider.default(),
      storage_config_provider=S3ConfigurationProvider.from_env(),
    )
    CODE
  2. Load the stored model state from S3:

    # Replace model_uri = "..." with the value obtained from the previous notebook when running `model.meta().saved_model_uri`
    model = Model.load_from_json(model_uri = "s3://kaptain/models/dev/mnist/trained/b69dc6f6e3c246858cf43a1eba8be5f5/0001", config = config)
    CODE

    Output:

    [I 221006 08:39:36 model_util:67] Loading model state from s3://kaptain/models/dev/mnist/trained/b69dc6f6e3c246858cf43a1eba8be5f5/0001.
    CODE
  3. Deploy the model:

    model.deploy(cpu="1", memory="2G", replace=True)
    CODE

Congratulations, you have completed this tutorial!