Inference Command
Command examples used to inference models
Before running the commands, we must make sure the model weights are organized.
If you followed the previous training steps, the model weights will be located in the specified
output_dirof the YAML. They will already be in the correct format, so you can directly run the commands at the bottom of the page.If you are using other pre-trained weights, find the folder/directory in which they are stored and follow the steps below.
The folder (e.g. home/data/MODEL_WEIGHTS) should include files like:
encoder_scaler.pkl, dbn.ckpt, encoder.ckpt, deep_cluster.ckpt, heir_fc.ckptIt also should include two .npy files, which, if missing, can be created with:
touch home/data/MODEL_WEIGHTS/train_data.indices.npy ## replace with your path
touch home/data/MODEL_WEIGHTS/train_data.npyNow modify the directory structure to look like this:

The script expects the files in specific folders, so it is important to not skip this step.
Tip: Use the
mkdirandmvcommands to create this structure.
Now we can run our commands:
Outputs zarr files
Last updated