Inference Command
Command examples used to inference models
Before running the commands, we must make sure the model weights are organized.
If you followed the previous training steps, the model weights will be located in the specified
output_dir
of the YAML. They will already be in the correct format, so you can directly run the commands at the bottom of the page.If you are using other pre-trained weights, find the folder/directory in which they are stored and follow the steps below.
The folder (e.g. home/data/MODEL_WEIGHTS
) should include files like:
encoder_scaler.pkl, dbn.ckpt, encoder.ckpt, deep_cluster.ckpt, heir_fc.ckpt
It also should include two .npy
files, which, if missing, can be created with:
touch home/data/MODEL_WEIGHTS/train_data.indices.npy ## replace with your path
touch home/data/MODEL_WEIGHTS/train_data.npy
Now modify the directory structure to look like this:

The script expects the files in specific folders, so it is important to not skip this step.
Tip: Use the
mkdir
andmv
commands to create this structure.
Now we can run our commands:
cd <path_to_SIT_FUSE>/SIT_FUSE/src/sit_fuse/inference
python3 generate_output.py -y <path_to_yaml>
# Same YAML as in the previous steps. See Code configuration
Outputs zarr files
Last updated