Pytorch (MNIST)
Running and MNIST dataset classifier through CUSTOM image endpoints
Overview
This example shows a complete step in the pipeline through custom endpoints:
Pre-processing step
Serve a model step
Post-processing step
Adding package to the classpath
We need to add the main package to the classpath so that the notebook can load all the necessary libraries from Konduit-Serving into the Jupyter Notebook kernel.
%classpath add jar ../../konduit.jarViewing the configuration file
To view the configuration file contents, use the following command and select the JSON file you want to view.
%%bash
less config.jsonYou'll be able to view the following.
{
"host" : "localhost",
"port" : 0,
"protocol" : "HTTP",
"pipeline" : {
"steps" : [ {
"@type" : "IMAGE_TO_NDARRAY",
"config" : {
"height" : 28,
"width" : 28,
"dataType" : "FLOAT",
"includeMinibatchDim" : true,
"aspectRatioHandling" : "CENTER_CROP",
"format" : "CHANNELS_FIRST",
"channelLayout" : "GRAYSCALE",
"normalization" : {
"type" : "SCALE"
},
"listHandling" : "NONE"
},
"keys" : [ "image" ],
"outputNames" : [ "Input3" ],
"keepOtherValues" : true,
"metadata" : false,
"metadataKey" : "@ImageToNDArrayStepMetadata"
}, {
"@type" : "LOGGING",
"logLevel" : "INFO",
"log" : "KEYS_AND_VALUES"
}, {
"@type" : "ONNX",
"modelUri" : "mnist.onnx",
"inputNames" : [ "Input3" ],
"outputNames" : [ "Plus214_Output_0" ]
}, {
"@type" : "CLASSIFIER_OUTPUT",
"inputName" : "Plus214_Output_0",
"labels" : [ "0", "1", "2", "3", "4", "5", "6", "7", "8", "9" ],
"allProbabilities" : false
} ]
}
}
Starting a server
Starts a server in the background with an id of onnx-mnist using config.json as configuration file without creating the manifest jar file before launching the server.
You'll get the following message once the server starts in the background.
We use konduit list command to view the list of activated server
The list of the activated server is below.
To view the logs of the running server, use konduit logs commands.
The output of server logging.
Making a prediction
Let's display the test image before feeding it as an input into the model for the classification.
Here is the test image used in the prediction for this example:

konduit predict command is used to classify the image based on the model served in Konduit-Serving with the id given before.
You'll be able to get the output similar to the following.
Stopping the server
After we're finished with the server, we can terminate it through the konduit stop command.
You'll receive this message once the server is terminated.
Last updated
Was this helpful?