Tensorflow
Example of Tensorflow framework with CUSTOM endpoints
Overview
In this example, we demonstrate Konduit-Serving with a complete pipeline step consist of :
Pre-processing step
Running a deep learning model
Post-processing step, expressing the output in a way human can understand
Adding package to the classpaths
Let's add the main package of Konduit-Serving so that the notebook can load all the required libraries that need to be used by Jupyter Notebook kernel.
%classpath add jar ../../konduit.jarStarting a server
Before starting a server, let's check if there is a running server with id tensorflow-mnist and stop it. This command may use once the server finished.
%%bash
konduit stop tensorflow-mnistYou'll get the following message if there is no server running with mentioned id.
No konduit server exists with an id: 'tensorflow-mnist'.Or, if you have a running server you'll received as following.
Now, let's start the server with an id of tensorflow-mnist using tensorflow.json as a configuration file in the background without creating the manifest jar file before launching the server.
You'll be able to view a similar message like below.
View the logs for the last 1000 lines -l for a given id by using the konduit logs command.
The output of log is similar as following.
Sending an input to served model
We can display all the available image for the inference result of the model. You'll be able to see the picture from zero to nine.
Let's take one of the testing images and send it to the served model in Konduit-Serving. With the help of the pipeline considered in the configuration, we could translate the image into an array and feed it into the model.
Thus, giving a result straight forward with the label of number classification based on prediction probabilities.
Last updated
Was this helpful?