Tensorflow Step
Example of applying Tensorflow Step
In this example, we include a series of steps in the core abstraction, a pipeline step. The pipeline step performs a task such as:
Pre-processing step
Running a model
Post-processing by transforming the output in a way that humans can understand
Once the pipeline is set up, the server can deploy the model.
import ai.konduit.serving.data.image.convert.ImageToNDArrayConfig;
import ai.konduit.serving.data.image.convert.config.AspectRatioHandling;
import ai.konduit.serving.data.image.convert.config.ImageNormalization;
import ai.konduit.serving.data.image.convert.config.NDChannelLayout;
import ai.konduit.serving.data.image.convert.config.NDFormat;
import ai.konduit.serving.data.image.step.ndarray.ImageToNDArrayStep;
import ai.konduit.serving.examples.utils.Train;
import ai.konduit.serving.models.nd4j.tensorflow.step.Nd4jTensorFlowStep;
import ai.konduit.serving.pipeline.api.data.NDArrayType;
import ai.konduit.serving.pipeline.impl.pipeline.SequencePipeline;
import ai.konduit.serving.pipeline.impl.step.ml.classifier.ClassifierOutputStep;
import ai.konduit.serving.vertx.api.DeployKonduitServing;
import ai.konduit.serving.vertx.config.InferenceConfiguration;
import ai.konduit.serving.vertx.config.InferenceDeploymentResult;
import com.mashape.unirest.http.Unirest;
import com.mashape.unirest.http.exceptions.UnirestException;
import io.vertx.core.DeploymentOptions;
import io.vertx.core.VertxOptions;
import org.nd4j.common.io.ClassPathResource;
import java.io.IOException;
import java.util.Arrays;Configure the pipeline step
Let's start from the main function by declare the variable of labels that will be used by ClassifierOutputStep() and getting the trained model.
Create an inference configuration by default.
We'll need to include pre-processing step using ImageToNDArrayStep() to convert the input image into an array and specify all characteristics of the input image. To run a model, add Nd4jTensorFlowStep() into the pipeline and specify modelUri, inputNames and outputNames. ClassifierOutputStep() can be included for transforming the output in a way that humans can understand and set the following:
inputName: names for model's output layerlabels: list of output labels classifierallProbabilities: false
Deploy the server
Let's deploy the model in the server by calling DeployKonduitServing with the configuration made before. The handler, a callback function, is applied, only after a successful or failed server deployment inside the handler block, as shown.
Note that we consider only one test input image in this example for inference to show the model's deployment in Konduit-Serving. After implementation, the successful server deployment gives below output text.
The complete inference configuration in YAML format is as follows.
Last updated
Was this helpful?