Creating a Sequence Pipeline
To create boilerplate configurations
A Sequence Pipeline is used to treat the data and Machine Learning or Deep Learning model in a series of steps from pre-processing to model serving and post-processing on the output product. In this example, the CLI
command specifies on konduit config
is used to configure the configuration file to serve the models on Konduit-Serving
. You'll be able to follow this example on your local terminal on any directory.
If deploying the model does not need pre- nor post-processing, only one step, a deep learning model is needed. This configuration is defined using a single Step to serve a model, and the command for creating the configuration file is like the following.
The YAML configuration is as follows.
The Steps are included in the --pipeline
based on the model's requirement and how output should represent. For example, the model fetches an image input, so the image_to_ndarray
should be pre-processing step to convert the image into an array. The table below shows all steps that can be used in the Sequence Pipeline.
Pre-processing Step
Model/Python Step
Post-processing Step
Logging
image_to_ndarray
dl4j
keras
tensorflow
nd4jtensorflow
onnx
samediff
python
crop_grid
crop_fixed_grip
draw_bounding_box
draw_fixed_grid
draw_segmentation
extract_bounding_box
camera_frame_capture
video_frame_capture
ssd_to_bounding_box
show_image
classifier_output
logging
Here is another example with a series of step in Sequence Pipeline (image_to_ndarray
to nd4jtensorflow
to classifier_output
). The input image needs to convert into an n-D array before feeding into the model and produce the classification output. A command likes below:
The command would give the configuration file in JSON with the complete Pipeline Steps.
Here is the example which use almost similar steps. You can find the JSON file on https://github.com/ShamsUlAzeem/konduit-serving-demo/blob/master/demos/4-tensorflow-mnist/tensorflow.json
Every Step in the Pipeline needs to modify based on the input characteristics, model configurations and how output should looks like in the end. Using a configuration file allows you to serve the model with Konduit-Serving.
Last updated