Running and IRIS dataset classifier through CUSTOM endpoints
Overview
This documentation shows Konduit-Serving can serve a custom model and include post-processing in the pipeline to give a direct output label understood by a human. Iris model is used in this example to deploy on the server as a classifier through custom endpoints.
Adding package to the classpath
First of all we need to add the main package to the classpath so that the notebook can load all the necessary libraries from Konduit-Serving into the Jupyter Notebook kernel.
Classpaths can be considered similar to site-packages in the python ecosystem where each library that's to be imported to your code is loaded from.
We package almost everything you need to get started with the konduit.jar package so you can just start working on the actual code, without having to care about any boilerplate configuration.
%classpathaddjar../../konduit.jar
Let's ensure the working directory is correct and list all the file available in the directory.
We're creating a Pytorch model from scratch here and then converting that into ONNX format.
%%bashlesstrain.py
You'll be able to browse the source code how the training takes places.
import pandas as pd
from sklearn.model_selection import train_test_split
from sklearn.metrics import accuracy_score, precision_score, recall_score
import torch
import torch.nn as nn
import torch.nn.functional as F
from torch.autograd import Variable
class Net(nn.Module):
# define nn
def __init__(self):
super(Net, self).__init__()
self.fc1 = nn.Linear(4, 100)
self.fc2 = nn.Linear(100, 100)
self.fc3 = nn.Linear(100, 3)
self.softmax = nn.Softmax(dim=1)
def forward(self, X):
X = F.relu(self.fc1(X))
X = self.fc2(X)
X = self.fc3(X)
X = self.softmax(X)
return X
# load IRIS dataset
dataset = pd.read_csv('dataset/iris.csv')
# transform species to numerics
dataset.loc[dataset.species == 'Iris-setosa', 'species'] = 0
dataset.loc[dataset.species == 'Iris-versicolor', 'species'] = 1
dataset.loc[dataset.species == 'Iris-virginica', 'species'] = 2
train_X, test_X, train_y, test_y = train_test_split(dataset[dataset.columns[0:4]].values,
dataset.species.values, test_size=0.8)
# wrap up with Variable in pytorch
train_X = Variable(torch.Tensor(train_X).float())
test_X = Variable(torch.Tensor(test_X).float())
train_y = Variable(torch.Tensor(train_y).long())
test_y = Variable(torch.Tensor(test_y).long())
net = Net()
criterion = nn.CrossEntropyLoss() # cross entropy loss
optimizer = torch.optim.SGD(net.parameters(), lr=0.01)
for epoch in range(1000):
optimizer.zero_grad()
out = net(train_X)
loss = criterion(out, train_y)
loss.backward()
optimizer.step()
if epoch % 100 == 0:
print('number of epoch', epoch, 'loss', loss.item())
predict_out = net(test_X)
_, predict_y = torch.max(predict_out, 1)
print('prediction accuracy', accuracy_score(test_y.data, predict_y.data))
print('macro precision', precision_score(test_y.data, predict_y.data, average='macro'))
print('micro precision', precision_score(test_y.data, predict_y.data, average='micro'))
print('macro recall', recall_score(test_y.data, predict_y.data, average='macro'))
print('micro recall', recall_score(test_y.data, predict_y.data, average='micro'))
# Input to the model
x = torch.randn(1, 4, requires_grad=True)
# Export the model
torch.onnx.export(net, # model being run
x, # model input (or a tuple for multiple inputs)
"iris.onnx", # where to save the model (can be a file or file-like object)
export_params=True, # store the trained parameter weights inside the model file
opset_version=10, # the ONNX version to export the model to
do_constant_folding=True, # whether to execute constant folding for optimization
input_names=['input'], # the model's input names
output_names=['output'], # the model's output names
dynamic_axes={'input': {0: 'batch_size'}, # variable length axes
'output': {0: 'batch_size'}})
Viewing the configuration file
The configuration for the custom endpoint is as follow:
%%bashlessonnx.yaml
The output shows configurations in YAML, in which you can see two steps in the pipeline, serving a model and post-processing to make it directly understood by a human.