Python SDK
A Konduit Serving instance can be created by:
creating a Python object of the
Server
class usingthe
Server()
function; orthe
server_from_file()
function from thekonduit.load
module; and
starting the server using the
.start()
method of theServer
object created in step 1.
We will use the server_from_file()
function to configure Konduit Serving in this example.
In Python, specify the path to your configuration in konduit_yaml_path
:
Initialize a Konduit Serving instance with the following code:
Note that the file also contains Client configuration. To create a Client
object, use the client_from_file()
function from the konduit.load
module:
The Client
class provides a .predict()
method that sends data to the Serving instance. First, create some sample data as a NumPy array:
Assuming your data is declared in the data_input
object, data can be passed to client
for prediction using:
The .predict()
method takes a single argument data_input
which is typically a dictionary. A NumPy array can be directly passed to the .predict()
method if the input name is default
.
Next steps
To build configurations using the YAML format, check out the YAML configurations page:
YAML configurations are sufficient for most use cases. In particular, if your use case:
does not involve DataVec transformations,
for Python steps: has one transformation script at each pipeline step,
then you should use a YAML configuration.
For more complex configurations, you should use the Python SDK. To build configurations with Python steps, start with the Python pipeline steps page:
To build configurations in Python with TensorFlow, DL4J and Keras models using DL4J and JavaCPP Presets, refer to the example for the respective framework:
To build ETL processes into your serving pipeline, refer to the DataVec example:
Last updated