EdgeClient.inferences.build_inference_request
Prepare input data into inference request object for Modzy Inference API
EdgeClient.inferences.build_inference_request(model_identifier: str, model_version: str, input_sources: List[InputSource], explain=False, tags=None)
This method allows users to prepare input data into the expected InferenceRequest
object format.
Parameters
Parameter | Type | Description | Example |
---|---|---|---|
model_identifier | str | The model identifier. | 'ed542963de' |
model_version | str | The model version string in semantic version format. | '1.0.1' |
input_sources | List[InputSource] | A list of input sources of type InputSource | [InputSource(key="input.txt", text="Today is a great day.")] |
explain | bool | If the model supports explainability, flag this job to return an explanation of the predictions | True |
tags | Mapping[str, str] | An arbitrary set of key/value tags to associate with this inference. |
Returns
A InferenceRequest
object that can be submitted to Inference API
Examples
import os
from modzy import EdgeClient
from modzy.edge import InputSource
# generate requests iterator to pass to stream method
requests = []
for img in os.listdir(img_folder):
input_object = InputSource(
key="image", # input filename defined by model author
data=open(os.path.join(img_folder, img), 'rb').read()
)
with EdgeClient('localhost', 55000) as client:
requests.append(client.inferences.build_inference_request("<model-id>", "<model-version>", input_object, explain=False, tags=None))
Updated 9 months ago