EdgeClient.inferences.stream

Provides a convenience method for running an multiple sequential synchronous inferences

EdgeClient.inferences.stream(input_iterator: Iterable[InferenceRequest])

This method is a convenience method for running multiple synchronous inferences consecutively and allows users to submit iterable objects to be processed sequentially in real-time.

Parameters

ParameterTypeDescriptionExample
input_iteratorIterable[InferenceRequest]Iterable object containing InferenceRequest objects.'ed542963de'

Returns

Iterable object containing Inference objects

Examples

from modzy import EdgeClient
from modzy.edge import InputSource

# generate requests iterator to pass to stream method
requests = []
for img in os.listdir(img_folder):
  input_object = InputSource(
    key="image", # input filename defined by model author
    data=open(os.path.join(img_folder, img), 'rb').read()
  )
  with EdgeClient('localhost', 55000) as client:
    requests.append(client.inferences.build_inference_request("<model-id>", "<model-version>", input_object, explain=False, tags=None)) 

# submit list of inference requests to streaming API
with EdgeClient('localhost', 55000) as client:
  streaming_results = client.inferences.stream(requests)