Requires proficiency in Docker and gRPC
Only follow this model packaging guide if your model can not be containerized by Chassis or if you are proficient in Docker and gRPC.
In this guide, we will prepare a machine learning model for deployment to Modzy using our Python gRPC template. This template will allow you to create an OMI-compliant container that can be imported directly into Modzy.
This containerization process includes three steps:
- Construct Model Container
- Construct Metadata Configuration File
- Test and Validate Model Container
Edit skeleton template scripts and Dockerfile to build a Modzy-compatible Docker container image that contains all model code and dependencies.
Migrate your existing Model Library or Develop a Model Library from Scratch
model_lib/src to store your model library and use
model_lib/tests to store its associated test suite. Your existing model library can be directly imported into this repository with any structure. However, you are required to expose functionality to instantiate and perform inference using your model at a minimum. For developers, it is recommended that the complete training code as well as the model architecture be included and documented within your model library in order to ensure full reproducibility and traceability.
Integrate your Model into the Modzy Model Wrapper Class
Navigate to the
model_lib/src/model.py file within the repository, which contains the Modzy Model Wrapper Class. Proceed to fill out the
handle_discrete_input() by following the instructions provided in the comments for this module.
Host gRPC server inside a Docker Container
Set up the
Dockerfile correctly to ensure your gRPC model server can be spun up inside your Docker container.
- Complete the
handle_discrete_input_batch()method in order to enable custom batch processing for your model.
- Refactor the
ExampleModelclass name in order to give your model a custom name.
Fill in YAML configuration file from template that contains important metadata the API uses to run the model on the Modzy Platform.
Provide model Metadata
Create a new version of your model using semantic versioning,
x.x.x, and create a new directory for this version under
asset bundle. Fill out a
docker_metadata.yaml file under
asset_bundle/x.x.x/ according to the proper specification and then update the
__VERSION__ = x.x.x variable located in
grpc_model/__init__.py prior to performing the release for your new version of the model. Also, you must update the following line in the
Dockerfile: COPY asset_bundle/x.x.x ./asset_bundle/x.x.x/
Add unit tests and run validation tests on model container analogously to the way the Modzy Platform will spin up the model container and run inference.
First, create a virtual environment (venv, conda, or other preferred virtual environment), activate it, and install any pip-installable packages defined in your
requirements.txt file. Using venv:
python -m venv ./grpc-model source grpc-model/bin/activate pip install -r requirements.txt
python -m venv .\grpc-model .\grpc-model\Scripts\activate pip install -r requirements.txt
Then, test your gRPC server and client connection in two separate terminals. In your first terminal, kick off the gRPC server.
python -m grpc_model.src.model_server
You will see your model instantiate and begin running on port 45000. This runs the
Status() remote procedure call. Next, after properly configuring the
grpc_model/src/model_client.py file, run the gRPC client in a separate terminal.
python -m grpc_model.src.model_client
This will run the custom gRPC client and execute the
Run() remote procedure call with the data you defined in the client script. Pending a successful inference run, you can move on to testing your model inside a newly build Docker container.
Build your container, and spin up your model inside the container:
docker build -t <container-image-name>:<tag> . docker run --rm -it -p 45000:45000 <container-image-name>:<tag>
Then, in a separate terminal, test the containerized server from a local gRPC model client:
python -m grpc_model.src.model_client
Pending a successful local client test, you can proceed knowing your model container runs as expected.
Congratulations! You have now successfully containerized your your Python model. To deploy your new model container to Modzy, follow the Import Container guide.
For reference, view these example implementations of the Python gRPC template:
- Basic Image Classification
- Tensorflow Object Detection
- Scikit-learn Tabular Data Classification
- Sentiment Analysis
Follow along in this video tutorial to learn more.
Updated 7 months ago