Deploying Models
There are several ways to deploy your model into your Modzy instance. As the model developer, you can decide how you package and import your model based on the framework or language you are working with.
Figure 1. Model Deployment Home Page
Deploy Now
No-Code Model Import Options:
Deploy an OMI-Compliant Container
- Import pre-built Docker container you have already built that meets the Open Model Interface Specification
Automatically Build and Deploy Model Container
- PyTorch
- Scikit-learn
- XGBoost
- LightGBM
- Fastai
- MXNet
- ONNX
- PMML
- General Chassis Template - if you do not see your model framework in the above list, it is likely supported! Follow this general template to automatically build and containerize your model (Note: Tensorflow/Keras is the only known framework currently not supported by Chassis)
Manually Create OMI-Compliant Container
Set Up a CI/CD Pipeline
Updated 5 days ago
Did this page help you?