converter.utils.upload_model_dir()
converter.utils.upload_model_dir(model_dir, container, model_key, storage_key, storage_secret, storage_provider, platform)
Uploads model artifacts required to execute successful model converter job to cloud storage bucket of choice. Only applies to Azure ML or MLflow models.
Parameters
Parameter | Type | Description | Example |
---|---|---|---|
model_dir | str | Path to saved model directory | "./path/to/my/mlflow/model/directory/" |
container | str | Storage provider container name | "my-bucket-name" |
resources_key | str | Desired key for resource archive once uploaded to storage provider. | "path/to/resources.tar.gz" The container and resources-key are appended to identify the full file path location for the desired uploaded file (e.g., "my-bucket-name/path/to/resources.tar.gz") |
storage_key | str | Storage provider access key. For an AWS S3 bucket, this corresponds to your Access key ID. For Azure ML, this corresponds to your Storage account name. For Google Cloud Storage, this corresponds to your Client email. | |
storage_secret | str | Storage provider secret key. For an AWS S3 bucket, this corresponds to your Secret access key. For Azure ML, this corresponds to your Access Key. For Google Cloud storage, this corresponds to your Private key. | |
storage_provider | str | Storage provider name (must be one of "S3", "AZURE_BLOBS", or "GOOGLE_STORAGE"). | "AZURE_BLOBS" |
platform | str | Training platform used to develop model. Either "azure" or "mlflow" | "mlflow" |
Returns
None
Examples
# Import some standard dependencies
import os
import json
import time
from modzy.converter.utils import upload_resources
blob_storage_provider = "S3"
blob_storage_container = "my-s3-bucket-name"
mlflow_model_dir = "local/path/to/mlflow-model/"
mlflow_model_key = "path/to/model.tar.gz"
upload_mlflow_model(
mlflow_model_dir,
blob_storage_container,
mlflow_model_key,
os.getenv("SP_ACCESS_KEY_ID"),
os.getenv("SP_SECRET_ACCESS_KEY"),
blob_storage_provider,
"mlflow"
)
Updated over 1 year ago