Inputs from Databases

Include database queries within inference requests sent to Modzy

Modzy provides a JDBC connector that makes it possible to send database inputs directly to a model for inference. The example below demonstrates how to query a database (hosted on and submit the results returned by a PostgreSQL query to Modzy's Named Entity Recognition, English model. To use the examples below, you will need to replace placeholder values with valid inputs:

  • Replace API_KEY with a valid API key string
  • Replace DB_USER_NAME and DB_PASSWORD with your database credentials
  • Update the other database fields, as necessary, to connect to your database
  • If necessary, replace with the URL of your instance of Modzy
POST /api/jobs HTTP/1.1
Authorization: ApiKey API_KEY
Content-Type: application/json
Accept: application/json

   "identifier": "a92fc413b5",
   "version": "0.0.12"
 "explain": true,
 "input": {
    "type": "jdbc",
    "url": "jdbc:postgresql://",
    "username": DB_USER_NAME,
    "password": DB_PASSWORD,
    "driver": "org.postgresql.Driver",
    "query":"SELECT \"mailaddr\" as \"input.txt\" FROM \"user/demo_repo\".\"atl_parcel_attr\" LIMIT 10;"
from modzy import ApiClient
from modzy._util import file_to_bytes

#Initialize your Modzy client
client = ApiClient(base_url="", api_key="<INSERT YOUR API KEY HERE>")

# Add database connection and query information (example using
db_url = "jdbc:postgresql://"
db_username = DB_USER_NAME
db_password = DB_PASSWORD
db_driver = "org.postgresql.Driver"
# We SELECT as "input.txt" becase that is the required input name for this model
db_query = "SELECT \"mailaddr\" as \"input.txt\" FROM \"user/demo_repo\".\"atl_parcel_attr\" LIMIT 10;"

#Once you are ready, submit the job to v0.0.12 of the named entity recognition, english model
job ="a92fc413b5","0.0.12",db_url, db_username, db_password, db_driver, db_query)

#Print the response from your job submission
const modzy = require("@modzy/modzy-sdk");

//Initialize the client
const modzyClient = new modzy.ModzyClient("", API_KEY)

//Add database connection and query information (example using
const dbUrl = "jdbc:postgresql://"
const dbUserName = DB_USER_NAME;
const dbPassword = DB_PASSWORD;
const dbDriver = "org.postgresql.Driver";
//We SELECT as "input.txt" becase that is the required input name for this model
const dbQuery = "SELECT \"mailaddr\" as \"input.txt\" FROM \"user/demo_repo\".\"atl_parcel_attr\" LIMIT 10;";

//We submit all the addresses to Modzy's Named Entity Recognition model (Id: a92fc413b5)
  .submitJobJDBC("a92fc413b5", "0.0.12", dbUrl, dbUserName, dbPassword, dbDriver, dbQuery)
      console.error("Modzy job submission failed with code " + error.code + " and message " + error.message);
package main

import (

    modzy ""

func main() {
    ctx := context.TODO()
  // Replace BASE_URL and API_KEY with valid values
    baseURL := ""
    apiKey := API_KEY

  // Initialize the API client
    client := modzy.NewClient(baseURL).WithAPIKey(apiKey)

  // Query a database and send the results to the Named Entity Recognition, English model
    submitResponse, err := client.Jobs().SubmitJobJDBC(ctx, &modzy.SubmitJobJDBCInput{
        ModelIdentifier:   "a92fc413b5",
        ModelVersion:      "0.0.12",
        Explain:           false,
        JDBCConnectionURL: "jdbc:postgresql://",
        DatabaseUsername:  DB_USER_NAME,
        DatabasePassword:  DB_PASSWORD,
        Query:             "SELECT \"mailaddr\" as \"input.txt\" FROM \"user/demo_repo\".\"atl_parcel_attr\" LIMIT 10;",

  // Check on the status of your request to analyze the sentiment of your input once a second
    jobDetails, _ := submitResponse.WaitForCompletion(ctx, time.Second)

  //When the input is done processing, retrive the results from Modzy, and log them to your terminal
    if jobDetails.Details.Status == modzy.JobStatusCompleted {

When using the JDBC connector to submit multiple database entries in the same inference request, each entry will use its Row ID as a unique key.


Inputs limited to 1000 per job on Modzy Basic

On Modzy Basic, inference jobs cannot include more than 1000 inputs, so be sure to put a LIMIT N or TOP N in place on your query. If you have more than 1,000 input to process, then you'll need to submit multiple inference requests with batches of database entries.

Did this page help you?