Single Model Inference

This endpoint allows you to do inference on a single model that you have deployed

Ted Tigerschiöld avatar
Written by Ted Tigerschiöld
Updated over a week ago

Do inference on one of your models.

Resource URL

https://api.app.labelf.ai/v2/models/{model_id}/inference

Resource information

Response formats

JSON

Requires authentication?

Yes - Bearer token

Path parameters

Name

Required

Type

Description

model_id

required

int

The model id for your model

JSON body parameters

Name

Required

Type

Description

Example

texts

required

List of strings

Specifies the texts that should be inferenced on.

["Breakfast was not tasty", "breakfast was very good"]

max_predictions

NOT required

int

Specifies the amount of predictions you want back. For example, setting this to three will yield the three class predictions with the highest probability. You can skip sending this parameter and you will instead get predictions for all classes.

3

label_filter

NOT required

List of strings

If you only want the predictions for certain classes and ignore all other classes, then list those here. Partial matches will also be filtered here, so "po" will filter yield the class "positive". Leaving this out will yield predictions for all classes.

["positive", "negative"]"

Example Request

POST /v2/models/{model_id}/inference HTTP/1.1 Host: api.app.labelf.ai Authorization: Bearer YOUR_BEARER_TOKEN Content-Type: application/json;charset=UTF-8 { "texts": ["Breakfast was not tasty"], "max_predictions": 2 }

Example Curl Request

curl --location --request POST 'https://api.app.labelf.ai/v2/models/inference \ --header 'Authorization: Bearer YOUR_BEARER_TOKEN' \ --header 'Content-Type: application/json' \ --data-raw '{ "texts": ["Breakfast was not tasty"], "max_predictions": 2, } '

Example Response

HTTP/1.1 200 OK Status: 200 OK Content-Type: application/json; charset=utf-8 ... [ { "text": "Breakfast was not tasty", "predictions": [ { "label": "positive", "score": 0.93 }, { "label": "neutral", "score": 0.03 } ] } ]
Did this answer your question?