Inference

Model Inference

post

An inference request is made with an HTTP POST to an inference endpoint. The model name and (optionally) version must be available in the URL. If a version is not provided the server may choose a version based on its own policies or return an error.

Path parameters
model_namestringRequired
model_versionstringRequired
Body
idstringOptional
Responses
200

OK

application/json
post
/v2/models/{model_name}/versions/{model_version}/infer

Model Inference

post

An inference request is made with an HTTP POST to an inference endpoint. The model name and (optionally) version must be available in the URL. If a version is not provided the server may choose a version based on its own policies or return an error.

Path parameters
model_namestringRequired
Body
idstringOptional
Responses
200

OK

application/json
post
/v2/models/{model_name}/infer

Last updated

Was this helpful?