Inference
An inference request is made with an HTTP POST to an inference endpoint. The model name and (optionally) version must be available in the URL. If a version is not provided the server may choose a version based on its own policies or return an error.
Path parameters
model_namestringRequired
model_versionstringRequired
Body
idstringOptional
Responses
200
OK
application/json
400
Bad Request
application/json
404
Not Found
application/json
500
Internal Server Error
application/json
post
/v2/models/{model_name}/versions/{model_version}/inferAn inference request is made with an HTTP POST to an inference endpoint. The model name and (optionally) version must be available in the URL. If a version is not provided the server may choose a version based on its own policies or return an error.
Path parameters
model_namestringRequired
Body
idstringOptional
Responses
200
OK
application/json
400
Bad Request
application/json
404
Not Found
application/json
500
Internal Server Error
application/json
post
/v2/models/{model_name}/inferLast updated
Was this helpful?