LogoLogo
  • MLServer
  • Getting Started
  • User Guide
    • Content Types (and Codecs)
    • OpenAPI Support
    • Parallel Inference
    • Adaptive Batching
    • Custom Inference Runtimes
    • Metrics
    • Deployment
      • Seldon Core
      • KServe
    • Streaming
  • Inference Runtimes
    • SKLearn
    • XGBoost
    • MLFlow
    • Spark MlLib
    • LightGBM
    • Catboost
    • Alibi-Detect
    • Alibi-Explain
    • HuggingFace
    • Custom
  • Reference
    • MLServer Settings
    • Model Settings
    • MLServer CLI
    • Python API
      • MLModel
      • Types
      • Codecs
      • Metrics
  • Examples
    • Serving Scikit-Learn models
    • Serving XGBoost models
    • Serving LightGBM models
    • Serving MLflow models
    • Serving a custom model
    • Serving Alibi-Detect models
    • Serving HuggingFace Transformer Models
    • Multi-Model Serving
    • Model Repository API
    • Content Type Decoding
    • Custom Conda environments in MLServer
    • Serving a custom model with JSON serialization
    • Serving models through Kafka
    • Streaming
    • Deploying a Custom Tensorflow Model with MLServer and Seldon Core
  • Changelog
Powered by GitBook
On this page
  • Swagger UI
  • Model Swagger UI

Was this helpful?

Edit on GitHub
Export as PDF
  1. User Guide

OpenAPI Support

PreviousContent Types (and Codecs)NextParallel Inference

Last updated 7 months ago

Was this helpful?

MLServer follows the Open Inference Protocol (previously known as the "V2 Protocol"). You can find the full OpenAPI spec for the Open Inference Protocol in the links below:

Name
Description
OpenAPI Spec

Open Inference Protocol

Main dataplane for inference, health and metadata

Model Repository Extension

Extension to the protocol to provide a control plane which lets you load / unload models dynamically

Swagger UI

On top of the OpenAPI spec above, MLServer also autogenerates a Swagger UI which can be used to interact dynamycally with the Open Inference Protocol.

The autogenerated Swagger UI can be accessed under the /v2/docs endpoint.

Besides the Swagger UI, you can also access the raw OpenAPI spec through the /v2/docs/dataplane.json endpoint.

Model Swagger UI

The model-specific autogenerated Swagger UI can be accessed under the following endpoints:

  • /v2/models/{model_name}/docs

  • /v2/models/{model_name}/versions/{model_version}/docs

Besides the Swagger UI, you can also access the model-specific raw OpenAPI spec through the following endpoints:

  • /v2/models/{model_name}/docs/dataplane.json

  • /v2/models/{model_name}/versions/{model_version}/docs/dataplane.json

Alongside the , MLServer will also autogenerate a Swagger UI tailored to individual models, showing the endpoints available for each one.

general API documentation
dataplane.json
model_repository.json