Codecs
Base64Codec
Codec that convers to / from a base64 input.
Methods
can_encode()
can_encode(payload: Any) -> boolEvaluate whether the codec can encode (decode) the payload.
decode_input()
decode_input(request_input: RequestInput) -> List[bytes]Decode a request input into a high-level Python type.
decode_output()
decode_output(response_output: ResponseOutput) -> List[bytes]Decode a response output into a high-level Python type.
encode_input()
encode_input(name: str, payload: List[bytes], use_bytes: bool = True, kwargs) -> RequestInputEncode the given payload into a RequestInput.
encode_output()
encode_output(name: str, payload: List[bytes], use_bytes: bool = True, kwargs) -> ResponseOutputEncode the given payload into a response output.
CodecError
Methods
add_note()
add_note(...)Exception.add_note(note) -- add a note to the exception
with_traceback()
with_traceback(...)Exception.with_traceback(tb) -- set self.traceback to tb and return self.
DatetimeCodec
Codec that convers to / from a datetime input.
Methods
can_encode()
can_encode(payload: Any) -> boolEvaluate whether the codec can encode (decode) the payload.
decode_input()
decode_input(request_input: RequestInput) -> List[datetime]Decode a request input into a high-level Python type.
decode_output()
decode_output(response_output: ResponseOutput) -> List[datetime]Decode a response output into a high-level Python type.
encode_input()
encode_input(name: str, payload: List[Union[str, datetime]], use_bytes: bool = True, kwargs) -> RequestInputEncode the given payload into a RequestInput.
encode_output()
encode_output(name: str, payload: List[Union[str, datetime]], use_bytes: bool = True, kwargs) -> ResponseOutputEncode the given payload into a response output.
InputCodec
The InputCodec interface lets you define type conversions of your raw input data to / from the Open Inference Protocol. Note that this codec applies at the individual input (output) level.
For request-wide transformations (e.g. dataframes), use the RequestCodec interface instead.
Methods
can_encode()
can_encode(payload: Any) -> boolEvaluate whether the codec can encode (decode) the payload.
decode_input()
decode_input(request_input: RequestInput) -> AnyDecode a request input into a high-level Python type.
decode_output()
decode_output(response_output: ResponseOutput) -> AnyDecode a response output into a high-level Python type.
encode_input()
encode_input(name: str, payload: Any, kwargs) -> RequestInputEncode the given payload into a RequestInput.
encode_output()
encode_output(name: str, payload: Any, kwargs) -> ResponseOutputEncode the given payload into a response output.
NumpyCodec
Decodes an request input (response output) as a NumPy array.
Methods
can_encode()
can_encode(payload: Any) -> boolEvaluate whether the codec can encode (decode) the payload.
decode_input()
decode_input(request_input: RequestInput) -> ndarrayDecode a request input into a high-level Python type.
decode_output()
decode_output(response_output: ResponseOutput) -> ndarrayDecode a response output into a high-level Python type.
encode_input()
encode_input(name: str, payload: ndarray, kwargs) -> RequestInputEncode the given payload into a RequestInput.
encode_output()
encode_output(name: str, payload: ndarray, kwargs) -> ResponseOutputEncode the given payload into a response output.
NumpyRequestCodec
Decodes the first input (output) of request (response) as a NumPy array. This codec can be useful for cases where the whole payload is a single NumPy tensor.
Methods
can_encode()
can_encode(payload: Any) -> boolEvaluate whether the codec can encode (decode) the payload.
decode_request()
decode_request(request: InferenceRequest) -> AnyDecode an inference request into a high-level Python object.
decode_response()
decode_response(response: InferenceResponse) -> AnyDecode an inference response into a high-level Python object.
encode_request()
encode_request(payload: Any, kwargs) -> InferenceRequestEncode the given payload into an inference request.
encode_response()
encode_response(model_name: str, payload: Any, model_version: Optional[str] = None, kwargs) -> InferenceResponseEncode the given payload into an inference response.
PandasCodec
Decodes a request (response) into a Pandas DataFrame, assuming each input (output) head corresponds to a column of the DataFrame.
Methods
can_encode()
can_encode(payload: Any) -> boolEvaluate whether the codec can encode (decode) the payload.
decode_request()
decode_request(request: InferenceRequest) -> DataFrameDecode an inference request into a high-level Python object.
decode_response()
decode_response(response: InferenceResponse) -> DataFrameDecode an inference response into a high-level Python object.
encode_outputs()
encode_outputs(payload: DataFrame, use_bytes: bool = True) -> List[ResponseOutput]encode_request()
encode_request(payload: DataFrame, use_bytes: bool = True, kwargs) -> InferenceRequestEncode the given payload into an inference request.
encode_response()
encode_response(model_name: str, payload: DataFrame, model_version: Optional[str] = None, use_bytes: bool = True, kwargs) -> InferenceResponseEncode the given payload into an inference response.
RequestCodec
The RequestCodec interface lets you define request-level conversions between high-level Python types and the Open Inference Protocol. This can be useful where the encoding of your payload encompases multiple input heads (e.g. dataframes, where each column can be thought as a separate input head).
For individual input-level encoding / decoding, use the InputCodec interface instead.
Methods
can_encode()
can_encode(payload: Any) -> boolEvaluate whether the codec can encode (decode) the payload.
decode_request()
decode_request(request: InferenceRequest) -> AnyDecode an inference request into a high-level Python object.
decode_response()
decode_response(response: InferenceResponse) -> AnyDecode an inference response into a high-level Python object.
encode_request()
encode_request(payload: Any, kwargs) -> InferenceRequestEncode the given payload into an inference request.
encode_response()
encode_response(model_name: str, payload: Any, model_version: Optional[str] = None, kwargs) -> InferenceResponseEncode the given payload into an inference response.
StringCodec
Encodes a list of Python strings as a BYTES input (output).
Methods
can_encode()
can_encode(payload: Any) -> boolEvaluate whether the codec can encode (decode) the payload.
decode_input()
decode_input(request_input: RequestInput) -> List[str]Decode a request input into a high-level Python type.
decode_output()
decode_output(response_output: ResponseOutput) -> List[str]Decode a response output into a high-level Python type.
encode_input()
encode_input(name: str, payload: List[str], use_bytes: bool = True, kwargs) -> RequestInputEncode the given payload into a RequestInput.
encode_output()
encode_output(name: str, payload: List[str], use_bytes: bool = True, kwargs) -> ResponseOutputEncode the given payload into a response output.
StringRequestCodec
Decodes the first input (output) of request (response) as a list of strings. This codec can be useful for cases where the whole payload is a single list of strings.
Methods
can_encode()
can_encode(payload: Any) -> boolEvaluate whether the codec can encode (decode) the payload.
decode_request()
decode_request(request: InferenceRequest) -> AnyDecode an inference request into a high-level Python object.
decode_response()
decode_response(response: InferenceResponse) -> AnyDecode an inference response into a high-level Python object.
encode_request()
encode_request(payload: Any, kwargs) -> InferenceRequestEncode the given payload into an inference request.
encode_response()
encode_response(model_name: str, payload: Any, model_version: Optional[str] = None, kwargs) -> InferenceResponseEncode the given payload into an inference response.
decode_args()
decode_args(predict: Callable) -> Callable[[ForwardRef('MLModel'), <class 'mlserver.types.dataplane.InferenceRequest'>], Coroutine[Any, Any, InferenceResponse]]No description available.
decode_inference_request()
decode_inference_request(inference_request: InferenceRequest, model_settings: Optional[ModelSettings] = None, metadata_inputs: Dict[str, MetadataTensor] = {}) -> Optional[Any]No description available.
decode_request_input()
decode_request_input(request_input: RequestInput, metadata_inputs: Dict[str, MetadataTensor] = {}) -> Optional[Any]No description available.
encode_inference_response()
encode_inference_response(payload: Any, model_settings: ModelSettings) -> Optional[InferenceResponse]No description available.
encode_response_output()
encode_response_output(payload: Any, request_output: RequestOutput, metadata_outputs: Dict[str, MetadataTensor] = {}) -> Optional[ResponseOutput]No description available.
get_decoded()
get_decoded(parametrised_obj: Union[InferenceRequest, RequestInput, RequestOutput, ResponseOutput, InferenceResponse]) -> AnyNo description available.
get_decoded_or_raw()
get_decoded_or_raw(parametrised_obj: Union[InferenceRequest, RequestInput, RequestOutput, ResponseOutput, InferenceResponse]) -> AnyNo description available.
has_decoded()
has_decoded(parametrised_obj: Union[InferenceRequest, RequestInput, RequestOutput, ResponseOutput, InferenceResponse]) -> boolNo description available.
register_input_codec()
register_input_codec(CodecKlass: Union[type[InputCodec], InputCodec])No description available.
register_request_codec()
register_request_codec(CodecKlass: Union[type[RequestCodec], RequestCodec])No description available.
Last updated
Was this helpful?
