Inference Secret Resource Reference
The inference secret resource configures a secret, such as OpenAI API key or
LiteLLM master key, that is used along with an inference_model. Multiple
inference models can use the same secret. Inference secret resources can be
listed and retrieved, but their values will be stripped from the payload to
protect them from leaking outside.
kind: inference_secret
version: v1
metadata:
name: example-openai-key
spec:
# value is the secret value, e.g. an OpenAI key.
value: "************************************"