Utilities for detecting and classifying the target inference server. Defines theDocumentation Index
Fetch the complete documentation index at: https://docs.mellea.ai/llms.txt
Use this file to discover all available pages before exploring further.
_ServerType enum (LOCALHOST, OPENAI, REMOTE_VLLM,
UNKNOWN) and _server_type, which classifies a URL by hostname. Also provides
is_vllm_server_with_structured_output, which probes a server’s /version
endpoint to determine whether it supports the structured_outputs parameter
introduced in vLLM ≥ 0.12.0. Used by the OpenAI-compatible backend to choose between
guided_json and structured_outputs request formats.
Functions
FUNC is_vllm_server_with_structured_output
base_url: Base url for LLM API.headers: Additional headers to pass to the request.
- True if the server is vLLM >= v0.12.0, False otherwise.