Documentation Index
Fetch the complete documentation index at: https://docs.mellea.ai/llms.txt
Use this file to discover all available pages before exploring further.
Prerequisites: Python 3.11+, pip or uv available.
Install
Install extras for specific backends and features:
pip install "mellea[litellm]" # LiteLLM multi-provider (Anthropic, Bedrock, etc.)
pip install "mellea[hf]" # HuggingFace transformers for local inference
pip install "mellea[watsonx]" # IBM WatsonX
pip install "mellea[tools]" # Tool and agent dependencies (LangChain, smolagents)
pip install "mellea[telemetry]" # OpenTelemetry tracing and metrics
uv add "mellea[litellm]" # LiteLLM multi-provider (Anthropic, Bedrock, etc.)
uv add "mellea[hf]" # HuggingFace transformers for local inference
uv add "mellea[watsonx]" # IBM WatsonX
uv add "mellea[tools]" # Tool and agent dependencies (LangChain, smolagents)
uv add "mellea[telemetry]" # OpenTelemetry tracing and metrics
You can combine extras:
pip install "mellea[litellm,tools,telemetry]"
uv add "mellea[litellm,tools,telemetry]"
All extras: mellea[all] installs everything. For the full list of available
extras see pyproject.toml.
Default backend: Ollama
The default session connects to Ollama running locally.
Install Ollama and pull the default model before running any examples:
ollama pull granite4:micro