Skip to main content
Mellea programs are, at last, just Python programs. Mellea programs can be shared via the Model Context Protocol or via the A2A protocol. Mellea programs can also consume tools and agents that implement these protocols.

Simple mcp server running Mellea

Like we mentioned, mellea are at the end python programs. We can wrap a simple mcp server around a program and use the server as-is. Here is an example using Pydantic AI’s inbuild mcp server.
# Create an MCP server
mcp = FastMCP("Demo")


@mcp.tool()
def write_a_poem(word_limit: int) -> str:
    """Write a poem with a word limit."""
    m = MelleaSession(OllamaModelBackend(model_ids.QWEN3_8B))
    wl_req = Requirement(
        f"Use only {word_limit} words.",
        validation_fn=simple_validate(lambda x: len(x.split(" ")) < word_limit),
    )

    res = m.instruct(
        "Write a poem",
        requirements=[wl_req],
        strategy=RejectionSamplingStrategy(loop_budget=4),
    )
    assert isinstance(res, ModelOutputThunk)
    return str(res.value)

if __name__ == '__main__':
    mcp.run()
This is a simple example to show how to write a poem MCP tool with Mellea and instruct-validate-repair. Being able to speak the tool language allows you to integrate with Claude Desktop, Langflow, … See code in mcp_example.py

run the example stand-alone

You need to install the mcp package:
Bash
uv pip install "mcp[cli]"
and run the example in MCP debug UI:
Bash
uv run mcp dev docs/examples/tutorial/mcp_example.py

use the poem tool via MCP in Langflow

Follow this tutorial to use the MCP tool in Langflow: https://docs.langflow.org/mcp-client#mcp-stdio-mode The JSON to register your MCP tool is the following. Make sure to insert the absolute path to the directory containing the mcp_example.py file:
JSON
{
  "mcpServers": {
    "mellea_mcp_server": {
      "command": "uv",
      "args": [
        "--directory",
        "<ABSOLUTE PATH>/mellea/docs/examples/mcp",
        "run",
        "mcp",
        "run",
        "mcp_example.py"
      ]
    }
  }
}
Connect your MCP tools in tool mode like this: Have fun trying your new tool in the Playground !

Running Mellea programs as an openai compatible server (Experimental)

We also provide an expiermental m serve utility for serving up an OpenAI-compatible chat endpoint. This allows you to write m programs that masquerade as a “model”. To learn more about this functionality, run:
m serve --help

Example m serve application

While deploying programs using m serve, it is important for the programs to follow a specific structure. The programs needs a have function called serve with the following signature:
# file: https://github.com/generative-computing/mellea/blob/main/docs/examples/agents/m_serve_example.py#L25-L29
def serve(
    input: list[ChatMessage],
    model_options: None | dict = None,
    **kwargs
)
the m serve command then subsequently takes this function and runs a server that is openai compatible. For more information, please have a look at this file for how to write an m serve compatible program. To run the example:
m serve docs/examples/tutorial/m_serve_example.py