ollama-mcp-bridge
Updated at 15 days ago
by patruff
123
on GitHub
Bridge between Ollama and MCP servers, enabling local LLMs to use Model Context Protocol tools
Tags
What is ollama-mcp-bridge
Ollama-mcp-bridge is a tool that enables Metal Compute Plane (MCP) to access and utilize large language models (LLMs) served by Ollama. It acts as a bridge, translating requests from MCP into a format Ollama can understand and relaying the responses back to MCP. This allows MCP users to leverage the power of LLMs without needing to directly manage or integrate with Ollama's API themselves. Essentially, it makes Ollama accessible as a resource within the MCP environment.
How to use
The bridge operates using HTTP requests. MCP makes requests to the bridge's endpoint (typically /ollama/api/generate
), which then translates the request into Ollama's API format (specifically the /api/generate
endpoint). The response from Ollama is relayed back to MCP. A simple configuration file (config.yaml
) allows for setting the Ollama server address and other parameters. Environment variables can be used for configuration as well.
The bridge expects JSON payloads for requests, including fields like prompt
to specify the text generation request.
Key features
- Bridge between MCP and Ollama: Enables seamless interaction between the two systems.
- Configuration via YAML or Environment Variables: Provides flexible configuration options.
- Simple HTTP Interface: Exposes an easy-to-use HTTP endpoint for requests.
- Request Translation: Handles the conversion of MCP requests into Ollama-compatible requests.
- Response Relaying: Passes Ollama's responses back to the MCP environment.
- Model Parameter Configuration: Allows configuring parameters such as
model
,format
,stream
(streaming responses),options
(extra parameters for the model).
Use cases
- Integrating LLMs into MCP-based applications: Allows MCP applications to use LLMs for tasks such as text generation, translation, and question answering.
- Centralized LLM access within MCP environments: Provides a single point of access to Ollama-served LLMs for all MCP components.
- Simplifying LLM deployment and management: Abstracts away the complexities of directly managing Ollama instances.
FAQ
-
What is MCP (Metal Compute Plane)?
- MCP is a compute platform. The specifics aren't defined in the readme, but the project facilitates it utilizing Ollama.
-
What kind of models can I use with this bridge?
- Any model that is available and compatible with Ollama can be used with the bridge.
-
How do I configure the bridge?
- You can configure the bridge using a
config.yaml
file or through environment variables. The configuration includes settings like the address of the Ollama server.
- You can configure the bridge using a
-
Can I stream responses from the LLM?
- Yes, the bridge supports streaming responses from Ollama.
-
What format should my requests be in?
- Requests should be in JSON format, and must include, at minimum, a
prompt
parameter to tell the LLM what to generate.
- Requests should be in JSON format, and must include, at minimum, a