Skip to main content
Ollama is an open-source tool for running large language models locally on your own hardware. With BonData, you can connect to a self-hosted Ollama instance to use local models in your Agent workflows — giving you full control over your data and infrastructure with no third-party API costs.

Ollama Connection Setup

Follow these steps to connect BonData to your Ollama server.

Step 1: Ensure Ollama is Accessible

Your Ollama server must be reachable from the internet. By default, Ollama runs on port 11434.
  1. Verify your server is running and accessible at its public URL
  2. Test the connection by visiting https://your-server.com:11434/api/tags — you should see a JSON response listing available models

Step 2: Enter Connection Details

Enter your Ollama server URL in BonData (e.g., https://ollama.your-server.com:11434).
Your Ollama server must have a public URL reachable from the internet. Local-only instances (e.g., localhost) will not work.
No API key is required — Ollama uses the server URL for authentication.

Required Fields

FieldDescription
Host URLPublic URL of your Ollama server (e.g., https://ollama.your-server.com:11434)