Ollama

Integrating with a local Ollama server for on-device AI

Ollama is a popular tool for running AI models locally. It runs a local web server that can be accessed via other tools.

Allowlisting Ollama Requests from Chrome Extensions

By default, Ollama only allows requests from 127.0.0.1 and 0.0.0.0 . To enable access from PixieBrix, set the OLLAMA_ORIGINS configuration variable:

OLLAMA_ORIGINS=chrome-extension://*

For more information, refer to the Ollama documentation:

Making Ollama Requests

Ollama makes API models available via a local HTTP server. Therefore, you can use PixieBrix's standard request bricks:

  • HTTP Request

  • [Experimental] Streaming HTTP Request

For a full list of available methods, refer to Ollama REST API Documentation

Response Streaming

By default, Ollama's /generateand /chat methods stream the response. To disable streaming, include stream: false in the request data.

Frequently Asked Questions

I receive a 403 Forbiddenerror when trying to access the Ollama server from PixieBrix

Ensure the OLLAMA_ORIGINS variable is set and that the Ollama server has read the variable. See #allowlisting-access-from-chrome-extensions

Last updated

Was this helpful?