# Ollama

[Ollama](https://ollama.com/) is a popular tool for running AI models locally. It runs a local web server that can be accessed via other tools.

### Allowlisting Ollama Requests from Chrome Extensions

By default, Ollama only allows requests from `127.0.0.1` and `0.0.0.0` . To enable access from PixieBrix, set the `OLLAMA_ORIGINS` configuration variable:

```
OLLAMA_ORIGINS=chrome-extension://*
```

For more information, refer to the Ollama documentation:

* [How can I allow additional web origins to access Ollama?](https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-allow-additional-web-origins-to-access-ollama)
* [How do I configure Ollama Server?](https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-configure-ollama-server)

### Making Ollama Requests

Ollama makes API models available via a local HTTP server. Therefore, you can use PixieBrix's standard request bricks:

* `HTTP Request`
* `[Experimental] Streaming HTTP Request`

For a full list of available methods, refer to [Ollama REST API Documentation](https://github.com/ollama/ollama/blob/main/docs/api.md)

#### Response Streaming

By default, Ollama's `/generate`and `/chat` methods stream the response. To disable streaming, include `stream: false` in the request data.

### Frequently Asked Questions

#### I receive a `403 Forbidden` error when trying to access the Ollama server from PixieBrix

Ensure the `OLLAMA_ORIGINS` variable is set and that the Ollama server has read the configuration variable. See [#allowlisting-access-from-chrome-extensions](#allowlisting-access-from-chrome-extensions "mention")
