# LM Studio

[LM Studio](https://lmstudio.ai/) is a tool for running AI models locally on your computer. LM Studio exposes a local server supporting LM Studio API, OpenAI-compatible, and Antropic-compatible endpoints

LM Studio supports popular LLM features, e.g., streaming responses, and MCP.

#### Chat Copilot Configuration

To use LM Studio with a Chat Copilot in PixieBrix:

1. Configure a Local Integration for "HTTP No Authentication". See [configuring-integrations](https://docs.pixiebrix.com/integrations/configuring-integrations "mention")
   1. Base URL: provide the URL of your local LM Studio server. The default LM Studio server origin is: <http://127.0.0.1:1234>
2. Register a Chat Copilot with the "Add AI Chat Copilot" brick
   1. LLM Integration Configuration: select the integration you configured
