LM Studio

Documentation on using LM Studio models from PixieBrix

LM Studioarrow-up-right is a tool for running AI models locally on your computer. LM Studio exposes a local server supporting LM Studio API, OpenAI-compatible, and Antropic-compatible endpoints

LM Studio supports popular LLM features, e.g., streaming responses, and MCP.

Chat Copilot Configuration

To use LM Studio with a Chat Copilot in PixieBrix:

  1. Configure a Local Integration for "HTTP No Authentication". See Configuring Integrations

    1. Base URL: provide the URL of your local LM Studio server. The default LM Studio server origin is: http://127.0.0.1:1234arrow-up-right

  2. Register a Chat Copilot with the "Add AI Chat Copilot" brick

    1. LLM Integration Configuration: select the integration you configured

Last updated

Was this helpful?