Files
grimoire/package.json
Claude 7d12b960e3 feat: add local LLM chat with WebLLM and PPQ.ai support
Implements browser-based AI chat functionality:

- WebLLM provider for local inference via WebGPU
  - Downloads and caches models in IndexedDB
  - Runs inference in web worker to keep UI responsive
  - Curated list of recommended models (SmolLM2, Llama 3.2, Phi 3.5, etc.)

- PPQ.ai provider for cloud-based inference
  - OpenAI-compatible API with Lightning payments
  - Dynamic model list fetching with caching

- Provider manager for coordinating multiple providers
  - Dexie tables for persisting provider instances and conversations
  - Model list caching with 1-hour TTL

- AIViewer component with sidebar pattern
  - Conversation history in resizable sidebar (mobile: sheet)
  - Model selector for WebLLM with download progress
  - Streaming chat responses
  - Auto-generated conversation titles

- New `ai` command to launch the interface

https://claude.ai/code/session_01HqtD9R33oqfB14Gu1V5wHC
2026-01-30 21:53:48 +00:00

3.9 KiB