secubox-openwrt/package/secubox/luci-app-localai
CyberMind-FR 62f2f6a7a8 docs(secubox): Add KISS README for all 46 remaining packages
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-03 07:34:06 +01:00
..
htdocs/luci-static/resources fix(luci-app-localai): Fix JSON parse error in chat 2026-01-22 05:14:15 +01:00
root/usr refactor(menu): Move SecuBox services to LuCI Services menu 2026-01-30 19:46:26 +01:00
Makefile fix(localai): Add LXC container support to RPCD backend 2026-01-21 18:05:35 +01:00
README.md docs(secubox): Add KISS README for all 46 remaining packages 2026-02-03 07:34:06 +01:00

LuCI LocalAI Dashboard

Local LLM inference server management with OpenAI-compatible API.

Installation

opkg install luci-app-localai

Access

LuCI menu: Services -> LocalAI

Tabs

  • Dashboard -- Service health, loaded models, API endpoint status
  • Models -- Install, remove, and manage LLM models
  • Chat -- Interactive chat interface for testing models
  • Settings -- API port, memory limits, runtime configuration

RPCD Methods

Backend: luci.localai

Method Description
status Service status and runtime info
models List installed models
config Get configuration
health API health check
metrics Inference metrics and stats
start Start LocalAI
stop Stop LocalAI
restart Restart LocalAI
model_install Install a model by name
model_remove Remove an installed model
chat Send chat completion request
complete Send text completion request

Dependencies

  • luci-base
  • secubox-app-localai

License

Apache-2.0