secubox-openwrt/package/secubox/luci-app-localai
CyberMind-FR 28acb7e70f fix(localai): Fix health check and uptime detection in RPCD handler
Health check grep was case-sensitive ("ok") but LocalAI returns "OK".
Uptime detection fell into the lxc-info branch (command exists on router)
even though no localai container runs, causing uptime to always be 0.
Simplified to always use /proc/PID which works for both native and
containerized processes.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-03 19:04:23 +01:00
..
htdocs/luci-static/resources fix(luci-app-localai): Fix JSON parse error in chat 2026-01-22 05:14:15 +01:00
root/usr fix(localai): Fix health check and uptime detection in RPCD handler 2026-02-03 19:04:23 +01:00
Makefile fix(localai): Add LXC container support to RPCD backend 2026-01-21 18:05:35 +01:00
README.md docs(secubox): Add KISS README for all 46 remaining packages 2026-02-03 07:34:06 +01:00

LuCI LocalAI Dashboard

Local LLM inference server management with OpenAI-compatible API.

Installation

opkg install luci-app-localai

Access

LuCI menu: Services -> LocalAI

Tabs

  • Dashboard -- Service health, loaded models, API endpoint status
  • Models -- Install, remove, and manage LLM models
  • Chat -- Interactive chat interface for testing models
  • Settings -- API port, memory limits, runtime configuration

RPCD Methods

Backend: luci.localai

Method Description
status Service status and runtime info
models List installed models
config Get configuration
health API health check
metrics Inference metrics and stats
start Start LocalAI
stop Stop LocalAI
restart Restart LocalAI
model_install Install a model by name
model_remove Remove an installed model
chat Send chat completion request
complete Send text completion request

Dependencies

  • luci-base
  • secubox-app-localai

License

Apache-2.0