secubox-openwrt/package/secubox/secubox-app-localai
CyberMind-FR 62f2f6a7a8 docs(secubox): Add KISS README for all 46 remaining packages
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-03 07:34:06 +01:00
..
files feat(hexojs): Add Build & Publish LuCI interface for Gitea workflow 2026-01-26 16:18:40 +01:00
Makefile feat(localai): Rewrite secubox-app-localai with native binary download 2026-01-22 04:55:17 +01:00
README.md docs(secubox): Add KISS README for all 46 remaining packages 2026-02-03 07:34:06 +01:00

SecuBox LocalAI

Native LLM server with OpenAI-compatible REST API. Supports GGUF models on ARM64 and x86_64.

Installation

opkg install secubox-app-localai

Configuration

UCI config file: /etc/config/localai

config localai 'main'
    option enabled '0'
    option port '8080'
    option models_path '/srv/localai/models'

Usage

# Install the binary (downloaded on first run)
localaictl install

# Start / stop the service
localaictl start
localaictl stop

# Check status
localaictl status

# Download a model
localaictl model-pull <model-name>

The binary is downloaded from GitHub releases on first localaictl install.

Features

  • OpenAI-compatible REST API
  • GGUF model support (LLaMA, Mistral, Phi, TinyLlama, etc.)
  • ARM64 and x86_64 architectures

Files

  • /etc/config/localai -- UCI configuration
  • /usr/sbin/localaictl -- controller CLI
  • /srv/localai/models/ -- model storage directory

Dependencies

  • libstdcpp
  • libpthread
  • wget-ssl
  • ca-certificates

License

MIT