secubox-openwrt/package/secubox/secubox-app-localai
CyberMind-FR daa4c48375 fix(localai): Add gte-small preset, fix RPC expect unwrapping and chat JSON escaping
- Add gte-small embedding model preset to localaictl with proper YAML
  config (embeddings: true, context_size: 512)
- Fix RPC expect declarations across api.js, dashboard.js, models.js to
  use empty expect objects, preserving full response including error fields
- Replace fragile sed/awk JSON escaping in RPCD chat and completion
  handlers with file I/O streaming through awk for robust handling of
  special characters in LLM responses
- Switch RPCD chat handler from curl to wget to avoid missing output
  file on timeout (curl doesn't create -o file on exit code 28)
- Bypass RPCD 30s script timeout for chat by calling LocalAI API
  directly from the browser via fetch()
- Add embeddings flag to models RPC and filter embedding models from
  chat view model selector

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-04 08:36:20 +01:00
..
files fix(localai): Add gte-small preset, fix RPC expect unwrapping and chat JSON escaping 2026-02-04 08:36:20 +01:00
Makefile feat(localai): Rewrite secubox-app-localai with native binary download 2026-01-22 04:55:17 +01:00
README.md docs(secubox): Add KISS README for all 46 remaining packages 2026-02-03 07:34:06 +01:00

SecuBox LocalAI

Native LLM server with OpenAI-compatible REST API. Supports GGUF models on ARM64 and x86_64.

Installation

opkg install secubox-app-localai

Configuration

UCI config file: /etc/config/localai

config localai 'main'
    option enabled '0'
    option port '8080'
    option models_path '/srv/localai/models'

Usage

# Install the binary (downloaded on first run)
localaictl install

# Start / stop the service
localaictl start
localaictl stop

# Check status
localaictl status

# Download a model
localaictl model-pull <model-name>

The binary is downloaded from GitHub releases on first localaictl install.

Features

  • OpenAI-compatible REST API
  • GGUF model support (LLaMA, Mistral, Phi, TinyLlama, etc.)
  • ARM64 and x86_64 architectures

Files

  • /etc/config/localai -- UCI configuration
  • /usr/sbin/localaictl -- controller CLI
  • /srv/localai/models/ -- model storage directory

Dependencies

  • libstdcpp
  • libpthread
  • wget-ssl
  • ca-certificates

License

MIT