- Replace Docker/LXC-based approach with direct binary download
- Download LocalAI v2.25.0 binary from GitHub releases
- Add localaictl CLI for install, model management, and service control
- Change default port to 8081 (avoid CrowdSec conflict on 8080)
- Remove secubox-app-localai-wb (merged into secubox-app-localai)
- Add model presets: tinyllama, phi2, mistral
Usage:
localaictl install
localaictl model-install tinyllama
/etc/init.d/localai enable && /etc/init.d/localai start
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>