Commit Graph

11 Commits

Author SHA1 Message Date
113f41b09c feat(localai): Upgrade to v3.9.0 with Agent Jobs and Memory Reclaimer
Upgrade LocalAI from v2.25.0 to v3.9.0 with new features:

- **Agent Jobs Panel**: Schedule and manage background agentic tasks
- **Memory Reclaimer**: LRU eviction for loaded models, automatic VRAM cleanup
- **VibeVoice backend**: New voice synthesis support

Update README with:
- v3.9 feature highlights
- Complete CLI command reference
- Model presets table (tinyllama, phi2, mistral, gte-small)
- API endpoints documentation
- SecuBox Couche 2 integration notes

This is part of the v0.18 AI Gateway roadmap.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-05 05:02:45 +01:00
612a1be6ea feat(localai): Rewrite secubox-app-localai with native binary download
- Replace Docker/LXC-based approach with direct binary download
- Download LocalAI v2.25.0 binary from GitHub releases
- Add localaictl CLI for install, model management, and service control
- Change default port to 8081 (avoid CrowdSec conflict on 8080)
- Remove secubox-app-localai-wb (merged into secubox-app-localai)
- Add model presets: tinyllama, phi2, mistral

Usage:
  localaictl install
  localaictl model-install tinyllama
  /etc/init.d/localai enable && /etc/init.d/localai start

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-22 04:55:17 +01:00
1cce649751 feat(localai): Use Docker Registry API for LXC install (no daemon needed)
- Download Docker image layers directly via Registry API
- No dockerd or podman daemon required anymore
- Same approach as mitmproxy and magicmirror2 packages
- All backends included (llama-cpp, whisper, etc.)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 18:51:34 +01:00
55914b8b3c feat(localai): Update to LocalAI v3.10.0
- Updated default version from v2.25.0 to v3.10.0
- Fixed binary URL format: local-ai-v3.10.0-linux-arm64
- Updated Docker image tag to v3.10.0-ffmpeg

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 18:34:05 +01:00
32aa17ab8b fix(localai): Fix standalone binary download URL
GitHub releases use: local-ai-Linux-arm64 (not local-ai-v2.25.0-linux-arm64)
- Fixed architecture naming (Linux-arm64, Linux-x86_64)
- Removed version from filename
- Added URL logging for debugging

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 18:32:37 +01:00
ec1f722687 fix(localai): Check Docker daemon is running before extracting rootfs
- Add runtime_is_working() to verify daemon connectivity
- Falls back to standalone binary with helpful message if daemon not running
- Provides hint: /etc/init.d/dockerd start

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 18:29:53 +01:00
23d511fcae feat(localai): LXC install extracts rootfs from Docker image
When using `localaictl install --lxc`:
1. If podman/docker available: extracts rootfs from Docker image
   - Includes ALL backends (llama-cpp, whisper, etc.)
   - Creates LXC container with full LocalAI capabilities
2. If no docker/podman: falls back to standalone binary
   - Limited backend support

This gives the best of both worlds:
- LXC lightweight container management
- Full Docker image backends

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 18:21:05 +01:00
6ca5b20b2c feat(localai): Add multi-runtime support (LXC, Docker, Podman)
localaictl now supports all three container runtimes:
- localaictl install --lxc     (standalone binary, limited backends)
- localaictl install --docker  (full image with all backends)
- localaictl install --podman  (same as docker, rootless)

Auto-detection order: running container > podman > docker > lxc

New UCI options:
- localai.main.runtime = auto|lxc|docker|podman
- localai.lxc.path = /srv/lxc
- localai.lxc.version = v2.25.0

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 18:18:12 +01:00
b245fdb3e7 feat(localai,ollama): Switch LocalAI to Docker and add Ollama package
LocalAI changes:
- Rewrite localaictl to use Docker/Podman instead of standalone binary
- Use localai/localai:v2.25.0-ffmpeg image with all backends included
- Fix llama-cpp backend not found issue
- Auto-detect podman or docker runtime
- Update UCI config with Docker settings

New Ollama package:
- Add secubox-app-ollama as lighter alternative to LocalAI
- Native ARM64 support with backends included
- Simple CLI: ollamactl pull/run/list
- Docker image ~1GB vs 2-4GB for LocalAI

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 17:56:40 +01:00
db3a41928e fix(luci): Fix require syntax in all LuCI views - use slashes instead of dots
All 'require module.submodule' directives changed to 'require module/submodule'
to match LuCI's module loading convention.

Affected packages:
- luci-app-auth-guardian
- luci-app-glances
- luci-app-localai
- luci-app-magicmirror2
- luci-app-mitmproxy
- luci-app-mmpm
- luci-app-mqtt-bridge
- luci-app-ndpid
- luci-app-network-modes
- luci-app-secubox-admin
- luci-app-secubox-portal
- luci-app-wireguard-dashboard

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 17:15:21 +01:00
6b28c4260b feat(localai): Add LocalAI LuCI app with chat, models management and portal integration
- Add secubox-app-localai package with LXC container support for LocalAI service
- Add luci-app-localai with dashboard, chat, models and settings views
- Implement RPCD backend for LocalAI API integration via /v1/models and /v1/chat/completions
- Use direct RPC declarations in LuCI views for reliable frontend communication
- Add LocalAI and Glances to secubox-portal services page
- Move Glances from services to monitoring section

Packages:
- secubox-app-localai: 0.1.0-r1
- luci-app-localai: 0.1.0-r8
- luci-app-secubox-portal: 0.6.0-r5

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 16:54:13 +01:00