Commit Graph

4 Commits

Author SHA1 Message Date
23d511fcae feat(localai): LXC install extracts rootfs from Docker image
When using `localaictl install --lxc`:
1. If podman/docker available: extracts rootfs from Docker image
   - Includes ALL backends (llama-cpp, whisper, etc.)
   - Creates LXC container with full LocalAI capabilities
2. If no docker/podman: falls back to standalone binary
   - Limited backend support

This gives the best of both worlds:
- LXC lightweight container management
- Full Docker image backends

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 18:21:05 +01:00
6ca5b20b2c feat(localai): Add multi-runtime support (LXC, Docker, Podman)
localaictl now supports all three container runtimes:
- localaictl install --lxc     (standalone binary, limited backends)
- localaictl install --docker  (full image with all backends)
- localaictl install --podman  (same as docker, rootless)

Auto-detection order: running container > podman > docker > lxc

New UCI options:
- localai.main.runtime = auto|lxc|docker|podman
- localai.lxc.path = /srv/lxc
- localai.lxc.version = v2.25.0

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 18:18:12 +01:00
b245fdb3e7 feat(localai,ollama): Switch LocalAI to Docker and add Ollama package
LocalAI changes:
- Rewrite localaictl to use Docker/Podman instead of standalone binary
- Use localai/localai:v2.25.0-ffmpeg image with all backends included
- Fix llama-cpp backend not found issue
- Auto-detect podman or docker runtime
- Update UCI config with Docker settings

New Ollama package:
- Add secubox-app-ollama as lighter alternative to LocalAI
- Native ARM64 support with backends included
- Simple CLI: ollamactl pull/run/list
- Docker image ~1GB vs 2-4GB for LocalAI

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 17:56:40 +01:00
6b28c4260b feat(localai): Add LocalAI LuCI app with chat, models management and portal integration
- Add secubox-app-localai package with LXC container support for LocalAI service
- Add luci-app-localai with dashboard, chat, models and settings views
- Implement RPCD backend for LocalAI API integration via /v1/models and /v1/chat/completions
- Use direct RPC declarations in LuCI views for reliable frontend communication
- Add LocalAI and Glances to secubox-portal services page
- Move Glances from services to monitoring section

Packages:
- secubox-app-localai: 0.1.0-r1
- luci-app-localai: 0.1.0-r8
- luci-app-secubox-portal: 0.6.0-r5

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-21 16:54:13 +01:00