secubox-openwrt/package/secubox/luci-app-localai
CyberMind-FR e58f479cd4 feat(waf): Update WAF scenarios with 2024-2025 CVEs and OWASP threats
Add detection patterns for latest actively exploited vulnerabilities:
- CVE-2025-55182 (React2Shell, CVSS 10.0)
- CVE-2025-8110 (Gogs RCE), CVE-2025-53770 (SharePoint)
- CVE-2025-52691 (SmarterMail), CVE-2025-40551 (SolarWinds)
- CVE-2024-47575 (FortiManager), CVE-2024-21887 (Ivanti)
- CVE-2024-3400, CVE-2024-0012, CVE-2024-9474 (PAN-OS)

New attack categories based on OWASP Top 10 2025:
- HTTP Request Smuggling (TE.CL/CL.TE conflicts)
- AI/LLM Prompt Injection (ChatML, instruction markers)
- WAF Bypass techniques (Unicode normalization, double encoding)
- Supply Chain attacks (CI/CD poisoning, dependency confusion)
- Extended SSTI (Jinja2, Freemarker, Velocity, Thymeleaf)
- API Abuse (BOLA/IDOR, mass assignment)

CrowdSec scenarios split into 11 separate files for reliability.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-02-12 05:02:57 +01:00
..
htdocs/luci-static/resources feat(waf): Update WAF scenarios with 2024-2025 CVEs and OWASP threats 2026-02-12 05:02:57 +01:00
root/usr fix(localai): Add gte-small preset, fix RPC expect unwrapping and chat JSON escaping 2026-02-04 08:36:20 +01:00
Makefile fix(localai): Add LXC container support to RPCD backend 2026-01-21 18:05:35 +01:00
README.md docs(secubox): Add KISS README for all 46 remaining packages 2026-02-03 07:34:06 +01:00

LuCI LocalAI Dashboard

Local LLM inference server management with OpenAI-compatible API.

Installation

opkg install luci-app-localai

Access

LuCI menu: Services -> LocalAI

Tabs

  • Dashboard -- Service health, loaded models, API endpoint status
  • Models -- Install, remove, and manage LLM models
  • Chat -- Interactive chat interface for testing models
  • Settings -- API port, memory limits, runtime configuration

RPCD Methods

Backend: luci.localai

Method Description
status Service status and runtime info
models List installed models
config Get configuration
health API health check
metrics Inference metrics and stats
start Start LocalAI
stop Stop LocalAI
restart Restart LocalAI
model_install Install a model by name
model_remove Remove an installed model
chat Send chat completion request
complete Send text completion request

Dependencies

  • luci-base
  • secubox-app-localai

License

Apache-2.0