secubox-openwrt/package/secubox/secubox-app-ollama/Makefile
CyberMind-FR d9e77745db fix(deps): Remove libubox/libubus/libuci from all SecuBox package dependencies
These base OpenWrt libraries are always present on the system but their
versions in the SDK-built feed don't match the router's installed versions,
causing opkg to fail with "Cannot satisfy dependencies" errors.

Fixed packages (18 total):
- secubox-core: removed libubox, libubus, libuci
- luci-app-ksm-manager: removed libubus, libubox
- luci-app-mqtt-bridge: removed libuci
- secubox-app-adguardhome: removed uci, libuci
- secubox-app-auth-logger: removed libubox-lua
- secubox-app-domoticz: removed uci, libuci
- secubox-app-gitea: removed uci, libuci
- secubox-app-glances: removed uci, libuci
- secubox-app-hexojs: removed uci, libuci
- secubox-app-lyrion: removed uci, libuci
- secubox-app-magicmirror2: removed uci, libuci
- secubox-app-mailinabox: removed uci, libuci
- secubox-app-mitmproxy: removed uci, libuci
- secubox-app-nextcloud: removed uci, libuci
- secubox-app-ollama: removed uci, libuci
- secubox-app-picobrew: removed uci, libuci
- secubox-app-streamlit: removed uci, libuci
- secubox-app-zigbee2mqtt: removed uci, libuci

The packages still work because these libs are implicitly available.

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-30 19:46:27 +01:00

77 lines
1.8 KiB
Makefile

include $(TOPDIR)/rules.mk
PKG_NAME:=secubox-app-ollama
PKG_RELEASE:=1
PKG_VERSION:=0.1.0
PKG_ARCH:=all
PKG_MAINTAINER:=CyberMind Studio <contact@cybermind.fr>
PKG_LICENSE:=MIT
include $(INCLUDE_DIR)/package.mk
define Package/secubox-app-ollama
SECTION:=utils
CATEGORY:=Utilities
PKGARCH:=all
SUBMENU:=SecuBox Apps
TITLE:=SecuBox Ollama - Local LLM Runtime
DEPENDS:=jsonfilter +wget-ssl
endef
define Package/secubox-app-ollama/description
Ollama - Simple local LLM runtime for SecuBox-powered OpenWrt systems.
Features:
- Easy model management (ollama pull, ollama run)
- OpenAI-compatible API
- Native ARM64 support with backends included
- Lightweight compared to LocalAI
- Support for LLaMA, Mistral, Phi, Gemma models
Runs in Docker/Podman container.
Configure in /etc/config/ollama.
endef
define Package/secubox-app-ollama/conffiles
/etc/config/ollama
endef
define Build/Compile
endef
define Package/secubox-app-ollama/install
$(INSTALL_DIR) $(1)/etc/config
$(INSTALL_CONF) ./files/etc/config/ollama $(1)/etc/config/ollama
$(INSTALL_DIR) $(1)/etc/init.d
$(INSTALL_BIN) ./files/etc/init.d/ollama $(1)/etc/init.d/ollama
$(INSTALL_DIR) $(1)/usr/sbin
$(INSTALL_BIN) ./files/usr/sbin/ollamactl $(1)/usr/sbin/ollamactl
endef
define Package/secubox-app-ollama/postinst
#!/bin/sh
[ -n "$${IPKG_INSTROOT}" ] || {
echo ""
echo "Ollama installed."
echo ""
echo "Prerequisites: Install podman or docker first"
echo " opkg install podman"
echo ""
echo "To install and start Ollama:"
echo " ollamactl install # Pull Docker image (~1GB)"
echo " /etc/init.d/ollama start"
echo ""
echo "API endpoint: http://<router-ip>:11434/api"
echo ""
echo "Download and run models:"
echo " ollamactl pull tinyllama"
echo " ollamactl run tinyllama"
echo ""
}
exit 0
endef
$(eval $(call BuildPackage,secubox-app-ollama))