Documentation on setting up a local LLM server on Debian from scratch, using Ollama/llama.cpp/vLLM, Open WebUI, Kokoro FastAPI, and ComfyUI.
-
Updated
Oct 6, 2025
Documentation on setting up a local LLM server on Debian from scratch, using Ollama/llama.cpp/vLLM, Open WebUI, Kokoro FastAPI, and ComfyUI.
Ziel ist eine einfache Installation und starten der OpenWebUi auf Deutsch mit Deutscher Sprachausgabe.
Add a description, image, and links to the openedai-speech topic page so that developers can more easily learn about it.
To associate your repository with the openedai-speech topic, visit your repo's landing page and select "manage topics."