llamacpp
Here are 91 public repositories matching this topic...
AGiXT is a dynamic AI Automation Platform that seamlessly orchestrates instruction management and complex task execution across diverse AI providers. Combining adaptive memory, smart features, and a versatile plugin system, AGiXT delivers efficient and comprehensive AI solutions.
-
Updated
Aug 18, 2023 - Python
Believe in AI democratization. llama for nodejs backed by llama-rs, llama.cpp and rwkv.cpp, work locally on your laptop CPU. support llama/alpaca/gpt4all/vicuna/rwkv model.
-
Updated
Aug 3, 2023 - Rust
Build AI applications, chatbots, and agents with JavaScript and TypeScript.
-
Updated
Aug 17, 2023 - TypeScript
The "vicuna-installation-guide" provides step-by-step instructions for installing and configuring Vicuna 13 and 7B
-
Updated
Jun 20, 2023
A Discord Bot for chatting with LLaMA, Vicuna, Alpaca, MPT, or any other Large Language Model (LLM) supported by text-generation-webui or llama.cpp.
-
Updated
Jun 4, 2023 - Python
Run any Large Language Model behind a unified API
-
Updated
Jul 24, 2023 - Python
LLaMA Server combines the power of LLaMA C++ with the beauty of Chatbot UI.
-
Updated
Jun 10, 2023 - Python
LocalAGI:Locally run AGI powered by LLaMA, ChatGLM and more. | 基于 ChatGLM, LLaMA 大模型的本地运行的 AGI
-
Updated
Jun 25, 2023 - Python
An AI-app that allows you to upload a PDF and ask questions about it. It uses StableVicuna 13B and runs locally.
-
Updated
May 7, 2023 - Python
An AI chatbot for Signal powered by Google Bard, Bing Chat, ChatGPT, HuggingChat, and llama.cpp
-
Updated
Aug 2, 2023 - Python
.NET wrapper for LLaMA.cpp for LLaMA language model inference on CPU.
-
Updated
May 9, 2023 - C#
-
Updated
Aug 10, 2023 - Python
Improve this page
Add a description, image, and links to the llamacpp topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the llamacpp topic, visit your repo's landing page and select "manage topics."