llama-cpp
Here are 42 public repositories matching this topic...
Extending Hugging Face transformers APIs for Transformer-based models and improve the productivity of inference deployment. With extremely compressed models, the toolkit can greatly improve the inference efficiency on Intel platforms.
-
Updated
Aug 18, 2023 - Python
Build AI applications, chatbots, and agents with JavaScript and TypeScript.
-
Updated
Aug 17, 2023 - TypeScript
UI for
-
Updated
Aug 18, 2023 - TypeScript
This repo is to showcase how you can run a model locally and offline, free of OpenAI dependencies.
-
Updated
Jul 1, 2023 - Python
Building applications with LLMs through composability, in Kotlin, Scala, ...
-
Updated
Aug 18, 2023 - Kotlin
LLaMA Server combines the power of LLaMA C++ with the beauty of Chatbot UI.
-
Updated
Jun 10, 2023 - Python
BabyAGI-
-
Updated
Jun 4, 2023 - Python
InsightSolver: Colab notebooks for exploring and solving operational issues using deep learning, machine learning, and related models.
-
Updated
Aug 11, 2023 - Jupyter Notebook
Making offline AI models accessible to all types of edge devices.
-
Updated
Jul 24, 2023 - Dart
Improve this page
Add a description, image, and links to the llama-cpp topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the llama-cpp topic, visit your repo's landing page and select "manage topics."