karpathy / minbpe
Minimal, clean, code for the Byte Pair Encoding (BPE) algorithm commonly used in LLM tokenization.
See what the GitHub community is most excited about today.
Minimal, clean, code for the Byte Pair Encoding (BPE) algorithm commonly used in LLM tokenization.
Detect file content types with deep learning
🔎 Hunt down social media accounts by username across social networks
Official PyTorch Implementation of "Scalable Diffusion Models with Transformers"
The official gpt4free repository | various collection of powerful language models
A UI-Focused Agent for Windows OS Interaction.
PyTorch code and models for V-JEPA self-supervised learning from video.
Langchain-Chatchat(原Langchain-ChatGLM)基于 Langchain 与 ChatGLM 等语言模型的本地知识库问答 | Langchain-Chatchat (formerly langchain-ChatGLM), local knowledge based LLM (like ChatGLM) QA app with langchain
GUI-focused roop
Stable Diffusion web UI
Microsoft-Outlook-Remote-Code-Execution-Vulnerability
Automate the process of making money online.
YOLOv5 🚀 in PyTorch > ONNX > CoreML > TFLite
🦜🔗 Build context-aware reasoning applications
Pure python3 implementation for working with iDevices (iPhone, etc...).
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
The most powerful and modular stable diffusion GUI, api and backend with a graph/nodes interface.
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
Easy-to-use LLM fine-tuning framework (LLaMA, BLOOM, Mistral, Baichuan, Qwen, ChatGLM)
The official repo of Qwen (通义千问) chat & pretrained large language model proposed by Alibaba Cloud.
SplaTAM: Splat, Track & Map 3D Gaussians for Dense RGB-D SLAM
Fast and memory-efficient exact attention
An Open-Source Assistants API and GPTs alternative. Dify.AI is an LLM application development platform. It integrates the concepts of Backend as a Service and LLMOps, covering the core tech stack required for building generative AI-native applications, including a built-in RAG engine.
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.