The Reliable USB Formatting Utility
-
Updated
Mar 10, 2023 - C
The Reliable USB Formatting Utility
An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
基于 ChatGPT API 的文本翻译、文本润色、语法纠错 Bob 插件,让我们一起迎接不需要巴别塔的新时代!
LightSeq: A High Performance Library for Sequence Processing and Generation
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
Tensor search for humans.
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
Transformer related optimization, including BERT, GPT
A client implementation for ChatGPT and Bing AI. Available as a Node.js module, REST API server, and CLI app.
notes for software engineers getting up to speed on new AI developments. Serves as datastore for lspace.swyx.io writing, and product brainstorming, but has cleaned up canonical references under the /Resources folder.
An unnecessarily tiny implementation of GPT-2 in NumPy.
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
Add a description, image, and links to the gpt topic page so that developers can more easily learn about it.
To associate your repository with the gpt topic, visit your repo's landing page and select "manage topics."