The Reliable USB Formatting Utility
-
Updated
Feb 20, 2023 - C
The Reliable USB Formatting Utility
An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
LightSeq: A High Performance Library for Sequence Processing and Generation
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
Tensor search for humans.
Transformer related optimization, including BERT, GPT
notes for software engineers getting up to speed on new AI developments. Serves as datastore for lspace.swyx.io writing, and product brainstorming, but has cleaned up canonical references under the /Resources folder.
RWKV is a RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
A client implementation for ChatGPT and Bing. Available as a Node.js module, REST API server, and CLI app.
Add a description, image, and links to the gpt topic page so that developers can more easily learn about it.
To associate your repository with the gpt topic, visit your repo's landing page and select "manage topics."