The official GitHub page for the survey paper "A Survey of Large Language Models".
-
Updated
Aug 2, 2023 - Python
The official GitHub page for the survey paper "A Survey of Large Language Models".
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
Easy-to-use fine-tuning framework using PEFT (PT+SFT+RLHF with QLoRA) (LLaMA-2, BLOOM, Falcon, Baichuan, Qwen)
Papers about pretraining and self-supervised learning on Graph Neural Networks (GNN).
Awesome Graph Self-Supervised Learning
Oscar and VinVL
Awesome resources for in-context learning and prompt engineering: Mastery of the LLMs such as ChatGPT, GPT-3, and FlanT5, with up-to-date and cutting-edge updates.
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
Awesome list for research on CLIP (Contrastive Language-Image Pre-Training).
Research code for ECCV 2020 paper "UNITER: UNiversal Image-TExt Representation Learning"
Code for ICLR 2020 paper "VL-BERT: Pre-training of Generic Visual-Linguistic Representations".
An Open-sourced Knowledgable Large Language Model Framework.
[NeurIPS 2020] "Graph Contrastive Learning with Augmentations" by Yuning You, Tianlong Chen, Yongduo Sui, Ting Chen, Zhangyang Wang, Yang Shen
Code for KDD'20 "Generative Pre-Training of Graph Neural Networks"
Multi-modality pre-training
GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training @ KDD 2020
Probing the representations of Vision Transformers.
Conceptual 12M is a dataset containing (image-URL, caption) pairs collected for vision-and-language pre-training.
(CVPR2021) Kaleido-BERT: Vision-Language Pre-training on Fashion Domain.
Add a description, image, and links to the pre-training topic page so that developers can more easily learn about it.
To associate your repository with the pre-training topic, visit your repo's landing page and select "manage topics."