Skip to content
Avatar
Block or Report

Block or report NExT-GPT

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Add an optional note:
Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
NExT-GPT/README.md

NExT-GPT: Any-to-Any Multimodal LLM

Shengqiong Wu, Hao Fei*, Leigang Qu, Wei Ji, and Tat-Seng Chua. (*Correspondence )

NExT++, School of Computing, National University of Singapore


License YouTube

This repository hosts the code, data and model weight of NExT-GPT, the first end-to-end MM-LLM that perceives input and generates output in arbitrary combinations (any-to-any) of text, image, video, and audio and beyond.


🎉 News

  • [2023.09.15] 🚀🚀 Release the code of NExT-GPT in version 7b_tiva_v0.

👉 TODO

  • Release checkpoints (projection layers).
  • Release MosIT data.
  • Updating NExT-GPT in more types&sizes of LLMs.
  • Empowering NExT-GPT with more modalities of inputs&outputs.
  • ...

Example Demos

Here we showcase examples generated from NExT-GPT. For more examples, kindly visit the webpage, or the online live demo.

example_5_Trim.mp4
example_6_Trim.mp4
example_9_Trim.mp4

Brief Introduction

NExt-GPT is built on top of existing pre-trained LLM, multimodal encoder and SoTA diffusion models, with sufficient end-to-end instruction tuning.

Video-LLaMA

  • Multimodal Encoding Stage. Leveraging established encoders to encode inputs in various modalities, where these representations are projected into language-like representations comprehensible to the LLM through a projection layer.
  • LLM Understanding and Reasoning Stage. Harnessing an existing open-sourced LLM as the core to process input information for semantic understanding and reasoning. The LLM not only directly generates text tokens but also produces unique “modality signal” tokens that serve as instructions to dictate the decoding layers whether & what modal content to output correspondingly.
  • Multimodal Generation Stage. Receiving the multimodal signals with specific instructions from LLM (if any), the Transformer-based output projection layers map the signal token representations into the ones that are understandable to following multimodal decoders.

For more technical details, kindly refer to the paper.


Getting Started

Table of Contents:


1. Code Structure

├── figures
├── data
│   ├── T-X_pair_data  
│   │   ├── audiocap                      # text-autio pairs data
│   │   │   ├── audios                    # audio files
│   │   │   └── audiocap.json             # the audio captions
│   │   ├── cc3m                          # text-image paris data
│   │   │   ├── images                    # image files
│   │   │   └── cc3m.json                 # the image captions
│   │   └── webvid                        # text-video pairs data
│   │   │   ├── videos                    # video files
│   │   │   └── webvid.json               # the video captions
│   ├── IT_data                           # instruction data
│   │   ├── T+X-T_data                    # text+[image/audio/video] to text instruction data
│   │   │   ├── alpaca                    # textual instruction data
│   │   │   ├── llava                     # visual instruction data
│   │   ├── T-T+X                         # synthesized text to text+[image/audio/video] instruction data
│   │   └── MosIT                         # Modality-switching Instruction Tuning instruction data
├── code
│   ├── config
│   │   ├── base.yaml                     # the model configuration 
│   │   ├── stage_1.yaml                  # enc-side alignment training configuration
│   │   ├── stage_2.yaml                  # dec-side alignment training configuration
│   │   └── stage_3.yaml                  # instruction-tuning configuration
│   ├── dsconfig
│   │   ├── stage_1.json                  # deepspeed configuration for enc-side alignment training
│   │   ├── stage_2.json                  # deepspeed configuration for dec-side alignment training
│   │   └── stage_3.json                  # deepspeed configuration for instruction-tuning training
│   ├── datast
│   │   ├── base_dataset.py
│   │   ├── cc3m_datast.py                # process and load text-image pair dataset
│   │   ├── audiocap_datast.py            # process and load text-audio pair dataset
│   │   ├── webvid_dataset.py             # process and load text-video pair dataset
│   │   └── instruction_dataset.py        # process and load instruction pair dataset
│   ├── model                     
│   │   ├── ImageBind                     # the code from ImageBind Model
│   │   ├── common
│   │   ├── anyToImageVideoAudio.py       # the main model file
│   │   ├── agent.py
│   │   ├── modeling_llama.py
│   │   ├── custom_ad.py                  # the audio diffusion 
│   │   ├── custom_sd.py                  # the image diffusion
│   │   ├── custom_vd.py                  # the video diffusion
│   │   ├── layers.py                     # the output projection layers
│   │   └── ...  
│   ├── scripts
│   │   ├── train.sh                      # training NExT-GPT script
│   │   └── app.sh                        # deploying demo script
│   ├── header.py
│   ├── process_embeddings.py             # precompute the captions embeddings
│   ├── train.py                          # training
│   ├── inference.py                      # inference
│   ├── demo_app.py                       # deploy Gradio demonstration 
│   └── ...
├── ckpt                           
│   ├── delta_ckpt                        # tunable NExT-GPT params
│   │   ├── nextgpt         
│   │   │   ├── 7b_tiva_v0                # the directory to save the log file
│   │   │   │   ├── log                   # the logs
│   └── ...       
│   ├── pretrained_ckpt                   # frozen params of pretrained modules
│   │   ├── imagebind_ckpt
│   │   │   ├──huge                       # version
│   │   │   │   └──imagebind_huge.pth
│   │   ├── vicuna_ckpt
│   │   │   ├── 7b_v0                     # version
│   │   │   │   ├── config.json
│   │   │   │   ├── pytorch_model-00001-of-00002.bin
│   │   │   │   ├── tokenizer.model
│   │   │   │   └── ...
├── LICENCE.md
├── README.md
└── requirements.txt

2. Environment Preparation [Back to Top]

Please first clone the repo and install the required environment, which can be done by running the following commands:

conda env create -n nextgpt python=3.8

conda activate nextgpt

# CUDA 11.6
conda install pytorch==1.13.1 torchvision==0.14.1 torchaudio==0.13.1 pytorch-cuda=11.6 -c pytorch -c nvidia

git clone https://github.com/NExT-GPT/NExT-GPT.git
cd NExT-GPT

pip install -r requirements.txt

3. Training/Adapting NExt-GPT on Your Own

3.1. Preparing Pre-trained Checkpoint [Back to Top]

NExT-GPT is trained based on following excellent existing models. Please follow the instructions to prepare the checkpoints.

  • ImageBind is the unified image/video/audio encoder. The pre-trained checkpoint can be downloaded from here with version huge. Afterward, put the imagebind_huge.pth file at [./ckpt/pretrained_ckpt/imagebind_ckpt/huge].
  • Vicuna: first prepare the LLaMA by following the instructions [here]. Then put the pre-trained model at [./ckpt/pretrained_ckpt/vicuna_ckpt/].
  • Image Diffusion is used to generate images. NExT-GPT uses Stable Diffusion with version v1-5. (will be automatically downloaded)
  • Audio Diffusion for producing audio content. NExT-GPT employs AudioLDM with version l-full. (will be automatically downloaded)
  • Video Diffusion for the video generation. We employ ZeroScope with version v2_576w. (will be automatically downloaded)

3.2. Preparing Dataset [Back to Top]

Please download the following datasets used for model training:

A) T-X pairs data

B) Instruction data

3.3. Precomputing Embeddings [Back to Top]

In decoding-side alignment training, we minimize the distance between the representation of signal tokens and captions. To save costs of time and memory, we precompute the text embeddings for image, audio and video captions using the text encoder within the respective diffusion models.

Please run this command before the following training of NExT-GPT, where the produced embedding file will be saved at [./data/embed].

cd ./code/
python process_embeddings.py ../data/T-X_pair_data/cc3m/cc3m.json image ../data/embed/ runwayml/stable-diffusion-v1-5

Note of arguments:

  • args[1]: path of caption file;
  • args[2]: modality, which can be image, video, and audio;
  • args[3]: saving path of embedding file;
  • args[4]: corresponding pre-trained diffusion model name.

3.4. Training NExT-GPT [Back to Top]

First of all, please refer to the base configuration file [./code/config/base.yaml] for the basic system setting of overall modules.

Then, the training of NExT-GPT starts with this script:

cd ./code
bash scripts/train.sh

Specifying the command:

deepspeed --include localhost:0 --master_addr 127.0.0.1 --master_port 28459 train.py \
    --model nextgpt \
    --stage 1\
    --dataset cc3m\
    --data_path  ../data/T-X_pair_data/cc3m/cc3m.json\
    --mm_root_path ../data/T-X_pair_data/cc3m/images/\
    --embed_path ../data/embed/\
    --save_path  ../ckpt/delta_ckpt/nextgpt/7b/\
    --log_path ../ckpt/delta_ckpt/nextgpt/7b/log/

where the key arguments are:

  • --include: localhost:0 indicating the GPT cuda number 0 of deepspeed.
  • --stage: training stage.
  • --dataset: the dataset name for training model.
  • --data_path: the data path for the training file.
  • --mm_root_path: the data path for the image/video/audio file.
  • --embed_path: the data path for the text embedding file.
  • --save_path: the directory which saves the trained delta weights. This directory will be automatically created.
  • --log_path: the directory which saves the log file.

The whole NExT-GPT training involves 3 steps:

  • Step-1: Encoding-side LLM-centric Multimodal Alignment. This stage trains the input projection layer while freezing the ImageBind, LLM, output projection layer.

    Just run the above train.sh script by setting:

    • --stage 1
    • --dataset x, where x varies from [cc3m, webvid, audiocap]
    • --data_path ../.../xxx.json, where xxx is the file name of the data in [./data/T-X_pair_data]
    • --mm_root_path .../.../x, x varies from [images, audios, videos]

    Also refer to the running config file [./code/config/stage_1.yaml] and deepspeed config file [./code/dsconfig/stage_1.yaml] for more step-wise configurations.

  • Step-2: Decoding-side Instruction-following Alignment. This stage trains the output projection layers while freezing the ImageBind, LLM, input projection layers.

    Just run the above train.sh script by setting:

    • --stage 2
    • --dataset x, where x varies from [cc3m, webvid, audiocap]
    • --data_path ../.../xxx.json, where xxx is the file name of the data in [./data/T-X_pair_data]
    • --mm_root_path .../.../x, x varies from [images, audios, videos]

    Also refer to the running config file [./code/config/stage_2.yaml] and deepspeed config file [./code/dsconfig/stage_2.yaml] for more step-wise configurations.

  • Step-3: Instruction Tuning. This stage instruction-tune 1) the LLM via LoRA, 2) input projection layer and 3) output projection layer on the instruction dataset.

    Just run the above train.sh script by setting:

    Also refer to the running config file [./code/config/stage_3.yaml] and deepspeed config file [./code/dsconfig/stage_3.yaml] for more step-wise configurations.

4. Running NExT-GPT System [Back to Top]

4.1. Preparing Checkpoints

First, loading the pre-trained NExT-GPT system.

4.2. Deploying Gradio Demo

Upon completion of the checkpoint loading, you can run the demo locally via:

cd ./code
bash scripts/app.sh

Specifying the key arguments as:

  • --nextgpt_ckpt_path: the path of pre-trained NExT-GPT params.

Contact

For any questions or feedback, feel free to contact Shengqiong Wu and Hao Fei.

Citation

If you find NextGPT useful in your research or applications, please kindly cite:

@articles{wu2023nextgpt,
  title={NExT-GPT: Any-to-Any Multimodal LLM},
  author={Shengqiong Wu and Hao Fei and Leigang Qu and Wei Ji and Tat-Seng Chua},
  journal = {CoRR},
  volume = {abs/2309.05519},
  year={2023}
}

Acknowledgements

You may refer to related work that serves as foundations for our framework and code repository, Vicuna, ImageBind, Stable Diffusion, AudioLDM, and Zeroscope. We also partially draw inspirations from PandaGPT, VPGTrans, GILL, CoDi, Video-LLaMA, and MiniGPT-4. Thanks for their wonderful works.

License Notices

This repository is under BSD 3-Clause License. NExT-GPT is a research project intended for non-commercial use only. One must NOT use the code of NExT-GPT for any illegal, harmful, violent, racist, or sexual purposes. One is strictly prohibited from engaging in any activity that will potentially violate these guidelines. Any potential commercial use of this code should be approved by the authors.

Popular repositories

  1. NExT-GPT Public

    Code and models for NExT-GPT: Any-to-Any Multimodal Large Language Model

    Python 1k 90

  2. NExT-GPT: Any-to-Any Multimodal Large Language Model

    HTML 13 1

16 contributions in the last year

Contribution Graph
Day of Week September October November December January February March April May June July August September
Sunday No contributions on Sunday, September 18, 2022 No contributions on Sunday, September 25, 2022 No contributions on Sunday, October 2, 2022 No contributions on Sunday, October 9, 2022 No contributions on Sunday, October 16, 2022 No contributions on Sunday, October 23, 2022 No contributions on Sunday, October 30, 2022 No contributions on Sunday, November 6, 2022 No contributions on Sunday, November 13, 2022 No contributions on Sunday, November 20, 2022 No contributions on Sunday, November 27, 2022 No contributions on Sunday, December 4, 2022 No contributions on Sunday, December 11, 2022 No contributions on Sunday, December 18, 2022 No contributions on Sunday, December 25, 2022 No contributions on Sunday, January 1, 2023 No contributions on Sunday, January 8, 2023 No contributions on Sunday, January 15, 2023 No contributions on Sunday, January 22, 2023 No contributions on Sunday, January 29, 2023 No contributions on Sunday, February 5, 2023 No contributions on Sunday, February 12, 2023 No contributions on Sunday, February 19, 2023 No contributions on Sunday, February 26, 2023 No contributions on Sunday, March 5, 2023 No contributions on Sunday, March 12, 2023 No contributions on Sunday, March 19, 2023 No contributions on Sunday, March 26, 2023 No contributions on Sunday, April 2, 2023 No contributions on Sunday, April 9, 2023 No contributions on Sunday, April 16, 2023 No contributions on Sunday, April 23, 2023 No contributions on Sunday, April 30, 2023 No contributions on Sunday, May 7, 2023 No contributions on Sunday, May 14, 2023 No contributions on Sunday, May 21, 2023 No contributions on Sunday, May 28, 2023 No contributions on Sunday, June 4, 2023 No contributions on Sunday, June 11, 2023 No contributions on Sunday, June 18, 2023 No contributions on Sunday, June 25, 2023 No contributions on Sunday, July 2, 2023 No contributions on Sunday, July 9, 2023 No contributions on Sunday, July 16, 2023 No contributions on Sunday, July 23, 2023 No contributions on Sunday, July 30, 2023 No contributions on Sunday, August 6, 2023 No contributions on Sunday, August 13, 2023 No contributions on Sunday, August 20, 2023 No contributions on Sunday, August 27, 2023 No contributions on Sunday, September 3, 2023 No contributions on Sunday, September 10, 2023 No contributions on Sunday, September 17, 2023
Monday No contributions on Monday, September 19, 2022 No contributions on Monday, September 26, 2022 No contributions on Monday, October 3, 2022 No contributions on Monday, October 10, 2022 No contributions on Monday, October 17, 2022 No contributions on Monday, October 24, 2022 No contributions on Monday, October 31, 2022 No contributions on Monday, November 7, 2022 No contributions on Monday, November 14, 2022 No contributions on Monday, November 21, 2022 No contributions on Monday, November 28, 2022 No contributions on Monday, December 5, 2022 No contributions on Monday, December 12, 2022 No contributions on Monday, December 19, 2022 No contributions on Monday, December 26, 2022 No contributions on Monday, January 2, 2023 No contributions on Monday, January 9, 2023 No contributions on Monday, January 16, 2023 No contributions on Monday, January 23, 2023 No contributions on Monday, January 30, 2023 No contributions on Monday, February 6, 2023 No contributions on Monday, February 13, 2023 No contributions on Monday, February 20, 2023 No contributions on Monday, February 27, 2023 No contributions on Monday, March 6, 2023 No contributions on Monday, March 13, 2023 No contributions on Monday, March 20, 2023 No contributions on Monday, March 27, 2023 No contributions on Monday, April 3, 2023 No contributions on Monday, April 10, 2023 No contributions on Monday, April 17, 2023 No contributions on Monday, April 24, 2023 No contributions on Monday, May 1, 2023 No contributions on Monday, May 8, 2023 No contributions on Monday, May 15, 2023 No contributions on Monday, May 22, 2023 No contributions on Monday, May 29, 2023 No contributions on Monday, June 5, 2023 No contributions on Monday, June 12, 2023 No contributions on Monday, June 19, 2023 No contributions on Monday, June 26, 2023 No contributions on Monday, July 3, 2023 No contributions on Monday, July 10, 2023 No contributions on Monday, July 17, 2023 No contributions on Monday, July 24, 2023 No contributions on Monday, July 31, 2023 No contributions on Monday, August 7, 2023 No contributions on Monday, August 14, 2023 No contributions on Monday, August 21, 2023 No contributions on Monday, August 28, 2023 No contributions on Monday, September 4, 2023 No contributions on Monday, September 11, 2023 No contributions on Monday, September 18, 2023
Tuesday No contributions on Tuesday, September 20, 2022 No contributions on Tuesday, September 27, 2022 No contributions on Tuesday, October 4, 2022 No contributions on Tuesday, October 11, 2022 No contributions on Tuesday, October 18, 2022 No contributions on Tuesday, October 25, 2022 No contributions on Tuesday, November 1, 2022 No contributions on Tuesday, November 8, 2022 No contributions on Tuesday, November 15, 2022 No contributions on Tuesday, November 22, 2022 No contributions on Tuesday, November 29, 2022 No contributions on Tuesday, December 6, 2022 No contributions on Tuesday, December 13, 2022 No contributions on Tuesday, December 20, 2022 No contributions on Tuesday, December 27, 2022 No contributions on Tuesday, January 3, 2023 No contributions on Tuesday, January 10, 2023 No contributions on Tuesday, January 17, 2023 No contributions on Tuesday, January 24, 2023 No contributions on Tuesday, January 31, 2023 No contributions on Tuesday, February 7, 2023 No contributions on Tuesday, February 14, 2023 No contributions on Tuesday, February 21, 2023 No contributions on Tuesday, February 28, 2023 No contributions on Tuesday, March 7, 2023 No contributions on Tuesday, March 14, 2023 No contributions on Tuesday, March 21, 2023 No contributions on Tuesday, March 28, 2023 No contributions on Tuesday, April 4, 2023 No contributions on Tuesday, April 11, 2023 No contributions on Tuesday, April 18, 2023 No contributions on Tuesday, April 25, 2023 No contributions on Tuesday, May 2, 2023 No contributions on Tuesday, May 9, 2023 No contributions on Tuesday, May 16, 2023 No contributions on Tuesday, May 23, 2023 No contributions on Tuesday, May 30, 2023 No contributions on Tuesday, June 6, 2023 No contributions on Tuesday, June 13, 2023 No contributions on Tuesday, June 20, 2023 No contributions on Tuesday, June 27, 2023 No contributions on Tuesday, July 4, 2023 No contributions on Tuesday, July 11, 2023 No contributions on Tuesday, July 18, 2023 No contributions on Tuesday, July 25, 2023 No contributions on Tuesday, August 1, 2023 No contributions on Tuesday, August 8, 2023 No contributions on Tuesday, August 15, 2023 No contributions on Tuesday, August 22, 2023 No contributions on Tuesday, August 29, 2023 2 contributions on Tuesday, September 5, 2023 No contributions on Tuesday, September 12, 2023 No contributions on Tuesday, September 19, 2023
Wednesday No contributions on Wednesday, September 21, 2022 No contributions on Wednesday, September 28, 2022 No contributions on Wednesday, October 5, 2022 No contributions on Wednesday, October 12, 2022 No contributions on Wednesday, October 19, 2022 No contributions on Wednesday, October 26, 2022 No contributions on Wednesday, November 2, 2022 No contributions on Wednesday, November 9, 2022 No contributions on Wednesday, November 16, 2022 No contributions on Wednesday, November 23, 2022 No contributions on Wednesday, November 30, 2022 No contributions on Wednesday, December 7, 2022 No contributions on Wednesday, December 14, 2022 No contributions on Wednesday, December 21, 2022 No contributions on Wednesday, December 28, 2022 No contributions on Wednesday, January 4, 2023 No contributions on Wednesday, January 11, 2023 No contributions on Wednesday, January 18, 2023 No contributions on Wednesday, January 25, 2023 No contributions on Wednesday, February 1, 2023 No contributions on Wednesday, February 8, 2023 No contributions on Wednesday, February 15, 2023 No contributions on Wednesday, February 22, 2023 No contributions on Wednesday, March 1, 2023 No contributions on Wednesday, March 8, 2023 No contributions on Wednesday, March 15, 2023 No contributions on Wednesday, March 22, 2023 No contributions on Wednesday, March 29, 2023 No contributions on Wednesday, April 5, 2023 No contributions on Wednesday, April 12, 2023 No contributions on Wednesday, April 19, 2023 No contributions on Wednesday, April 26, 2023 No contributions on Wednesday, May 3, 2023 No contributions on Wednesday, May 10, 2023 No contributions on Wednesday, May 17, 2023 No contributions on Wednesday, May 24, 2023 No contributions on Wednesday, May 31, 2023 No contributions on Wednesday, June 7, 2023 No contributions on Wednesday, June 14, 2023 No contributions on Wednesday, June 21, 2023 No contributions on Wednesday, June 28, 2023 No contributions on Wednesday, July 5, 2023 No contributions on Wednesday, July 12, 2023 No contributions on Wednesday, July 19, 2023 No contributions on Wednesday, July 26, 2023 No contributions on Wednesday, August 2, 2023 No contributions on Wednesday, August 9, 2023 No contributions on Wednesday, August 16, 2023 No contributions on Wednesday, August 23, 2023 13 contributions on Wednesday, August 30, 2023 No contributions on Wednesday, September 6, 2023 No contributions on Wednesday, September 13, 2023 No contributions on Wednesday, September 20, 2023
Thursday No contributions on Thursday, September 22, 2022 No contributions on Thursday, September 29, 2022 No contributions on Thursday, October 6, 2022 No contributions on Thursday, October 13, 2022 No contributions on Thursday, October 20, 2022 No contributions on Thursday, October 27, 2022 No contributions on Thursday, November 3, 2022 No contributions on Thursday, November 10, 2022 No contributions on Thursday, November 17, 2022 No contributions on Thursday, November 24, 2022 No contributions on Thursday, December 1, 2022 No contributions on Thursday, December 8, 2022 No contributions on Thursday, December 15, 2022 No contributions on Thursday, December 22, 2022 No contributions on Thursday, December 29, 2022 No contributions on Thursday, January 5, 2023 No contributions on Thursday, January 12, 2023 No contributions on Thursday, January 19, 2023 No contributions on Thursday, January 26, 2023 No contributions on Thursday, February 2, 2023 No contributions on Thursday, February 9, 2023 No contributions on Thursday, February 16, 2023 No contributions on Thursday, February 23, 2023 No contributions on Thursday, March 2, 2023 No contributions on Thursday, March 9, 2023 No contributions on Thursday, March 16, 2023 No contributions on Thursday, March 23, 2023 No contributions on Thursday, March 30, 2023 No contributions on Thursday, April 6, 2023 No contributions on Thursday, April 13, 2023 No contributions on Thursday, April 20, 2023 No contributions on Thursday, April 27, 2023 No contributions on Thursday, May 4, 2023 No contributions on Thursday, May 11, 2023 No contributions on Thursday, May 18, 2023 No contributions on Thursday, May 25, 2023 No contributions on Thursday, June 1, 2023 No contributions on Thursday, June 8, 2023 No contributions on Thursday, June 15, 2023 No contributions on Thursday, June 22, 2023 No contributions on Thursday, June 29, 2023 No contributions on Thursday, July 6, 2023 No contributions on Thursday, July 13, 2023 No contributions on Thursday, July 20, 2023 No contributions on Thursday, July 27, 2023 No contributions on Thursday, August 3, 2023 No contributions on Thursday, August 10, 2023 No contributions on Thursday, August 17, 2023 No contributions on Thursday, August 24, 2023 No contributions on Thursday, August 31, 2023 No contributions on Thursday, September 7, 2023 No contributions on Thursday, September 14, 2023 No contributions on Thursday, September 21, 2023
Friday No contributions on Friday, September 23, 2022 No contributions on Friday, September 30, 2022 No contributions on Friday, October 7, 2022 No contributions on Friday, October 14, 2022 No contributions on Friday, October 21, 2022 No contributions on Friday, October 28, 2022 No contributions on Friday, November 4, 2022 No contributions on Friday, November 11, 2022 No contributions on Friday, November 18, 2022 No contributions on Friday, November 25, 2022 No contributions on Friday, December 2, 2022 No contributions on Friday, December 9, 2022 No contributions on Friday, December 16, 2022 No contributions on Friday, December 23, 2022 No contributions on Friday, December 30, 2022 No contributions on Friday, January 6, 2023 No contributions on Friday, January 13, 2023 No contributions on Friday, January 20, 2023 No contributions on Friday, January 27, 2023 No contributions on Friday, February 3, 2023 No contributions on Friday, February 10, 2023 No contributions on Friday, February 17, 2023 No contributions on Friday, February 24, 2023 No contributions on Friday, March 3, 2023 No contributions on Friday, March 10, 2023 No contributions on Friday, March 17, 2023 No contributions on Friday, March 24, 2023 No contributions on Friday, March 31, 2023 No contributions on Friday, April 7, 2023 No contributions on Friday, April 14, 2023 No contributions on Friday, April 21, 2023 No contributions on Friday, April 28, 2023 No contributions on Friday, May 5, 2023 No contributions on Friday, May 12, 2023 No contributions on Friday, May 19, 2023 No contributions on Friday, May 26, 2023 No contributions on Friday, June 2, 2023 No contributions on Friday, June 9, 2023 No contributions on Friday, June 16, 2023 No contributions on Friday, June 23, 2023 No contributions on Friday, June 30, 2023 No contributions on Friday, July 7, 2023 No contributions on Friday, July 14, 2023 No contributions on Friday, July 21, 2023 No contributions on Friday, July 28, 2023 No contributions on Friday, August 4, 2023 No contributions on Friday, August 11, 2023 No contributions on Friday, August 18, 2023 No contributions on Friday, August 25, 2023 No contributions on Friday, September 1, 2023 No contributions on Friday, September 8, 2023 1 contribution on Friday, September 15, 2023
Saturday No contributions on Saturday, September 24, 2022 No contributions on Saturday, October 1, 2022 No contributions on Saturday, October 8, 2022 No contributions on Saturday, October 15, 2022 No contributions on Saturday, October 22, 2022 No contributions on Saturday, October 29, 2022 No contributions on Saturday, November 5, 2022 No contributions on Saturday, November 12, 2022 No contributions on Saturday, November 19, 2022 No contributions on Saturday, November 26, 2022 No contributions on Saturday, December 3, 2022 No contributions on Saturday, December 10, 2022 No contributions on Saturday, December 17, 2022 No contributions on Saturday, December 24, 2022 No contributions on Saturday, December 31, 2022 No contributions on Saturday, January 7, 2023 No contributions on Saturday, January 14, 2023 No contributions on Saturday, January 21, 2023 No contributions on Saturday, January 28, 2023 No contributions on Saturday, February 4, 2023 No contributions on Saturday, February 11, 2023 No contributions on Saturday, February 18, 2023 No contributions on Saturday, February 25, 2023 No contributions on Saturday, March 4, 2023 No contributions on Saturday, March 11, 2023 No contributions on Saturday, March 18, 2023 No contributions on Saturday, March 25, 2023 No contributions on Saturday, April 1, 2023 No contributions on Saturday, April 8, 2023 No contributions on Saturday, April 15, 2023 No contributions on Saturday, April 22, 2023 No contributions on Saturday, April 29, 2023 No contributions on Saturday, May 6, 2023 No contributions on Saturday, May 13, 2023 No contributions on Saturday, May 20, 2023 No contributions on Saturday, May 27, 2023 No contributions on Saturday, June 3, 2023 No contributions on Saturday, June 10, 2023 No contributions on Saturday, June 17, 2023 No contributions on Saturday, June 24, 2023 No contributions on Saturday, July 1, 2023 No contributions on Saturday, July 8, 2023 No contributions on Saturday, July 15, 2023 No contributions on Saturday, July 22, 2023 No contributions on Saturday, July 29, 2023 No contributions on Saturday, August 5, 2023 No contributions on Saturday, August 12, 2023 No contributions on Saturday, August 19, 2023 No contributions on Saturday, August 26, 2023 No contributions on Saturday, September 2, 2023 No contributions on Saturday, September 9, 2023 No contributions on Saturday, September 16, 2023

Contribution activity

September 2023

Seeing something unexpected? Take a look at the GitHub profile guide.