bert
Here are 1,078 public repositories matching this topic...
-
Updated
Oct 11, 2020 - Python
-
Updated
Oct 3, 2020 - Jupyter Notebook
-
Updated
Dec 1, 2019
-
Updated
Sep 23, 2020 - Python
-
Updated
Oct 19, 2020 - Rust
chooses 15% of token
From paper, it mentioned
Instead, the training data generator chooses 15% of tokens at random, e.g., in the sentence my
dog is hairy it chooses hairy.
It means that 15% of token will be choose for sure.
From https://github.com/codertimo/BERT-pytorch/blob/master/bert_pytorch/dataset/dataset.py#L68,
for every single token, it has 15% of chance that go though the followup procedure.
PositionalEmbedding
-
Updated
Oct 14, 2020 - Python
-
Updated
Oct 16, 2020 - Python
-
Updated
Apr 20, 2020 - Python
-
Updated
Oct 16, 2020 - Python
-
Updated
Oct 13, 2020 - Jupyter Notebook
-
Updated
May 9, 2020 - Jupyter Notebook
-
Updated
Sep 17, 2020 - Python
-
Updated
Jul 28, 2020 - Python
-
Updated
Sep 20, 2020 - Python
-
Updated
Sep 11, 2020 - Python
-
Updated
Aug 4, 2020 - Python
-
Updated
Oct 18, 2020 - Scala
-
Updated
Oct 23, 2019
-
Updated
Oct 17, 2020 - Python
-
Updated
Jun 29, 2020 - Python
-
Updated
Oct 14, 2020 - Python
-
Updated
Mar 1, 2020 - Python
-
Updated
Sep 18, 2020 - Jupyter Notebook
-
Updated
Oct 12, 2020 - Erlang
-
Updated
Aug 6, 2020 - Python
-
Updated
Oct 18, 2020 - Python
-
Updated
Sep 1, 2020 - JavaScript
-
Updated
Oct 16, 2020 - Python
Improve this page
Add a description, image, and links to the bert topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the bert topic, visit your repo's landing page and select "manage topics."
The architecture
GPT2ForSequenceClassificationwas added in #7501 in PyTorch. It would be great to have it in TensorFlow (cf. issues #7622), but it would also be great to have it for other causal models:OpenAI GPT, CTRL, TransfoXLCurrently working on OpenAI GPT: @fmcurti(done)Below is a list of items to follow to make sure the integration of such an architect