bert
Here are 1,214 public repositories matching this topic...
-
Updated
Oct 20, 2020 - Python
-
Updated
Oct 20, 2020 - Jupyter Notebook
-
Updated
Oct 22, 2020
-
Updated
Nov 20, 2020 - Python
-
Updated
Dec 25, 2020 - Rust
chooses 15% of token
From paper, it mentioned
Instead, the training data generator chooses 15% of tokens at random, e.g., in the sentence my
dog is hairy it chooses hairy.
It means that 15% of token will be choose for sure.
From https://github.com/codertimo/BERT-pytorch/blob/master/bert_pytorch/dataset/dataset.py#L68,
for every single token, it has 15% of chance that go though the followup procedure.
PositionalEmbedding
-
Updated
Dec 25, 2020 - Python
-
Updated
Oct 16, 2020 - Python
-
Updated
Oct 22, 2020 - Python
-
Updated
Dec 25, 2020 - Python
-
Updated
Nov 16, 2020 - Jupyter Notebook
-
Updated
Dec 22, 2020 - Jupyter Notebook
-
Updated
Sep 17, 2020 - Python
-
Updated
Jul 28, 2020 - Python
-
Updated
Dec 9, 2020 - Python
-
Updated
Dec 14, 2020 - Python
-
Updated
Dec 26, 2020 - Scala
-
Updated
Dec 9, 2020 - Python
-
Updated
Oct 23, 2019
-
Updated
Nov 6, 2020 - Python
-
Updated
Jun 29, 2020 - Python
-
Updated
Dec 17, 2020 - Python
-
Updated
Mar 1, 2020 - Python
-
Updated
Sep 18, 2020 - Jupyter Notebook
-
Updated
Dec 17, 2020 - Erlang
-
Updated
Nov 6, 2020 - Python
Is your feature request related to a problem? Please describe.
For a fast proof of concepts users need to try Haystack on their own documents. While this is possible on the code level (e.g. Tutorial 1), in many cases an additional UI would be helpful so that users can interact more easily with the model and show it to colleagues.
Describe the solution you'd like
A minimal UI built wit
-
Updated
Dec 26, 2020 - Python
训练数据集问题
你好,看代码使用的训练数据为Restaurants_Train.xml.seg,请问这是这是在哪里下载的吗,还是semeval14的任务4中xml文件生成的?如果是后续生成的,请问有数据生成部分的代码吗?
Improve this page
Add a description, image, and links to the bert topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the bert topic, visit your repo's landing page and select "manage topics."
Bart is a seq2seq model, but there might be applications where one would like to use only the pre-trained BartDecoder in an EncoderDecoder setting with a "long" encoder, such as
This is already p