transformer
Here are 1,102 public repositories matching this topic...
-
Updated
Oct 20, 2020 - Jupyter Notebook
Bidirectional RNN
Is there a way to train a bidirectional RNN (like LSTM or GRU) on trax nowadays?
-
Updated
Apr 12, 2021 - Python
chooses 15% of token
From paper, it mentioned
Instead, the training data generator chooses 15% of tokens at random, e.g., in the sentence my
dog is hairy it chooses hairy.
It means that 15% of token will be choose for sure.
From https://github.com/codertimo/BERT-pytorch/blob/master/bert_pytorch/dataset/dataset.py#L68,
for every single token, it has 15% of chance that go though the followup procedure.
PositionalEmbedding
-
Updated
Apr 22, 2021 - Python
-
Updated
Nov 13, 2020 - Python
-
Updated
May 3, 2017 - Java
-
Updated
Mar 12, 2021 - Jupyter Notebook
-
Updated
Apr 18, 2021 - Jupyter Notebook
-
Updated
Apr 14, 2021 - JavaScript
-
Updated
Feb 7, 2019 - Python
-
Updated
Mar 31, 2021 - Python
-
Updated
Mar 31, 2021 - Python
-
Updated
Mar 5, 2021 - Java
Till now, we have released the English model which is trained on
academic datasets. We are planning to support recognition models of more languages.
If you want to support a new language, please provide us with two files:
- A char_list.txt file, which lists all the characters that used in the new language.
- A dict_list.txt file, which lists the words in the new language as more as po
Roadmap of MMOCR
-
Updated
Apr 7, 2021 - Python
-
Updated
Dec 19, 2020 - Python
-
Updated
Feb 27, 2021 - Python
-
Updated
Apr 23, 2021
Hi, I am so interesting in your project, and wonder if you need contributor and how could I make my own contribution?
-
Updated
Jul 26, 2019 - Python
-
Updated
Apr 3, 2021 - Jupyter Notebook
-
Updated
Dec 25, 2020 - Python
-
Updated
May 12, 2020 - Java
-
Updated
Apr 23, 2021 - Python
-
Updated
Jan 21, 2021 - Python
-
Updated
Nov 15, 2019 - Python
-
Updated
Mar 10, 2021 - Python
Improve this page
Add a description, image, and links to the transformer topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the transformer topic, visit your repo's landing page and select "manage topics."
This is a feature request to add Wav2Vec2 Pretraining functionality to the transformers library. This is a "Good Second Issue" feature request, which means that interested contributors should have some experience with the transformers library and ideally also with training/fine-tuning Wav2Vec2.
Motivation
The popular [Wav2Vec2](https://huggingface.co/models?filter=w