Here are
29 public repositories
matching this topic...
Recent Advances in Vision and Language PreTrained Models (VL-PTMs)
Papers about pretraining and self-supervised learning on Graph Neural Networks (GNN).
End-to-End recipes for pre-training and fine-tuning BERT using Azure Machine Learning Service
Updated
Nov 11, 2020
Jupyter Notebook
X-modaler is a versatile and high-performance codebase for cross-modal analytics.
Updated
Aug 19, 2021
Python
Official Pytorch Implementation of: "ImageNet-21K Pretraining for the Masses"(NeurIPS, 2021) paper
Updated
Aug 7, 2021
Python
Paddle Distributed Training Extended. 飞桨分布式训练扩展包
Updated
Aug 20, 2021
Shell
OpenAI GPT2 pre-training and sequence prediction implementation in Tensorflow 2.0
Updated
Jun 9, 2021
Python
Transformer with Untied Positional Encoding (TUPE). Code of paper "Rethinking Positional Encoding in Language Pre-training". Improve existing models like BERT.
Updated
Jun 28, 2021
Python
Research code for EMNLP 2020 paper "HERO: Hierarchical Encoder for Video+Language Omni-representation Pre-training"
Updated
Aug 8, 2021
Python
AAAI-20 paper: Cross-Lingual Natural Language Generation via Pre-Training
Updated
Aug 4, 2021
Python
PyTorch code for "Unifying Vision-and-Language Tasks via Text Generation" (ICML 2021)
Updated
Jul 28, 2021
Python
An official implementation for " UniVL: A Unified Video and Language Pre-Training Model for Multimodal Understanding and Generation"
Updated
Apr 21, 2021
Python
Research Code for NeurIPS 2020 Spotlight paper "Large-Scale Adversarial Training for Vision-and-Language Representation Learning": UNITER adversarial training part
Updated
Jan 13, 2021
Python
Revisiting Contrastive Methods for Unsupervised Learning of Visual Representations. [2021]
Updated
Aug 4, 2021
Python
Python source code for EMNLP 2020 paper "Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT".
Updated
Dec 8, 2020
Python
Emergent Communication Pretraining for Few-Shot Machine Translation
Updated
Dec 3, 2020
Python
Code for generating a single image pretraining dataset
Updated
Aug 3, 2021
Python
Dynamic Transfer Learning for Low-Resource Neural Machine Translation
Updated
Aug 4, 2020
Python
Pre-training and fine-tuning GNN model on source code
Updated
Jun 29, 2021
Python
Understanding "A Lite BERT". An Transformer approach for learning self-supervised Language Models. (wip)
Updated
Jul 14, 2021
Python
Updated
Aug 16, 2021
Python
This repository contains the source code for the Semantic Knowledge Extractor Tool (SKET). SKET is an unsupervised hybrid knowledge extraction system that combines a rule-based expert system with pre-trained machine learning models to extract cancer-related information from pathology reports.
Updated
Aug 19, 2021
Python
It is very useful for preparing data set of machine learning.
Updated
Apr 20, 2021
Python
Pretraining on 2015, 2019 and IDRIDs with ResNet 101 and 152 and fine tuning with 2019 dataset only
Updated
Dec 9, 2019
Python
Updated
Jul 23, 2021
Python
An overview of German transformer models related to language modeling in the field of Natural Language Processing.
Script to pre-train hugginface transformers BART with Tensorflow 2
Updated
Aug 18, 2021
Python
This is a flexible class for training specific layers of deep neural-nets in an online manner. Supports Keras models.
Updated
Dec 3, 2020
Python
Improve this page
Add a description, image, and links to the
pretraining
topic page so that developers can more easily learn about it.
Curate this topic
Add this topic to your repo
To associate your repository with the
pretraining
topic, visit your repo's landing page and select "manage topics."
Learn more
You can’t perform that action at this time.
You signed in with another tab or window. Reload to refresh your session.
You signed out in another tab or window. Reload to refresh your session.