Skip to content
master
Go to file
Code

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
Jul 17, 2020
Apr 16, 2020
Jun 23, 2020
Apr 26, 2020
Jun 26, 2020
Jul 23, 2019
Sep 30, 2019
Sep 30, 2019

README.md

UniLM

Pre-trained models for natural language understanding (NLU) and generation (NLG) tasks

The family of UniLM:

UniLM: unified pre-training for language understanding and generation

InfoXLM (new): multilingual/cross-lingual pre-trained models for language understanding and generation

MiniLM: small pre-trained models for language understanding and generation

LayoutLM: multimodal (text + layout/format + image) pre-training for document understanding (e.g. scanned documents, PDF, etc.)

s2s-ft: sequence-to-sequence fine-tuning toolkit

News

Release

***** New February, 2020: UniLM v2 | MiniLM v1 | LayoutLM v1 | s2s-ft v1 release *****

***** October 1st, 2019: UniLM v1 release *****

License

This project is licensed under the license found in the LICENSE file in the root directory of this source tree. Portions of the source code are based on the transformers project.

Microsoft Open Source Code of Conduct

Contact Information

For help or issues using UniLM, please submit a GitHub issue.

For other communications related to UniLM, please contact Li Dong (lidong1@microsoft.com), Furu Wei (fuwei@microsoft.com).

You can’t perform that action at this time.