A framework for large scale recommendation algorithms.
-
Updated
Mar 10, 2023 - Python
A framework for large scale recommendation algorithms.
Repository for Project Insight: NLP as a Service
Federated Learning Utilities and Tools for Experimentation
CLIP (Contrastive Language–Image Pre-training) for Italian
I will implement Fastai in each projects present in this repository.
Pytorch implementation of image captioning using transformer-based model.
[TMI 2023] XBound-Former: Toward Cross-scale Boundary Modeling in Transformers
Symbolic music generation taking inspiration from NLP and human composition process
Neural Persian Poet: A sequence-to-sequence model for composing Persian poetry
Public repo for the paper: "Modeling Intensification for Sign Language Generation: A Computational Approach" by Mert Inan*, Yang Zhong*, Sabit Hassan*, Lorna Quandt, Malihe Alikhani
An autoML for explainable text classification.
The special repository to demonstrate how you can use transformers for Swahili text classification
Public repo for the paper: "COSMic: A Coherence-Aware Generation Metric for Image Descriptions" by Mert İnan, Piyush Sharma, Baber Khalid, Radu Soricut, Matthew Stone, Malihe Alikhani
Train a T5 model to generate simple Fake News and use a RoBERTa model to classify what's fake and what's real.
A Transformer Implementation that is easy to understand and customizable.
Contains work done on the fintech patents classification project. The goal of this project is to build a model that can detect if a patent is fintech or not based on it's text content. If a patent is fintech then we want to know which kind of fintech patent it is form our defined fintech categories.
Comparing between residual stream and highway stream in transformers(BERT) .
Minimalistic PyTorch implementation of transformer
This repository contains a number of experiments with Multi Lingual Transformer models (Multi-Lingual BERT, DistilBERT, XLM-RoBERTa, mT5 and ByT5) focussed on the Dutch language.
Convert the normal maps used in the game Transformers: Fall of Cybertron to the Mikk format
Add a description, image, and links to the transformers-models topic page so that developers can more easily learn about it.
To associate your repository with the transformers-models topic, visit your repo's landing page and select "manage topics."