Skip to content
#

fastai

Here are 608 public repositories matching this topic...

harisriaz17
harisriaz17 commented Mar 15, 2021

In many time series datasets, the sequences are of variable length. Now TSAI dataloader expects fixed length time sequences. The most natural solution is to transcribe each time sequence to the maximum length in our dataset and pad zeros (or -1 or any other token) in the extra positions. In vanilla PyTorch implementation of Transformer, there is an src_key_padding argument given to forward

lgvaz
lgvaz commented Mar 17, 2021

Make use of the recent implemented parameter is_new that is passed to Parser.parse_fields

Functions like record.set_filepath, record.set_img_size, now only need to be called once, refer to this notebook for an example.

We need to update or default parsers to make use of this functionality:

  • `coco

Improve this page

Add a description, image, and links to the fastai topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the fastai topic, visit your repo's landing page and select "manage topics."

Learn more