Skip to content

Highlights

  • Arctic Code Vault Contributor
  • Pro

Organizations

@huggingface

1,089 contributions in the last year

Jul Aug Sep Oct Nov Dec Jan Feb Mar Apr May Jun Jul Mon Wed Fri

Contribution activity

July 2020

Created a pull request in huggingface/transformers that received 7 comments

[Generation] better error message

If cur_len of input context is as long or longer than max_length a nice error message should be shown.

+8 −0 7 comments

Created an issue in google/trax that received 1 comment

Question/Potential Issue - Attention mask in SelfAttention Layer

Description It seems to me that it is not possible to mask the first position of a tensor when using the layer SelfAttention. Looking at this line:

1 comment

Seeing something unexpected? Take a look at the GitHub profile guide.

You can’t perform that action at this time.