Supervised Learning of Universal Sentence Representations from Natural Language Inference Data

  • By :
  • Category : Video

Many modern NLP systems rely on word embeddings, previously trained in an
unsupervised manner on large corpora, as base features. Efforts to obtain
embeddings for larger chunks of text, such as sentences, have however not been
so successful. Several attempts at learning unsupervised representations of
sentences have not reached satisfactory enough performance to be widely
adopted. In this paper, we show how universal sentence representations trained
using the supervised data of the Stanford Natural Language Inference datasets
can consistently outperform unsupervised methods like SkipThought vectors on a
wide range of transfer tasks. Much like how computer vision uses ImageNet to
obtain features, which can then be transferred to other tasks, our work tends
to indicate the suitability of natural language inference for transfer learning
to other NLP tasks. Our encoder is publicly available.

Source: http://lslink.info/?c=R2z

No Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Video
Blind Birdwatcher Sees With Sound

Juan Pablo Culasso is a birdwatcher in Uruguay, but he doesn’t see birds the way that most birdwatchers do. In fact, he doesn’t see them at all. Born without sight, Culasso listens to the birds and has developed a keen ability to identify their distinct calls and melodies. He has …

Video
Voodoo Gets Even Voodooier

Even less likely to work now than then. Source: http://lslink.info/?c=Y5d

Video
How to Quickly Find Content That Shouldn’t Be Indexed

Duplicate/thin content is almost always bad, and it’s sometimes difficult to find it on our websites, especially the bigger ones. Lots of different advanced operators and code searches can bring up some bad content, but there’s another method I haven’t seen discussed that can also do a lot of good …