Supervised Learning of Universal Sentence Representations from Natural Language Inference Data

Many modern NLP systems rely on word embeddings, previously trained in an
unsupervised manner on large corpora, as base features. Efforts to obtain
embeddings for larger chunks of text, such as sentences, have however not been
so successful. Several attempts at learning unsupervised representations of
sentences have not reached satisfactory enough performance to be widely
adopted. In this paper, we show how universal sentence representations trained
using the supervised data of the Stanford Natural Language Inference dataset
can consistently outperform unsupervised methods like SkipThought vectors on a
wide range of transfer tasks. Much like how computer vision uses ImageNet to
obtain features, which can then be transferred to other tasks, our work tends
to indicate the suitability of natural language inference for transfer learning
to other NLP tasks. Our sentence encoder is publicly available.

Source: http://lslink.info/?c=Km9

No Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Uncategorized
Curious arctic fox kits discover (and destroy) a motion-sensor camera

Watch a pack of curious and adorable arctic fox kits as they frolic and play… and as they discover and destroy a motion-sensor camera put in place by a filmmaker. It’s a clip from PBS Nature’s Fox Tales. Plus, more about these tundra-based mammals from Wikipedia: The Arctic fox lives …

Uncategorized
Why Do You Care How Much Other People Work? Revisited

Basically, you shouldn’t. Source: http://lslink.info/?c=XzG

Uncategorized
How I Improved My Public Speaking

In mid 2011, I had what was really my first speaking gig, at SMX Advanced, about link building. I had done a link building clinic before 10 or so people before, but this was the first serious thing I had done – talking before 200-300 some odd internet marketers in …