Natural Language Processing(7)
- Efficient human-like semantic representations via the Information Bottleneck principle
- Specializing Word Embeddings (for Parsing) by Information Bottleneck
- Specializing Word Embeddings (for Parsing) by Information Bottleneck
- How Contextual are Contextualized Word Representations? Comparing the Geometry of BERT, ELMo, and GPT-2 Embeddings
- Correlations between Word Vector Sets
- Revealing the Dark Secrets of BERT
- nlp-category-test