Topic discovery through data dependent and random projections
Learning Better Embeddings for Rare Words Using Distributional Representations
1. Learning Better Embeddings for Rare Words
Using Distributional Representations
by YIrina Sergienya, Hinrich Schuze
担当: @Quasi_quant2010
EMNLP2015読み会1
【EMNLP2015読み会】
12. 参考文献
[S.Qiu] Co-learning of Word Representations and Morpheme
Representations. COLING14
[N.Djuric] Hierarchical Neural Language Models for Joint
Representation of Streaming Documents and their Content.
WWW15
[K.Hashimoto] Learning Embeddings for Transitive Verb
Disambiguation by Implicit Tensor Factorization. CVSC15
[X.Rong] word2vec Parameter Learning Explained. arXiv14
[O.Levy] Improving Distributional Similarity with Lessons
Learned from Word Embedding. TACL15
[O.Levy] Neural Word Embedding as Implicit Matrix
Factorization. NIPS14
[Y.Li] Word Embedding Revisited: A New Representation
Learning and Explicit Matrix Factorization Perspective. IJCAI15
EMNLP2015読み会12