Find topics in texts which are semantically embedded using techniques like word2vec or Glove.
This topic modelling technique models each word with a categorical distribution whose natural parameter is the inner product between a word embedding and an embedding of its assigned topic.
The techniques are explained in detail in the paper 'Topic Modeling in Embedding Spaces' by Adji B. Dieng, Francisco J. R. Ruiz, David M. Blei (2019), available at <arXiv:1907.04907>.
Version: |
0.1.0 |
Depends: |
R (≥ 2.10) |
Imports: |
graphics, stats, Matrix, torch (≥ 0.5.0) |
Suggests: |
udpipe (≥ 0.8.4), word2vec, uwot, tinytest, textplot (≥
0.2.0), ggrepel, ggalt |
Published: |
2021-11-08 |
Author: |
Jan Wijffels [aut, cre, cph] (R implementation),
BNOSAC [cph] (R implementation),
Adji B. Dieng [ctb, cph] (original Python implementation in inst/orig),
Francisco J. R. Ruiz [ctb, cph] (original Python implementation in
inst/orig),
David M. Blei [ctb, cph] (original Python implementation in inst/orig) |
Maintainer: |
Jan Wijffels <jwijffels at bnosac.be> |
License: |
MIT + file LICENSE |
NeedsCompilation: |
no |
SystemRequirements: |
LibTorch (https://pytorch.org/) |
Materials: |
README NEWS |
CRAN checks: |
topicmodels.etm results |