knowledge_based topic model KBTM,knowledgebased


http://blog.csdn.net/pipisorry/article/details/44040701

术语

Mustlink states that two words should belong to the same topic
Cannot-link states that two words should not belong to the same topic.

DF-LDA

is perhaps the earliest KBTM, which can incorporate two forms of prior knowledge from the user: must-links and cannot-links.

[Andrzejewski, David, Zhu, Xiaojin, and Craven, Mark. Incorporating domain knowledge into topic modeling via Dirichlet Forest priors. In ICML, pp. 25–32, 2009.]


DF-LDA [1]: A knowledge-based topic model that can use both must-links and cannot-links, but it assumes all the knowledge is correct.
MC-LDA [10]: A knowledge-based topic model that also use both the must-link and the cannot-link knowledge. It assumes that all knowledge is correct as well.
GK-LDA [9]: A knowledge-based topic model that uses the ratio of word probabilities under each topic to reduce the effect of wrong knowledge. However, it can only use the must-link type of knowledge.
LTM [7]: A lifelong learning topic model that learns only the must-link type of knowledge automatically. It outperformed [8].


from:http://blog.csdn.net/pipisorry/article/details/44040701

相关内容