This document presents a new method for estimating the posterior distribution of the Correlated Topic Model (CTM) using Stochastic Gradient Variational Bayes (SGVB). The CTM is an extension of LDA that models correlations between topics. The proposed method approximates the true posterior of the CTM with a factorial variational distribution and uses SGVB to maximize the evidence lower bound. This allows incorporating randomness into posterior inference for the CTM without requiring explicit inversion of the covariance matrix. Perplexity results on several datasets were comparable to LDA. Future work could explore online learning for topic models using neural networks.
Related topics: