10月06, 2019

conditional vae for mnist figure generation

损失函数里KL相似度可以反向求导,使用了两个高斯分布的分部积分计算

def loss_fn(x, r, mu, log_var):
    # Reconstruction loss
    loss_r = F.binary_cross_entropy(r, x, reduction="sum")
    # KL Divergence
    loss_kl = - 0.5 * (1 + log_var - mu**2 - log_var.exp()).sum()
    return loss_r + loss_kl

https://stats.stackexchange.com/questions/7440/kl-divergence-between-two-univariate-gaussians

https://tilman.xyz/understanding-cvaes/

本文链接:http://57km.cc/post/understanding cvae.html

-- EOF --

Comments