Institute of Information Theory and Automation

You are here

Bibliography

Conference Paper (international conference)

Orthogonal Approximation of Marginal Likelihood of Generative Models

Šmídl Václav, Bím J., Pevný T.

: Bayesian Deep Learning NeurIPS 2019 Workshop, 48

: NeurIPS 2019, (Vancouver, CA, 20191208)

: GA18-21409S, GA ČR

: approximation, generative models, orthogonal combinations

: http://library.utia.cas.cz/separaty/2020/AS/smidl-0522204.pdf

(eng): This paper presents a new approximation of the marginal likelihood of generative models which is used as a score for anomaly detection. The score is motivated by the shortcoming of the popular reconstruction error that it can behave arbitrarily outside the known samples. The proposed score corrects this by orthogonal combination of the reconstruction error and the likelihood in the latent space. As experimentally shown on benchmark problems from anomaly detection and illustrated on a toy problem, this combination lends the score robustness to outliers. Generative models evaluated with this score outperformed the competing methods especially in tasks of learning distribution from data corrupted by anomalies. Finally, the score is compatible with contemporary generative models, namely variational auto-encoders and generative adversarial networks

: BD

: 10201

2019-01-07 08:39