Research interests: MCMC, Bayesian machine learning
Thesis title:Scalable methods for deep probabilistic inference
Publications:
Neklyudov, K., Egorov, E., Shvechikov, P., & Vetrov, D. (2018). Metropolis-Hastings view on variational inference and adversarial training. arXiv preprint arXiv:1810.07151.
Egorov, E., Neklydov, K., Kostoev, R., & Burnaev, E. (2019, July). MaxEntropy Pursuit Variational Inference. In International Symposium on Neural Networks (pp. 409-417). Springer, Cham.
Kuzina, A., Egorov, E., & Burnaev, E. (2019). BooVAE: A scalable framework for continual VAE learning under boosting approach. arXiv preprint arXiv:1908.11853.
Neklyudov, K., Egorov, E., & Vetrov, D. (2019). The Implicit Metropolis-Hastings Algorithm. arXiv preprint arXiv:1906.03644.
Proskura, P., Zaytsev, A., Braslavsky, I., Egorov, E., & Burnaev, E. (2019). Usage of multiple RTL features for Earthquake prediction. arXiv preprint arXiv:1905.10805.
Kuzina, A., Egorov, E., & Burnaev, E. (2019). Bayesian generative models for knowledge transfer in MRI semantic segmentation problems. Frontiers in neuroscience, 13, 844.