Artwork

Вміст надано Oxford University. Весь вміст подкастів, включаючи епізоди, графіку та описи подкастів, завантажується та надається безпосередньо компанією Oxford University або його партнером по платформі подкастів. Якщо ви вважаєте, що хтось використовує ваш захищений авторським правом твір без вашого дозволу, ви можете виконати процедуру, описану тут https://uk.player.fm/legal.
Player FM - додаток Podcast
Переходьте в офлайн за допомогою програми Player FM !

Complexity of local MCMC methods for high-dimensional model selection

1:01:51
 
Поширити
 

Manage episode 296616627 series 1610930
Вміст надано Oxford University. Весь вміст подкастів, включаючи епізоди, графіку та описи подкастів, завантажується та надається безпосередньо компанією Oxford University або його партнером по платформі подкастів. Якщо ви вважаєте, що хтось використовує ваш захищений авторським правом твір без вашого дозволу, ви можете виконати процедуру, описану тут https://uk.player.fm/legal.
Quan Zhou, Texas A and M University, gives an OxCSML Seminar on Friday 25th June 2021. Abstract: In a model selection problem, the size of the state space typically grows exponentially (or even faster) with p (the number of variables). But MCMC methods for model selection usually rely on local moves which only look at a neighborhood of size polynomial in p. Naturally one may wonder how efficient these sampling methods are at exploring the posterior distribution. Consider variable selection first. Yang, Wainwright and Jordan (2016) proved that the random-walk add-delete-swap sampler is rapidly mixing under mild high-dimensional assumptions. By using an informed proposal scheme, we obtain a new MCMC sampler which achieves a much faster mixing time that is independent of p, under the same assumptions. The mixing time proof relies on a novel approach called "two-stage drift condition", which can be useful for obtaining tight complexity bounds. This result shows that the mixing rate of locally informed MCMC methods can be fast enough to offset the computational cost of local posterior evaluation, and thus such methods scale well to high-dimensional data. Next, we generalize this result to other model selection problems. It turns out that locally informed samplers attain a dimension-free mixing time if the posterior distribution satisfies a unimodal condition. We show that this condition can be established for the high-dimensional structure learning problem even when the ordering of variables is unknown. This talk is based on joint works with H. Chang, J. Yang, D. Vats, G. Roberts and J. Rosenthal. Bio: Quan Zhou is an assistant professor of the Department of Statistics at Texas A&M University (TAMU). Before joining TAMU, he was a postdoctoral research fellow at Rice University. He did his PhD at Baylor College of Medicine.
  continue reading

51 епізодів

Artwork
iconПоширити
 
Manage episode 296616627 series 1610930
Вміст надано Oxford University. Весь вміст подкастів, включаючи епізоди, графіку та описи подкастів, завантажується та надається безпосередньо компанією Oxford University або його партнером по платформі подкастів. Якщо ви вважаєте, що хтось використовує ваш захищений авторським правом твір без вашого дозволу, ви можете виконати процедуру, описану тут https://uk.player.fm/legal.
Quan Zhou, Texas A and M University, gives an OxCSML Seminar on Friday 25th June 2021. Abstract: In a model selection problem, the size of the state space typically grows exponentially (or even faster) with p (the number of variables). But MCMC methods for model selection usually rely on local moves which only look at a neighborhood of size polynomial in p. Naturally one may wonder how efficient these sampling methods are at exploring the posterior distribution. Consider variable selection first. Yang, Wainwright and Jordan (2016) proved that the random-walk add-delete-swap sampler is rapidly mixing under mild high-dimensional assumptions. By using an informed proposal scheme, we obtain a new MCMC sampler which achieves a much faster mixing time that is independent of p, under the same assumptions. The mixing time proof relies on a novel approach called "two-stage drift condition", which can be useful for obtaining tight complexity bounds. This result shows that the mixing rate of locally informed MCMC methods can be fast enough to offset the computational cost of local posterior evaluation, and thus such methods scale well to high-dimensional data. Next, we generalize this result to other model selection problems. It turns out that locally informed samplers attain a dimension-free mixing time if the posterior distribution satisfies a unimodal condition. We show that this condition can be established for the high-dimensional structure learning problem even when the ordering of variables is unknown. This talk is based on joint works with H. Chang, J. Yang, D. Vats, G. Roberts and J. Rosenthal. Bio: Quan Zhou is an assistant professor of the Department of Statistics at Texas A&M University (TAMU). Before joining TAMU, he was a postdoctoral research fellow at Rice University. He did his PhD at Baylor College of Medicine.
  continue reading

51 епізодів

Усі епізоди

×
 
Loading …

Ласкаво просимо до Player FM!

Player FM сканує Інтернет для отримання високоякісних подкастів, щоб ви могли насолоджуватися ними зараз. Це найкращий додаток для подкастів, який працює на Android, iPhone і веб-сторінці. Реєстрація для синхронізації підписок між пристроями.

 

Короткий довідник