Deterministic Sampling of Expensive Posteriors Using Kullback-Leibler Divergence

发稿时间:2020-12-13浏览次数:

报告题目:Deterministic Sampling of Expensive Posteriors Using Kullback-Leibler Divergence

主讲人:孙法省

报告摘要: This paper introduces a new way of discrete approximation a continuous probability distribution $F$ into a set of representative points. These points are generated by minimizing the Kullback-Leibler divergence, a statistical potential measure of two probability distributions for testing goodness-of-fit. The Kullback-Leibler divergence is nonnegative, with the value is zero when the two probability distributions are equal. With this feature, weshow that the empirical distribution of these representative points converges to the distribution $F$. The advantage of these points over Monte Carlo and other deterministic sampling are illustrated in the simulation. Two important applications of such points are then highlighted: (a) simulation from the complex probability densities, and (b) exploration and optimization of expensive black-box functions.

孙法省简介:东北师范大学教授、博导,吉林省优秀教师。博士毕业于南开大学概率论与数理统计专业,分别在加拿大西蒙弗雷泽大学统计与保险系、加州大学洛杉矶分校统计系做访问学者。主要从事大数据抽样与分析、计算机试验设计与分析等方面的研究。获教育部高校科学研究优秀成果奖(科学技术)自然科学奖,全国统计科学研究优秀成果奖、吉林省自然科学学术成果奖.

报告时间: 20201123日下午3:30-6:00

报告地点:南湖校区教学科研楼405

主办单位:科研处/数学与统计学院