Bayesian optimization: implicit information about the optimum in a Gaussian process

Antonio Sala, UPV

Difficulty: *** ,       Relevance: PIC,      Duration: 18:53

Materials:    [ Cód.: BOIntroOptimalSample.mlx ] [ PDF ]

Summary:

This video plots the (approximate) distribution of global minima of the posterior of a function modeled as a Gaussian process of which three samples have been observed.

First, the video reviews how to obtain said posterior. Next, we review how to obtain realizations of it, and the code to repeat them and obtain the minimum of each one (this is the basic of the ”Thompson sampling” heuristic in Bayesian optimization). With this code, six thousand minimums are obtained and three histograms are plotted:

This is the implicitly available information about the location of the optimum. The goal of Bayesian optimization is to refine that information through sampling (explore) and/or to hit the optimum of my unknown underlying function as quickly as possible (exploit).

*Link to my [ whole collection] of videos in English. Link to larger [ Colección completa] in Spanish.

© 2024, A. Sala. All rights reserved for materials from authors affiliated to Universitat Politecnica de Valencia.
Please consult original source/authors for info regarding rights of materials from third parties.