Materiales: [BOIntroTheoENG.pdf]
This video continues with the introduction and motivation to Bayesian
Optimization problems which was started in video [
In video [
Set up a prior statistical model
Acquire samples (this may be a costly procedure in some relevant application cases)
Compute a posterior (conditional to the measurements)
Carry out a statistical analysis on the posterior to decide next sample, and go to step 2, unless some termination criteria is met.
This video expands on the third step of the methodology, arguing that it is carried out in a Gaussian process with the conditional formulae from multivariate normal distributions.
The last step is also outlined... next sample may be selected in order to optimize the expected value (EV), probability of improving (PI), expected improvement (EI), we may be risky and choose a lower confidence bound (we will not likely achieve it, but we would obtain a very good sample if we did), or optimize the information gain by sampling (entropy search). A very brief description of each of these options is presented, just to get a glimpse of the main ideas, but a detailed analysis is be the topic of other materials that analyse BO in more depth.
Application cases and concluding remarks in video [
*Link to my [whole collection] of videos in English. Link to larger [Colección completa] in Spanish.