keywords: Machine learning, PAC, PAC learnable
keywords: Machine learning, PAC, PAC learnable
This note includes mean-field and structured variational families. Besides VI, mean-field and other variational family can also be used in the inference of probabilistic neural network.
Compared to MCMC, variational Inference is a faster method to approximate difficult-to-compute posterior distribution. Variational Inference is an optimization problem, while MCMC is an asymptotic method.
Metric Embedding plays an important role in unsupervised machine learning and algorithm analysis. Embedding methods are also considered as one of the most important methods in the design of approximation algorithms.
The parameterized function with similar training error widely diverge in the generalization performance. However, the flat minima may imply a low-complexity neural network structure. Some SGD methods have shown can converge to a flatter minima, which potentially make the solution of nonconvex optimization more robust. The first part of this note is a review of Flat minima( Hochreiter and Schmidhuber, 1997). The second part contains an introduction to Gradient Descent algorithms’ properties and visualization.
Generally, stochastic approximation methods are a family of iterative methods. The goal of these algorithms is to recover some properties of a function depending on random variables. The application of stochastic approximation ranges from deep learning (e.g., SGD ) to online learning methods.
Update your browser to view this website correctly. Update my browser now