Tl;dr Approximate Inference methods made easy

“MCMC vs VI” is no longer a discussion about your favourite Roman numeral. If you share my trepidation for model performance in the face of data sparsity, or you simply suffer from anxiety uncertainty, you might be tempted into the Bayesian world. Years later at the precipice of your career (and mental health degeneracy) you over-engineer probabilistic models so intractable that would stress Lord Bayes himself into stomach ulcers. The solution? Approximate inference, the true antihero to model simplification.

Leon Chlon

Leon Chlon

·

Follow

7 min read

·

1 day ago

 

46

 

 

 

 

 

source:unsplash.com

A closed-form solution to a machine learning model is one that can be written down on a sheet of paper using a finite number of standard mathematical operations. For example, linear models have closed-form solutions IF the design covariance matrix is invertible, otherwise we obtain a solution using iterative optimisation.

 

Bayesian models do not typically have exact closed-form solutions for their posterior distributions. One thing that typically helps is choosing simple models, Gaussian likelihood functions and conjugate priors. A prior distribution is said to be conjugate to a likelihood function if the resulting posterior belongs to the same distribution family as the prior.

Click Here