2023-01-16    Share on: Twitter | Facebook | HackerNews | Reddit

When Bayesian Methods Are Not the Best Option?

Bayesian methods are a powerful tool for probabilistic modeling and inference, but they may not be the best choice in certain situations. Here are a few examples of when it might be better to use alternative methods:

When the data is scarce or the model is very complex

In these cases, the computation required for Bayesian inference may be infeasible or impractical. In such cases, frequentist methods or classical machine learning algorithms may be more appropriate.

When the model is well-defined and the goal is prediction rather than understanding underlying mechanism

In these cases, Bayesian methods may be overkill and traditional machine learning algorithms may be more appropriate and computationally efficient.

When the problem is high-dimensional

High-dimensional problems can be computationally challenging for Bayesian methods, and alternative methods such as regularization techniques or dimensionality reduction methods may be more appropriate.

When the model is non-conjugate

Bayesian methods often rely on conjugate priors, which can make the math easier, but when the model is non-conjugate, more advanced techniques such as variational inference or MCMC methods may be required, making the computation much more challenging

These examples are not exhaustive, but they give an idea of when Bayesian methods may not be the best choice. This is also not to say that Bayesian methods are never appropriate in these situations, but it may be more challenging or less efficient.

X::Learning Bayesian Methods as Data Scientist