In parameterized form, the prior distribution is often assumed to come from a family of distributions called conjugate priors.
The usefulness of a conjugate prior is that the corresponding posterior distribution will be in the same family, and the calculation may be expressed in closed form.
(In some instances, frequentist statistics can work around this problem.
For example, confidence intervals and prediction intervals in frequentist statistics when constructed from a normal distribution with unknown mean and variance are constructed using a Student's t-distribution.
In this case there is almost surely no asymptotic convergence.
By parameterizing the space of models, the belief in all models may be updated in a single step.
Ian Hacking noted that traditional "Dutch book" arguments did not specify Bayesian updating: they left open the possibility that non-Bayesian updating rules could avoid Dutch books.
Hacking wrote "And neither the Dutch book argument, nor any other in the personalist arsenal of proofs of the probability axioms, entails the dynamic assumption. So the personalist requires the dynamic assumption to be Bayesian.
That is, instead of a fixed point as a prediction, a distribution over possible points is returned.
Only this way is the entire posterior distribution of the parameter(s) used.
Bayesian inference is a method of statistical inference in which Bayes' theorem is used to update the probability for a hypothesis as more evidence or information becomes available.