This article is available as a gzipped PostScript file,
as a PDF file
or as a gzipped TeX file.
The principle of maximum entropy is a
general method to assign values to probability distributions
on the basis of partial information. This principle,
introduced by Jaynes in 1957, forms an extension of the
classical principle of insufficient reason. It has been
further generalized, both in mathematical formulation and in
intended scope, into the principle of maximum relative entropy
or of minimum information.
It has been claimed that these principles are singled out as
unique methods of statistical inference that agree with
certain compelling consistency requirements. This paper
reviews these consistency arguments and the surrounding
controversy. It is shown that the uniqueness proofs are
flawed, or rest on unreasonably strong assumptions. A more
general class of inference rules, maximizing the so-called
Rényi entropies, is exhibited which also fulfill the
reasonable part of the consistency assumptions.