It is known that elementary symmetries provide a compelling necessaryandsufficient foundation for probability theory, and that information (aka minus entropy, KullbackLeibler) H(p;q) = SUM p log(p/q) is the unique quantification of divergence from source probability distribution q to destination p. Its curvature matrix grad grad H = diag(1/p) can be the only generally valid candidate for a geometrical metric. This "information geometry" has been widely promoted.
A geometrical metric induces a corresponding "FisherRao" density sqrt(det(metric)) to be used as an intrinsic measure on probabilities. Putting these ideas together suggests the "entropic prior" Prob(p) = exp(H(p;flat) / PRODUCT sqrt(p) for an unknown p. If assignment of a single "best" p by symmetry or MaxEnt or less formal judgement is deemed too definitive, this prior appears to allow flexible generalization.
Nevertheless, information geometry and the entropic prior are wrongly founded and cannot apply generally. Geometry fails because geodesic distances are inherently symmetric d(p;q) = d(q;p) whereas the only connection that obeys the founding symmetries is H, which is asymmetric so d cannot be H. Probability distributions do not form a metric space! The entropic prior likewise fails, mostly through its dependence on the FisherRao density, but also for other reasons.
It remains only to generate the promised counterexamples. A selection is presented.
