# Cox's theorem

Cox's theorem Cox's theorem, named after the physicist Richard Threlkeld Cox, is a derivation of the laws of probability theory from a certain set of postulates. This derivation justifies the so-called "logical" interpretation of probability, as the laws of probability derived by Cox's theorem are applicable to any proposition. Logical (also known as objective Bayesian) probability is a type of Bayesian probability. Other forms of Bayesianism, such as the subjective interpretation, are given other justifications.

Contents 1 Cox's assumptions 2 Implications of Cox's postulates 3 Interpretation and further discussion 4 See also 5 References Cox's assumptions Cox wanted his system to satisfy the following conditions: Divisibility and comparability – The plausibility of a proposition is a real number and is dependent on information we have related to the proposition. Common sense – Plausibilities should vary sensibly with the assessment of plausibilities in the model. Consistency – If the plausibility of a proposition can be derived in many ways, all the results must be equal.

The postulates as stated here are taken from Arnborg and Sjödin.[1][2][3] "Common sense" includes consistency with Aristotelian logic in the sense that logically equivalent propositions shall have the same plausibility.

The postulates as originally stated by Cox were not mathematically rigorous (although more so than the informal description above), as noted by Halpern.[4][5] However it appears to be possible to augment them with various mathematical assumptions made either implicitly or explicitly by Cox to produce a valid proof.

Cox's notation: The plausibility of a proposition {displaystyle A} given some related information {displaystyle X} is denoted by {displaystyle Amid X} .

Cox's postulates and functional equations are: The plausibility of the conjunction {displaystyle AB} of two propositions {displaystyle A} , {displaystyle B} , given some related information {displaystyle X} , is determined by the plausibility of {displaystyle A} given {displaystyle X} and that of {displaystyle B} given {displaystyle AX} . In form of a functional equation {displaystyle ABmid X=g(Amid X,Bmid AX)} Because of the associative nature of the conjunction in propositional logic, the consistency with logic gives a functional equation saying that the function {displaystyle g} is an associative binary operation. Additionally, Cox postulates the function {displaystyle g} to be monotonic. All strictly increasing associative binary operations on the real numbers are isomorphic to multiplication of numbers in a subinterval of [0, +∞], which means that there is a monotonic function {displaystyle w} mapping plausibilities to [0, +∞] such that {displaystyle w(ABmid X)=w(Amid X)w(Bmid AX)} In case {displaystyle A} given {displaystyle X} is certain, we have {displaystyle ABmid X=Bmid X} and {displaystyle Bmid AX=Bmid X} due to the requirement of consistency. The general equation then leads to {displaystyle w(Bmid X)=w(Amid X)w(Bmid X)} This shall hold for any proposition {displaystyle B} , which leads to {displaystyle w(Amid X)=1} In case {displaystyle A} given {displaystyle X} is impossible, we have {displaystyle ABmid X=Amid X} and {displaystyle Amid BX=Amid X} due to the requirement of consistency. The general equation (with the A and B factors switched) then leads to {displaystyle w(Amid X)=w(Bmid X)w(Amid X)} This shall hold for any proposition {displaystyle B} , which, without loss of generality, leads to a solution {displaystyle w(Amid X)=0} Due to the requirement of monotonicity, this means that {displaystyle w} maps plausibilities to interval [0, 1]. The plausibility of a proposition determines the plausibility of the proposition's negation. This postulates the existence of a function {displaystyle f} such that {displaystyle w({text{not }}Amid X)=f(w(Amid X))} Because "a double negative is an affirmative", consistency with logic gives a functional equation {displaystyle f(f(x))=x,} saying that the function {displaystyle f} is an involution, i.e., it is its own inverse. Furthermore, Cox postulates the function {displaystyle f} to be monotonic. The above functional equations and consistency with logic imply that {displaystyle w(ABmid X)=w(Amid X)f(w({text{not }}Bmid AX))=w(Amid X)fleft({w(A{text{ not }}Bmid X) over w(Amid X)}right)} Since {displaystyle AB} is logically equivalent to {displaystyle BA} , we also get {displaystyle w(Amid X)fleft({w(A{text{ not }}Bmid X) over w(Amid X)}right)=w(Bmid X)fleft({w(B{text{ not }}Amid X) over w(Bmid X)}right)} If, in particular, {displaystyle B={text{ not }}(AD)} , then also {displaystyle A{text{ not }}B={text{not }}B} and {displaystyle B{text{ not }}A={text{ not }}A} and we get {displaystyle w(A{text{ not }}Bmid X)=w({text{not }}Bmid X)=f(w(Bmid X))} and {displaystyle w(B{text{ not }}Amid X)=w({text{not }}Amid X)=f(w(Amid X))} Abbreviating {displaystyle w(Amid X)=x} and {displaystyle w(Bmid X)=y} we get the functional equation {displaystyle x,fleft({f(y) over x}right)=y,fleft({f(x) over y}right)} Implications of Cox's postulates The laws of probability derivable from these postulates are the following.[6] Let {displaystyle Amid B} be the plausibility of the proposition {displaystyle A} given {displaystyle B} satisfying Cox's postulates. Then there is a function {displaystyle w} mapping plausibilities to interval [0,1] and a positive number {displaystyle m} such that Certainty is represented by {displaystyle w(Amid B)=1.} {displaystyle w^{m}(A|B)+w^{m}({text{not }}Amid B)=1.} {displaystyle w(ABmid C)=w(Amid C)w(Bmid AC)=w(Bmid C)w(Amid BC).} It is important to note that the postulates imply only these general properties. We may recover the usual laws of probability by setting a new function, conventionally denoted {displaystyle P} or {displaystyle Pr } , equal to {displaystyle w^{m}} . Then we obtain the laws of probability in a more familiar form: Certain truth is represented by {displaystyle Pr(Amid B)=1} , and certain falsehood by {displaystyle Pr(Amid B)=0.} {displaystyle Pr(Amid B)+Pr({text{not }}Amid B)=1.} {displaystyle Pr(ABmid C)=Pr(Amid C)Pr(Bmid AC)=Pr(Bmid C)Pr(Amid BC).} Rule 2 is a rule for negation, and rule 3 is a rule for conjunction. Given that any proposition containing conjunction, disjunction, and negation can be equivalently rephrased using conjunction and negation alone (the conjunctive normal form), we can now handle any compound proposition.

The laws thus derived yield finite additivity of probability, but not countable additivity. The measure-theoretic formulation of Kolmogorov assumes that a probability measure is countably additive. This slightly stronger condition is necessary for the proof of certain theorems.[citation needed] Interpretation and further discussion Cox's theorem has come to be used as one of the justifications for the use of Bayesian probability theory. For example, in Jaynes[6] it is discussed in detail in chapters 1 and 2 and is a cornerstone for the rest of the book. Probability is interpreted as a formal system of logic, the natural extension of Aristotelian logic (in which every statement is either true or false) into the realm of reasoning in the presence of uncertainty.

It has been debated to what degree the theorem excludes alternative models for reasoning about uncertainty. For example, if certain "unintuitive" mathematical assumptions were dropped then alternatives could be devised, e.g., an example provided by Halpern.[4] However Arnborg and Sjödin[1][2][3] suggest additional "common sense" postulates, which would allow the assumptions to be relaxed in some cases while still ruling out the Halpern example. Other approaches were devised by Hardy[7] or Dupré and Tipler.[8] The original formulation of Cox's theorem is in Cox (1946), which is extended with additional results and more discussion in Cox (1961). Jaynes[6] cites Abel[9] for the first known use of the associativity functional equation. János Aczél[10] provides a long proof of the "associativity equation" (pages 256-267). Jaynes[6]: 27 reproduces the shorter proof by Cox in which differentiability is assumed. A guide to Cox's theorem by Van Horn aims at comprehensively introducing the reader to all these references.[11] See also Probability axioms Probability logic References ^ Jump up to: a b Stefan Arnborg and Gunnar Sjödin, On the foundations of Bayesianism, Preprint: Nada, KTH (1999) — ftp://ftp.nada.kth.se/pub/documents/Theory/Stefan-Arnborg/06arnborg.ps — ftp://ftp.nada.kth.se/pub/documents/Theory/Stefan-Arnborg/06arnborg.pdf ^ Jump up to: a b Stefan Arnborg and Gunnar Sjödin, A note on the foundations of Bayesianism, Preprint: Nada, KTH (2000a) — ftp://ftp.nada.kth.se/pub/documents/Theory/Stefan-Arnborg/fobshle.ps — ftp://ftp.nada.kth.se/pub/documents/Theory/Stefan-Arnborg/fobshle.pdf ^ Jump up to: a b Stefan Arnborg and Gunnar Sjödin, "Bayes rules in finite models," in European Conference on Artificial Intelligence, Berlin, (2000b) — ftp://ftp.nada.kth.se/pub/documents/Theory/Stefan-Arnborg/fobc1.ps — ftp://ftp.nada.kth.se/pub/documents/Theory/Stefan-Arnborg/fobc1.pdf ^ Jump up to: a b Joseph Y. Halpern, "A counterexample to theorems of Cox and Fine," Journal of AI research, 10, 67–85 (1999) — http://www.jair.org/media/536/live-536-2054-jair.ps.Z Archived 2015-11-25 at the Wayback Machine ^ Joseph Y. Halpern, "Technical Addendum, Cox's theorem Revisited," Journal of AI research, 11, 429–435 (1999) — http://www.jair.org/media/644/live-644-1840-jair.ps.Z Archived 2015-11-25 at the Wayback Machine ^ Jump up to: a b c d Edwin Thompson Jaynes, Probability Theory: The Logic of Science, Cambridge University Press (2003). — preprint version (1996) at "Archived copy". Archived from the original on 2016-01-19. Retrieved 2016-01-19.; Chapters 1 to 3 of published version at http://bayes.wustl.edu/etj/prob/book.pdf ^ Michael Hardy, "Scaled Boolean algebras", Advances in Applied Mathematics, August 2002, pages 243–292 (or preprint); Hardy has said, "I assert there that I think Cox's assumptions are too strong, although I don't really say why. I do say what I would replace them with." (The quote is from a Wikipedia discussion page, not from the article.) ^ Dupré, Maurice J. & Tipler, Frank J. (2009). "New Axioms for Rigorous Bayesian Probability", Bayesian Analysis, 4(3): 599-606. ^ Niels Henrik Abel "Untersuchung der Functionen zweier unabhängig veränderlichen Gröszen x und y, wie f(x, y), welche die Eigenschaft haben, dasz f[z, f(x,y)] eine symmetrische Function von z, x und y ist.", Jour. Reine u. angew. Math. (Crelle's Jour.), 1, 11–15, (1826). ^ János Aczél, Lectures on Functional Equations and their Applications, Academic Press, New York, (1966). ^ Van Horn, K. S. (2003). "Constructing a logic of plausible inference: A guide to Cox's theorem". International Journal of Approximate Reasoning. 34: 3–24. doi:10.1016/S0888-613X(03)00051-3. Cox, R. T. (1946). "Probability, Frequency and Reasonable Expectation". American Journal of Physics. 14: 1–10. doi:10.1119/1.1990764. Cox, R. T. (1961). The Algebra of Probable Inference. Baltimore, MD: Johns Hopkins University Press. Terrence L. Fine, Theories of Probability; An examination of foundations, Academic Press, New York, (1973). Categories: Probability theoremsProbability interpretationsTheorems in statistics

Si quieres conocer otros artículos parecidos a **Cox's theorem** puedes visitar la categoría **Probability interpretations**.

Deja una respuesta