This paper establishes a link between Bayesian inference (learning) and predicate and state transformer operations from programming semantics and logic. Specifically, a very general definition of backward inference is given via first applying a predicate transformer and then conditioning. Analogously, forward inference involves first conditioning and then applying a state transformer. These definitions are illustrated in many examples in discrete and continuous probability theory and also in quantum theory.
Jacobs B., Zanasi F. (2016). A Predicate/State Transformer Semantics for Bayesian Learning. PO BOX 211, 1000 AE AMSTERDAM, NETHERLANDS : ELSEVIER SCIENCE BV [10.1016/j.entcs.2016.09.038].
A Predicate/State Transformer Semantics for Bayesian Learning
Zanasi F.
2016
Abstract
This paper establishes a link between Bayesian inference (learning) and predicate and state transformer operations from programming semantics and logic. Specifically, a very general definition of backward inference is given via first applying a predicate transformer and then conditioning. Analogously, forward inference involves first conditioning and then applying a state transformer. These definitions are illustrated in many examples in discrete and continuous probability theory and also in quantum theory.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.