This paper establishes a link between Bayesian inference (learning) and predicate and state transformer operations from programming semantics and logic. Specifically, a very general definition of backward inference is given via first applying a predicate transformer and then conditioning. Analogously, forward inference involves first conditioning and then applying a state transformer. These definitions are illustrated in many examples in discrete and continuous probability theory and also in quantum theory.
A Predicate/State Transformer Semantics for Bayesian Learning / Jacobs B.; Zanasi F.. - In: ELECTRONIC NOTES IN THEORETICAL COMPUTER SCIENCE. - ISSN 1571-0661. - ELETTRONICO. - 325:(2016), pp. 185-200. (Intervento presentato al convegno The 32nd Conference on the Mathematical Foundations of Programming Semantics, MFPS 2016 tenutosi a Pittsburgh, PA, USA nel May 23-26, 2016) [10.1016/j.entcs.2016.09.038].
A Predicate/State Transformer Semantics for Bayesian Learning
Zanasi F.
2016
Abstract
This paper establishes a link between Bayesian inference (learning) and predicate and state transformer operations from programming semantics and logic. Specifically, a very general definition of backward inference is given via first applying a predicate transformer and then conditioning. Analogously, forward inference involves first conditioning and then applying a state transformer. These definitions are illustrated in many examples in discrete and continuous probability theory and also in quantum theory.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.