Register for the Philosophy of Probability Workshop

Submitted by Kate Goldyn on

This event will bring together philosophers, mathematicians, and scientists for a one-day workshop on April 28, 2018 to critically examine interpretations of probability theory in its many and varied applications. Probability theory is used widely in stochastic and dynamical systems, statistical and quantum physics, theories of rationality in epistemology, and theories of evidence and inductive inference in statistics and philosophy of science. It often happens that results or viewpoints from certain applications of probability theory, which would be immediately useful in other areas, go unnoticed due to lack of communication between isolated research communities. For this reason, the purpose of this event is not just to rehash well-known debates between Bayesian and frequentist approaches to probability, but rather to put researchers with different views on probability theory in conversation. The event will serve to open channels of communication between researchers in different fields working on related questions in the foundations of probability theory.

Register for the workshop here.

The referenced media source is missing and needs to be re-embedded.
Schedule of Speakers:

The referenced media source is missing and needs to be re-embedded.

9:00-10:15AM

Title: Radically Finite Probability
Simon Huttegger (University of California, Irvine)

Abstract: There is a certain disconnect between the classical mathematics of scientific theories and models, on the one hand, and the practice of observation and experimentation, on the other. Classical mathematics involves infinity, whereas observational processes are inherently finite. Edward Nelson's "radically elementary probability theory" provides a framework that allows one practice probability theory for arbitrarily large numbers of events with effectively finite means. I will apply radically elementary probability theory to Bayesian theorems on convergence to the truth and merging of opinions. This shows that those results have finite counterparts; but since radically elementary probability theory offers a richer conceptual framework than classical probability theory, they also shed light on the meaning of convergence theorems.

10:30-11:45AM

Title: What finite additivity can add to decision theory
Teddy Seidenfeld (Carnegie Mellon University) (Joint work with Mark J. Schervish, Rafael Stern, and Joseph B. Kadane)

Abstract: We examine general decision problems with loss functions that are bounded below. We allow the loss function to assume the value infinity. No other assumptions are made about the action space, the types of data available, the types of non-randomized decision rules allowed, or the parameter space. By allowing prior distributions and the randomizations in randomized rules to be finitely-additive, we prove very general complete class and minimax theorems. Specifically, under the sole assumption that the loss function is bounded below, we show that every decision problem has a minimal complete class and all admissible rules are Bayes rules. We also show that every decision problem has a minimax rule and a least-favorable distribution and that every minimax rule is Bayes with respect to the least-favorable distribution. Some special care is required to deal properly with infinite-valued risk functions and integrals taking infinite values.

12:00-1:00pm   Lunch Break

1:00-2:15PM

Title: Additivity Principles for Quantum Probabilities
Laura Ruetsche (University of Michigan) (Joint work with Aristidis Arageorgis and John Earman

Abstract: The question of whether personal probability functions should be countably or (merely) finitely additive has been extensively debated. We ask a different question: what additivity principles govern physical, specifically quantum mechanical, probabilities? Taking this to be a question about what quantum states are physically realizable, we present a “transcendental” argument in favor of countable additivity. In order for us to have evidence supporting quantum mechanics (the argument goes), the probability functions induced by quantum states must be countably additive. Otherwise, we could never prepare systems in specific quantum states, where such preparation is a precondition for collecting evidence confirming the theory. Although our focus is physical probabilities, reflecting on state preparation allows us to draw a few quick and polemical morals concerning rational credences about quantum events. First, the Principal Principle is otiose. Second, Bayesian personalists tempted by de Finetti’s arguments in favor of merely finite additivity indulge that temptation at their peril. Finally, it is also at their peril that personalists pursue operational approaches to quantum mechanics (e.g. POVMs, effect algebras, etc.).

2:30-3:45PM

Title: Imprecise chance in evolution
Marshall Abrams (University of Alabama at Birmingham)

Abstract: I argue that biological evolution and other complex processes sometimes depend on imprecise chances, i.e. imprecise analogues of real-valued objective probabilities realized by chance setups the world. I give a general argument for the existence of imprecise chances in nature. I then argue that natural selection, whether involving imprecise chance or not, would give rise to organisms whose behavior was imprecisely chancy. This behavior would then be part of the environment of other organisms in ways that would the make latter's evolution imprecisely chancy. Thus evolution sometimes involves imprecise chance. I explain why the absence of reports of imprecise chance in evolution is nevertheless unsurprising. More specifically, after summarizing my earlier arguments that many chances are what I call causal probabilities, (1) I give an argument for the existence of causal imprecise chances in nature that's more general than arguments for related conclusions given by Terrence Fine and collaborators, Alan Hajek, Luke Fenton-Glynn and collaborators, or Seamus Bradley. In particular I provide general reasons to think that some outcomes are objectively erratic--that they occur with no (causal) probability--and explain how this can give rise to imprecise chances. (2) I give a simple example that illustrates how erratic environmental variation could, in principle, give rise to imprecise chances and "imprecise fitnesses" that would affect natural selection. I briefly discuss relationships between imprecise fitness and imprecise decision rules. I argue that it's more fruitful, though, to focus on models of effects of imprecise chances over many generations, and I illustrate this strategy with computer simulations. (3) I then argue that if we start by assuming that environments only make evolution chancy in the traditional, precise sense, evolution would still produce organisms whose internal processes and behaviors involved imprecise chances, because producing mechanisms that reliably realize precise chances is costly. (4) However, since a population's environment includes organisms of other species, and behaviors of those other organisms would sometimes be imprecisely chancy, it's likely that evolution actually does depend on imprecise chances in many cases. (Though this conclusion contradicts the initial assumption that evolution involves only precise chances, it doesn't undermine the argument; behaviors of organisms evolving in an imprecisely chancy environment are even more likely to be imprecisely chancy.) I explain why it's unsurprising that there seem to be no reports of imprecisely chancy evolution.

4:00-5:15PM

Keynote: Title: A Third Way?

Persi Diaconis (Stanford)

Abstract: The Bayesian/frequentist debate has danced along for hundreds of years with little progress. In a little known paper, Von Mises suggested a different approach. Probability is part of mathematics. The connection between the mathematics and the real world should be treated in the same way as the connection between geometry and the real world or between mechanics and the real world. After all, no one measures straight lines and there are no real triangles. No one can measure actual velocities, just approximations using difference quotients. Yet trigonometric computations and so on work pretty well in broad generality. I think that this is the way most working statisticians think and find it interesting that this position has not been developed. I'll do the best I can.

Graduate Student Discussant: Alex Meehan (Princeton)

We gratefully acknowledge support from the UW Department of Philosophy, the Saari Endowment, the UW Department of Mathematics, and the Simpson Center for the Humanities.

News Topic
Share