This book explores the probabilistic approach to cognitive science, which models learning and reasoning as inference in complex probabilistic models.
We examine how a broad range of empirical phenomena, including intuitive physics, concept learning, causal reasoning, social cognition, and language understanding, can be modeled using probabilistic programs (using the WebPPL language).

This book is an open source project. We welcome content contributions (via GitHub)!

The ProbMods Contibutors are:

Noah D. Goodman (editor)

Joshua B. Tenenbaum

Daphna Buchsbaum

Joshua Hartshorne

Robert Hawkins

Timothy J. Oâ€™Donnell

Michael Henry Tessler

`https://probmods.org/`

[bibtex]

@misc{probmods2, title = {{Probabilistic Models of Cognition}}, edition = {Second}, author = {Goodman, Noah D and Tenenbaum, Joshua B. and The ProbMods Contributors}, year = {2016}, howpublished = {\url{http://probmods.org/v2}}, note = {Accessed: } }

We are grateful for crucial technical assitance from: Andreas Stuhlmüller, Tomer Ullman, John McCoy, Long Ouyang, Julius Cheng.

The construction and ongoing support of this tutorial are made possible by grants from the Office of Naval Research, the James S. McDonnell Foundation, the Stanford VPOL, and the Center for Brains, Minds, and Machines (funded by NSF STC award CCF-1231216).

- Introduction

*A brief introduction to the philosophy.* - Generative models

*Representing working models with probabilistic programs.* - Conditioning

*Asking questions of models by conditional inference.* - Causal and statistical dependence

*Causal and statistical dependence.* - Conditional dependence

*Patterns of inference as evidence changes.* - Bayesian data analysis

*Making scientific inferences from data.* - Algorithms for inference

*The many ways to approximate inference. Efficiency tradeoffs of different algorithms.* - Rational process models

*The psycological reality of inference algorithms.* - Learning as conditional inference

*How inferences change as data accumulate.* - Learning with a language of thought

*Compositional hypothesis spaces.* - Learning (deep) continuous functions

*Functional hypothesis spaces and deep probabilistic models* - Hierarchical models

*The power of statistical abstraction.* - Occam's Razor

*How inference penalizes extra model flexibility.* - Mixture models

*Models for inferring the kinds of things.* - Social cognition

*Inference about inference* - Appendix - JavaScript basics

*A very brief primer on JavaScript.* - Appendix - Useful distribtuions

*A very brief sumary of some important distributions.*