Simplicity, Complexity, Unexpectedness, Cognition, Probability, Information

by Jean-Louis Dessalles
(created 31 December 2008, updated April 2016)

Logic and generation complexity

* *
*Consequences are no more complex to generate than their causes.* |

Generation complexity captures the causal part that people attribute to probability.
Generation complexity is mush more natural to people than probability (Saillenfest & Dessalles, 20015).
When people face series of choice points, they estimate the complexity of reaching a specific outcome,
without being able to translate their estimate into a probability value. Conversely, many studies
have shown that people are bad at taking probabilistic value as input in their reasoning and decisions
(see Saillenfest & Dessalles, 2015).

The most spectacular situation in which generation complexity appears more natural than probability can
be seen when we want to define independence. This concepts is traditionally defined using probability
(*P*(*A *∧* B*)* = P*(*A*) × *P*(*B*)). This may make sense when probabilities come from statistics, but not
when we deal with unique events or when we seek for explanations behind statical dependence.
In both cases, causal reasoning is needed. This is what generation complexity offers.

Independence

*s*_{1} and *s*_{1} are independent iff

**
***C*_{w}(*s*_{1}* & s*_{2})* = C*_{w}(*s*_{1})* + C*_{w}(*s*_{2})

This definition of independence is not only closer to common sense, but also can be used to predict phenomena
that cannot be derived from probability theory. A good illustration of this is offered by coincidences.

Generation complexity and boolean relations

The following relations are very natural consequence of the definition of Generation complexity.
*
C*_{w}(*a *∨* b*)* = *min(*C*_{w}(*a*)*, C*_{w}(*b*))*
*

C_{w}(*a *∧* b*)* **<** C*_{w}(*a*)* + C*_{w}(*b*)*
*

If* *(*a *⊃* b*)*, *then* C*_{w}(*a*)* **>** C*_{w}(*b*)*
*

Generation complexity and negation

Generation complexity is often obtain through a causal scenario *H* (see the example of
the rabid bat or the example of the running nuns).

If *H*(*s*) is the best (i.e. least demanding in parameters) available story that leads to *s*, then:
*
C*_{w}(*s*)* = C*(*H*(*s*))*
*

The *mutability* of *s* is usually defined as the propensity to consider that ¬*s* is true. Using generation complexity, we can define mutability as:
*
M*(*s*)* = – C*_{w}(*H*(¬*s*))* = – C*_{w}(¬*s*)*
*

(note that mutability is always negative).

When *H*(¬*s*) is a complex scenario, ¬*s* is complex to generate and the fact *s* is not mutable (*M*(*s*)* << –*1). From consequence relation above, we can derive:
If (*a *⊃* b*), then *M*(*a*)* **>** M*(*b*)

Bibliography

Dessalles, J-L. (2013). Algorithmic simplicity and relevance. In D. L. Dowe (Ed.), *Algorithmic probability and friends - LNAI 7070*, 119-130. Berlin, D: Springer Verlag.
Saillenfest, A. & Dessalles, J.-L. (2015). Some probability judgments may rely on complexity assessments. *Proceedings of the 37th Annual Conference of the Cognitive Science Society*, to appear. Austin, TX: Cognitive Science Society.

Back to the Simplicity Theory page.