Definition:Objective Unknown
Jump to navigation
Jump to search
Definition
An objective unknown is an event which has a well-defined objective probability of occurring.
Two objective unknowns with the same probability are equivalent in the field of decision theory.
An objective unknown is appropriately modelled by means of a probability model.
Also known as
Some sources refer to an objective unknown as a risk.
A gamble on an objective unknown can be seen referred to as a roulette lottery.
Examples
Examples of objective unknowns include:
- The fall of a coin in a game of coin-tossing
- The number selected by the spin of a roulette wheel (hence the term roulette lottery)
- The blind selection of a ball from an urn.
Also see
Sources
- 1991: Roger B. Myerson: Game Theory ... (previous) ... (next): $1.2$ Basic Concepts of Decision Theory