A+ »

# ACMMM137

## 1.2 Introduction to Probabilities

#### Probabilities

• Probability is a numerical measure of the chance of a particular event occurring.
• It is defined that the sum of probabilities of all possible outcomes to be 1, and all individual probabilities is non-negative. To sum up, the probability of each outcome is a real number in the interval [0,\ 1].
• The probability of an event is equal to the sum of the probabilities of the outcomes in that event.
• Mathematically, we represent probability of an event as \operatorname{Pr}(X=x) or \operatorname{Pr}(x), which X is the criteria to be observed, and x is the (actual) possible outcome.
• Usually, we assume that each outcome is equally likely to happen i.e. have the same probability, unless there are conditions/evidence that says otherwise. Thus, if there are n equally likely outcomes, probability of each event is \frac{1}{n}.
Read More »1.2 Introduction to Probabilities

## 1.1 Sample Spaces, Events, Tree Diagrams

#### Sample Spaces and Events

• A random experiment is an experiment where all the possible outcomes are known, but we do not know which one will be observed.
• We can list all possible outcomes into a set. Each outcome will be an element in the set. This set is known as the sample space, denoted by the Greek letter \varepsilon (pronounced as ‘epsilon’).
• An event is a subset of the sample space, usually denoted by a capital letter. It usually represents a particular, smaller set of possible outcomes that has a common characteristic.
• It is nice to have visual representations. Thus, we use a Venn Diagram to do so (see the example below). Each set is represented by a shape (typically, the sample space would be a rectangle, and each event is a circle).
Read More »1.1 Sample Spaces, Events, Tree Diagrams