Introduction
Probability is a measure of likelihood that an event will occur in a random experiment.It is quantified by a number between 0 and 1, where, 0 indicates impossibility and 1 indicates certainty. For example, if a coin is tossed, the probability of getting "head" is \(\frac{1}{2}\), and it represents that there is 50% chance of getting head.
We can define probability in three different ways/approaches as below.
- Classical approach
- Relative frequency approach
- Axiomatic approach
The classical approach
Classical definition of probability is for equally likely outcomes. If an experiment can produced \( n\) mutually exclusive and equally likely outcomes out of which \( k\) outcomes are favorable to the occurrence of event \( A\) , then the probability of \( A\) is denoted by \( P(A)\) and is defined by
\( P(A)=\frac{ \text{Number of outcomes favourable to A}}{\text{Number of possible outcomes}}=\frac{k}{n}\)
For example, if a bag contains 3 red and 5 white marbles, then the probability of getting red marble is
\( P(Red)=\frac{ \text{Number of outcomes favourable to Red marbles}}{\text{Number of marbles}}=\frac{3}{8}\)
The classical definition of probability has always been criticized for the following reasons:
- This definition assumes that the outcomes are equally likely.
- The definition is not applicable when the numbers of outcomes are not equally likely.
- The definition is also not applicable when the total number of outcomes is infinite or it is difficult to count the total outcomes. For example, it is difficult to count the fish in the ocean.
Example 1
A fair die is thrown. Find the probabilities that the face on the die is (a) Maximum (b) Prime (c) Multiple of 3 (d) Multiple of 7Solution
There are 6 possible outcomes when a die is tossed. We assumed that all the 6 faces are equally likely. The classical definition of probability is to be applied here
The sample space is \( S=\{1,2,3,4,5,6\}\) and \( n(S)=6\)
- (a) Let \( A\) be the event that the face is maximum, thus
\( A=\{6\}, n(A)=1\)
Therefore, \( P(A) =\frac{\text{Number of outcomes favorable to A}}{\text{Number of possible outcomes in S}}=\frac{(n(A)}{(n(S)}=\frac{1}{6}\) - (b) Let \( B\) be the event that is prime, thus
\( B=\{2,3,5\}, n(B)=3\)
Therefore, \( P(B) =\frac{\text{Number of outcomes favorable to B}}{\text{Number of possible outcomes in S}}=\frac{(n(B)}{(n(S)}=\frac{3}{6}\) - (c) Let \( C\) be the event that is multiples of 3, thus
\( C=\{3,6\}, n(C)=2 \)
Therefore, \( P(C) =\frac{\text{Number of outcomes favorable to C}}{\text{Number of possible outcomes in S}}=\frac{(n(C)}{(n(S)}=\frac{2}{6}\) - (d) Let \( D\) be the event that is multiples of 7, Thus
\( D=\phi , n(D)=0 \)
Therefore, \( P(D) =\frac{\text{Number of outcomes favorable to D}}{\text{Number of possible outcomes in S}}=\frac{(n(D)}{(n(S)}=0\)
Relative frequency approach
Let us take an example. When a meteorologist states that the chance of rain is 90\%, the meteorologist is in terms of relative frequency. The relative frequency is the ratio of observed frequency to the total frequency of a random experiment. Suppose a random experiment is repeated n times and outcomes A is observed k times, then \(\frac{k}{n}\) is the relative ratio of outcome A. In this case, probability is defined as limiting value of the ratio. It is given by
\(P(A)= \displaystyle \lim_{n \to \infty } \frac{k}{n}\)
For example, probability of having rain today? Probability of selecting a motorbike of YAMAHA brand in KTM valley?
Both of these answer are sample dependent. We need a sample to describe these probabilities. Therefore, this can be answered based on relative frequency approach of probability.
Axiomatic approach
An axiomatic approach to probability refers to the probability of an event based on additional evidence. This approach is based on three axioms. According to this approach, probability is defined as follows.
Let S is a sample space, then probability is a function that assign a real number to every event E of S. In this case, the probability of en even event is P(E), and it satisfy following three axioms:
- Non negativity: \(0 \le P(E)\le 1\) for any event E
- Additive: \(P(E_1 \cup E_2 \cup..) = P(E_1)+P(E_2) + … \) for mutually exclusive events
- Certainty: P(S) =1, for sample space S.
Key terms of Probability
- Sample Space and Event
The set of all possible outcomes in a random experiment is called sample space. It is denoted by S. A subset E of sample space S is called an event.
For example, if a dice is rolled, then sample space is
\(S = \{1,2,3,4,5,6\}\)
And, if we are interested in even number, then
\(E = \{2,4,6\}\) is an event. - Mutually exclusive events
In probability, two events A and B are called mutually exclusive events if both events A and B cannot occur together. For example,- in a dice, 1 and 2 are mutually exclusive events because, either 1 will come or 2 will come but not both at a time.
- in a coin, H and T are mutually exclusive events because, either H will come or T will come but not both at a time.
- in a dice, even and prime are not-mutually exclusive events because, if 2 is the output, then it is even as well as prime, thus even AND prime both can come together at a time.
- Independent events
In probability, two events A and B are called independent events if occurrence of A has no effect in the occurrence of B. For example, in a dice 1 and in a coin H are independent events because, occurrence of either will have no effect on the other.
NOTE: Independent events are applied for series (more than one) of experiment.- If a bag contains 3 red and 5 white marbles, and two marbles are selected one after another with replacement then the events in both experiment are independent.
- If a bag contains 3 red and 5 white marbles, and two marbles are selected one after another without replacement then the events in both experiment are not independent.
Random variable
A headmaster bought one dozen of computer monitor, of which, unknown to him three are broken. He checks four of the monitors at random to see how many are broken in his sample of 4. The answer which he label X, is one of the numbers 0,1,2,3,4. Since the value of X is random, X is called random variable. The mathematical definition is as follows.
Random variable is a function that assign a numerical value to sample space of a random experiment. A random variable is also called chance variable and it is abbreviated by \(r.v.\) and denoted by capital letter \(X, Y, ...,\) of English alphabets. For example, if we toss a coin and define sample space \( S = \{H, T\} \). Then we can define X= number of heads, as a random variable. In this case, X=1 represents occurrence of 1 head. The table for the distribution of the values of X is given below.
S (Sample Space ) | T | H |
X=x (Number of heads) | 0 (no head) | 1(1 head) |
The value of the random variable is denoted by small letters of English alphabets, like \(X = x\). A random variable can be of any form: one-variate, bi-variate or multi-variate.
Types of random variable (Discrete and Continuous)
Random variable are of two types. They ate discrete and continuous. A random variable \( X\) is called discrete if it can assume finite number of values, and the values it has taken can be separated by gaps. Some example of discrete \( r.v.\) are number of students in a class, no of books in students bag, number of family member of student. If \( X =\)
number of books in students bag, then \( X\) can take 0, 1 or 2 as value. But \( X\) cannot take its value as 0.1 or 1.5 etc. Here, \( X\) can take only the specific values which are 0, 1 and 2 and so on. Therefore, in this case \( X\) is discrete random variable.
A random variable \( X\) is called continuous if it can take all values in a possible range, and there are no gaps between its values. For example, the weight of a student could be any real number between certain extreme limits. Suppose, \( X\) = weight of students in a certain class which lies been between 40kg to 70kg . Then \( X\) can take any value between the ranges 100 to 150. The weight of a student may be 45kg or 45.2kg or it may take any value between 45 and 45.2 . Here, value of \( X\) can be any real number between 40 to 70. So, \( X\) is continuous random variable.
Probability distribution
Probability distribution is a table or a function that give a numerical value for each outcome of a random experiment. With this probability distribution, one can model behavior of a random variable. In this essence, probability distribution is called function of random variable.
Discrete probability distribution
Let \( X \) be a discrete random variable then probability distribution of \( X \) is denoted by \( f(x) \) and defined by \(f ( x )=P ( X=x )\) satisfying (a) \(f ( x )\ge 0\) and (b) \( \sum f ( x )=1 \).
No comments:
Post a Comment