Module - 3 Probability Theory

Lesson - 2 Random Variables and Probability Distributions

Random variables and probability distributions are basic concepts in statistics and probability theory. Probability distributions can be discrete or continuous and are utilized to depict the possible results of an arbitrary variable at the side their corresponding probabilities. Expectation, variance, and the central limit theorem are vital concepts in probability theory. Probability distributions have applications in different areas such as finance, engineering, material science, and pharmaceutical. Understanding random factors and probability dispersions is significant for anybody fascinated by information examination.

Random variables are a fundamental concept in probability theory and statistics. A random variable is a variable whose possible values are numerical outcomes of a random phenomenon. They are used to model uncertainty and randomness in a wide range of real-world scenarios, from the roll of a dice to the behavior of stock prices.

Formally, a random variable is a variable whose value is determined by chance. It is usually denoted by a capital letter, such as X, and can take on a range of possible values. The set of all possible values that a random variable can take is called the sample space, and it is denoted by S.

Random variables can be **discrete or continuous**. A discrete random variable takes on a finite or countably infinite set of values, such as the number of heads obtained in a series of coin flips. A continuous random variable takes on an uncountably infinite set of values, such as the height of a randomly chosen person in a population.

It is denoted by P(X).

Notation and basic properties:

- P(X=x) is the probability that the random variable X takes on the value x.
- The sum of all probabilities is equal to 1: ∑P(X=x) = 1.
- The expected value of a random variable X is denoted by E(X).
- The variance of a random variable X is denoted by Var(X).

A discrete random variable is one that takes on a finite or countably infinite number of values. Examples include the number of heads in a coin toss or the number of defective items in a production line.

- The probability mass function is the function that describes the probabilities of each possible value of a discrete random variable.
- It is denoted by P(X=x).
- Properties of a PMF:
- 0 ≤ P(X=x) ≤ 1 for all x
- ∑P(X=x) = 1

- Example:
- Let X be the number of heads in two coin tosses. The PMF of X is: P(X=0) = 1/4 P(X=1) = 1/2 P(X=2) = 1/4

A continuous random variable is one that takes on an uncountably infinite number of possible values. Examples include height, weight, and temperature.

Probability density function (PDF):

- The probability density function is the function that describes the probabilities of different intervals of values of a continuous random variable.
- It is denoted by f(x).
- Properties of a PDF:
- f(x) ≥ 0 for all x
- ∫f(x)dx = 1

- Example:
- Let X be a random variable with a uniform distribution between 0 and 1. The PDF of X is: f(x) = 1 for 0 ≤ x ≤ 1 f(x) = 0 otherwise

The expected value of a random variable X is the mean of its probability distribution. It is denoted by E(X) and is defined as:

```
E(X) = ∑xP(X=x) for discrete random variables
E(X) = ∫xf(x)dx for continuous random variables
```

The variance of a random variable X measures how spread out its probability distribution is. It is denoted by Var(X) and is defined as:

```
Var(X) = E((X-E(X))^2) = E(X^2) - [E(X)]^2
```

A joint probability distribution is the probability distribution of two or more random variables. It describes the probabilities of different outcomes of each random variable together.

Joint probability mass function (joint PMF):

- The joint probability mass function is the function that describes the probabilities of all possible combinations of values of two or more discrete random variables.
- It is denoted by P(X=x, Y=y).

Joint probability density function (joint PDF):

- The joint probability density function is the function that describes the probabilities of different intervals of values of two or more continuous random variables.
- It is denoted by f(x,y).

The central limit theorem (CLT) is a fundamental theorem in statistics that states that the sum or average of a large number of independent and identically distributed random variables will tend towards a normal distribution, regardless of the underlying distribution of the individual random variables.

The CLT has important applications in fields such as finance, engineering, physics, and more. For example, it can be used to model the distribution of stock prices or to estimate the probability of equipment failure in a manufacturing process.

Likelihood dispersions have a wide run of applications in different areas. Here are a few cases:

**Finance**: Probability distributions are utilized to show the behavior of stock costs, intrigued rates, and other monetary factors.**Designing**: Probability distributions are utilized to demonstrate the unwavering quality of components in a framework, the lifetime of items, and more.**Material science**: Probability distributions are utilized to demonstrate the behaviour of subatomic particles, the distribution of vitality levels in a framework, and more.**Medicine**: Probability distributions are utilized to show the dissemination of the event of diseases and to gauge the adequacy of medications.

Random variables and probability distributions are essential concepts in statistics and probability theory. Discrete and continuous probability distributions have their own properties and formulas, and joint probability distributions provide a way to model the behavior of multiple random variables. Expectation, variance, and the central limit theorem are important concepts in probability theory, and probability distributions have applications in various fields.

- A arbitrary variable could be a variable that takes on distinctive values based on chance or likelihood, and probability distribution is the set of all possible results of a irregular variable together with their comparing probabilities.
- Discrete irregular factors take on a limited or countably unbounded number of values, whereas ceaseless random variables take on an uncountably interminable number of values.
- The probability mass function (PMF) is utilized to depict the probabilities of each conceivable esteem of a discrete arbitrary variable, whereas the probability density function (PDF) is utilized to portray the probabilities of diverse interims of values of a ceaseless irregular variable.
- The anticipated esteem of a random variable is the mean of its likelihood conveyance, and the variance measures how spread out its likelihood conveyance is.
- The central limit theorem could be a fundamental theorem in measurements that states that the whole or normal of a expansive number of autonomous and indistinguishably conveyed arbitrary factors will tend towards a typical distribution.
- Likelihood conveyances have applications in different areas, counting back, building, material science, and pharmaceutical.

**1. Which of the following is a continuous random variable?** a) Number of children in a family

b) Number of heads in a coin toss

c) Time taken to complete a task

d) Number of red balls in a bag

**Answer**: c) Time taken to complete a task

**2. Which of the following is used to describe the probabilities of each possible value of a discrete random variable?**

a) Probability density function

b) Cumulative distribution function

c) Probability mass function

d) Variance

**Answer**: c) Probability mass function

**3. Which of the following measures how spread out a random variable's probability distribution is?**

a) Variance

b) Expected value

c) Standard deviation

d) Median

**Answer**: a) Variance

**4. The central limit theorem states that:**

a) A large number of independent and identically distributed random variables will tend towards a uniform distribution

b) A large number of independent and identically distributed random variables will tend towards a normal distribution

c) A large number of independent and identically distributed random variables will tend towards a Poisson distribution

d) A large number of independent and identically distributed random variables will tend towards an exponential distribution

**Answer**: b) A large number of independent and identically distributed random variables will tend towards a normal distribution

Related Tutorials to watch

Top Articles toRead

Read

- Contact Us
- admissions@almabetter.com
- 08046008400

- Official Address
- 4th floor, 133/2, Janardhan Towers, Residency Road, Bengaluru, Karnataka, 560025

- Communication Address
- 4th floor, 315 Work Avenue, Siddhivinayak Tower, 152, 1st Cross Rd., 1st Block, Koramangala, Bengaluru, Karnataka, 560034

- Follow Us

© 2023 AlmaBetter