Joint Probability Mass Function - GeeksforGeeks (2025)

Last Updated : 02 Aug, 2024

Summarize

Comments

Improve

Joint Probability Mass Function (PMF) is a fundamental concept in probability theory and statistics, used to describe the likelihood of two discrete random variables occurring simultaneously. It provides a way to calculate the probability of multiple events occurring together.

Table of Content

  • Joint Probability Mass Function
    • Characteristics of Joint Probability Mass Function
  • What is Difference between PMF and PDF?
  • PMF of Binomial Distribution
    • Applications of PMF of Binomial Distribution
  • PMF of Poisson Distribution
  • Applications of Probability Mass Functions
  • Examples on Joint Probability Mass Function

Joint Probability Mass Function

A Joint Probability Mass Function, denoted as P(X = x, Y = y) or f(x, y), is a function that gives the probability that discrete random variables X and Y simultaneously take on specific values x and y, respectively.

Characteristics of Joint Probability Mass Function

  • Domain: Function is defined for all possible combinations of x and y in the X and Y sample space.
  • Range: 0 ≤ P(X = x, Y = y) ≤ 1 for all x and y.
  • Sum: Sum of the joint PMF over all possible values of x and y must equal 1.
  • Non-Negativity: P(X = x, Y = y) ≥ 0 for all x and y.
  • Marginal Distributions: Joint PMF can be used to find the marginal PMFs of X or Y: P(X = x) = ∑P(X = x, Y = y) y P(Y = y) = ∑P(X = x, Y = y) x
  • Conditional Probability: Joint PMF can be used to calculate conditional probabilities: P(Y = y | X = x) = P(X = x, Y = y) / P(X = x)
  • Independence: Random variables X and Y are independent if and only if: P(X = x, Y = y) = P(X = x) × P(Y = y) for all x and y

What is Difference between PMF and PDF?

CharacteristicProbability Mass Function (PMF)Probability Density Function (PDF)
Type of Random VariableDiscreteContinuous
DefinitionA function that gives the probability that a discrete random variable is exactly equal to some value.A function that describes the relative likelihood for this random variable to take on a given value.
Range of ValuesPMF can only be defined for specific discrete values.PDF is defined over a continuous range of values.
Probability CalculationP(X = x) = f(x), where f(x) is the PMFP(a ≤ X ≤ b) = ∫[a to b] f(x)dx, where f(x) is the PDF
PropertiesΣf(x) = 1 for all possible x, and 0 ≤ f(x) ≤ 1∫[all x] f(x)dx = 1, and f(x) ≥ 0 for all x
Graphical RepresentationRepresented by discrete points or bars in a histogram.Represented by a continuous curve.
ExampleBinomial distribution or Poisson distribution.Normal distribution or Exponential distribution.
Value at a PointGives the actual probability.Does not give the actual probability, but rather a density.
UnitsUnitless (pure probability).Has units of probability per unit of measurement.
Cumulative DistributionSumming the PMF values up to a point.Integrating the PDF up to a point.
Expectation CalculationE[X] = Σx × f(x)E[X] = ∫x × f(x)dx
InterpretationDirect probability interpretation.Area under the curve gives probability.

PMF of Binomial Distribution

Binomial Distribution is a discrete probability distribution that models the number of successes in a fixed number of independent Bernoulli trials. Here's a detailed explanation of its PMF:

Definition: A random variable X follows a Binomial Distribution with parameters n and p, denoted as X ~ B(n, p), if and only if:

  • There are a fixed number n of independent trials
  • Each trial has only two possible outcomes: success (with probability p) or failure (with probability 1-p)
  • Probability of success p remains constant for all trials

Probability Mass Function: For a random variable X that follows a Binomial Distribution B(n, p), the PMF is given by:

P(X = k) = C(n,k)×pk×(1-p)(n-k)

where:

  • k is Number of Successes (0 ≤ k ≤ n)
  • n is Total Number of Trials
  • p is Probability of Success on an Individual Trial
  • C(n,k) is Binomial Coefficient, also written as (n choose k)

Binomial Coefficient: Binomial coefficient C(n,k) represents the number of ways to choose k items from a set of n items, without replacement and regardless of order. It is calculated as:

C(n,k) = n! / {k! × (n-k)!}

where, "!" denotes the factorial operation.

Properties:

  • Mean (expected value) of X is E(X) = np
  • Variance of X is Var(X) = np(1-p)
  • PMF is symmetric if and only if p = 0.5

Example: Consider a fair coin tossed 5 times. Let X be the number of heads obtained. Then:X ~ B(5, 0.5)To find P(X = 3), that is, the probability of getting exactly 3 heads, we use the

PMF:P(X = 3) = C(5,3) × (0.5)3 × (0.5)(5-3)

= (5! / (3! × 2!)) × (0.5)3 × (0.5)2

= 10 × 0.125 × 0.25

= 0.3125

Applications of PMF of Binomial Distribution

  • Binomial Distribution can model various real-world scenarios, such as:
  • Number of defective items in a batch of products
  • Number of successful sales calls made by a salesperson
  • Number of patients that respond to a treatment in a clinical trial

PMF of Poisson Distribution

Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time or space, assuming these events occur with a known constant mean rate and independently of the time since the last event.

PMF of Poisson Distribution

Let X be a random variable that follows a Poisson distribution with parameter λ (lambda). The PMF of X is given by:

P(X = k) = {e(-λ) × λk} / k!

where:

  • e is Base of Natural Logarithms (approximately 2.71828)
  • λ is Average Number of Events in an Interval
  • k is Number of Events (k = 0, 1, 2, ...)
  • k! Denotes Factorial of k

Properties and Characteristics

  • PMF gives the probability that an event occurs exactly k times in an interval.
  • λ represents both the mean and the variance of the distribution, such that: E(X) = Var(X) = λ
  • Poisson distribution is often used to model rare events, and it can be applied in various fields such as physics, biology, or business.
  • As λ increases, the Poisson distribution becomes more symmetric and approaches a normal distribution.
  • Sum of two or more independent Poisson random variables is also a Poisson random variable, with a mean that is the sum of the individual means.

Applications of PMF of Poisson Distribution

Poisson distribution can be used to model various phenomena, such as:

  • Number of customers arriving at a store in an hour
  • Number of defects in a manufactured product
  • Number of phone calls received by a call center in a day
  • Number of radioactive particle emissions detected in a fixed time interval

Relationship with Other Distributions:

  • Poisson distribution is closely related to the exponential distribution, which models the time between Poisson events.
  • As λ becomes large, the Poisson distribution can be approximated by a normal distribution with mean λ and variance λ.
  • Poisson distribution can be derived as a limiting case of the binomial distribution under certain conditions.

Example: Suppose we want to find the probability of exactly 3 events occurring in an interval where the average number of events is 2 (λ = 2).

P(X = 3) = (e(-2) × 23)/3!

= (0.1353 × 8) / 6

≈ 0.1804

Therefore, the probability of exactly 3 events occurring is approximately 0.1804 or 18.04%.

Applications of Probability Mass Functions

Various application of Probability Mass Functions is:

  • Discrete Random Variables: PMFs are used to describe the probability distribution of discrete random variables. For example, an PMF can model the number of heads that occur in a series of coin flips or the number of defective items in a batch of products.
  • Statistical Inference: PMFs play a crucial role in statistical inference. They are used to calculate probabilities and expectations that are essential for hypothesis testing and parameter estimation.
  • Bayesian Statistics: In Bayesian statistics, PMFs are used to represent prior and posterior distributions of discrete random variables. This application is particularly useful in fields such as machine learning and data science.
  • Queueing Theory: PMFs are applied in queueing theory to model the number of customers in a system or the time between arrivals. This has applications in operations research and computer network modeling.
  • Reliability Engineering: In reliability engineering, PMFs can model the number of failures that occur in a system over time. This is useful for predicting maintenance needs and system longevity.
  • Genetics and Biology: PMFs are used to model genetic inheritance patterns or the distribution of species in an ecosystem. For instance, they can describe the probability of inheriting certain traits or the likelihood of observing a specific number of organisms in a habitat.
  • Finance and Economics: In finance, PMFs can model discrete investment returns or the number of trades in a given time period. They are also used in economics to model things like household sizes or income distributions.
  • Information Theory: PMFs are fundamental in information theory, where they are used to calculate entropy and mutual information. These concepts are crucial in data compression and communication systems.
  • Game Theory: In game theory, PMFs can represent the distribution of strategies that players might choose. This application is relevant in economics, political science, and artificial intelligence.
  • Quality Control: PMFs are used in quality control to model the number of defects in a product or process. This helps in setting quality standards and implementing control measures.
  • Actuarial Science: In insurance and actuarial science, PMFs model discrete events such as the number of claims filed in a given period or the number of accidents that occur.

Related Articles:

  • Poisson Distribution
  • Bernoulli Trials
  • Binomial Distribution

Examples on Joint Probability Mass Function

Example 1: Consider an experiment where a fair six-sided die is rolled. The PMF table for this experiment is:

x (outcome)123456
P(X = x)1/61/61/61/61/61/6

Find the probability of rolling an even number.

Solution:

P(even) = P(X = 2) + P(X = 4) + P(X = 6)

= 1/6 + 1/6 + 1/6

= 3/6 = 1/2

Example 2: A bag contains 3 red, 4 blue, and 5 green marbles. The PMF table for randomly drawing a marble is:

x (color)RedBlueGreen
P(X = x)1/41/35/12

Calculate the probability of drawing a red or blue marble.

Solution:

P(red or blue) = P(X = red) + P(X = blue)

= 1/4 + 1/3 = 3/12 + 4/12

= 7/12

Example 3: Consider a discrete random variable X with the following PMF:

x0123
P(X = x)0.10.30.40.2

Find P(X ≤ 2) and the expected value E(X).

Solution:

P(X ≤ 2) = P(X = 0) + P(X = 1) + P(X = 2)

= 0.1 + 0.3 + 0.4

= 0.8

E(X) = 0 × 0.1 + 1 × 0.3 + 2 × 0.4 + 3 × 0.2

= 0 + 0.3 + 0.8 + 0.6

= 1.7

Example 4: An experiment involves flipping a fair coin twice. The PMF table for the number of heads is:

x (number of heads)012
P(X = x)1/41/21/4

Calculate the probability of getting at least one head and the variance of X.

Solution:

P(at least one head) = P(X = 1) + P(X = 2)

= 1/2 + 1/4 = 3/4

E(X) = 0 × 1/4 + 1 × 1/2 + 2 × 1/4 = 1

E(X²) = 0² × 1/4 + 1² × 1/2 + 2² × 1/4 = 1.5

Var(X) = E(X²) - [E(X)]²

= 1.5 - 1² = 0.5

Example 5: Consider an experiment where two fair six-sided dice are rolled, and the sum of the numbers is recorded. The PMF table for this experiment is:

x (sum)23456789101112
P(X = x)1/362/363/364/365/366/365/364/363/362/361/36

Find the probability that the sum is greater than 9 or equal to 7.

Solution:

P(X > 9 or X = 7) = P(X = 10) + P(X = 11) + P(X = 12) + P(X = 7)

= 3/36 + 2/36 + 1/36 + 6/36

= 12/36

= 1/3

Conclusion

Understanding Joint PMFs is crucial for analyzing discrete multivariate probability distributions. They provide a powerful tool for modeling complex systems and making predictions in fields ranging from statistics and engineering to economics and biology.

FAQs on Joint Probability Mass Function

What is the main difference between a PMF and a PDF?

  • PMFs are for discrete random variables.
  • PDFs are for continuous random variables.

How do you calculate marginal distributions from a joint PMF?

To calculate marginal distributions from a joint PMF, sum the joint PMF over all values of the other variable.

What condition must be met for two random variables to be independent?

For two random variables to be independentheir joint PMF must equal the product of their individual PMFs for all values.

What are some common applications of PMFs?

Some common applications of PMFs includes: Statistical Inference, Reliability Engineering, Finance, and Quality Control.

How is the Poisson distribution related to the binomial distribution?

Poisson distribution can be derived as a limiting case of the binomial distribution under certain conditions.


Next Article

Probability Mass Function

N

nandinimi5b7m

Joint Probability Mass Function - GeeksforGeeks (1)

Improve

Article Tags :

  • Mathematics
  • School Learning
  • Probability

Similar Reads

  • Joint Probability Mass Function Joint Probability Mass Function (PMF) is a fundamental concept in probability theory and statistics, used to describe the likelihood of two discrete random variables occurring simultaneously. It provides a way to calculate the probability of multiple events occurring together. Table of Content Joint 11 min read
  • Probability Mass Function Probability mass function i.e., PMF is the probability of discrete random variables with fixed values. In this article we will see the probability mass function along with the PMF definition, probability mass function examples, properties of probability mass function and probability mass function fo 8 min read
  • Probability: Joint vs. Marginal vs. Conditional Probability is a fundamental concept in statistics that helps us understand the likelihood of different events occurring. Within probability theory, there are three key types of probabilities: joint, marginal, and conditional probabilities. Marginal Probability refers to the probability of a single 6 min read
  • Probability Density Function Probability Density Function is the function of probability defined for various distributions of variables and is the less common topic in the study of probability throughout the academic journey of students. However, this function is very useful in many areas of real life such as predicting rainfal 15+ min read
  • Function Table in Math A function table in math is a table used to organize and display the relationship between inputs (often called x values or independent variables) and their corresponding outputs (often called y values or dependent variables) in a function. The table shows how a specific function transforms one value 8 min read
  • Function Notation in Math Function notation is a precise and simplified way to express the relationship between inputs and outputs. Instead of using the typical y = format, function notation replaces y with a function name, such as f(x), where f represents the function's name, and x is the input variable. This format helps m 6 min read
  • Probability Distribution Function Probability Distribution refers to the function that gives the probability of all possible values of a random variable.It shows how the probabilities are assigned to the different possible values of the random variable.Common types of probability distributions Include: Binomial Distribution.Bernoull 9 min read
  • Marginal Probability Marginal Probability is a fundamental concept in probability theory and statistics. It refers to the probability of the single event occurring irrespective of the outcomes of the other related events. The Marginal probabilities are essential for understanding joint distributions and are commonly use 9 min read
  • Joint Probability | Concept, Formula and Examples Probability theory is a cornerstone of statistics, offering a powerful tool for navigating uncertainty and randomness in various fields, including business. One key concept within probability theory is Joint Probability, which enables us to analyse the likelihood of multiple events occurring simulta 5 min read
  • Compound Probability: Definition, Formulas, Examples Compound probability is a fundamental concept in mathematics and statistics that deals with the likelihood of multiple events occurring together within a single experiment or scenario. It provides a framework for understanding the combined probability of two or more events happening simultaneously o 7 min read
Joint Probability Mass Function - GeeksforGeeks (2025)

References

Top Articles
Latest Posts
Recommended Articles
Article information

Author: Delena Feil

Last Updated:

Views: 6048

Rating: 4.4 / 5 (45 voted)

Reviews: 84% of readers found this page helpful

Author information

Name: Delena Feil

Birthday: 1998-08-29

Address: 747 Lubowitz Run, Sidmouth, HI 90646-5543

Phone: +99513241752844

Job: Design Supervisor

Hobby: Digital arts, Lacemaking, Air sports, Running, Scouting, Shooting, Puzzles

Introduction: My name is Delena Feil, I am a clean, splendid, calm, fancy, jolly, bright, faithful person who loves writing and wants to share my knowledge and understanding with you.