Continuous Probability Distributions
Explore continuous distributions beyond the normal: exponential, uniform, and more. Learn probability density functions and their applications.
On This Page
Continuous vs Discrete Distributions
| Aspect | Discrete | Continuous |
|---|---|---|
| Values | Countable (0, 1, 2, …) | Uncountable (any value in range) |
| Function | PMF: P(X = x) | PDF: f(x) |
| P(X = specific value) | Can be positive | Always 0 |
| Probabilities | Sum up | Integrate (area under curve) |
Probability Density Function (PDF)
The probability density function f(x) describes the relative likelihood of values. Probability = area under the curve.
- for all x
- Total area under curve = 1
- (area between a and b)
1. Uniform Distribution
The uniform distribution assigns equal probability to all values in an interval [a, b].
Mean:
Variance:
A random number is generated uniformly between 0 and 10.
X ~ Uniform(0, 10)
P(X < 3)?
P(2 < X < 7)?
Mean and SD:
- μ = (0+10)/2 = 5
- σ² = (10-0)²/12 = 100/12 = 8.33
- σ = 2.89
Buses arrive every 15 minutes. You arrive at a random time.
Wait time X ~ Uniform(0, 15)
P(wait less than 5 minutes)?
Expected wait time:
2. Exponential Distribution
The exponential distribution models time between events in a Poisson process. It’s the continuous counterpart to the geometric distribution.
Mean:
Variance:
CDF:
Customers arrive at rate λ = 4 per hour (Poisson process).
Time between arrivals X ~ Exponential(4)
Mean time between customers:
P(next customer arrives within 10 minutes = 1/6 hour)?
P(wait more than 30 minutes = 0.5 hours)?
Memoryless Property
The exponential distribution is memoryless: the probability of waiting t more time units doesn’t depend on how long you’ve already waited.
Light bulb lifetime ~ Exponential with mean 1000 hours.
If a bulb has lasted 500 hours, P(lasting 200 more hours)?
Same as P(new bulb lasting 200 hours):
The bulb doesn’t “remember” it’s been on for 500 hours!
3. Chi-Square Distribution
The chi-square distribution arises when you sum squared standard normal variables. It’s crucial for hypothesis testing.
If are independent standard normal variables:
Mean: (degrees of freedom)
Variance:
Properties
- Only takes positive values (x ≥ 0)
- Right-skewed (especially for small df)
- Approaches normal as df increases
- Used in chi-square tests, confidence intervals for variance
Common critical values for chi-square tests:
| df | χ²₀.₉₅ (right tail = 0.05) | χ²₀.₀₅ (left tail = 0.05) |
|---|---|---|
| 1 | 3.84 | 0.004 |
| 5 | 11.07 | 1.15 |
| 10 | 18.31 | 3.94 |
| 20 | 31.41 | 10.85 |
4. t-Distribution
The t-distribution (Student’s t) is similar to the normal but with heavier tails. It’s used when estimating means with unknown population standard deviation.
Mean: μ = 0 (for df > 1)
Variance: (for df > 2)
Properties
- Symmetric and bell-shaped like normal
- Heavier tails than normal (more probability in extremes)
- Approaches standard normal as df → ∞
- At df ≈ 30, very close to normal
| Confidence | z (normal) | t (df=10) | t (df=30) |
|---|---|---|---|
| 90% | 1.645 | 1.812 | 1.697 |
| 95% | 1.96 | 2.228 | 2.042 |
| 99% | 2.576 | 3.169 | 2.750 |
With small samples, you need to go further from the mean for the same confidence!
5. F-Distribution
The F-distribution is the ratio of two chi-square variables. It’s used in ANOVA and comparing variances.
If and are independent:
Mean: (for df₂ > 2)
Properties
- Only takes positive values
- Right-skewed
- Has TWO degrees of freedom parameters
- Used in F-tests, ANOVA, regression analysis
6. Beta Distribution
The beta distribution is flexible for modeling proportions and probabilities.
Values in [0, 1]
Mean:
Different α and β values create different shapes:
- α = β = 1: Uniform distribution
- α = β > 1: Symmetric, bell-shaped
- α > β: Left-skewed
- α < β: Right-skewed
Comparison of Continuous Distributions
| Distribution | Support | Parameters | Common Use |
|---|---|---|---|
| Normal | (-∞, ∞) | μ, σ | Natural phenomena |
| Uniform | [a, b] | a, b | Random number generation |
| Exponential | [0, ∞) | λ | Wait times |
| Chi-square | [0, ∞) | df | Variance tests |
| t | (-∞, ∞) | df | Small sample means |
| F | [0, ∞) | df₁, df₂ | ANOVA, variance comparison |
| Beta | [0, 1] | α, β | Proportions |
Choosing the Right Distribution
Scenario 1: Time until next earthquake → Exponential (waiting time for rare event)
Scenario 2: Test scores of large population → Normal (central limit theorem)
Scenario 3: Bus arrival time within a 10-minute window → Uniform (any time equally likely)
Scenario 4: Sample variance of normal data → Chi-square (sum of squared normals)
Scenario 5: Batting average, proportion successful → Beta (values between 0 and 1)
Relationships Between Distributions
Many distributions are related:
- Normal → Chi-square:
- Chi-square + Chi-square → F: Ratio of chi-squares
- Normal + Chi-square → t: Z divided by sqrt(χ²/df)
- Poisson → Exponential: Time between Poisson events
- Binomial → Normal: For large n (CLT)
- Exponential → Gamma: Sum of exponentials
Summary
In this lesson, you learned:
- Continuous distributions use PDFs; probability = area under curve
- Uniform: Equal probability over an interval
- Exponential: Time between events (memoryless)
- Chi-square: Sum of squared normals (variance tests)
- t-distribution: Like normal but heavier tails (small samples)
- F-distribution: Ratio of chi-squares (ANOVA)
- Beta: Flexible distribution for proportions
- Distributions are interconnected through mathematical relationships
Practice Problems
1. X ~ Uniform(5, 15). Find: a) P(X < 8) b) P(7 < X < 12) c) Mean and standard deviation
2. Phone calls arrive at rate 2 per hour (Poisson). Time between calls is exponential. a) Mean time between calls? b) P(more than 1 hour between calls)? c) P(call arrives in next 15 minutes)?
3. A chi-square distribution has df = 8. What is: a) The mean? b) The variance?
4. For a t-distribution with df = 15, the critical value for a two-tailed test at α = 0.05 is 2.131. What does this mean?
Click to see answers
1. a) P(X < 8) = (8-5)/(15-5) = 3/10 = 0.3 b) P(7 < X < 12) = (12-7)/(15-5) = 5/10 = 0.5 c) μ = (5+15)/2 = 10 σ² = (15-5)²/12 = 100/12 = 8.33 σ = 2.89
2. λ = 2 per hour a) Mean = 1/λ = 0.5 hours = 30 minutes b) P(X > 1) = e^(-2×1) = e^(-2) ≈ 0.135 c) P(X < 0.25) = 1 - e^(-2×0.25) = 1 - e^(-0.5) = 1 - 0.607 ≈ 0.393
3. a) Mean = df = 8 b) Variance = 2×df = 2×8 = 16
4. If the test statistic |t| > 2.131, we reject the null hypothesis. This leaves 2.5% probability in each tail of the distribution.
Next Steps
Continue building your statistical knowledge:
- Sampling Distributions - Distributions of sample statistics
- Confidence Intervals - Using distributions for estimation
- T-Test Calculator - Apply t-distribution to real tests
Was this lesson helpful?
Help us improve by sharing your feedback or spreading the word.