Many statisticians only ever deal with a handful of probability distributions. The uniform and Gaussian distributions are the ones that you're most likely to run across. They might know of Poisson and exponential distributions, but aren't real sure what they can do with them. They can calculate Student's T and Chi-squared statistics. They might even check the results in a convenient table, but just try to get one of them to explain what it is that they are doing past counting the number of tails in a test.
In reality, there are many probability distributions. Theoretically, the number of possible distributions is infinite. A handful are mentioned here, but this just scratches the surface.
If all you need is some formulae, most of texts cited at the end of this document provide some sort of mathematical expression for any particular distribution. Unfortunately, this is often insufficient to determine when the practitioner should use a particular distribution. Knowing precisely what circumstances yield random variates that follow a particular distribution can be especially fruitful in determining which hypotheses can be tested.
Instead, this article focuses on showing how the distributions are related at a conceptual level. Knowing the situations under which specific distributions arise can help modelling and testing efforts.
The concept of a Bernoulli trial occurs frequently in relationship with probability distributions. Briefly, a Bernoulli trial has a fixed probability of success without regard to when it is tried, and any one Bernoulli trial is independent of all others. See Beta, Binomial, Geometric, Pascal and Poisson distributions for different ways of looking at sequences of Bernoulli trials.
BETA. The beta distribution is continuous distribution related to the discrete binomial distribution. Mathematically, the Beta distribution describes the distribution of the probability of success of "n" Bernoulli trials given "s" successes. That is, the number of trials and successes is known, and the probability of success (p) is allowed to vary. Consequently, the domain of the beta distribution lies between 0 and 1 inclusively.
An alternative generation of the beta distribution takes two independent chi-square random variates (X1 and X2) with "degrees of freedom" v1 and v2 respectively, then the expression "X1/(X1+X2)" will follow a beta distribution with parameters v1/2 and v2/2.
BINOMIAL. The discrete binomial distribution describes the number of successes (s) out of a specific number of Bernoulli trials (n) with a specified probability of success (p). The range of the binomial distribution varies from 0 to n successes. When p is near 0.5 and n is very large, the binomial distribution approximates a Gaussian distribution with mean n/2. When n is very large and p is very small, the distribution is approximately Poisson.
CAUCHY. The continuous Cauchy distribution is peculiar in the sense that it has no well-defined mean or variance. However, it arises in some physical phenomena.
CHI-SQUARE. The sum of the squares of N independent, Gaussian random variates (with zero mean and unit variance) will follow a continuous Chi-square distribution with "N-1" degrees of freedom. See Error, Rayleigh and Maxwell distributions for special treatments when N=1,2,3 respectively. Snedecor's F distribution relates the ratio of chi-square variates.
Chi-square has statistical use in testing whether or not the sampling methodology produces Gaussian results. The cumulative distribution functions returns the probability that an observed chi-square statistic will be less than "chs" with "dof" degrees-of-freedom. Low values indicate "cooked" or "biased" sampling; there are insufficient outliers. High values indicate significant differences between model predictions and experimental outcomes.
ERROR. The continuous distribution of the absolute values of Gaussian variates (with zero mean and unit variance). The square of the error distribution is a chi-square distribution with 0 degrees of freedom.
EXPONENTIAL. The continuous exponential distribution describes the intervals between adjacent Poisson events. The discrete Poisson distribution describes the number of such events that occur within a given interval.
SNEDECOR'S F. If X1 and X2 are independent chi-square variates with v1 and v2 degrees of freedom, then the expression (X1/v1)/(X2/v2) (the "F-ratio") follows an F-distribution.
As a statistical test, the F-ratio checks the assumption that the variance of two populations is the same. The CDF returns the probability that an observed F-ratio will be less than "f" with "dof1" and "dof2" degreess of freedom. Low and high values indicate significant differences between two sample variances.
GAMMA. A continuous version of the Poisson distribution.
GAUSSIAN. Gaussian (Normal) distributions result from the sum of many small variates. It doesn't matter what the underlying distribution might be. Consequently, Gaussian distributions are ubiquitous in nature.
GEOMETRIC. Interval between Bernoulli successes, or number of trials until first success (or the number of trials until the next success).
HYPERGEOMETRIC. This is perhaps the most primitive of the probability distributions in this collection. In a finite population "Npop" of items there is a specific number of "T" of items of interest. Examine "Nsamp" of the population items (sampled without replacement). The number of items of interest in the sample follows a hypergeometric distribution.
If the population size is allowed to grow while the ratio of items of interest remains the same and sample size is lowered to 1, the hypergeometric distribution approaches the binomial distribution.
KOLMOGOROV-SMIRNOV D. Measures the maximum departure of a sample cumulative distribution function from the underlying population distribution function.
LAPLACIAN. The Laplacian or double-exponential is a continous distribution that is a double-ended version of the exponential.
MAXWELL. If X1, X2, and X3 are independent, gaussian random variates with zero mean and unit variance, then sqrt( sqr(X1) + sqr(X2) + sqr(X3)) has a Maxwell distribution. This distribution arises with three dimensional applications with spherical error probabilities.
PASCAL. (Negative Binomial) The discrete distribution of failures in a run of Bernoulli trials that have exactly "n" successes where the probability of success of each trial is "p". The domain goes from 0 to infinity.
POISSON. Poisson is a limiting case of the binomial distribution as the probability of each individual Bernoulli event goes to zero, and the number of trials goes to infinity, but the expected number of events remains constant.
RAYLEIGH. If X1 and X2 are independent gaussian random variates with zero mean and unit variance, then sqrt( sqr(X1) + sqr(X2)) has a Rayleigh distribution. This distribution arises in two-dimensional applications with circular or cylindrical error probabilities . See also Error and Maxwell distributions.
STUDENT'S T. The expected value of the mean of a sample from a normal population is the mean of the population. However, the exact mean is rarely observed. Some difference is quite reasonable. The statistic Student's T captures the distribution of sample means of specific sizes drawn from a populations with normal distribution with zero mean and unit variance.
As with most statistically-inspired distributions, Student's t-test starts off making assumptions about the underlying distribution (normal with a specific mean and variance). Rarely expected values of the t-statistic are reason to reject the assumption as erroneous. Reasonable values of the t-statistic neither prove nor disprove the underlying assumption.
UNIFORM. (Rectangular) The trivial or degenerate probability distribution function. All plausible results have equal likelihood. Many computer languages include pseudo-random functions with results that approximate a uniform distribution.
Abramowitz and Stegun, Handbook of Mathmetical Functions, Government Printing Office. (also available as a Dover reprint)
Beyer, Handbook of Mathematical Sciences, CRC Press.
Beyer, Basic Statistical Tables, CRC Press.
Knuth, Semi-numerical Algorithms.
Menzel, Fundamental Formulas of Physics, Dover reprint.
Pearson, Handbook of Applied Mathematics, Van Nostrand Reinhold.
Press, et al., Numerical Recipes, Cambridge.