Thus \( k - 1 = \lfloor z \rfloor \) in this formulation. Suppose that \( X_n \) has the discrete uniform distribution with endpoints \( a \) and \( b \), and step size \( (b - a) / n \), for each \( n \in \N_+ \). \( \E(X) = a + \frac{1}{2}(n - 1) h = \frac{1}{2}(a + b) \), \( \var(X) = \frac{1}{12}(n^2 - 1) h^2 = \frac{1}{12}(b - a)(b - a + 2 h) \), \( \kur(X) = \frac{3}{5} \frac{3 n^2 - 7}{n^2 - 1} \). The distribution function \( G \) of \( Z \) is given by \( G(z) = \frac{1}{n}\left(\lfloor z \rfloor + 1\right) \) for \( z \in [0, n - 1] \). \( \kur(Z) = \frac{3}{5} \frac{3 n^2 - 7}{n^2 - 1} \). This follows from the definition of the distribution function: \( F(x) = \P(X \le x) \) for \( x \in \R \). The entropy of \( X \) is \( H(X) = \ln[\#(S)] \). In probability theory and statistics, the discrete uniform distribution is a symmetric probability distribution wherein a finite number of values are equally likely to be observed; every one of n values has equal probability 1/n. \( X \) has probability density function \( f \) given by \( f(x) = \frac{1}{n} \) for \( x \in S \). For families whose support does not depend on the parameters, the Pitman–Koopman–Darmois theorem states that only exponential families have a sufficient statistic whose dimension is bounded as sample size increases. One example of this in a discrete case is rolling a single standard die. The discrete uniform distribution. We now generalize the standard discrete uniform distribution by adding location and scale parameters. There are a total of six sides of the die, and each side has the same probability of being rolled face up. Standard Deviation – By the basic definition of standard deviation, Example 1 – The current (in mA) measured in a piece of copper wire is known to follow a uniform distribution over the interval [0, 25]. Of course, the results in the previous subsection apply with \( x_i = i - 1 \) and \( i \in \{1, 2, \ldots, n\} \). By definition we can take \(X = a + h Z\) where \(Z\) has the standard uniform distribution on \(n\) points. Open the Special Distribution Simulation and select the discrete uniform distribution. Compute a few values of the distribution function and the quantile function. Note that the last point is \( b = a + (n - 1) h \), so we can clearly also parameterize the distribution by the endpoints \( a \) and \( b \), and the step size \( h \). The chapter on Finite Sampling Models explores a number of such models. A discrete uniform random variable X with parameters a and b has probability mass function f(x)= 1 b−a+1 Then \[ H(X) = \E\{-\ln[f(X)]\} = \sum_{x \in S} -\ln\left(\frac{1}{n}\right) \frac{1}{n} = -\ln\left(\frac{1}{n}\right) = \ln(n) \]. For selected values of the parameters, run the simulation 1000 times and compare the empirical mean and standard deviation to the true mean and standard deviation. Note the graph of the probability density function. so a standard deviation of approximately For the remainder of this discussion, we assume that \(X\) has the distribution in the definiiton. In Uniform Distribution we explore the continuous version of the uniform distribution where any number between α and β can be selected. Our first result is that the distribution of \( X \) really is uniform. Uniform Distribution for Discrete Random Variables . In fact, if we let N = β – α + 1, then the discrete uniform distribution determines the probability of selecting an integer between 1 and N at random. Vary the parameters and note the graph of the probability density function. \( X \) has moment generating function \( M \) given by \( M(0) = 1 \) and \[ M(t) = \frac{1}{n} e^{t a} \frac{1 - e^{n t h}}{1 - e^{t h}}, \quad t \in \R \setminus \{0\} \]. The uniformly minimum variance unbiased (UMVU) estimator for the maximum is given by. \( F^{-1}(3/4) = a + h \left(\lceil 3 n / 4 \rceil - 1\right) \) is the third quartile. , The distribution corresponds to picking an element of \( S \) at random. [ "article:topic", "showtoc:no", "license:ccby", "authorname:ksiegrist" ], \(\newcommand{\R}{\mathbb{R}}\) \(\newcommand{\N}{\mathbb{N}}\) \(\newcommand{\Z}{\mathbb{Z}}\) \(\newcommand{\E}{\mathbb{E}}\) \(\newcommand{\P}{\mathbb{P}}\) \(\newcommand{\var}{\text{var}}\) \(\newcommand{\sd}{\text{sd}}\) \(\newcommand{\cov}{\text{cov}}\) \(\newcommand{\cor}{\text{cor}}\) \(\newcommand{\skw}{\text{skew}}\) \(\newcommand{\kur}{\text{kurt}}\), 5.21: The Uniform Distribution on an Interval, Uniform Distributions on Finite Subsets of \( \R \), Uniform Distributions on Discrete Intervals, probability generating function of \( Z \), \( F(x) = \frac{k}{n} \) for \( x_k \le x \lt x_{k+1}\) and \( k \in \{1, 2, \ldots n - 1 \} \), \( \sigma^2 = \frac{1}{n} \sum_{i=1}^n (x_i - \mu)^2 \). {\displaystyle {\tfrac {N}{k}}} The discrete uniform distribution itself is inherently non-parametric. By definition, \( F^{-1}(p) = x_k \) for \(\frac{k - 1}{n} \lt p \le \frac{k}{n}\) and \(k \in \{1, 2, \ldots, n\} \). The uniform distribution is thus a simple example showing the limit of this theorem. Recall that \begin{align} \sum_{k=1}^{n-1} k^3 & = \frac{1}{4}(n - 1)^2 n^2 \\ \sum_{k=1}^{n-1} k^4 & = \frac{1}{30} (n - 1) (2 n - 1)(3 n^2 - 3 n - 1) \end{align} Hence \( \E(Z^3) = \frac{1}{4}(n - 1)^2 n \) and \( \E(Z^4) = \frac{1}{30}(n - 1)(2 n - 1)(3 n^2 - 3 n - 1) \). Note that the mean is the average of the endpoints (and so is the midpoint of the interval \( [a, b] \)) while the variance depends only on the number of points and the step size.