View on GitHub

memo

Chapter3-04. Distributions with Monotone Likelihood Ratio

3.4 Distributions with Monotone Likelihood Ratio

Definition (Monotone Likelihood Ration)

We say that $(p_{\theta})_{\theta \in \Theta}$ have monotone likelihood ration if

\[\exists T: \mathcal{X} \rightarrow \mathbb{R}: \text{measurable} \quad \text{ s.t. } \ \forall \theta < \theta^{\prime}, \ \exists g: \mathbb{R} \rightarrow \mathbb{R}_{\ge 0}: \text{non-deceasing},\ \ \frac{ p_{\theta^{\prime}}(x) }{ p_{\theta}(x) } = g(T(x)) \quad (\forall x \in \mathcal{X}) .\]

Theorem 3.4.1

\[\phi(x) := \begin{cases} 1 & (T(x) > C) \\ \gamma & (T(x) = C) \\ 0 & (T(x) < C) \end{cases}\]

where $\gamma$ and $C$ are consntats which satisfies

\[\beta_{\phi}(\theta_{0}) = \alpha .\] \[\phi = \argmin_{\phi^{\prime}} \{ \beta_{\phi^{\prime}}(\theta) \mid \theta < \theta_{0}, \ \beta_{\phi^{\prime}}(\theta) \}\]

proof

(i), (ii)

It suffices to show that $\exists C, \gamma$ such that

\[\begin{eqnarray} & & \mathrm{E}_{\theta_{0}} \left[ \phi(X) \right] = \alpha \nonumber \\ & \Leftrightarrow & P_{\theta_{0}}(T(X) > C) + \gamma P_{\theta_{0}}(T(X) = C) = \alpha \nonumber \end{eqnarray}\]

Let us defined $\phi$

\[\phi(C) := P_{\theta_{0}}(T(X) = C) \quad (C \in \mathbb{R}) .\]

Thus, $\phi$ is right continuous with left limits and

\[\phi(-\infty) = 0, \ \phi(\infty) = 1, .\]

Since

\[\begin{eqnarray} P_{\theta_{0}}(T(X) > C) & = & 1 - \phi(C) \nonumber \\ P_{\theta_{0}}(T(X) = C) & = & \Delta \phi(C) \nonumber \end{eqnarray}\]

where \(\Delta \phi(C) := \phi(C) - \phi(C-)\).

\[\begin{eqnarray} & & 1 - \phi(C) + \gamma \Delta \phi(C) = \alpha \nonumber \\ & \Leftrightarrow & \phi(C) = 1 - \alpha + \gamma \Delta \phi(C) \nonumber \end{eqnarray}\]

Hence let

\[C^{*} := \inf \{ C \mid \phi(C) \ge 1 - \alpha \} \in \mathbb{R} .\]

Then by right continutity, we have

\[\phi(C^{*}) \ge 1 - \alpha .\]

In case of $\phi(C^{*}) = 1- \alpha$,

\[\gamma^{*} := 0 .\]

In case of \(\phi(C^{*}) > 1 - \alpha\), $C^{*}$ is a Indeed,

Take any \(C < C^{*}\). Then by definition of $C^{*}$ we have

\[\phi(C) < 1 - \alpha.\] \[\begin{eqnarray} \phi(C^{*}-) & = & \lim_{C \nearrow C^{*}} \phi(C) \nonumber \\ & \ge & 1 - \alpha \nonumber \\ & < & \phi(C) . \nonumber \end{eqnarray}\]

Therefore

\[\phi(C^{*}-) \le 1 - \alpha < \phi(C^{*}) .\]

Let us defined $\gamma^{*}$ by

\[\gamma^{*} := \frac{ \phi(C^{*}) - (1 - \alpha) }{ \Delta \phi(C^{*}) } \in [0, 1] .\]

proof of (ii)

Let $\theta^{\prime} < \theta^{\prime\prime}$. Then consider $\phi^{\prime}$ such that

\[\phi^{\prime} \equiv \alpha^{\prime} = \beta_{\phi}(\theta^{\prime}) \in (0, 1) .\]

Then it follow that

\[\begin{eqnarray} & & \beta_{\phi^{\prime}}(\theta^{\prime\prime}) \ge \beta_{\phi}(\theta^{\prime\prime}) \nonumber \\ & \Leftrightarrow & \beta_{\phi^{\prime}}(\theta^{\prime}) \ge \beta_{\phi}(\theta^{\prime\prime}) \nonumber \end{eqnarray}\]

Assume $\beta_{\phi}(\theta^{\prime\prime}) = \beta_{\phi}(\theta^{\prime\prime})$.

$\Box$

Lemma

$\phi$ is MP test at levelt $\alpha^{\prime}$ for hypothesis \(\Theta_{0} := \{P_{\theta^{\prime}}\}\) and alternatives \(\Theta_{1} := \{P_{\theta^{\prime\prime}}\}\).

proof

By MLR, there exists non decreasing function $g: \mathbb{R} \rightarrow \mathbb{R}$ such that

\[g(T(x)) = p_{\theta^{\prime\prime}}(x) \frac{ p_{\theta^{\prime\prime}}(x) }{ p_{\theta^{\prime}}(x) } .\]

Using $g$, $\phi$ satisfy

\[\phi(x) = \begin{cases} 1 & (p_{\theta^{\prime\prime}}(x) > k p_{\theta^{\prime}}(x)) \\ 0 & (p_{\theta^{\prime\prime}}(x) > k p_{\theta^{\prime}}(x)) \end{cases}\]

where $k := g(C)$.

\[\begin{eqnarray} & & \Gamma(x) > c \nonumber \\ & \Leftrightarrow & g(T(x)) > g(C) \nonumber \\ & \Leftrightarrow & p_{\theta^{\prime\prime}}(x) > g(C)p_{\theta^{\prime}}(x) \nonumber \end{eqnarray}\]
$\Box$

Corollary 3.4.1

\[\begin{equation} p_{\theta}(x) = C(\theta) \exp \left( Q(\theta) T(x) \right) h(x) , \label{chap03_03_19} \end{equation}\]

Then there exists a UMP test $\phi:\mathcal{X} \rightarrow [0, 1]$ for hypothesis $\Theta_{0}$ and alternative $\Theta_{1}$. Moreover, if $Q$ is increasing

\[\phi = \begin{cases} 1 & T(x) > C, \\ \gamma & T(x) = C, \\ 0 & T(x) < C, \end{cases}\]

where $C$ and $\gamma$ are determined by \(\mathrm{E}_{\theta_{0}}[\phi(X)] = \alpah\). If $Q$ is decreasing,

\[\phi = \begin{cases} 1 & T(x) < C, \\ \gamma & T(x) = C, \\ 0 & T(x) > C, \end{cases} .\]

proof

$\Box$

Example 3.4.2 Binomial

Definition Essentially complete

$C$ is said to be essentially complete if

\[\forall \phi \in \Phi \setminus C, \ \exists \phi^{\prime} \in C, \ \forall \theta \in \Theta, \ R(\theta, \phi) \ge R(\theta, \phi^{\prime}) .\]
\[L_{1}(\theta) - L_{0}(\theta) \begin{cases} > 0 & \theta < \theta_{0} \\ < 0 & \theta > \theta_{0} \end{cases}\]

Theorem 3.4.2

\[\mathcal{C} := \{ \phi: \mathcal{X} \rightarrow [0, 1] \mid \forall \alpha \in [0, 1], \ \phi \text{ satisfies} \eqref{chap03_03_16}, \eqref{chap03_03_17}. \}\] \[\forall \theta, \theta^{\prime} \in \Theta, \ \{ x \in \mathcal{X} \mid p_{\theta}(x) > 0 \} = \{ x \in \mathcal{X} \mid p_{\theta^{\prime}}(x) > 0 \} .\]

proof

(i)

\[\begin{eqnarray} R(\theta, \phi) & = & \int_{\mathcal{X}} p_{\theta}(x) \left( \phi(x) L_{1}(\theta) + (1 - \phi(x)) L_{0}(\theta) \right) \ \mu(dx) \nonumber \\ & = & \int_{\mathcal{X}} p_{\theta}(x) \left( L_{0}(\theta) + (L_{1}(\theta) - L_{0}(\theta)) \phi(x) \right) \ \mu(dx) \nonumber \end{eqnarray}\]

Hence

\[\begin{eqnarray} R(\theta, \phi^{\prime}) - R(\theta, \phi) & = & (L_{1}(\theta) - L_{0}(\theta)) \int_{\mathcal{X}} (\phi^{\prime}(x) - \phi(x)) p_{\theta}(x) \ \mu(dx) \nonumber \end{eqnarray}\]

If

\[\begin{eqnarray} \beta_{\phi^{\prime}}(\theta) - \beta_{\phi}(\theta) & = & \int_{\mathcal{X}} (\phi^{\prime}(x) - \phi(x)) p_{\theta}(x) \ \mu(dx) & & \begin{cases} > 0 & (\theta > \theta_{0}) \\ < 0 & (\theta < \theta_{0}) \\ = 0 & (\theta = \theta_{0}) \end{cases}, \nonumber \end{eqnarray}\]

then

\[\forall \theta \in \Theta, \ R(\theta, \phi^{\prime}) - R(\theta, \phi) \le 0 .\]
$\Box$

Definiiton Stochastically increasing

\(\{F_{\theta}\}_{\theta \in \Theta}\) is said to be stochastically increasing if

\[\begin{eqnarray} & & F_{\theta} \not\equiv F_{\theta^{\prime}}, \nonumber \\ & & \forall \theta < \theta^{\prime}, \ \forall x \in \mathbb{R}, \ F_{\theta^{\prime}}(x) \le F_{\theta}(x), \nonumber \end{eqnarray}\]

Remark

Then

\[\forall x \in \mathbb{R}, \ P_{\theta}(X > x) \le P_{\theta^{\prime}}(X > x) .\]

Hence increasing means r.v. is increasing.

Lemma 3.4.1

Then the following statements are equivalent:

\[\forall x \in \mathbb{R}, \ F_{1}(x) \le F_{0}(x)\] \[\begin{eqnarray} \exists f_{0}, f_{1}:\mathbb{R} \rightarrow \mathbb{R}: \text{non decreasing}, \ \exists V: \mathcal{X} \rightarrow \mathbb{R}, & \text{ s.t. } & f_{0}(v) \le f_{1}(v) \nonumber \\ & & F_{f_{0}(V)} = F_{0}, \ F_{f_{1}(V)} = F_{1} \nonumber \end{eqnarray}\]

proof

(i) $\Rightarrow$ (ii)

\[\begin{eqnarray} F_{1}(x) & = & P(f_{1}(V) \le x) \nonumber \\ & \le & P(f_{0}(V) \le x) \nonumber \\ & = & F_{0}(x) . \nonumber \end{eqnarray}\]

(i) $\Leftarrow$ (ii)

Let

\[f_{i}(y) := \inf \{ x \mid F_{i}(x - 0) \le y \le F_{i}(x) \} \ (i = 0, 1) .\]

Then

\[\begin{eqnarray} & & \forall x, \ f_{i}(F_{i}(x)) \le x \nonumber \\ & & \forall x, \ F_{i}(f_{i}(x)) \ge x . \end{eqnarray}\]

It follows that

\[\begin{eqnarray} \end{eqnarray}\]
$\Box$

Definition location parameter family

\(\{F\}\) is said to be localtion parameter family if

\[\begin{eqnarray} F_{\theta}(x) = F(x - \theta) . \nonumber \end{eqnarray}\]

Example

Localtion parameter family \(\{F_{\theta}\}\) is stochastically increasing. Indeed,