top of page

ISS PREVIOUS YEAR 2016 PAPER-2 SOLUTION SET-A Q.NO. 6

ISS PREVIOUS YEAR 2016 PAPER-2 SOLUTION SET-A

In the Gauss-Markov linear model, letŷ denote the vector of fitted values andê denote the vector of residuals.

Consider the following statements:





The components of ŷ are pairwise uncorrelated.



The components of ê are pairwise uncorrelated.
ISS PREVIOUS YEAR 2016 PAPER-2 SOLUTION SET-A

 Random Effects Model Question (ISS Level)

 Question

Suppose b₁, b₂, b₃, …, bₙ are independent N(0, σ²) random variables and eᵢⱼ are independent N(0, τ²) random variables for i = 1, 2, 3, …, k and j = 1, 2, 3, …, n.

Suppose we observe only:

Xᵢⱼ = bᵢ + eᵢⱼ

for i = 1, 2, 3, …, k and j = 1, 2, 3, …, n.

Then which of the following assertions are true?

  1. Var(Xᵢⱼ) = σ² + τ² for all i, j

  2. Cov(Xᵢⱼ, Xᵢ′ⱼ′) = 0 for all i, j except when i = i′ and j = j′

  3. (Xᵢⱼ − Xᵢⱼ′)² / 2 is an unbiased estimator of σ² for j ≠ j′

 Options

(a) 1 and 2 only(b) 1 and 3 only(c) 2 and 3 only(d) 1, 2 and 3

 


Variance, Covariance and Unbiased Estimator Problem (Step-by-Step Solution)

Consider the following statistical model:

Suppose (b_1, b_2, b_3, \dots, b_k) are independent random variables following a normal distribution with mean 0 and variance (\sigma^2). That is,

bᵢ ~ N(0, σ²) for i = 1, 2, 3, ..., k

Similarly, let (\varepsilon_{ij}) be independent random variables following a normal distribution with mean 0 and variance (\tau^2):

εᵢⱼ ~ N(0, τ²) for i = 1, 2, ..., k and j = 1, 2, ..., n

Assume that (b_i) and (\varepsilon_{ij}) are mutually independent.

We observe the variable:

Xᵢⱼ = bᵢ + εᵢⱼ

for i = 1, 2, ..., k and j = 1, 2, ..., n.

We now evaluate the following statements.

1. Variance of Xᵢⱼ

Given:

Xᵢⱼ = bᵢ + εᵢⱼ

Since (b_i) and (ε_{ij}) are independent,

Var(Xᵢⱼ) = Var(bᵢ + εᵢⱼ)

Using the property of variance for independent variables:

Var(Xᵢⱼ) = Var(bᵢ) + Var(εᵢⱼ)

Substituting the values:

Var(Xᵢⱼ) = σ² + τ²

Therefore,

Var(Xᵢⱼ) = σ² + τ² for all i, j.

Hence, Statement 1 is TRUE.

2. Covariance between Xᵢⱼ and Xᵢ'ⱼ'

Consider two observations:

Xᵢⱼ = bᵢ + εᵢⱼXᵢ'ⱼ' = bᵢ' + εᵢ'ⱼ'

Now compute:

Cov(Xᵢⱼ , Xᵢ'ⱼ') = Cov(bᵢ + εᵢⱼ , bᵢ' + εᵢ'ⱼ')

Consider the case when i = i′ but j ≠ j′:

Xᵢⱼ = bᵢ + εᵢⱼXᵢⱼ' = bᵢ + εᵢⱼ'

Then,

Cov(Xᵢⱼ , Xᵢⱼ')= Cov(bᵢ + εᵢⱼ , bᵢ + εᵢⱼ')

Expanding covariance:

= Cov(bᵢ , bᵢ) + Cov(bᵢ , εᵢⱼ') + Cov(εᵢⱼ , bᵢ) + Cov(εᵢⱼ , εᵢⱼ')

Since ε terms are independent of b and also independent among themselves,

Cov(Xᵢⱼ , Xᵢⱼ') = Var(bᵢ)

Cov(Xᵢⱼ , Xᵢⱼ') = σ²

Thus the covariance is not zero when i = i′ and j ≠ j′.

Therefore the statement:

Cov(Xᵢⱼ , Xᵢ'ⱼ') = 0 for all i, j except when i = i′ and j = j′

is FALSE.

Hence, Statement 2 is FALSE.

3. Unbiased estimator

Consider the estimator:

T = (Xᵢⱼ − Xᵢⱼ')² / 2

Now compute the difference:

Xᵢⱼ − Xᵢⱼ'= (bᵢ + εᵢⱼ) − (bᵢ + εᵢⱼ')

Xᵢⱼ − Xᵢⱼ' = εᵢⱼ − εᵢⱼ'

Now calculate the variance:

Var(εᵢⱼ − εᵢⱼ')= Var(εᵢⱼ) + Var(εᵢⱼ')

Var(εᵢⱼ − εᵢⱼ') = τ² + τ²

Var(εᵢⱼ − εᵢⱼ') = 2τ²

Since the mean is zero,

E[(εᵢⱼ − εᵢⱼ')²] = 2τ²

Therefore,

E[(Xᵢⱼ − Xᵢⱼ')² / 2] = τ²

So the estimator estimates τ², not σ².

Hence the statement that it is an unbiased estimator of σ² is FALSE.

Thus, Statement 3 is FALSE.

Final Result

Statement 1 → TRUE

Statement 2 → FALSE

Statement 3 → FALSE

Therefore, only statement 1 is correct.

However, since the options in the question do not include "1 only", the given options appear to be incorrect.

Correct conclusion: Only Statement 1 is true.

ISS PREVIOUS YEAR 2016 PAPER-2 SOLUTION SET-A Click Here to Download

Comments


  • call
  • gmail-02
  • Blogger
  • SUNRISE CLASSES TELEGRAM LINK
  • Whatsapp
  • LinkedIn
  • Facebook
  • Twitter
  • YouTube
  • Pinterest
  • Instagram
bottom of page