Reliability for Lindley Distribution with an Outlier

In this paper, we consider the problem of estimating R=P(Y<X), when Y has lindley distribution with parameter a and x has lindley distribution with presence of one outlier with parameters b and c, such that X and Y are independent. The maximum likelihood estimator of R is derived and some results of simulation studies are presented.


Introduction
In reliability context inferences about R = P(Y<X) , when X and Y are independently distributed, are a subject of interest.For example in mechanical reliability of a system if X is the strength of a component which is subject to stress Y , then R is a measure of system performance. The system fails, if at any time the applied stress is greater than its strength. Stress-strength reliability has been discussed in Kapur and Lamberson (1977). Sathe and Dixit (2001) have done estimation of R in the negative binomial distribution. Baklizi and Dayyeh (2003) have done shrinkage estimation of R in exponential case, and recently Deiri (2011) has done estimation of R with presence of two outliers in the exponential and gamma cases, respectively.Jafari (2011) has obtained the moment, maximum likelihood and mixture estimators of R in Rayleigh distribution in the presence of one outlier and Jabbari, Abolhasani and Fathipour (2012) have discussed the estimation of R in the six parameter generalized Burr XII distribution with transformation method. In this paper, we obtain the maximum likelihood estimator of R for lindley distribution with presence of one outlier generated from the same distribution.
The probability density function of the lindley distribution with parameter of a is given by: ( ; ) = 2
In this paper we assume that the random variables (Y 1 ,Y 2 ,…,Y m ) have lindley distribution with parameter a and the random variables (X 1 ,X 2 ,...,X n ) are such that one of them is from lindley distribution with parameter c and the remaining (n-1) random variables are from lindley distribution with parameter b The paper is organized as follows: In section 2, we obtain the joint distribution of (X 1 ,X 2 ,...,X n ) in the presence of one outlier. Section 3 and section 4 discusses the method of maximum likelihood estimators of parameters and the MLE of R respectively. In section 5 simulation studies are presented and the results are summarized in section 6.

Joint distribution of X1,X2,...,Xn in presence of an outlier
Assume (X 1 ,X 2 ,...,X n ) are such that one of them is distributed with p.d.f g(x,c) as lindley(c) and remaining (n-1) of them are distributed with p.d.f f(x,b) as lindley(b). The joint distribution of (X 1 ,X 2 ,...,X n ) can be expressed as  Taking the derivative with respect to a and equating to 0, we obtain the MLE of a as Now let X1,X2,...,Xn be a random sample for X with presence of one outlier with pdf, Taking the derivatives with respect to b and c and equating the results to 0, we obtain the normal equations as There is no closed-form solution to this system of equations, so we will solve for ̂ and ̂ iteratively, using the Newton-Raphson method. In our case we will estimate ̂= (̂,) iteratively: where g is the vector of normal equations for which we want = [ 1 2 ] and G is the matrix of second derivatives The Newton-Raphson algorithm converges, as our estimate of b and c change by less than a tolerated amount with each successive iteration, to ̂ and .

The maximum likelihood estimator of R
Let ~( ) with pdf h(y;a) and X be distributed with pdf f(x;b,c) given in (2). The parameter R we want to estimate is  where ̂,̂, and ĉan be obtained from (3) and (6).

Simulation Study
In this section we generate random numbers from lindley distribution (with and without outlier) with accept-reject method by Maple software. Using these samples and the Newton-Raphson method we obtain the maximum likelihood estimators of parameters a,b and c.

Conclusion
Acording to the results of simulation, when the value of parameters b and c are close to each other, the biases and MSEs are often around zero and when the difference between b and c is greater than 1, the biases and MSEs increase.