Important Conditions Describing Positive Dependence

Một phần của tài liệu Sổ tay kỹ thuật sản phẩm (Trang 175 - 178)

PART V. Practices and Emerging Applications

7.2 Important Conditions Describing Positive Dependence

Concepts of stochastic dependence for a bivariate distribution play an important part in statistics.

For each concept, it is often convenient to refer to the bivariate distributions that satisfy it as a family, a group. In this chapter, we are mainly concerned with positive dependence. Although negative dependence concepts do exist, they are often obtained by negative analogues of positive dependence via reversing the inequity signs.

Various notions of dependence are motivated from applications in statistical reliability (e.g.

see Barlow and Proschan [3]). The baseline, or starting point, of the reliability analysis of systems is independence of the lifetimes of the

Concepts of Stochastic Dependence in Reliability Analysis 143 components. As noted by many authors, it is often

more realistic to assume some form of positive dependence among components.

Around about 1970, several studies discussed different notions of positive dependence between two random variables, and derived some interre- lationships among them,e.g.Lehmann [4], Esary et al. [5], Esary and Proschan [6], Harris [7], and Brindley and Thompson [8], among others.

Yanagimoto [9] unified some of these notions by introducing a family of concepts of positive de- pendence. Some further notions of positive depen- dence were introduced by Shaked [10–12].

For concepts of multivariate dependence, see Block and Ting [13] and a more recent text by Joe [2].

7.2.1 Six Basic Conditions

Jogdeo [14] lists four of the following basic conditions describing positive dependence; these are in increasing order of stringency (see also Jogdeo [15]).

1. Positive correlation,cov(X, Y )≥0.

2. For every pair of increasing functionsa and b, defined on the real lineR,cov[a(X), b(Y )]

≥0. Lehmann [4] showed that this condition is equivalent to

Pr(Xx, Yy)≥Pr(Xx)Pr(Yy) (7.1) Or equivalently, Pr(Xx, Yy)≥Pr(Xx)Pr(Yy).

We say that (X, Y ) shows positive quadrant dependence if and only if these inequalities hold. Several families of PQD distributions will be introduced in Section 7.4.

3. Esary et al. [5] introduced the following condition, termed “association”: for every pair of functions a and b, defined on R2, that are increasing in each of the arguments (separately)

cov[a(X, Y ), b(X, Y )] ≥0 (7.2) We note, in passing, that a direct verification of this dependence concept is difficult in

general. However, it is often easier to verify one of the alternative positive dependence notions that imply association.

4. Y is right-tail increasing in X if Pr(Y > y| X > x)is increasing inx for ally (written as RTI(Y |X). Similarly,Y is left-tail decreasing ifPr(Yy|Xx)is decreasing inxfor ally (written as LTD(Y|X)).

5. Y is said to be stochastically increasing inx for ally (written as SI(Y|X)) if, for everyy, Pr(Y > y|X=x)is increasing inx. Similarly, we say thatXis stochastically increasing iny for allx (written as SI(X|Y )) if for everyx, Pr(X > x|Y =y)is increasing iny.

Note that SI(Y|X)is often simply denoted by SI. Some authors refer to this relationship as Y being positively regression dependent onX (abbreviated to PRD) and similarly X being positively regression dependent onY. 6. LetX andY have a joint probability density

functionf (x, y). Thenf is said to be totally positive of order two (TP2)if for allx1< x2, y1< y2:

f (x1, y1)f (x2, y2)f (x1, y2)f (x2, y1) (7.3) Note that if f is TP2, then F and F¯ (survival function) are also TP2, i.e.

F (x1, y1)F (x2, y2)F (x1, y2)F (x2, y1)and F (x¯ 1, y1)F (x¯ 2, y2)≥ ¯F (x1, y2)F (x¯ 2, y1). It is easy to see that either F TP2 or F¯ TP2

implies thatF is PQD.

7.2.2 The Relative Stringency of the Conditions

It is well known that these concepts are inter- related, e.g. see Joe [2]. The six conditions we listed above can be arranged in an increasing order of stringency,i.e.(6)(5)(4)(3)(2)(1). More precisely:

TP2⇒SI⇒RTI⇒Association

⇒PQD⇒cov(X, Y )≥0 TP2⇒SI⇒LTD⇒Association

⇒PQD⇒cov(X, Y )≥0

144 Statistical Reliability Theory

Some of the proofs for the chain of implications are not straightforward, whereas some others are obvious (e.g. see Barlow and Proschan [3], p.143–4). For example, it is easy to see that PQD implies positive correlation by applying Hoeffding’s lemma, which states:

cov(X, Y )

= 2 ∞

−∞

2 ∞

−∞[F (x, y)FX(x)FY(y)]dxdy (7.4) This identity is often useful in many areas of statistics.

7.2.3 Positive Quadrant Dependent in Expectation

We now introduce a slightly less stringent dependence notion that would include the PQD distributions. For any real number x, let Yx be the random variable with distribution function Pr(Yy|X > x). It is easy to verify that the inequality in the conditional distribution Pr(Yy|X > x)≤Pr(Yy) implies an inequality in expectation E(Yx)E(Y )ifY is a non-negative random variable.

We say that Y is PQDE on X if the last inequality involving expectation holds. Similarly, we say that there is negative quadrant dependent in expectation ifE(Yx)E(Y ).

It is easy to show that PQD⇒PQDE by observing PQD is equivalent to Pr(Y > y|X >

x)≥Pr(Y > y), which in turn implies E(Yx)E(Y )(assumingY ≥0).

Next, we have cov(X, Y )

= 2 2

[ ¯F (x, y)− ¯FX(x)F¯Y(y)]dxdy

=

2 F¯X(x) 2

[Pr(Y > y|X > x)

− ¯FY(y)]dy

dx

=

2 F¯X(x)[E(Yx)E(Y )]dx

which is greater than zero if X and Y are PQDE. Thus, PQDE implies that cov(X, Y )≥0.

In other words, PQDE lies between PQD and positive correlation. There are many bivariate random variables being PQDE, because all the PQD distributions withY ≥0are also PQDE.

7.2.4 Associated Random Variables

Recall in Section 7.2.1, we say that two random variables X and Y are associated if cov[a(X, Y ), b(X, Y )] ≥0. Obviously, this expression can be represented alternatively by

E[a(X, Y )b(X, Y )] ≥E[a(X, Y )]E[b(X, Y )] (7.5) where the inequality holds for all real functions a and b that are increasing in each component and are such that the expectations in Equation 7.5 exist.

It appears from Equation 7.5 that it is almost impossible to check this condition for association directly. What one normally does is to verify one of the dependence conditions in the higher hierarchy that implies association.

Barlow and Proschan [3], p.29, considered some practical reliability situations for which the components lifetimes are not independent, but rather are associated:

1. minimal path structures of a coherent system having components in common;

2. components subject to the same set of stresses;

3. structures in which components share the same load, so that the failure of one compo- nent results in increased load on each of the remaining components.

We note that in each case the random variables of interest tend to act similarly. In fact, all the positive dependence concepts share this characteristic.

An important application of the concept of association is to provide probability bounds

Concepts of Stochastic Dependence in Reliability Analysis 145 for system reliability. Many such bounds are

presented in Barlow and Proschan [3], in section 3 of chapter 2 and section 7 of chapter 4.

7.2.5 Positively Correlated Distributions

Positive correlation is the weakest notion of dependence between two random variables X and Y. We note that it is easy to construct a positively correlated bivariate distribution. For example, such a distribution may be obtained by simply applying a well-known trivariate reduction technique described as follows.

SetX=X1+X3,Y =X2+X3, withXi (i= 1,2,3) being mutually independent, then the correlation coefficient ofXandY is

ρ=varX3/[var(X1+X3)var(X2+X3)]1/2>0 For example, let Xi∼Poissoni), i=1,2,3.

Then X∼Poisson1+λ3), Y ∼Poisson2+ λ3), with ρ=λ3/[1+λ3)(λ2+λ3)]1/2>0. X and Y constructed in this manner are also PQD;

see example 1(ii) in Lehmann [4], p.1139.

7.2.6 Summary of Interrelationships

Among the positive dependence concepts we have introduced so far, TP2is the strongest. A slightly weaker notion introduced by Harris [7] is called the right corner set increasing (RCSI), meaning Pr(X > x1, Y > y1|X > x2, Y > y2)is increasing inx2andy2for allx1andy1. Shaked [10] showed that TP2⇒RCSI. By choosingx1= −∞andy2=

−∞, we see thatRCSI⇒RTI.

We may summarize the chain of relations in the following (in which Y is conditional on X whenever there is a conditioning):

RCSI ⇒ RTI ⇒ ASSOCIATION ⇒ PQD ⇒ cov≥0

⇑ ⇑ ⇑

TP2 ⇒ SI ⇒ LTD

There are other chains of relationships between various concepts of dependence.

Một phần của tài liệu Sổ tay kỹ thuật sản phẩm (Trang 175 - 178)

Tải bản đầy đủ (PDF)

(696 trang)