Statistical Background

This chapter presents a brief review of statistical principles and terminology. The objective of this chapter is to introduce concepts from probability theory and statistics that will be used in later chapters. As such, this chapter is not intended to cover this subject completely, but rather to provide an overview of applicable concepts as a foundation that you can refer to when more complex concepts are introduced.

If you are familiar with basic probability theory and life data analysis, you may wish to skip this chapter. If you would like additional information, we encourage you to review other references on the subject.

=A Brief Introduction to Probability Theory =

Basic Definitions
Before considering the methodology for estimating system reliability, some basic concepts from probability theory should be reviewed.

The terms that follow are important in creating and analyzing reliability block diagrams.
 * 1) Experiment $$(E)$$ :  An experiment is any well-defined action that may result in a number of outcomes.  For example, the rolling of dice can be considered an experiment.
 * 2) Outcome $$(O)$$ :  An outcome is defined as any possible result of an experiment.
 * 3) Sample space $$(S)$$ :  The sample space is defined as the set of all possible outcomes of an experiment.
 * 4) Event: An event is a collection of outcomes.
 * 5) Union of two events $$A$$  and  $$B$$   $$(A\cup B)$$ :  The union of two events  $$A$$  and  $$B$$  is the set of outcomes that belong to  $$A$$  or  $$B$$  or both.
 * 6) Intersection of two events $$A$$  and  $$B$$   $$(A\cap B)$$ :  The intersection of two events  $$A$$  and  $$B$$  is the set of outcomes that belong to both  $$A$$  and  $$B$$.
 * 7) Complement of event A ( $$\overline{A}$$ ): A complement of an event  $$A$$  contains all outcomes of the sample space,  $$S$$, that do not belong to  $$A$$.
 * 8) Null event ( $$\varnothing$$ ): A null event is an empty set that has no outcomes.
 * 9) Probability: Probability is a numerical measure of the likelihood of an event relative to a set of alternative events.  For example, there is a  $$50%$$  probability of observing heads relative to observing tails when flipping a coin (assuming a fair or unbiased coin).

Example

Consider an experiment that consists of the rolling of a six-sided die. The numbers on each side of the die are the possible outcomes. Accordingly, the sample space is $$S=\{1,2,3,4,5,6\}$$.

Let $$A$$  be the event of rolling a 3, 4 or 6 ( $$A=\{3,4,6\}$$ ) and let  $$B$$  be the event of rolling a 2, 3 or 5 ( $$B=\{2,3,5\}$$ ).
 * 1) The union of $$A$$  and  $$B$$  is:  $$A\cup B=\{2,3,4,5,6\}.$$
 * 2) The intersection of $$A$$  and  $$B$$  is:  $$A\cap B=\{3\}.$$
 * 3) The complement of $$A$$  is:  $$\overline{A}=\{1,2,5\}.$$

Probability Properties, Theorems and Axioms
The probability of an event $$A$$  is expressed as  $$P(A)$$  and has the following properties:
 * 1) $$0\le P(A)\le 1$$
 * 2) $$P(A)=1-P(\overline{A})$$
 * 3) $$P(\varnothing)=0$$
 * 4) $$P(S)=1$$

In other words, when an event is certain to occur, it has a probability equal to $$1$$ ; when it is impossible for the event to occur, it has a probability equal to  $$0$$.

It can also be shown that the probability of the union of two events $$A$$  and  $$B$$  is:


 * $$P(A\cup B)=P(A)+P(B)-P(A\cap B)\ $$

Similarly, the probability of the union of three events, $$A$$,  $$B$$  and  $$C$$  is given by:


 * $$\begin{align}

P(A\cup B\cup C)= & P(A)+P(B)+P(C) \\ & -P(A\cap B)-P(A\cap C) \\ & -P(B\cap C)+P(A\cap B\cap C) \end{align}$$

Mutually Exclusive Events
Two events $$A$$  and  $$B$$  are said to be mutually exclusive if it is impossible for them to occur simultaneously ( $$A\cap B$$  =  $$\varnothing$$ ). In such cases, the expression for the union of these two events reduces to the following, since the probability of the intersection of these events is defined as zero.


 * $$P(A\cup B)=P(A)+P(B)$$

Conditional Probability
The conditional probability of two events $$A$$  and  $$B$$  is defined as the probability of one of the events occurring, knowing that the other event has already occurred. The expression below denotes the probability of $$A$$  occurring given that  $$B$$  has already occurred.


 * $$P(A|B)=\frac{P(A\cap B)}{P(B)}\ $$

Note that knowing that event $$B$$  has occurred reduces the sample space.

Independent Events
If knowing $$B$$  gives no information about  $$A$$, then the events are said to be independent and the conditional probability expression reduces to:


 * $$P(A|B)=P(A)\ $$

From the definition of conditional probability, $$P(A|B)=\frac{P(A\cap B)}{P(B)}\ $$ can be written as:


 * $$P(A\cap B)=P(A|B)P(B)\ $$

Since events $$A$$  and  $$B$$  are independent, the expression reduces to:


 * $$P(A\cap B)=P(A)P(B)\ $$

If a group of $$n$$  events  $${{A}_{i}}$$  are independent, then:


 * $$P\left[ \underset{i=1}{\overset{n}{\mathop \bigcap }}\,{{A}_{i}} \right]=\underset{i=1}{\overset{n}{\mathop \prod }}\,P({{A}_{i}})\ $$

As an illustration, consider the outcome of a six-sided die roll. The probability of rolling a 3 is one out of six or:


 * $$\begin{align}

P(O=3)=1/6=0.16667 \end{align}$$

All subsequent rolls of the die are independent events, since knowing the outcome of the first die roll gives no information as to the outcome of subsequent die rolls (unless the die is loaded). Thus the probability of rolling a 3 on the second die roll is again:


 * $$\begin{align}

P(O=3)=1/6=0.16667 \end{align}$$

However, if one were to ask the probability of rolling a double 3 with two dice, the result would be:


 * $$\begin{align}

0.16667\cdot 0.16667= & 0.027778 \\ = & \frac{1}{36} \end{align}$$

Example 1
Consider a system where two hinged members are holding a load in place, as shown next.



The system fails if either member fails and the load is moved from its position.
 * 1) Let $$A=$$  event of failure of Component 1 and let  $$\overline{A}$$   $$=$$  the event of not failure of Component 1.
 * 2) Let $$B=$$  event of failure of Component 2 and let  $$\overline{B}$$   $$=$$  the event of not failure of Component 2.

Failure occurs if Component 1 or Component 2 or both fail. The system probability of failure (or unreliability) is:


 * $${{P}_{f}}=P(A\cup B)=P(A)+P(B)-P(A\cap B)$$

Assuming independence (or that the failure of either component is not influenced by the success or failure of the other component), the system probability of failure becomes the sum of the probabilities of $$A$$  and  $$B$$  occurring minus the product of the probabilities:


 * $${{P}_{f}}=P(A\cup B)=P(A)+P(B)-P(A)P(B)$$

Another approach is to calculate the probability of the system not failing (i.e., the reliability of the system):


 * $$\begin{align}

P(no\text{ }failure)= & Reliability \\ = & P(\overline{A}\cap\overline{B})\\ = & P(\overline{A})P(\overline{B}) \end{align}$$

Then the probability of system failure is simply 1 (or 100%) minus the reliability:


 * $$\begin{align}

{{P}_{f}}=1-Reliability \end{align}$$

Example 2
Consider a system with a load being held in place by two rigid members, as shown next.


 * •	Let $$A=$$  event of failure of Component 1.
 * •	Let $$B=$$  event of failure of Component 2.
 * •	The system fails if Component 1 fails and Component 2 fails. In other words, both components must fail for the system to fail.

The system probability of failure is defined as the intersection of events $$A$$  and  $$B$$ :


 * $${{P}_{f}}=P(A\cap B)) $$

Case 1

Assuming independence (i.e., either one of the members is sufficiently strong to hold the load in place), the probability of system failure becomes the product of the probabilities of $$A$$ and  $$B$$  failing:


 * $${{P}_{f}}=P(A\cap B)=P(A)P(B)$$

The reliability of the system now becomes:


 * $$Reliability=1-{{P}_{f}}=1-P(A)P(B)\ $$

Case 2

If independence is not assumed (e.g., when one component fails the other one is then more likely to fail), then the simplification given in $$Reliability=1-{{P}_{f}}=1-P(A)P(B)\ $$ is no longer applicable. In this case, $${{P}_{f}}=P(A\cap B)) $$ must be used. We will examine this dependency in later sections under the subject of load sharing.

=A Brief Introduction to Continuous Life Distributions =

=A Brief Introduction to Life-Stress Relationships=

In certain cases when one or more of the characteristics of the distribution change based on an outside factor, one may be interested in formulating a model that includes both the life distribution and a model that describes how a characteristic of the distribution changes. In reliability, the most common "outside factor" is the stress applied to the component. In system analysis, stress comes into play when dealing with units in a load sharing configuration. When components of a system operate in a load sharing configuration, each component supports a portion of the total load for that aspect of the system. When one or more load sharing components fail, the operating components must take on an increased portion of the load in order to compensate for the failure(s). Therefore, the reliability of each component is dependent upon the performance of the other components in the load sharing configuration.

Traditionally in a reliability block diagram, one assumes independence and thus an item's failure characteristics can be fully described by its failure distribution. However, if the configuration includes load sharing redundancy, then a single failure distribution is no longer sufficient to describe an item's failure characteristics. Instead, the item will fail differently when operating under different loads and the load applied to the component will vary depending on the performance of the other component(s) in the configuration. Therefore, a more complex model is needed to fully describe the failure characteristics of such blocks. This model must describe both the effect of the load (or stress) on the life of the product and the probability of failure of the item at the specified load. The models, theory and methodology used in Quantitative Accelerated Life Testing (QALT) data analysis can be used to obtain the desired model for this situation. The objective of QALT analysis is to relate the applied stress to life (or a life distribution). Identically in the load sharing case, one again wants to relate the applied stress (or load) to life. The following figure graphically illustrates the probability density function ( $$pdf$$ ) for a standard item, where only a single distribution is required.

The next figure represents a load sharing item by using a 3-D surface that illustrates the $$pdf$$, load and time. The following figure shows the reliability curve for a load sharing item vs. the applied load.

Formulation
To formulate the model, a life distribution is combined with a life-stress relationship. The distribution choice is based on the product's failure characteristics while the life-stress relationship is based on how the stress affects the life characteristics. The following figure graphically shows these elements of the formulation. The next figure shows the combination of both an underlying distribution and a life-stress model by plotting a $$pdf$$  against both time and stress.

The assumed underlying life distribution can be any life distribution. The most commonly used life distributions include the Weibull, the exponential and the lognormal. The life-stress relationship describes how a specific life characteristic changes with the application of stress. The life characteristic can be any life measure such as the mean, median, $$R(x)$$,  $$F(x)$$ , etc. It is expressed as a function of stress. Depending on the assumed underlying life distribution, different life characteristics are considered. Typical life characteristics for some distributions are shown in the next table. For example, when considering the Weibull distribution, the scale parameter, $$\eta $$, is chosen to be the life characteristic that is stress-dependent while  $$\beta $$  is assumed to remain constant across different stress levels. A life-stress relationship is then assigned to $$\eta .$$

For a detailed discussion of this topic, see ReliaSoft's Accelerated Life Testing Data Analysis Reference.