Template:Weibull fisher matrix confidence bounds

From ReliaWiki
Jump to navigation Jump to search

Fisher Matrix Confidence Bounds

One of the methods used by the application in estimating the different types of confidence bounds for Weibull data, the Fisher matrix method, is presented in this section. The complete derivations were presented in detail (for a general function) in chapter Confidence Bounds.

Bounds on the Parameters

One of the properties of maximum likelihood estimators is that they are asymptotically normal, meaning that for large samples they are normally distributed. Additionally, since both the shape parameter estimate, [math]\displaystyle{ \hat{\beta } }[/math], and the scale parameter estimate, [math]\displaystyle{ \hat{\eta }, }[/math] must be positive, thus lnβ and lnη are treated as being normally distributed as well. The lower and upper bounds on the parameters are estimated from [30]:

[math]\displaystyle{ \beta _{U} =\hat{\beta }\cdot e^{\frac{K_{\alpha }\sqrt{Var(\hat{ \beta })}}{\hat{\beta }}}\text{ (upper bound)} }[/math]
[math]\displaystyle{ \beta _{L} =\frac{\hat{\beta }}{e^{\frac{K_{\alpha }\sqrt{Var(\hat{ \beta })}}{\hat{\beta }}}} \text{ (lower bound)} }[/math]

and:

[math]\displaystyle{ \eta _{U} =\hat{\eta }\cdot e^{\frac{K_{\alpha }\sqrt{Var(\hat{ \eta })}}{\hat{\eta }}}\text{ (upper bound)} }[/math]
[math]\displaystyle{ \eta _{L} =\frac{\hat{\eta }}{e^{\frac{K_{\alpha }\sqrt{Var(\hat{ \eta })}}{\hat{\eta }}}}\text{ (lower bound)} }[/math]

where [math]\displaystyle{ K_{\alpha} }[/math] is defined by:

[math]\displaystyle{ \alpha =\frac{1}{\sqrt{2\pi }}\int_{K_{\alpha }}^{\infty }e^{-\frac{t^{2}}{2} }dt=1-\Phi (K_{\alpha }) }[/math]

If δ is the confidence level, then [math]\displaystyle{ \alpha =\frac{1-\delta }{2} }[/math] for the two-sided bounds and α = 1 − δ for the one-sided bounds. The variances and covariances of [math]\displaystyle{ \hat{\beta } }[/math] and [math]\displaystyle{ \hat{\eta } }[/math] are estimated from the inverse local Fisher matrix, as follows:

[math]\displaystyle{ \left( \begin{array}{cc} \hat{Var}\left( \hat{\beta }\right) & \hat{Cov}\left( \hat{ \beta },\hat{\eta }\right) \\ \hat{Cov}\left( \hat{\beta },\hat{\eta }\right) & \hat{Var} \left( \hat{\eta }\right) \end{array} \right) =\left( \begin{array}{cc} -\frac{\partial ^{2}\Lambda }{\partial \beta ^{2}} & -\frac{\partial ^{2}\Lambda }{\partial \beta \partial \eta } \\ -\frac{\partial ^{2}\Lambda }{\partial \beta \partial \eta } & -\frac{ \partial ^{2}\Lambda }{\partial \eta ^{2}} \end{array} \right) _{\beta =\hat{\beta },\text{ }\eta =\hat{\eta }}^{-1} }[/math]


Fisher Matrix Confidence Bounds and Regression Analysis

Note that the variance and covariance of the parameters are obtained from the inverse Fisher information matrix as described in this section. The local Fisher information matrix is obtained from the second partials of the likelihood function, by substituting the solved parameter estimates into the particular functions. This method is based on maximum likelihood theory and is derived from the fact that the parameter estimates were computed using maximum likelihood estimation methods. When one uses least squares or regression analysis for the parameter estimates, this methodology is theoretically then not applicable. However, if one assumes that the variance and covariance of the parameters will be similar ( One also assumes similar properties for both estimators.) regardless of the underlying solution method, then the above methodology can also be used in regression analysis.

The Fisher matrix is one of the methodologies that Weibull++ uses for both MLE and regression analysis. Specifically, Weibull++ uses the likelihood function and computes the local Fisher information matrix based on the estimates of the parameters and the current data. This gives consistent confidence bounds regardless of the underlying method of solution, i.e. MLE or regression. In addition, Weibull++ checks this assumption and proceeds with it if it considers it to be acceptable. In some instances, Weibull++ will prompt you with an "Unable to Compute Confidence Bounds" message when using regression analysis. This is an indication that these assumptions were violated.

Bounds on Reliability

The bounds on reliability can easily be derived by first looking at the general extreme value distribution (EVD). Its reliability function is given by:

[math]\displaystyle{ R(t)=e^{-e^{\left( \frac{t-p_{1}}{p_{2}}\right) }} }[/math]

By transforming t = lnt and converting [math]\displaystyle{ p=\ln({\eta}) }[/math], [math]\displaystyle{ p_{2}=\frac{1}{ \beta } }[/math], the above equation becomes the Weibull reliability function:

[math]\displaystyle{ R(t)=e^{-e^{\beta \left( \ln t-\ln \eta \right) }}=e^{-e^{\ln \left( \frac{t }{\eta }\right) ^{\beta }}}=e^{-\left( \frac{t}{\eta }\right) ^{\beta }} }[/math]

with:

[math]\displaystyle{ R(T)=e^{-e^{\beta \left( \ln t-\ln \eta \right) }} }[/math]

set:

[math]\displaystyle{ u=\beta \left( \ln t-\ln \eta \right). }[/math]

The reliability function now becomes:

[math]\displaystyle{ R(T)=e^{-e^{u}} }[/math]

The next step is to find the upper and lower bounds on u. Using the equations derived in Chapter Confidence Bounds, the bounds on are then estimated from [30]:

[math]\displaystyle{ u_{U} =\hat{u}+K_{\alpha }\sqrt{Var(\hat{u})} }[/math]
[math]\displaystyle{ u_{L} =\hat{u}-K_{\alpha }\sqrt{Var(\hat{u})} }[/math]

where:

[math]\displaystyle{ Var(\hat{u}) =\left( \frac{\partial u}{\partial \beta }\right) ^{2}Var( \hat{\beta })+\left( \frac{\partial u}{\partial \eta }\right) ^{2}Var( \hat{\eta }) +2\left( \frac{\partial u}{\partial \beta }\right) \left( \frac{\partial u }{\partial \eta }\right) Cov\left( \hat{\beta },\hat{\eta }\right) }[/math]

or:

[math]\displaystyle{ Var(\hat{u}) =\frac{\hat{u}^{2}}{\hat{\beta }^{2}}Var(\hat{ \beta })+\frac{\hat{\beta }^{2}}{\hat{\eta }^{2}}Var(\hat{\eta }) -\left( \frac{2u}{\hat{\eta }}\right) Cov\left( \hat{\beta }, \hat{\eta }\right). }[/math]

The upper and lower bounds on reliability are:

[math]\displaystyle{ R_{U} =e^{-e^{u_{L}}}\text{ (upper bound)} }[/math]
[math]\displaystyle{ R_{L} =e^{-e^{u_{U}}}\text{ (lower bound)} }[/math]


Other Weibull Forms

Weibull++ makes the following assumptions/substitutions when using the three-parameter or one-parameter forms:


  • For the three-parameter case, substitute [math]\displaystyle{ t=\ln (t-\hat{\gamma }) }[/math] (and by definition γ < t), instead of lnt. (Note that this is an approximation since it eliminates the third parameter and assumes that [math]\displaystyle{ Var( \hat{\gamma })=0. }[/math])
  • For the one-parameter, [math]\displaystyle{ Var(\hat{\beta })=0, }[/math] thus:
[math]\displaystyle{ Var(\hat{u})=\left( \frac{\partial u}{\partial \eta }\right) ^{2}Var( \hat{\eta })=\left( \frac{\hat{\beta }}{\hat{\eta }}\right) ^{2}Var(\hat{\eta }) }[/math]

Also note that the time axis (x-axis) in the three-parameter Weibull plot in Weibull++ is not but t − γ. This means that one must be cautious when obtaining confidence bounds from the plot. If one desires to estimate the confidence bounds on reliability for a given time t0 from the adjusted plotted line, then these bounds should be obtained for a t0 − γ entry on the time axis.

Bounds on Time

The bounds around the time estimate or reliable life estimate, for a given Weibull percentile (unreliability), are estimated by first solving the reliability equation with respect to time, as follows [24, 30]:

[math]\displaystyle{ \ln R =-\left( \frac{t}{\eta }\right) ^{\beta } }[/math]
[math]\displaystyle{ \ln (-\ln R) =\beta \ln \left( \frac{t}{\eta }\right) }[/math]
[math]\displaystyle{ \ln (-\ln R) =\beta (\ln t-\ln \eta ) }[/math]

or:

[math]\displaystyle{ u=\frac{1}{\beta }\ln (-\ln R)+\ln \eta }[/math]

where u = lnt.

The upper and lower bounds on are estimated from:

[math]\displaystyle{ u_{U} =\hat{u}+K_{\alpha }\sqrt{Var(\hat{u})} }[/math]
[math]\displaystyle{ u_{L} =\hat{u}-K_{\alpha }\sqrt{Var(\hat{u})} }[/math]

where:

[math]\displaystyle{ Var(\hat{u})=\left( \frac{\partial u}{\partial \beta }\right) ^{2}Var( \hat{\beta })+\left( \frac{\partial u}{\partial \eta }\right) ^{2}Var( \hat{\eta })+2\left( \frac{\partial u}{\partial \beta }\right) \left( \frac{\partial u}{\partial \eta }\right) Cov\left( \hat{\beta },\hat{ \eta }\right) }[/math]

or:

[math]\displaystyle{ Var(\hat{u}) =\frac{1}{\hat{\beta }^{4}}\left[ \ln (-\ln R)\right] ^{2}Var(\hat{\beta })+\frac{1}{\hat{\eta }^{2}}Var(\hat{\eta })+2\left( -\frac{1}{\hat{\beta }^{2}}\right) \left( \frac{\ln (-\ln R)}{ \hat{\eta }}\right) Cov\left( \hat{\beta },\hat{\eta }\right) }[/math]

The upper and lower bounds are then found by:

[math]\displaystyle{ T_{U} =e^{u_{U}}\text{ (upper bound)} }[/math]
[math]\displaystyle{ T_{L} =e^{u_{L}}\text{ (lower bound)} }[/math]