# Weibull Confidence Bounds

New format available! This reference is now available in a new format that offers faster page load, improved display for calculations and images, more targeted search and the latest content available as a PDF. As of September 2023, this Reliawiki page will not continue to be updated. Please update all links and bookmarks to the latest reference at help.reliasoft.com/reference/life_data_analysis

 Chapter 8.2: Weibull Confidence Bounds

 Chapter 8.2 Weibull Confidence Bounds

Available Software:
Weibull++

More Resources:
Weibull++ Examples Collection

## Fisher Matrix Confidence Bounds

One of the methods used by the application in estimating the different types of confidence bounds for Weibull data, the Fisher matrix method, is presented in this section. The complete derivations were presented in detail (for a general function) in Confidence Bounds.

### Bounds on the Parameters

One of the properties of maximum likelihood estimators is that they are asymptotically normal, meaning that for large samples they are normally distributed. Additionally, since both the shape parameter estimate, $\displaystyle{ \hat{\beta } \,\! }$, and the scale parameter estimate, $\displaystyle{ \hat{\eta }, \,\! }$ must be positive, thus $\displaystyle{ ln\beta \,\! }$ and $\displaystyle{ ln\eta \,\! }$ are treated as being normally distributed as well. The lower and upper bounds on the parameters are estimated from Nelson [30]:

$\displaystyle{ \beta _{U} =\hat{\beta }\cdot e^{\frac{K_{\alpha }\sqrt{Var(\hat{ \beta })}}{\hat{\beta }}}\text{ (upper bound)} \,\! }$
$\displaystyle{ \beta _{L} =\frac{\hat{\beta }}{e^{\frac{K_{\alpha }\sqrt{Var(\hat{ \beta })}}{\hat{\beta }}}} \text{ (lower bound)} \,\! }$

and:

$\displaystyle{ \eta _{U} =\hat{\eta }\cdot e^{\frac{K_{\alpha }\sqrt{Var(\hat{ \eta })}}{\hat{\eta }}}\text{ (upper bound)} \,\! }$
$\displaystyle{ \eta _{L} =\frac{\hat{\eta }}{e^{\frac{K_{\alpha }\sqrt{Var(\hat{ \eta })}}{\hat{\eta }}}}\text{ (lower bound)} \,\! }$

where $\displaystyle{ K_{\alpha}\,\! }$ is defined by:

$\displaystyle{ \alpha =\frac{1}{\sqrt{2\pi }}\int_{K_{\alpha }}^{\infty }e^{-\frac{t^{2}}{2} }dt=1-\Phi (K_{\alpha }) \,\! }$

If $\displaystyle{ d\,\! }$ is the confidence level, then $\displaystyle{ \alpha =\frac{1-\delta }{2} \,\! }$ for the two-sided bounds and $\displaystyle{ a = 1 - d\,\! }$ for the one-sided bounds. The variances and covariances of $\displaystyle{ \hat{\beta }\,\! }$ and $\displaystyle{ \hat{\eta }\,\! }$ are estimated from the inverse local Fisher matrix, as follows:

$\displaystyle{ \left( \begin{array}{cc} \hat{Var}\left( \hat{\beta }\right) & \hat{Cov}\left( \hat{ \beta },\hat{\eta }\right) \\ \hat{Cov}\left( \hat{\beta },\hat{\eta }\right) & \hat{Var} \left( \hat{\eta }\right) \end{array} \right) =\left( \begin{array}{cc} -\frac{\partial ^{2}\Lambda }{\partial \beta ^{2}} & -\frac{\partial ^{2}\Lambda }{\partial \beta \partial \eta } \\ -\frac{\partial ^{2}\Lambda }{\partial \beta \partial \eta } & -\frac{ \partial ^{2}\Lambda }{\partial \eta ^{2}} \end{array} \right) _{\beta =\hat{\beta },\text{ }\eta =\hat{\eta }}^{-1} \,\! }$

Fisher Matrix Confidence Bounds and Regression Analysis

Note that the variance and covariance of the parameters are obtained from the inverse Fisher information matrix as described in this section. The local Fisher information matrix is obtained from the second partials of the likelihood function, by substituting the solved parameter estimates into the particular functions. This method is based on maximum likelihood theory and is derived from the fact that the parameter estimates were computed using maximum likelihood estimation methods. When one uses least squares or regression analysis for the parameter estimates, this methodology is theoretically then not applicable. However, if one assumes that the variance and covariance of the parameters will be similar ( One also assumes similar properties for both estimators.) regardless of the underlying solution method, then the above methodology can also be used in regression analysis.

The Fisher matrix is one of the methodologies that Weibull++ uses for both MLE and regression analysis. Specifically, Weibull++ uses the likelihood function and computes the local Fisher information matrix based on the estimates of the parameters and the current data. This gives consistent confidence bounds regardless of the underlying method of solution, (i.e., MLE or regression). In addition, Weibull++ checks this assumption and proceeds with it if it considers it to be acceptable. In some instances, Weibull++ will prompt you with an "Unable to Compute Confidence Bounds" message when using regression analysis. This is an indication that these assumptions were violated.

### Bounds on Reliability

The bounds on reliability can easily be derived by first looking at the general extreme value distribution (EVD). Its reliability function is given by:

$\displaystyle{ R(t)=e^{-e^{\left( \frac{t-p_{1}}{p_{2}}\right) }} \,\! }$

By transforming $\displaystyle{ t = \ln t\,\! }$ and converting $\displaystyle{ p_{1}=\ln({\eta})\,\! }$, $\displaystyle{ p_{2}=\frac{1}{ \beta } \,\! }$, the above equation becomes the Weibull reliability function:

$\displaystyle{ R(t)=e^{-e^{\beta \left( \ln t-\ln \eta \right) }}=e^{-e^{\ln \left( \frac{t }{\eta }\right) ^{\beta }}}=e^{-\left( \frac{t}{\eta }\right) ^{\beta }} \,\! }$

with:

$\displaystyle{ R(T)=e^{-e^{\beta \left( \ln t-\ln \eta \right) }}\,\! }$

set:

$\displaystyle{ u=\beta \left( \ln t-\ln \eta \right) \,\! }$

The reliability function now becomes:

$\displaystyle{ R(T)=e^{-e^{u}} \,\! }$

The next step is to find the upper and lower bounds on $\displaystyle{ u\,\! }$. Using the equations derived in Confidence Bounds, the bounds on reliability are then estimated from Nelson [30]:

$\displaystyle{ u_{U} =\hat{u}+K_{\alpha }\sqrt{Var(\hat{u})} \,\! }$
$\displaystyle{ u_{L} =\hat{u}-K_{\alpha }\sqrt{Var(\hat{u})} \,\! }$

where:

$\displaystyle{ Var(\hat{u}) =\left( \frac{\partial u}{\partial \beta }\right) ^{2}Var( \hat{\beta })+\left( \frac{\partial u}{\partial \eta }\right) ^{2}Var( \hat{\eta }) +2\left( \frac{\partial u}{\partial \beta }\right) \left( \frac{\partial u }{\partial \eta }\right) Cov\left( \hat{\beta },\hat{\eta }\right) \,\! }$

or:

$\displaystyle{ Var(\hat{u}) =\frac{\hat{u}^{2}}{\hat{\beta }^{2}}Var(\hat{ \beta })+\frac{\hat{\beta }^{2}}{\hat{\eta }^{2}}Var(\hat{\eta }) -\left( \frac{2\hat{u}}{\hat{\eta }}\right) Cov\left( \hat{\beta }, \hat{\eta }\right). \,\! }$

The upper and lower bounds on reliability are:

$\displaystyle{ R_{U} =e^{-e^{u_{L}}}\text{ (upper bound)}\,\! }$
$\displaystyle{ R_{L} =e^{-e^{u_{U}}}\text{ (lower bound)}\,\! }$

Other Weibull Forms

Weibull++ makes the following assumptions/substitutions when using the three-parameter or one-parameter forms:

• For the 3-parameter case, substitute $\displaystyle{ t=\ln (t-\hat{\gamma }) \,\! }$ (and by definition $\displaystyle{ \gamma\, \lt t\! }$), instead of $\displaystyle{ \ln t\,\! }$. (Note that this is an approximation since it eliminates the third parameter and assumes that $\displaystyle{ Var( \hat{\gamma })=0. \,\! }$)
• For the 1-parameter, $\displaystyle{ Var(\hat{\beta })=0, \,\! }$ thus:
$\displaystyle{ Var(\hat{u})=\left( \frac{\partial u}{\partial \eta }\right) ^{2}Var( \hat{\eta })=\left( \frac{\hat{\beta }}{\hat{\eta }}\right) ^{2}Var(\hat{\eta }) \,\! }$

Also note that the time axis (x-axis) in the three-parameter Weibull plot in Weibull++ is not $\displaystyle{ {t}\,\! }$ but $\displaystyle{ t - \gamma\,\! }$. This means that one must be cautious when obtaining confidence bounds from the plot. If one desires to estimate the confidence bounds on reliability for a given time $\displaystyle{ {{t}_{0}}\,\! }$ from the adjusted plotted line, then these bounds should be obtained for a $\displaystyle{ {{t}_{0}} - \gamma\,\! }$ entry on the time axis.

### Bounds on Time

The bounds around the time estimate or reliable life estimate, for a given Weibull percentile (unreliability), are estimated by first solving the reliability equation with respect to time, as discussed in Lloyd and Lipow [24] and in Nelson [30]:

$\displaystyle{ \ln R =-\left( \frac{t}{\eta }\right) ^{\beta } \,\! }$
$\displaystyle{ \ln (-\ln R) =\beta \ln \left( \frac{t}{\eta }\right) \,\! }$
\displaystyle{ \begin{align} \ln (-\ln R) =\beta (\ln t-\ln \eta ) \end{align}\,\! }

or:

$\displaystyle{ u=\frac{1}{\beta }\ln (-\ln R)+\ln \eta \,\! }$

where $\displaystyle{ u = \ln t\,\! }$ .

The upper and lower bounds on are estimated from:

$\displaystyle{ u_{U} =\hat{u}+K_{\alpha }\sqrt{Var(\hat{u})} \,\! }$
$\displaystyle{ u_{L} =\hat{u}-K_{\alpha }\sqrt{Var(\hat{u})} \,\! }$

where:

$\displaystyle{ Var(\hat{u})=\left( \frac{\partial u}{\partial \beta }\right) ^{2}Var( \hat{\beta })+\left( \frac{\partial u}{\partial \eta }\right) ^{2}Var( \hat{\eta })+2\left( \frac{\partial u}{\partial \beta }\right) \left( \frac{\partial u}{\partial \eta }\right) Cov\left( \hat{\beta },\hat{ \eta }\right) \,\! }$

or:

$\displaystyle{ Var(\hat{u}) =\frac{1}{\hat{\beta }^{4}}\left[ \ln (-\ln R)\right] ^{2}Var(\hat{\beta })+\frac{1}{\hat{\eta }^{2}}Var(\hat{\eta })+2\left( -\frac{\ln (-\ln R)}{\hat{\beta }^{2}}\right) \left( \frac{1}{ \hat{\eta }}\right) Cov\left( \hat{\beta },\hat{\eta }\right) \,\! }$

The upper and lower bounds are then found by:

$\displaystyle{ T_{U} =e^{u_{U}}\text{ (upper bound)} \,\! }$
$\displaystyle{ T_{L} =e^{u_{L}}\text{ (lower bound)} \,\! }$

## Likelihood Ratio Confidence Bounds

As covered in Confidence Bounds, the likelihood confidence bounds are calculated by finding values for $\displaystyle{ {{\theta}_{1}}\,\! }$ and $\displaystyle{ {{\theta}_{2}}\,\! }$ that satisfy:

$\displaystyle{ -2\cdot \text{ln}\left( \frac{L(\theta _{1},\theta _{2})}{L(\hat{\theta }_{1}, \hat{\theta }_{2})}\right) =\chi _{\alpha ;1}^{2} \,\! }$

This equation can be rewritten as:

$\displaystyle{ L(\theta _{1},\theta _{2})=L(\hat{\theta }_{1},\hat{\theta } _{2})\cdot e^{\frac{-\chi _{\alpha ;1}^{2}}{2}} \,\! }$

For complete data, the likelihood function for the Weibull distribution is given by:

$\displaystyle{ L(\beta ,\eta )=\prod_{i=1}^{N}f(x_{i};\beta ,\eta )=\prod_{i=1}^{N}\frac{ \beta }{\eta }\cdot \left( \frac{x_{i}}{\eta }\right) ^{\beta -1}\cdot e^{-\left( \frac{x_{i}}{\eta }\right) ^{\beta }} \,\! }$

For a given value of $\displaystyle{ \alpha\,\! }$, values for $\displaystyle{ \beta\,\! }$ and $\displaystyle{ \eta\,\! }$ can be found which represent the maximum and minimum values that satisfy the above equation. These represent the confidence bounds for the parameters at a confidence level $\displaystyle{ \delta\,\! }$, where $\displaystyle{ \alpha = \delta\,\! }$ for two-sided bounds and $\displaystyle{ \alpha = 2\delta - 1\,\! }$ for one-sided.

Similarly, the bounds on time and reliability can be found by substituting the Weibull reliability equation into the likelihood function so that it is in terms of $\displaystyle{ \beta\,\! }$ and time or reliability, as discussed in Confidence Bounds. The likelihood ratio equation used to solve for bounds on time (Type 1) is:

$\displaystyle{ L(\beta ,t)=\prod_{i=1}^{N}\frac{\beta }{\left( \frac{t}{(-\text{ln}(R))^{ \frac{1}{\beta }}}\right) }\cdot \left( \frac{x_{i}}{\left( \frac{t}{(-\text{ ln}(R))^{\frac{1}{\beta }}}\right) }\right) ^{\beta -1}\cdot \text{exp}\left[ -\left( \frac{x_{i}}{\left( \frac{t}{(-\text{ln}(R))^{\frac{1}{\beta }}} \right) }\right) ^{\beta }\right] \,\! }$

The likelihood ratio equation used to solve for bounds on reliability (Type 2) is:

$\displaystyle{ L(\beta ,R)=\prod_{i=1}^{N}\frac{\beta }{\left( \frac{t}{(-\text{ln}(R))^{ \frac{1}{\beta }}}\right) }\cdot \left( \frac{x_{i}}{\left( \frac{t}{(-\text{ ln}(R))^{\frac{1}{\beta }}}\right) }\right) ^{\beta -1}\cdot \text{exp}\left[ -\left( \frac{x_{i}}{\left( \frac{t}{(-\text{ln}(R))^{\frac{1}{\beta }}} \right) }\right) ^{\beta }\right] \,\! }$

## Bayesian Confidence Bounds

### Bounds on Parameters

Bayesian Bounds use non-informative prior distributions for both parameters. From Confidence Bounds, we know that if the prior distribution of $\displaystyle{ \eta\,\! }$ and $\displaystyle{ \beta\,\! }$ are independent, the posterior joint distribution of $\displaystyle{ \eta\,\! }$ and $\displaystyle{ \beta\,\! }$ can be written as:

$\displaystyle{ f(\eta ,\beta |Data)= \dfrac{L(Data|\eta ,\beta )\varphi (\eta )\varphi (\beta )}{\int_{0}^{\infty }\int_{0}^{\infty }L(Data|\eta ,\beta )\varphi (\eta )\varphi (\beta )d\eta d\beta } \,\! }$

The marginal distribution of $\displaystyle{ \eta\,\! }$ is:

$\displaystyle{ f(\eta |Data) =\int_{0}^{\infty }f(\eta ,\beta |Data)d\beta = \dfrac{\int_{0}^{\infty }L(Data|\eta ,\beta )\varphi (\eta )\varphi (\beta )d\beta }{\int_{0}^{\infty }\int_{-\infty }^{\infty }L(Data|\eta ,\beta )\varphi (\eta )\varphi (\beta )d\eta d\beta } \,\! }$

where: $\displaystyle{ \varphi (\beta )=\frac{1}{\beta } \,\! }$ is the non-informative prior of $\displaystyle{ \beta\,\! }$. $\displaystyle{ \varphi (\eta )=\frac{1}{\eta } \,\! }$ is the non-informative prior of $\displaystyle{ \eta\,\! }$. Using these non-informative prior distributions, $\displaystyle{ f(\eta|Data)\,\! }$ can be rewritten as:

$\displaystyle{ f(\eta |Data)=\dfrac{\int_{0}^{\infty }L(Data|\eta ,\beta )\frac{1}{\beta } \frac{1}{\eta }d\beta }{\int_{0}^{\infty }\int_{0}^{\infty }L(Data|\eta ,\beta )\frac{1}{\beta }\frac{1}{\eta }d\eta d\beta } \,\! }$

The one-sided upper bounds of $\displaystyle{ \eta\,\! }$ is:

$\displaystyle{ CL=P(\eta \leq \eta _{U})=\int_{0}^{\eta _{U}}f(\eta |Data)d\eta \,\! }$

The one-sided lower bounds of $\displaystyle{ \eta\,\! }$ is:

$\displaystyle{ 1-CL=P(\eta \leq \eta _{L})=\int_{0}^{\eta _{L}}f(\eta |Data)d\eta \,\! }$

The two-sided bounds of $\displaystyle{ \eta\,\! }$ is:

$\displaystyle{ CL=P(\eta _{L}\leq \eta \leq \eta _{U})=\int_{\eta _{L}}^{\eta _{U}}f(\eta |Data)d\eta \,\! }$

Same method is used to obtain the bounds of $\displaystyle{ \beta\,\! }$.

### Bounds on Reliability

$\displaystyle{ CL=\Pr (R\leq R_{U})=\Pr (\eta \leq T\exp (-\frac{\ln (-\ln R_{U})}{\beta })) \,\! }$

From the posterior distribution of $\displaystyle{ \eta\,\! }$ we have:

$\displaystyle{ CL=\dfrac{\int\nolimits_{0}^{\infty }\int\nolimits_{0}^{T\exp (-\dfrac{\ln (-\ln R_{U})}{\beta })}L(\beta ,\eta )\frac{1}{\beta }\frac{1}{\eta }d\eta d\beta }{\int\nolimits_{0}^{\infty }\int\nolimits_{0}^{\infty }L(\beta ,\eta )\frac{1}{\beta }\frac{1}{\eta }d\eta d\beta } \,\! }$

The above equation is solved numerically for $\displaystyle{ {{R}_{U}}\,\! }$. The same method can be used to calculate the one sided lower bounds and two-sided bounds on reliability.

### Bounds on Time

From Confidence Bounds, we know that:

$\displaystyle{ CL=\Pr (T\leq T_{U})=\Pr (\eta \leq T_{U}\exp (-\frac{\ln (-\ln R)}{\beta })) \,\! }$

From the posterior distribution of $\displaystyle{ \eta\,\! }$, we have:

$\displaystyle{ CL=\dfrac{\int\nolimits_{0}^{\infty }\int\nolimits_{0}^{T_{U}\exp (-\dfrac{ \ln (-\ln R)}{\beta })}L(\beta ,\eta )\frac{1}{\beta }\frac{1}{\eta }d\eta d\beta }{\int\nolimits_{0}^{\infty }\int\nolimits_{0}^{\infty }L(\beta ,\eta )\frac{1}{\beta }\frac{1}{\eta }d\eta d\beta } \,\! }$

The above equation is solved numerically for $\displaystyle{ {{T}_{U}}\,\! }$. The same method can be applied to calculate one sided lower bounds and two-sided bounds on time.