Template:Lognormal distribution estimation of the parameters: Difference between revisions

From ReliaWiki
Jump to navigation Jump to search
No edit summary
(Redirected page to The Lognormal Distribution)
 
(9 intermediate revisions by 2 users not shown)
Line 1: Line 1:
==Estimation of the Parameters==
#REDIRECT [[The_Lognormal_Distribution]]
 
{{ld probability plotting}}
 
===Rank Regression on Y===
Performing a rank regression on Y requires that a straight line be fitted to a set of data points such that the sum of the squares of the vertical deviations from the points to the line is minimized.
 
The least squares parameter estimation method, or regression analysis, was discussed in Chapter 3 and the following equations for regression on Y were derived, and are again applicable:
 
::<math>\hat{a}=\bar{y}-\hat{b}\bar{x}=\frac{\underset{i=1}{\overset{N}{\mathop{\sum }}}\,{{y}_{i}}}{N}-\hat{b}\frac{\underset{i=1}{\overset{N}{\mathop{\sum }}}\,{{x}_{i}}}{N}</math>
 
:and:
 
::<math>\hat{b}=\frac{\underset{i=1}{\overset{N}{\mathop{\sum }}}\,{{x}_{i}}{{y}_{i}}-\tfrac{\underset{i=1}{\overset{N}{\mathop{\sum }}}\,{{x}_{i}}\underset{i=1}{\overset{N}{\mathop{\sum }}}\,{{y}_{i}}}{N}}{\underset{i=1}{\overset{N}{\mathop{\sum }}}\,x_{i}^{2}-\tfrac{{{\left( \underset{i=1}{\overset{N}{\mathop{\sum }}}\,{{x}_{i}} \right)}^{2}}}{N}}</math>
 
In our case the equations for  <math>{{y}_{i}}</math>  and <math>x_{i}</math> are:
 
::<math>{{y}_{i}}={{\Phi }^{-1}}\left[ F(T_{i}^{\prime }) \right]</math>
 
:and:
 
::<math>{{x}_{i}}=T_{i}^{\prime }</math>
 
where the  <math>F(T_{i}^{\prime })</math>  is estimated from the median ranks. Once  <math>\widehat{a}</math>  and  <math>\widehat{b}</math>  are obtained, then  <math>\widehat{\sigma }</math>  and  <math>\widehat{\mu }</math>  can easily be obtained from Eqns. (aln) and (bln).
 
====The Correlation Coefficient====
The estimator of  <math>\rho </math>  is the sample correlation coefficient,  <math>\hat{\rho }</math> , given by:
 
::<math>\hat{\rho }=\frac{\underset{i=1}{\overset{N}{\mathop{\sum }}}\,({{x}_{i}}-\overline{x})({{y}_{i}}-\overline{y})}{\sqrt{\underset{i=1}{\overset{N}{\mathop{\sum }}}\,{{({{x}_{i}}-\overline{x})}^{2}}\cdot \underset{i=1}{\overset{N}{\mathop{\sum }}}\,{{({{y}_{i}}-\overline{y})}^{2}}}}</math>
 
 
====Example 2====
Fourteen units were reliability tested and the following life test data were obtained:
 
{|align="center" border=1 cellspacing=0
|-
|colspan="2" style="text-align:center"| Table 9.1 - Life Test Data for Example 2
|-
!Data point index
!Time-to-failure
|-
|1 ||5
|-
|2 ||10
|-
|3 ||15
|-
|4 ||20
|-
|5 ||25
|-
|6 ||30
|-
|7 ||35
|-
|8 ||40
|-
|9 ||50
|-
|10 ||60
|-
|11 ||70
|-
|12 ||80
|-
|13 ||90
|-
|14 ||100
|}
 
Assuming the data follow a lognormal distribution, estimate the parameters and the correlation coefficient,  <math>\rho </math> , using rank regression on Y.
 
=====Solution to Example 2=====
Construct Table 9.2, as shown next.
 
<center><math>\overset{{}}{\mathop{\text{Table 9}\text{.2 - Least Squares Analysis}}}\,</math></center>
 
<center><math>\begin{matrix}
  N & T_{i} & F(T_{i}) & {T_{i}}'& y_{i} & {{T_{i}}'}^{2} & y_{i}^{2} & T_{i} y_{i}  \\
  \text{1} & \text{5} & \text{0}\text{.0483} & \text{1}\text{.6094}& \text{-1}\text{.6619} & \text{2}\text{.5903} & \text{2}\text{.7619} & \text{-2}\text{.6747}  \\
  \text{2} & \text{10} & \text{0}\text{.1170} & \text{2.3026}& \text{-1.1901} & \text{5.3019} & \text{1.4163} & \text{-2.7403}  \\
  \text{3} & \text{15} & \text{0}\text{.1865} & \text{2.7080}&\text{-0.8908} & \text{7.3335} & \text{0.7935} & \text{-2.4123}  \\
  \text{4} & \text{20} & \text{0}\text{.2561} & \text{2.9957} &\text{-0.6552} & \text{8.9744} & \text{0.4292} & \text{-1.9627}  \\
  \text{5} & \text{25} & \text{0}\text{.3258} & \text{3.2189}& \text{-0.4512} & \text{10.3612} & \text{0.2036} & \text{-1.4524}  \\
  \text{6} & \text{30} & \text{0}\text{.3954} & \text{3.4012}& \text{-0.2647} & \text{11.5681} & \text{0.0701} & \text{-0.9004}  \\
  \text{7} & \text{35} & \text{0}\text{.4651} & \text{3.5553} & \text{-0.0873} & \text{12.6405} & \text{-0.0076}& \text{-0.3102}  \\
  \text{8} & \text{40} & \text{0}\text{.5349} & \text{3.6889}& \text{0.0873} & \text{13.6078} & \text{0.0076} & \text{0.3219}  \\
  \text{9} & \text{50} & \text{0}\text{.6046} & \text{3.912} & \text{0.2647} & \text{15.3039} & \text{0.0701} &\text{1.0357}  \\
  \text{10} & \text{60} & \text{0}\text{.6742} & \text{4.0943} & \text{0.4512} & \text{16.7637} & \text{0.2036}&\text{1.8474}  \\
  \text{11} & \text{70} & \text{0}\text{.7439} & \text{4.2485} & \text{0.6552} & \text{18.0497}& \text{0.4292} & \text{2.7834} \\
  \text{12} & \text{80} & \text{0}\text{.8135} & \text{4.382} & \text{0.8908} & \text{19.2022} & \text{0.7935} & \text{3.9035}  \\
  \text{13} & \text{90} & \text{0}\text{.8830} & \text{4.4998} & \text{1.1901} & \text{20.2483}&\text{1.4163} & \text{5.3552}  \\
    \text{14} & \text{100}& \text{1.9517} & \text{4.6052} & \text{1.6619} & \text{21.2076} &\text{2.7619} & \text{7.6533}  \\
  \sum_{}^{} & \text{ } & \text{ } & \text{49.222} & \text{0} & \text{183.1531} & \text{11.3646} & \text{10.4473}  \\
 
\end{matrix}</math></center>
 
 
The median rank values ( <math>F({{T}_{i}})</math> ) can be found in rank tables or by using the Quick Statistical Reference in Weibull++ .
 
The  <math>{{y}_{i}}</math>  values were obtained from the standardized normal distribution's area tables by entering for  <math>F(z)</math>  and getting the corresponding  <math>z</math>  value ( <math>{{y}_{i}}</math> ).
 
Given the values in the table above, calculate  <math>\widehat{a}</math>  and  <math>\widehat{b}</math>  using Eqns. (aaln) and (bbln):
 
 
::<math>\begin{align}
  & \widehat{b}= & \frac{\underset{i=1}{\overset{14}{\mathop{\sum }}}\,T_{i}^{\prime }{{y}_{i}}-(\underset{i=1}{\overset{14}{\mathop{\sum }}}\,T_{i}^{\prime })(\underset{i=1}{\overset{14}{\mathop{\sum }}}\,{{y}_{i}})/14}{\underset{i=1}{\overset{14}{\mathop{\sum }}}\,T_{i}^{\prime 2}-{{(\underset{i=1}{\overset{14}{\mathop{\sum }}}\,T_{i}^{\prime })}^{2}}/14} \\
&  &  \\
& \widehat{b}= & \frac{10.4473-(49.2220)(0)/14}{183.1530-{{(49.2220)}^{2}}/14} 
\end{align}</math>
 
:or:
 
::<math>\widehat{b}=1.0349</math>
 
:and:
 
::<math>\widehat{a}=\frac{\underset{i=1}{\overset{N}{\mathop{\sum }}}\,{{y}_{i}}}{N}-\widehat{b}\frac{\underset{i=1}{\overset{N}{\mathop{\sum }}}\,T_{i}^{\prime }}{N}</math>
 
:or:
 
::<math>\widehat{a}=\frac{0}{14}-(1.0349)\frac{49.2220}{14}=-3.6386</math>
 
:Therefore, from Eqn. (bln):
 
::<math>{{\sigma }_{{{T}'}}}=\frac{1}{\widehat{b}}=\frac{1}{1.0349}=0.9663</math>
 
:and from Eqn. (aln):
 
::<math>{\mu }'=-\widehat{a}\cdot {{\sigma }_{{{T}'}}}=-(-3.6386)\cdot 0.9663</math>
 
:or:
 
::<math>{\mu }'=3.516</math>
 
The mean and the standard deviation of the lognormal distribution are obtained using Eqns. (mean) and (sdv):
 
::<math>\overline{T}=\mu ={{e}^{3.516+\tfrac{1}{2}{{0.9663}^{2}}}}=53.6707\text{ hours}</math>
 
:and:
 
::<math>{{\sigma }_{T}}=\sqrt{({{e}^{2\cdot 3.516+{{0.9663}^{2}}}})({{e}^{{{0.9663}^{2}}}}-1)}=66.69\text{ hours}</math>
 
The correlation coefficient can be estimated using Eqn. (RHOln):
 
::<math>\widehat{\rho }=0.9754</math>
 
The above example can be repeated using Weibull++ , using RRY.
 
[[Image:5folio.png|thumb|center|400px| ]]
 
The mean can be obtained from the QCP and both the mean and the standard deviation can be obtained from the Function Wizard.
 
===Rank Regression on X===
Performing a rank regression on X requires that a straight line be fitted to a set of data points such that the sum of the squares of the horizontal deviations from the points to the line is minimized.
 
Again, the first task is to bring our  <math>cdf</math>  function into a linear form. This step is exactly the same as in regression on Y analysis and Eqns. (lnorm), (yln), (aln) and (bln) apply in this case too. The deviation from the previous analysis begins on the least squares fit part, where in this case we treat  <math>x</math>  as the dependent variable and  <math>y</math>  as the independent variable. The best-fitting straight line to the data, for regression on X (see Chapter 3), is the straight line:
 
::<math>x=\widehat{a}+\widehat{b}y</math>
 
The corresponding equations for    and  <math>\widehat{b}</math>  are:
 
::<math>\hat{a}=\overline{x}-\hat{b}\overline{y}=\frac{\underset{i=1}{\overset{N}{\mathop{\sum }}}\,{{x}_{i}}}{N}-\hat{b}\frac{\underset{i=1}{\overset{N}{\mathop{\sum }}}\,{{y}_{i}}}{N}</math>
 
:and:
 
::<math>\hat{b}=\frac{\underset{i=1}{\overset{N}{\mathop{\sum }}}\,{{x}_{i}}{{y}_{i}}-\tfrac{\underset{i=1}{\overset{N}{\mathop{\sum }}}\,{{x}_{i}}\underset{i=1}{\overset{N}{\mathop{\sum }}}\,{{y}_{i}}}{N}}{\underset{i=1}{\overset{N}{\mathop{\sum }}}\,y_{i}^{2}-\tfrac{{{\left( \underset{i=1}{\overset{N}{\mathop{\sum }}}\,{{y}_{i}} \right)}^{2}}}{N}}</math>
 
:where:
 
::<math>{{y}_{i}}={{\Phi }^{-1}}\left[ F(T_{i}^{\prime }) \right]</math>
 
:and:
 
::<math>{{x}_{i}}=T_{i}^{\prime }</math>
 
and the  <math>F(T_{i}^{\prime })</math>  is estimated from the median ranks. Once  <math>\widehat{a}</math>  and  <math>\widehat{b}</math>  are obtained, solve Eqn. (xlineln) for the unknown  <math>y</math> , which corresponds to:
 
::<math>y=-\frac{\widehat{a}}{\widehat{b}}+\frac{1}{\widehat{b}}x</math>
 
Solving for the parameters from Eqns. (bln) and (aln) we get:
 
::<math>a=-\frac{\widehat{a}}{\widehat{b}}=-\frac{{{\mu }'}}{{{\sigma }_{{{T}'}}}}</math>
 
:and:
 
::<math>b=\frac{1}{\widehat{b}}=\frac{1}{{{\sigma }_{{{T}'}}}}\text{ }</math>
 
The correlation coefficient is evaluated as before using Eqn. (RHOln).
 
====Example 3====
Using the data of Example 2 and assuming a lognormal distribution, estimate the parameters and estimate the correlation coefficient,  <math>\rho </math> , using rank regression on X.
 
=====Solution to Example 3=====
Table 9.2 constructed in Example 2 applies to this example as well. Using the values in this table we get:
 
::<math>\begin{align}
  & \hat{b}= & \frac{\underset{i=1}{\overset{14}{\mathop{\sum }}}\,T_{i}^{\prime }{{y}_{i}}-\tfrac{\underset{i=1}{\overset{14}{\mathop{\sum }}}\,T_{i}^{\prime }\underset{i=1}{\overset{14}{\mathop{\sum }}}\,{{y}_{i}}}{14}}{\underset{i=1}{\overset{14}{\mathop{\sum }}}\,y_{i}^{2}-\tfrac{{{\left( \underset{i=1}{\overset{14}{\mathop{\sum }}}\,{{y}_{i}} \right)}^{2}}}{14}} \\
&  &  \\
& \widehat{b}= & \frac{10.4473-(49.2220)(0)/14}{11.3646-{{(0)}^{2}}/14} 
\end{align}</math>
 
:or:
 
::<math>\widehat{b}=0.9193</math>
 
:and:
 
::<math>\hat{a}=\overline{x}-\hat{b}\overline{y}=\frac{\underset{i=1}{\overset{14}{\mathop{\sum }}}\,T_{i}^{\prime }}{14}-\widehat{b}\frac{\underset{i=1}{\overset{14}{\mathop{\sum }}}\,{{y}_{i}}}{14}</math>
 
:or:
 
::<math>\widehat{a}=\frac{49.2220}{14}-(0.9193)\frac{(0)}{14}=3.5159</math>
 
Therefore, from Eqn. (blnx):
 
::<math>{{\sigma }_{{{T}'}}}=\widehat{b}=0.9193</math>
 
and from Eqn. (alnx):
 
::<math>{\mu }'=\frac{\widehat{a}}{\widehat{b}}{{\sigma }_{{{T}'}}}=\frac{3.5159}{0.9193}\cdot 0.9193=3.5159</math>
 
Using Eqns. (mean) and (sdv) we get:
 
::<math>\overline{T}=\mu =51.3393\text{ hours}</math>
 
:and:
 
::<math>{{\sigma }_{T}}=59.1682\text{ hours}.</math>
 
 
The correlation coefficient is found using Eqn. (RHOln):
 
::<math>\widehat{\rho }=0.9754.</math>
 
Note that the regression on Y analysis is not necessarily the same as the regression on X. The only time when the results of the two regression types are the same (i.e. will yield the same equation for a line) is when the data lie perfectly on a line.
 
Using Weibull++ , with the Rank Regression on X option, the results are:
 
[[Image:5folio.png|thumb|center|400px| ]]
 
===Maximum Likelihood Estimation===
As it was outlined in Chapter 3, maximum likelihood estimation works by developing a likelihood function based on the available data and finding the values of the parameter estimates that maximize the likelihood function. This can be achieved by using iterative methods to determine the parameter estimate values that maximize the likelihood function. However, this can be rather difficult and time-consuming, particularly when dealing with the three-parameter distribution.  Another method of finding the parameter estimates involves taking the partial derivatives of the likelihood equation with respect to the parameters, setting the resulting equations equal to zero, and solving simultaneously to determine the values of the parameter estimates. The log-likelihood functions and associated partial derivatives used to determine maximum likelihood estimates for the lognormal distribution are covered in Appendix C.
 
===Confidence Bounds===
The method used by the application in estimating the different types of confidence bounds for lognormally distributed data is presented in this section. Note that there are closed-form solutions for both the normal and lognormal reliability that can be obtained without the use of the Fisher information matrix. However, these closed-form solutions only apply to complete data. To achieve consistent application across all possible data types, Weibull++ always uses the Fisher matrix in computing confidence intervals. The complete derivations were presented in detail for a general function in Chapter 5. For a discussion on exact confidence bounds for the normal and lognormal, see Chapter 8.
 
===Fisher Matrix Bounds===
====Bounds on the Parameters====
The lower and upper bounds on the mean,  <math>{\mu }'</math> , are estimated from:
 
 
::<math>\begin{align}
  & \mu _{U}^{\prime }= & {{\widehat{\mu }}^{\prime }}+{{K}_{\alpha }}\sqrt{Var({{\widehat{\mu }}^{\prime }})}\text{ (upper bound),} \\
& \mu _{L}^{\prime }= & {{\widehat{\mu }}^{\prime }}-{{K}_{\alpha }}\sqrt{Var({{\widehat{\mu }}^{\prime }})}\text{ (lower bound)}\text{.} 
\end{align}</math>
 
 
For the standard deviation,  <math>{{\widehat{\sigma }}_{{{T}'}}}</math> ,  <math>\ln ({{\widehat{\sigma }}_{{{T}'}}})</math>  is treated as normally distributed, and the bounds are estimated from:
 
 
::<math>\begin{align}
  & {{\sigma }_{U}}= & {{\widehat{\sigma }}_{{{T}'}}}\cdot {{e}^{\tfrac{{{K}_{\alpha }}\sqrt{Var({{\widehat{\sigma }}_{{{T}'}}})}}{{{\widehat{\sigma }}_{{{T}'}}}}}}\text{ (upper bound),} \\
& {{\sigma }_{L}}= & \frac{{{\widehat{\sigma }}_{{{T}'}}}}{{{e}^{\tfrac{{{K}_{\alpha }}\sqrt{Var({{\widehat{\sigma }}_{{{T}'}}})}}{{{\widehat{\sigma }}_{{{T}'}}}}}}}\text{ (lower bound),} 
\end{align}</math>
 
where  <math>{{K}_{\alpha }}</math>  is defined by:
 
::<math>\alpha =\frac{1}{\sqrt{2\pi }}\int_{{{K}_{\alpha }}}^{\infty }{{e}^{-\tfrac{{{t}^{2}}}{2}}}dt=1-\Phi ({{K}_{\alpha }})</math>
 
 
If  <math>\delta </math>  is the confidence level, then  <math>\alpha =\tfrac{1-\delta }{2}</math>  for the two-sided bounds and  <math>\alpha =1-\delta </math>  for the one-sided bounds.
 
The variances and covariances of  <math>{{\widehat{\mu }}^{\prime }}</math>  and  <math>{{\widehat{\sigma }}_{{{T}'}}}</math>  are estimated as follows:
 
 
::<math>\left( \begin{matrix}
  \widehat{Var}\left( {{\widehat{\mu }}^{\prime }} \right) & \widehat{Cov}\left( {{\widehat{\mu }}^{\prime }},{{\widehat{\sigma }}_{{{T}'}}} \right)  \\
  \widehat{Cov}\left( {{\widehat{\mu }}^{\prime }},{{\widehat{\sigma }}_{{{T}'}}} \right) & \widehat{Var}\left( {{\widehat{\sigma }}_{{{T}'}}} \right)  \\
\end{matrix} \right)=\left( \begin{matrix}
  -\tfrac{{{\partial }^{2}}\Lambda }{\partial {{({\mu }')}^{2}}} & -\tfrac{{{\partial }^{2}}\Lambda }{\partial {\mu }'\partial {{\sigma }_{{{T}'}}}}  \\
  {} & {}  \\
  -\tfrac{{{\partial }^{2}}\Lambda }{\partial {\mu }'\partial {{\sigma }_{{{T}'}}}} & -\tfrac{{{\partial }^{2}}\Lambda }{\partial \sigma _{{{T}'}}^{2}}  \\
\end{matrix} \right)_{{\mu }'={{\widehat{\mu }}^{\prime }},{{\sigma }_{{{T}'}}}={{\widehat{\sigma }}_{{{T}'}}}}^{-1}</math>
 
 
where  <math>\Lambda </math>  is the log-likelihood function of the lognormal distribution.
 
====Bounds on Reliability====
The reliability of the lognormal distribution is:
 
 
::<math>\hat{R}({T}';{\mu }',{{\sigma }_{{{T}'}}})=\int_{{{T}'}}^{\infty }\frac{1}{{{\widehat{\sigma }}_{{{T}'}}}\sqrt{2\pi }}{{e}^{-\tfrac{1}{2}{{\left( \tfrac{t-{{\widehat{\mu }}^{\prime }}}{{{\widehat{\sigma }}_{{{T}'}}}} \right)}^{2}}}}dt</math>
 
 
Let  <math>\widehat{z}(t;{{\hat{\mu }}^{\prime }},{{\hat{\sigma }}_{{{T}'}}})=\tfrac{t-{{\widehat{\mu }}^{\prime }}}{{{\widehat{\sigma }}_{{{T}'}}}},</math>  then  <math>\tfrac{d\widehat{z}}{dt}=\tfrac{1}{{{\widehat{\sigma }}_{{{T}'}}}}.</math> For  <math>t={T}'</math> ,  <math>\widehat{z}=\tfrac{{T}'-{{\widehat{\mu }}^{\prime }}}{{{\widehat{\sigma }}_{{{T}'}}}}</math> , and for  <math>t=\infty ,</math>  <math>\widehat{z}=\infty .</math>  The above equation then becomes:
 
 
::<math>\hat{R}(\widehat{z})=\int_{\widehat{z}({T}')}^{\infty }\frac{1}{\sqrt{2\pi }}{{e}^{-\tfrac{1}{2}{{z}^{2}}}}dz</math>
 
 
The bounds on  <math>z</math>  are estimated from:
 
::<math>\begin{align}
  & {{z}_{U}}= & \widehat{z}+{{K}_{\alpha }}\sqrt{Var(\widehat{z})} \\
& {{z}_{L}}= & \widehat{z}-{{K}_{\alpha }}\sqrt{Var(\widehat{z})} 
\end{align}</math>
 
:where:
 
::<math>\begin{align}
  & Var(\widehat{z})= & \left( \frac{\partial z}{\partial {\mu }'} \right)_{{{\widehat{\mu }}^{\prime }}}^{2}Var({{\widehat{\mu }}^{\prime }})+\left( \frac{\partial z}{\partial {{\sigma }_{{{T}'}}}} \right)_{{{\widehat{\sigma }}_{{{T}'}}}}^{2}Var({{\widehat{\sigma }}_{{{T}'}}}) \\
&  & +2{{\left( \frac{\partial z}{\partial {\mu }'} \right)}_{{{\widehat{\mu }}^{\prime }}}}{{\left( \frac{\partial z}{\partial {{\sigma }_{{{T}'}}}} \right)}_{{{\widehat{\sigma }}_{{{T}'}}}}}Cov\left( {{\widehat{\mu }}^{\prime }},{{\widehat{\sigma }}_{{{T}'}}} \right) 
\end{align}</math>
 
:or:
 
::<math>Var(\widehat{z})=\frac{1}{\widehat{\sigma }_{{{T}'}}^{2}}\left[ Var({{\widehat{\mu }}^{\prime }})+{{\widehat{z}}^{2}}Var({{\widehat{\sigma }}_{{{T}'}}})+2\cdot \widehat{z}\cdot Cov\left( {{\widehat{\mu }}^{\prime }},{{\widehat{\sigma }}_{{{T}'}}} \right) \right]</math>
 
 
The upper and lower bounds on reliability are:
 
::<math>\begin{align}
  & {{R}_{U}}= & \int_{{{z}_{L}}}^{\infty }\frac{1}{\sqrt{2\pi }}{{e}^{-\tfrac{1}{2}{{z}^{2}}}}dz\text{ (Upper bound)} \\
& {{R}_{L}}= & \int_{{{z}_{U}}}^{\infty }\frac{1}{\sqrt{2\pi }}{{e}^{-\tfrac{1}{2}{{z}^{2}}}}dz\text{ (Lower bound)} 
\end{align}</math>
 
====Bounds on Time====
 
The bounds around time for a given lognormal percentile, or unreliability, are estimated by first solving the reliability equation with respect to time, as follows:
 
 
::<math>{T}'({{\widehat{\mu }}^{\prime }},{{\widehat{\sigma }}_{{{T}'}}})={{\widehat{\mu }}^{\prime }}+z\cdot {{\widehat{\sigma }}_{{{T}'}}}</math>
 
:where:
 
::<math>z={{\Phi }^{-1}}\left[ F({T}') \right]</math>
 
:and:
 
::<math>\Phi (z)=\frac{1}{\sqrt{2\pi }}\int_{-\infty }^{z({T}')}{{e}^{-\tfrac{1}{2}{{z}^{2}}}}dz</math>
 
 
The next step is to calculate the variance of  <math>{T}'({{\widehat{\mu }}^{\prime }},{{\widehat{\sigma }}_{{{T}'}}}):</math>
 
::<math>\begin{align}
  & Var({{{\hat{T}}}^{\prime }})= & {{\left( \frac{\partial {T}'}{\partial {\mu }'} \right)}^{2}}Var({{\widehat{\mu }}^{\prime }})+{{\left( \frac{\partial {T}'}{\partial {{\sigma }_{{{T}'}}}} \right)}^{2}}Var({{\widehat{\sigma }}_{{{T}'}}}) \\
&  & +2\left( \frac{\partial {T}'}{\partial {\mu }'} \right)\left( \frac{\partial {T}'}{\partial {{\sigma }_{{{T}'}}}} \right)Cov\left( {{\widehat{\mu }}^{\prime }},{{\widehat{\sigma }}_{{{T}'}}} \right) \\
&  &  \\
& Var({{{\hat{T}}}^{\prime }})= & Var({{\widehat{\mu }}^{\prime }})+{{\widehat{z}}^{2}}Var({{\widehat{\sigma }}_{{{T}'}}})+2\cdot \widehat{z}\cdot Cov\left( {{\widehat{\mu }}^{\prime }},{{\widehat{\sigma }}_{{{T}'}}} \right) 
\end{align}</math>
 
 
The upper and lower bounds are then found by:
 
::<math>\begin{align}
  & T_{U}^{\prime }= & \ln {{T}_{U}}={{{\hat{T}}}^{\prime }}+{{K}_{\alpha }}\sqrt{Var({{{\hat{T}}}^{\prime }})} \\
& T_{L}^{\prime }= & \ln {{T}_{L}}={{{\hat{T}}}^{\prime }}-{{K}_{\alpha }}\sqrt{Var({{{\hat{T}}}^{\prime }})} 
\end{align}</math>
 
 
Solving for  <math>{{T}_{U}}</math>  and  <math>{{T}_{L}}</math>  we get:
 
::<math>\begin{align}
  & {{T}_{U}}= & {{e}^{T_{U}^{\prime }}}\text{ (upper bound),} \\
& {{T}_{L}}= & {{e}^{T_{L}^{\prime }}}\text{ (lower bound)}\text{.} 
\end{align}</math>
 
====Example 4====
Using the data of Example 2 and assuming a lognormal distribution, estimate the parameters using the MLE method.
=====Solution to Example 4=====
In this example we have only complete data. Thus, the partials reduce to:
 
::<math>\begin{align}
  & \frac{\partial \Lambda }{\partial {\mu }'}= & \frac{1}{\sigma _{{{T}'}}^{2}}\cdot \underset{i=1}{\overset{14}{\mathop \sum }}\,\ln ({{T}_{i}})-{\mu }'=0 \\
& \frac{\partial \Lambda }{\partial {{\sigma }_{{{T}'}}}}= & \underset{i=1}{\overset{14}{\mathop \sum }}\,\left( \frac{\ln ({{T}_{i}})-{\mu }'}{\sigma _{{{T}'}}^{3}}-\frac{1}{{{\sigma }_{{{T}'}}}} \right)=0 
\end{align}</math>
 
 
Substituting the values of  <math>{{T}_{i}}</math>  and solving the above system simultaneously, we get:
 
::<math>\begin{align}
  & {{{\hat{\sigma }}}_{{{T}'}}}= & 0.849 \\
& {{{\hat{\mu }}}^{\prime }}= & 3.516 
\end{align}</math>
 
 
Using Eqns. (mean) and (sdv) we get:
 
::<math>\overline{T}=\hat{\mu }=48.25\text{ hours}</math>
 
 
:and:
 
::<math>{{\hat{\sigma }}_{{{T}'}}}=49.61\text{ hours}.</math>
 
The variance/covariance matrix is given by:
 
::<math>\left[ \begin{matrix}
  \widehat{Var}\left( {{{\hat{\mu }}}^{\prime }} \right)=0.0515 & {} & \widehat{Cov}\left( {{{\hat{\mu }}}^{\prime }},{{{\hat{\sigma }}}_{{{T}'}}} \right)=0.0000  \\
  {} & {} & {}  \\
  \widehat{Cov}\left( {{{\hat{\mu }}}^{\prime }},{{{\hat{\sigma }}}_{{{T}'}}} \right)=0.0000 & {} & \widehat{Var}\left( {{{\hat{\sigma }}}_{{{T}'}}} \right)=0.0258  \\
\end{matrix} \right]</math>
 
====Note About Bias====
See the discussion regarding bias with the normal distribution  in Chapter 8 for information regarding parameter bias in the lognormal distribution.
 
===Likelihood Ratio Confidence Bounds===
 
====Bounds on Parameters====
As covered in Chapter 5, the likelihood confidence bounds are calculated by finding values for  <math>{{\theta }_{1}}</math>  and  <math>{{\theta }_{2}}</math>  that satisfy:
 
 
::<math>-2\cdot \text{ln}\left( \frac{L({{\theta }_{1}},{{\theta }_{2}})}{L({{\widehat{\theta }}_{1}},{{\widehat{\theta }}_{2}})} \right)=\chi _{\alpha ;1}^{2}</math>
 
This equation can be rewritten as:
 
 
::<math>L({{\theta }_{1}},{{\theta }_{2}})=L({{\widehat{\theta }}_{1}},{{\widehat{\theta }}_{2}})\cdot {{e}^{\tfrac{-\chi _{\alpha ;1}^{2}}{2}}}</math>
 
For complete data, the likelihood formula for the normal distribution is given by:
 
 
::<math>L({\mu }',{{\sigma }_{{{T}'}}})=\underset{i=1}{\overset{N}{\mathop \prod }}\,f({{x}_{i}};{\mu }',{{\sigma }_{{{T}'}}})=\underset{i=1}{\overset{N}{\mathop \prod }}\,\frac{1}{{{x}_{i}}\cdot {{\sigma }_{{{T}'}}}\cdot \sqrt{2\pi }}\cdot {{e}^{-\tfrac{1}{2}{{\left( \tfrac{\text{ln}({{x}_{i}})-{\mu }'}{{{\sigma }_{{{T}'}}}} \right)}^{2}}}}</math>
 
where the  <math>{{x}_{i}}</math>  values represent the original time-to-failure data.  For a given value of  <math>\alpha </math> , values for  <math>{\mu }'</math>  and  <math>{{\sigma }_{{{T}'}}}</math>  can be found which represent the maximum and minimum values that satisfy Eqn. (lratio3). These represent the confidence bounds for the parameters at a confidence level  <math>\delta ,</math>  where  <math>\alpha =\delta </math>  for two-sided bounds and  <math>\alpha =2\delta -1</math>  for one-sided.
 
====Example 5====
Five units are put on a reliability test and experience failures at 45, 60, 75, 90, and 115 hours. Assuming a lognormal distribution, the MLE parameter estimates are calculated to be  <math>{{\widehat{\mu }}^{\prime }}=4.2926</math>  and  <math>{{\widehat{\sigma }}_{{{T}'}}}=0.32361.</math>  Calculate the two-sided 75% confidence bounds on these parameters using the likelihood ratio method.
=====Solution to Example 5=====
The first step is to calculate the likelihood function for the parameter estimates:
 
<center><math>\begin{align}
  L({{\widehat{\mu }}^{\prime }},{{\widehat{\sigma }}_{{{T}'}}})= & \underset{i=1}{\overset{N}{\mathop \prod }}\,f({{x}_{i}};{{\widehat{\mu }}^{\prime }},{{\widehat{\sigma }}_{{{T}'}}}), \\
  = & \underset{i=1}{\overset{N}{\mathop \prod }}\,\frac{1}{{{x}_{i}}\cdot {{\widehat{\sigma }}_{{{T}'}}}\cdot \sqrt{2\pi }}\cdot {{e}^{-\tfrac{1}{2}{{\left( \tfrac{\text{ln}({{x}_{i}})-{{\widehat{\mu }}^{\prime }}}{{{\widehat{\sigma }}_{{{T}'}}}} \right)}^{2}}}} \\
  L({{\widehat{\mu }}^{\prime }},{{\widehat{\sigma }}_{{{T}'}}})= & \underset{i=1}{\overset{5}{\mathop \prod }}\,\frac{1}{{{x}_{i}}\cdot 0.32361\cdot \sqrt{2\pi }}\cdot {{e}^{-\tfrac{1}{2}{{\left( \tfrac{\text{ln}({{x}_{i}})-4.2926}{0.32361} \right)}^{2}}}} \\
  L({{\widehat{\mu }}^{\prime }},{{\widehat{\sigma }}_{{{T}'}}})= & 1.115256\times {{10}^{-10}} 
\end{align}</math></center>
 
where  <math>{{x}_{i}}</math>  are the original time-to-failure data points. We can now rearrange Eqn. (lratio3) to the form:
 
::<math>L({\mu }',{{\sigma }_{{{T}'}}})-L({{\widehat{\mu }}^{\prime }},{{\widehat{\sigma }}_{{{T}'}}})\cdot {{e}^{\tfrac{-\chi _{\alpha ;1}^{2}}{2}}}=0</math>
 
Since our specified confidence level,  <math>\delta </math> , is 75%, we can calculate the value of the chi-squared statistic,  <math>\chi _{0.75;1}^{2}=1.323303.</math>  We can now substitute this information into the equation:
 
::<math>\begin{align}
  & L({\mu }',{{\sigma }_{{{T}'}}})-L({{\widehat{\mu }}^{\prime }},{{\widehat{\sigma }}_{{{T}'}}})\cdot {{e}^{\tfrac{-\chi _{\alpha ;1}^{2}}{2}}}= & 0 \\
& L({\mu }',{{\sigma }_{{{T}'}}})-1.115256\times {{10}^{-10}}\cdot {{e}^{\tfrac{-1.323303}{2}}}= & 0 \\
& L({\mu }',{{\sigma }_{{{T}'}}})-5.754703\times {{10}^{-11}}= & 0 
\end{align}</math>
 
It now remains to find the values of  <math>{\mu }'</math>  and  <math>{{\sigma }_{{{T}'}}}</math>  which satisfy this equation. This is an iterative process that requires setting the value of  <math>{{\sigma }_{{{T}'}}}</math>  and finding the appropriate values of  <math>{\mu }'</math> , and vice versa.
 
The following table gives the values of  <math>{\mu }'</math>  based on given values of  <math>{{\sigma }_{{{T}'}}}</math> .
 
 
<center><math>\begin{matrix}
  {{\sigma }_{{{T}'}}} & \mu _{1}^{\prime } & \mu _{2}^{\prime } & {{\sigma }_{{{T}'}}} & \mu _{1}^{\prime } & \mu _{2}^{\prime }  \\
  0.24 & 4.2421 & 4.3432 & 0.37 & 4.1145 & 4.4708  \\
  0.25 & 4.2115 & 4.3738 & 0.38 & 4.1152 & 4.4701  \\
  0.26 & 4.1909 & 4.3944 & 0.39 & 4.1170 & 4.4683  \\
  0.27 & 4.1748 & 4.4105 & 0.40 & 4.1200 & 4.4653  \\
  0.28 & 4.1618 & 4.4235 & 0.41 & 4.1244 & 4.4609  \\
  0.29 & 4.1509 & 4.4344 & 0.42 & 4.1302 & 4.4551  \\
  0.30 & 4.1419 & 4.4434 & 0.43 & 4.1377 & 4.4476  \\
  0.31 & 4.1343 & 4.4510 & 0.44 & 4.1472 & 4.4381  \\
  0.32 & 4.1281 & 4.4572 & 0.45 & 4.1591 & 4.4262  \\
  0.33 & 4.1231 & 4.4622 & 0.46 & 4.1742 & 4.4111  \\
  0.34 & 4.1193 & 4.4660 & 0.47 & 4.1939 & 4.3914  \\
  0.35 & 4.1166 & 4.4687 & 0.48 & 4.2221 & 4.3632  \\
  0.36 & 4.1150 & 4.4703 & {} & {} & {}  \\
\end{matrix}</math></center>
 
These points are represented graphically in the following contour plot:
 
[[Image:ldachp9ex5.gif|thumb|center|400px| ]]
 
(Note that this plot is generated with degrees of freedom  <math>k=1</math> , as we are only determining bounds on one parameter. The contour plots generated in Weibull++ are done with degrees of freedom  <math>k=2</math> , for use in comparing both parameters simultaneously.) As can be determined from the table the lowest calculated value for  <math>{\mu }'</math>  is 4.1145, while the highest is 4.4708. These represent the two-sided 75% confidence limits on this parameter. Since solutions for the equation do not exist for values of  <math>{{\sigma }_{{{T}'}}}</math>  below 0.24 or above 0.48, these can be considered the two-sided 75% confidence limits for this parameter. In order to obtain more accurate values for the confidence limits on  <math>{{\sigma }_{{{T}'}}}</math> , we can perform the same procedure as before, but finding the two values of  <math>\sigma </math>  that correspond with a given value of  <math>{\mu }'.</math>  Using this method, we find that the 75% confidence limits on  <math>{{\sigma }_{{{T}'}}}</math>  are 0.23405 and 0.48936, which are close to the initial estimates of 0.24 and 0.48.
 
====Bounds on Time and Reliability====
In order to calculate the bounds on a time estimate for a given reliability, or on a reliability estimate for a given time, the likelihood function needs to be rewritten in terms of one parameter and time/reliability, so that the maximum and minimum values of the time can be observed as the parameter is varied. This can be accomplished by substituting a form of the normal reliability equation into the likelihood function. The normal reliability equation can be written as:
 
::<math>R=1-\Phi \left( \frac{\text{ln}(t)-{\mu }'}{{{\sigma }_{{{T}'}}}} \right)</math>
 
This can be rearranged to the form:
 
::<math>{\mu }'=\text{ln}(t)-{{\sigma }_{{{T}'}}}\cdot {{\Phi }^{-1}}(1-R)</math>
 
where  <math>{{\Phi }^{-1}}</math>  is the inverse standard normal. This equation can now be substituted into Eqn. (lognormlikelihood) to produce a likelihood equation in terms of  <math>{{\sigma }_{{{T}'}}},</math>  <math>t</math>  and  <math>R\ \ :</math> 
 
::<math>L({{\sigma }_{{{T}'}}},t/R)=\underset{i=1}{\overset{N}{\mathop \prod }}\,\frac{1}{{{x}_{i}}\cdot {{\sigma }_{{{T}'}}}\cdot \sqrt{2\pi }}\cdot {{e}^{-\tfrac{1}{2}{{\left( \tfrac{\text{ln}({{x}_{i}})-\left( \text{ln}(t)-{{\sigma }_{{{T}'}}}\cdot {{\Phi }^{-1}}(1-R) \right)}{{{\sigma }_{{{T}'}}}} \right)}^{2}}}}</math>
 
The unknown variable  <math>t/R</math>  depends on what type of bounds are being determined.  If one is trying to determine the bounds on time for a given reliability, then  <math>R</math>  is a known constant and  <math>t</math>  is the unknown variable. Conversely, if one is trying to determine the bounds on reliability for a given time, then  <math>t</math>  is a known constant and  <math>R</math>  is the unknown variable. Either way, Eqn. (lognormliketr) can be used to solve Eqn. (lratio3) for the values of interest.
 
====Example 6====
For the data given in Example 5, determine the two-sided 75% confidence bounds on the time estimate for a reliability of 80%.  The ML estimate for the time at  <math>R(t)=80%</math>  is 55.718.
=====Solution to Example 6=====
In this example, we are trying to determine the two-sided 75% confidence bounds on the time estimate of 55.718. This is accomplished by substituting  <math>R=0.80</math>  and  <math>\alpha =0.75</math>  into Eqn. (lognormliketr), and varying  <math>{{\sigma }_{{{T}'}}}</math>  until the maximum and minimum values of  <math>t</math>  are found. The following table gives the values of  <math>t</math>  based on given values of  <math>{{\sigma }_{{{T}'}}}</math> .
 
 
<center><math>\begin{matrix}
  {{\sigma }_{{{T}'}}} & {{t}_{1}} & {{t}_{2}} & {{\sigma }_{{{T}'}}} & {{t}_{1}} & {{t}_{2}}  \\
  0.24 & 56.832 & 62.879 & 0.37 & 44.841 & 64.031  \\
  0.25 & 54.660 & 64.287 & 0.38 & 44.494 & 63.454  \\
  0.26 & 53.093 & 65.079 & 0.39 & 44.200 & 62.809  \\
  0.27 & 51.811 & 65.576 & 0.40 & 43.963 & 62.093  \\
  0.28 & 50.711 & 65.881 & 0.41 & 43.786 & 61.304  \\
  0.29 & 49.743 & 66.041 & 0.42 & 43.674 & 60.436  \\
  0.30 & 48.881 & 66.085 & 0.43 & 43.634 & 59.481  \\
  0.31 & 48.106 & 66.028 & 0.44 & 43.681 & 58.426  \\
  0.32 & 47.408 & 65.883 & 0.45 & 43.832 & 57.252  \\
  0.33 & 46.777 & 65.657 & 0.46 & 44.124 & 55.924  \\
  0.34 & 46.208 & 65.355 & 0.47 & 44.625 & 54.373  \\
  0.35 & 45.697 & 64.983 & 0.48 & 45.517 & 52.418  \\
  0.36 & 45.242 & 64.541 & {} & {} & {}  \\
\end{matrix}</math></center>
 
 
This data set is represented graphically in the following contour plot:
 
[[Image:ldachp9ex6.gif|thumb|center|400px| ]]
 
As can be determined from the table, the lowest calculated value for  <math>t</math>  is 43.634, while the highest is 66.085. These represent the two-sided 75% confidence limits on the time at which reliability is equal to 80%.
 
====Example 7====
For the data given in Example 5, determine the two-sided 75% confidence bounds on the reliability estimate for  <math>t=65</math> .  The ML estimate for the reliability at  <math>t=65</math>  is 64.261%.
=====Solution to Example 7=====
In this example, we are trying to determine the two-sided 75% confidence bounds on the reliability estimate of 64.261%. This is accomplished by substituting  <math>t=65</math>  and  <math>\alpha =0.75</math>  into Eqn. (lognormliketr), and varying  <math>{{\sigma }_{{{T}'}}}</math>  until the maximum and minimum values of  <math>R</math>  are found. The following table gives the values of  <math>R</math>  based on given values of  <math>{{\sigma }_{{{T}'}}}</math> .
 
 
<center><math>\begin{matrix}
  {{\sigma }_{{{T}'}}} & {{R}_{1}} & {{R}_{2}} & {{\sigma }_{{{T}'}}} & {{R}_{1}} & {{R}_{2}}  \\
  0.24 & 61.107% & 75.910% & 0.37 & 43.573% & 78.845%  \\
  0.25 & 55.906% & 78.742% & 0.38 & 43.807% & 78.180%  \\
  0.26 & 55.528% & 80.131% & 0.39 & 44.147% & 77.448%  \\
  0.27 & 50.067% & 80.903% & 0.40 & 44.593% & 76.646%  \\
  0.28 & 48.206% & 81.319% & 0.41 & 45.146% & 75.767%  \\
  0.29 & 46.779% & 81.499% & 0.42 & 45.813% & 74.802%  \\
  0.30 & 45.685% & 81.508% & 0.43 & 46.604% & 73.737%  \\
  0.31 & 44.857% & 81.387% & 0.44 & 47.538% & 72.551%  \\
  0.32 & 44.250% & 81.159% & 0.45 & 48.645% & 71.212%  \\
  0.33 & 43.827% & 80.842% & 0.46 & 49.980% & 69.661%  \\
  0.34 & 43.565% & 80.446% & 0.47 & 51.652% & 67.789%  \\
  0.35 & 43.444% & 79.979% & 0.48 & 53.956% & 65.299%  \\
  0.36 & 43.450% & 79.444% & {} & {} & {}  \\
\end{matrix}</math></center>
 
 
This data set is represented graphically in the following contour plot:
 
[[Image:ldachp9ex7.gif|thumb|center|400px| ]]  
 
As can be determined from the table, the lowest calculated value for  <math>R</math>  is 43.444%, while the highest is 81.508%. These represent the two-sided 75% confidence limits on the reliability at  <math>t=65</math> .

Latest revision as of 09:00, 3 August 2012