Template:Lognormal distribution bayesian confidence bounds: Difference between revisions

From ReliaWiki
Jump to navigation Jump to search
 
Line 1: Line 1:
===Bayesian Confidence Bounds===
#REDIRECT [[The_Lognormal_Distribution#Bayesian_Confidence_Bounds]]
====Bounds on Parameters====
From Chapter [[Parameter Estimation]], we know that the marginal distribution of parameter  <math>{\mu }'</math>  is:
 
::<math>\begin{align}
  f({\mu }'|Data)= & \int_{0}^{\infty }f({\mu }',{{\sigma'}}|Data)d{{\sigma'}} \\
  = & \frac{\int_{0}^{\infty }L(Data|{\mu }',{{\sigma'}})\varphi ({\mu }')\varphi ({{\sigma'}})d{{\sigma'}}}{\int_{0}^{\infty }\int_{-\infty }^{\infty }L(Data|{\mu }',{{\sigma'}})\varphi ({\mu }')\varphi ({{\sigma'}})d{\mu }'d{{\sigma'}}} 
\end{align}</math>
 
where:
::<math>\varphi ({{\sigma '}})</math>  is  <math>\tfrac{1}{{{\sigma '}}}</math> , non-informative prior of  <math>{{\sigma '}}</math> .
<math>\varphi ({\mu }')</math>  is an uniform distribution from - <math>\infty </math>  to + <math>\infty </math> , non-informative prior of  <math>{\mu }'</math> .
With the above prior distributions,  <math>f({\mu }'|Data)</math>  can be rewritten as:
 
 
::<math>f({\mu }'|Data)=\frac{\int_{0}^{\infty }L(Data|{\mu }',{{\sigma '}})\tfrac{1}{{{\sigma '}}}d{{\sigma '}}}{\int_{0}^{\infty }\int_{-\infty }^{\infty }L(Data|{\mu }',{{\sigma '}})\tfrac{1}{{{\sigma '}}}d{\mu }'d{{\sigma '}}}</math>
 
 
The one-sided upper bound of  <math>{\mu }'</math>  is:
 
 
::<math>CL=P({\mu }'\le \mu _{U}^{\prime })=\int_{-\infty }^{\mu _{U}^{\prime }}f({\mu }'|Data)d{\mu }'</math>
 
 
The one-sided lower bound of  <math>{\mu }'</math>  is:
 
 
::<math>1-CL=P({\mu }'\le \mu _{L}^{\prime })=\int_{-\infty }^{\mu _{L}^{\prime }}f({\mu }'|Data)d{\mu }'</math>
 
 
The two-sided bounds of  <math>{\mu }'</math>  is:
 
 
::<math>CL=P(\mu _{L}^{\prime }\le {\mu }'\le \mu _{U}^{\prime })=\int_{\mu _{L}^{\prime }}^{\mu _{U}^{\prime }}f({\mu }'|Data)d{\mu }'</math>
 
 
The same method can be used to obtained the bounds of  <math>{{\sigma '}}</math> .
 
====Bounds on Time (Type 1)====
The reliable life of the lognormal distribution is:
 
 
::<math>\ln T={\mu }'+{{\sigma '}}{{\Phi }^{-1}}(1-R)</math>
 
 
The one-sided upper on time bound is given by:
 
 
::<math>CL=\underset{}{\overset{}{\mathop{\Pr }}}\,(\ln t\le \ln {{t}_{U}})=\underset{}{\overset{}{\mathop{\Pr }}}\,({\mu }'+{{\sigma '}}{{\Phi }^{-1}}(1-R)\le \ln {{t}_{U}})</math>
 
 
The above equation can be rewritten in terms of  <math>{\mu }'</math>  as:
 
 
::<math>CL=\underset{}{\overset{}{\mathop{\Pr }}}\,({\mu }'\le \ln {{t}_{U}}-{{\sigma '}}{{\Phi }^{-1}}(1-R)</math>
 
 
From the posterior distribution of  <math>{\mu }'</math>  get:
 
 
::<math>CL=\frac{\int_{0}^{\infty }\int_{-\infty }^{\ln {{t}_{U}}-{{\sigma ‘}}{{\Phi }^{-1}}(1-R)}L({{\sigma '}},{\mu }')\tfrac{1}{{{\sigma '}}}d{\mu }'d{{\sigma '}}}{\int_{0}^{\infty }\int_{-\infty }^{\infty }L({{\sigma '}},{\mu }')\tfrac{1}{{{\sigma '}}}d{\mu }'d{{\sigma '}}}</math>
 
 
The above equation is solved w.r.t.  <math>{{t}_{U}}.</math>  The same method can be applied for one-sided lower bounds and two-sided bounds on Time.
 
====Bounds on Reliability (Type 2)====
 
The one-sided upper bound on reliability is given by:
 
 
::<math>CL=\underset{}{\overset{}{\mathop{\Pr }}}\,(R\le {{R}_{U}})=\underset{}{\overset{}{\mathop{\Pr }}}\,({\mu }'\le \ln t-{{\sigma '}}{{\Phi }^{-1}}(1-{{R}_{U}}))</math>
 
 
From the posterior distribution of  <math>{\mu }'</math>  is:
 
 
::<math>CL=\frac{\int_{0}^{\infty }\int_{-\infty }^{\ln t-{{\sigma '}}{{\Phi }^{-1}}(1-{{R}_{U}})}L({{\sigma'}},{\mu }')\tfrac{1}{{{\sigma'}}}d{\mu }'d{{\sigma '}}}{\int_{0}^{\infty }\int_{-\infty }^{\infty }L({{\sigma '}},{\mu }')\tfrac{1}{{{\sigma '}}}d{\mu }'d{{\sigma '}}}</math>
 
 
The above equation is solved w.r.t.  <math>{{R}_{U}}.</math>  The same method is used to calculate the one-sided lower bounds and two-sided bounds on Reliability.
 
 
'''Example 8:'''
{{Example: Lognormal Distribution Bayesian Bound (Parameters)}}

Latest revision as of 06:09, 13 August 2012