Estimating Probability Distribution by doing Computer Experiments
Doctoral thesis, 2000
In almost every field of engineering, advanced computer programs are used. In many cases only the probability distributions of the input variables to the programs are known. Sometimes stochastic processes are used as input, for example if the stresses in a construction due to an earthquake are to be calculated. In such cases it is only possible to estimate the distributions of the calculated entities. This can be done by computer simulation, i.e. doing many calculations with different configurations of input variables. As the computing time for each computer run can be long it is important to get an accurate estimate of the distribution of the output variable with a small number of runs. The organization of such computer simulations is called Computer planning.
This thesis consists of three papers. The first paper describes how synthetic earthquake time-histories which in mean fulfil given target spectra can be generated. The second paper proposes a new sampling plan, the level-based stratified sampling plan, which can be used if the probability distribution of a system described by a computer code should be estimated by computer simulations. It is shown that estimates from this sampling plan have the lowest variance among estimates from unbiased sampling plans. The third paper discusses the problem of estimating the variance of estimates from the Latin hypercube sampling plan. This sampling plan is widely used and it is well known that its estimates in general have low variance. In the paper similarities between field survey sampling and computer simulations are outlined and methods used for estimating the variance in field survey sampling are examined to se if they also can be used in computer simulations. The difficulties to estimate the variance from a single sample is clearly seen and some methods, which can be used in certain applications, are suggested.
response spectrum prediction
latin hypercube sampling