System identification of large-scale linear and nonlinear structural dynamicmodels
Doctoral thesis, 2016
System identification is a powerful technique to build a model from measurement data by using methods from different fields such as stochastic inference, optimization and linear algebra. It consists of three steps: collecting data, constructing a mathematical model and estimating its parameters. The available data often do not contain enough information or contain too much noise to enable an estimation of all uncertain model parameters with a good-enough precision. These are examples of challenges in the field of system identification. To construct a mathematical model, one should decide upon a model structure and then estimate its associated parameters. This model structure could be built with clear physical interpretation of its parameters like a parameterized finite element model, or be built just to fit to test data like general state-space or modal model. Each such model class has its own identification challenges. For the former, the complexity of finite element models can create an obstacle because of their time-consuming simulation. Furthermore, if a linear model does not represent the data with reasonable accuracy, nonlinear models need to be engaged and their modeling and parameterization impose even bigger challenges. For the latter, selecting a proper model order is a challenge and the physical relevance of identified states is an important issue. Deciding upon the physical relevance of states is presently a highly judgmental task and to instead do a classification based on physical relevance in an automated fashion is a formidable challenge. In-depth studies of such modeling and computational challenges are presented here and proper tools are suggested. They specifically target problems encountered in identification of large-scale linear and nonlinear structures. An experimental design strategy is proposed to increase the information content of test data for linear structures. By combining some new correlation metrics with a bootstrap data resampling technique, an automated procedure is developed that gives a proper model order that represent test data. The procedure’s focus is on the physical relevance of identified states and on uncertainty quantification of parameter estimates. A method for stochastic parameter calibration of linear finite element models is developed by using a damping equalization method. Bootstrapping is used also here to estimate the uncertainty on the model parameters and response predictions. For identification of nonlinear systems a method is developed in which the information content of the data is increased by incorporating multiple harmonics of the response spectra. The parameter uncertainty is here estimated by employing a cross-validation technique. A fast higher-order time-integration method is developed which combines the well-known pseudo-force method with exponential time-integration methods. High-order-hold interpolation schemes are derived to increase the methods stability. As an alternative, to speed up the computations for large-scale linear models, a surrogate model for frequency response functions is developed based on sparse Polynomial Chaos Expansion.
Finite element model
Surrogate modeling
Bootstrapping
Polynomial chaos expansion
System identification
Exponential integration
Uncertainty quantification