Workshop ONE SIMULATION MODEL IS NOT ENOUGH!
Tuesday, 23rd April, Room 110, LL&M Department, University of Rostock, Campus South, Albert-Einstein-Str. 25
09:00 - 09:05
Adelinde Uhrmacher, University of Rostock
09:05 - 10:10
Bernard Zeigler, Professor Emeritus, University of Arizona and Chief Scientist, RTSync Corp.
The conventional approach to model construction for simulation is to focus on a single model and follow a more or less structured development cycle. Why should we put in twice the time and effort to develop two models rather than one? The answer lies in the fact that like most greedy heuristics, short-sightedness at the beginning may be much more costly in the end. This talk champions the cause of the pairs-of-models approach. We show how this approach eventually leads to better results than initially attempting to construct a complex model, followed later by having to revert to a simpler model when increasing complexity makes progress hard to achieve. We show how pairs-of-models development can be supported by computational tools for relating structure and behavior between models. Benefits of pairs of models, and eventually families of models, include the ability to perform mutual cross-calibration, avoid the usual difficulties in harmonizing the underlying ontologies, and narrow the search for plausible parameter assignments.
10:10 - 10:30
Thorsten Pawletta, University of Applied Sciences, Technology, Business and Design, Wismar
Modeling and Simulation (M&S) is one of the core methods of Computer Aided Engineering. Initially, M&S was mainly used in the engineering development process for individual technical systems. Today M&S is used to study systems of systems with high variability from system design to system operation. Thus, configurable models (in structure and parameters) for whole system families must be developed. The complexity, variability and the use of models in different engineering phases also influence the experimentation with the models. The objectives and basic conditions of the experiments differ in the individual phases. For example, the simulation method must be combined with other numerical methods or reactive studies must be carried out under real-time conditions. After a short summary of the current requirements to the M&S some current work of the research group CEA will be discussed briefly in this context. An infrastructure for modeling and simulation of versatile technical systems will be shown. The specification of model families and associated experiment scenarios as well as the automated execution of simulation experiments will be discussed regarding the suggested infrastructure.
10:30 - 11:00
11:00 - 11:20
André Grow, Max Plank Institute for Demographic Research
Explaining why the members of some social categories (e.g., men and whites) are accorded higher status than members of other categories (e.g., women and non-whites) has preoccupied sociologists since the beginning of the discipline. Status construction theory (SCT) offers one prominent account of interaction processes that can create status inequality. The social dynamics that SCT describes are complex and it is difficult to assess the theory’s implications by mere verbal reasoning alone. Existing SCT research has therefore relied on analytical and computational approaches to explore the macro-level implications of status construction processes. In this presentation, I will talk about my own experiences with developing agent-based computational models of SCT. The models that I have developed are all similar in their structure, but they focus on different levels of analysis (small group contexts vs. society) and involve different time scales (short-lived developments vs. long-term societal change).
11:20 - 11:40
Daniel Ciganda, Max Plank Institute for Demographic Research
The recuperation of fertility rates in some of the most advanced economies in the world triggered a reinterpretation of the link between development and fertility among researchers interested in demographic dynamics. The reversal of the long-term negative correlation between fertility rates and a series of development indicators at the macro level has been well documented for female labor force participation, the Human Development Index and GDP per-capita. Recent fertility theories have tried to explain the emergence of the U-shaped pattern in the relationship between development and fertility suggesting that the same set of factors that reshaped women's roles and pushed fertility to low levels, namely increasing education and labor force participation, will drive a re-increase of fertility levels as we move from a male-breadwinner equilibrium to a dual-earner dual-carer equilibrium. In this paper we present a computational model of reproductive decision-making that allows us to reproduce the observed pattern of decline and recuperation of fertility in the context of rather linear increases in the proportions of working women and women with higher education. We exploit the potential of the agent-based approach by modelling aggregate fertility patterns as a result of the interconnected decisions of individuals over the life course and across cohorts. We estimate the parameters of our model from macro level data with the aid of a Gaussian process emulator using data on the evolution of age-specific fertility rates in France and Spain, two countries that exhibit the pattern of decline and recuperation of period fertility but present very different economic, institutional and policy contexts.
11:40 - 12:00
Oliver Reinhardt, University of Rostock
Migration processes are not only influenced by changes of the physical, social, and political environment, which alone are already difficult to predict, but also by a sequence of individual decisions that depend on the individual, and their experiences, and the exchange of experiences between migrants. We apply agent-based modeling to gain a better understanding of how individual decisions and the exchange of information lead to the establishment of migration routes. To identify the specific challenges of this model, we have implemented two prototypes: in a general purpose programming language, and in a domain specific modeling language we have developed earlier. We have gained valuable insights for modeling and for modeling language development.
12:00 - 13:15
13:20 - 13:40
Julius Zimmermann, University of Rostock
In di erent areas of medicine, electric stimulation is used to improve healing processes or to contribute to a constant alleviation of a disease's symptoms. Common examples are Deep Brain Stimulation or Electrical Muscle Stimulation. Less known are approaches in the relatively young eld of tissue engineering. By exposing cell cultures or tissue samples to external electric elds, a range of positive e ects is expected. They span increased production of extracellular matrix proteins, increased proliferation or di erentiation of stem cells in a well-de ned manner. To link the experimental ndings to the employed electric stimulus, numerical simulations are needed. Depending on the research question, the simulation models have to include di erent scales. For that, di erent methods have to be coupled. To obtain a full picture of the stimulation e ect, multi-physics models have to be employed.
In our talk, we will focus mainly on simulations using the Finite Element method. The model development will be discussed starting from the wet-lab scale and going down to the cellular scale.
13:40 - 14:00
Stefanie Kreß, University of Rostock
In recent years, various geometrical models have been developed to analyze the temperature distribution in-side the human body, which is described by the bioheat equation of Pennes. Most of them exhibit simplified geometries consisting of e. g. multi-segmented cylinders, each representing a different tissue layer. Only a few bioheat models consider a more realistic human geometry with still a limited number of tissue types.
In this work, we consider a realistic human torso geometry, based on segmented magnetic resonance imaging data and conduct a bio-heat analysis using a finite element (FE) approach. Our tissue model consists of three different layers, skin, fat and muscle and assumes realistic material parameters. It accounts for the blood perfusion, metabolic heat generation, convection and radiation at the skin surface. Furthermore, we implement sweating as a thermoregulatory process. In this torso model a newly designed thermoelectric generator (TEG) is embedded. It is developed for supporting electrically active implants within a collaborative research center 1270 ELAINE.
As a full-scale FE model is too large for efficient design optimization of the TEG, we apply mathematical methods of model order reduction to create compact but still accurate numerical models, which can be employed within a system-level simulation.
14:00 - 14:20
Fiete Haack, University of Rostock
Model development is a successive process of validating, revising, extending and composing models, and requires iterative execution of simulation experiments. An automatic generation, execution and analysis of these simulation experiments provides a valuable support in the process of developing models. A prerequisite is the explicit specification of simulation experiments and model behavioral properties. To illustrate this, we present a case study in which a comprehensive cell-biological model of the canonical Wnt signaling pathway is developed by successively integrating individual model components, that characterize specific spatio-temporal aspects of canonical Wnt signaling. Thereby each individual model needed to be validated and probed for the expected spatio-temporal properties. Accordingly corresponding simulation experiments were annotated in the declarative domain specific language SESSL (Simulation Experiment Specification via a Scala Layer) and statistical model checking was used to specify the expected model behavioral properties (in this case the specific spatio-temporal dynamics) of each model component. Thereby, original experiment and model property specifications could be reused, adapted, and automatically applied to the extended model in each of the successive model extension step.
14:20 - 14:40
Dagmar Waltemath, Institute of Community Medicine, University Medicine, University of Greifswald
A systems biology project creates heterogenous, distributed and often big sets of data, including experimental data, computational models, simulation setups, textual descriptions, data outputs and analyses. The data items may be generated within and across multi-disciplinary research projects or reused from previous scientific studies. Results are typically published in scientific journals and model repositories. To handle this situation efficiently and to ensure confirmability and reproducibility of results, it is recommended to use standardised data formats and documentation whenever possible. The COMBINE initiative (http://co.mbine.org/) coordinates the development of such standards and guidelines for computational modeling in biology. In this talk, I will present the COMBINE initiative and showcase how a simulation study can be prepared for publication in an open model repository such that the results can be reproduced and thus easily be reused by other scientists.
14:40 - 15:00
15:00 - 16:00
World Café - Methodological Challenges of Developing more than one Model
16:00 - 17:00
Presentation and Discussion Results World Café
Closing remarks and end of workshop