HOME    »    PROGRAMS/ACTIVITIES    »    Annual Thematic Program
IMA Annual Program Year Workshop
Development and Analysis of Multiscale Methods
November 3-7, 2008

Frank BrownUniversity of California
Anne ChakaNational Institute of Standards and Technology
Gero FrieseckeTechnical University of Munich
Kurt KremerMax-Planck Institut für Polymerforschung
Yousef SaadUniversity of Minnesota, Twin Cities
Group Photo

Theoretical, computational and experimental approaches to problems in natural sciences typically focus on particular aspects of the studied phenomena or systems. This is linked to the need to structure the questions with respect to the most relevant length and time scales. This need comes from the limited range of applicability of specific experimental as well as theoretical tools. In the past this has created huge progress and is the basis of our current understanding of physical, chemical and biological systems. For example, in the area of phase transitions and critical phenomena renormalization group theory has shown that many properties such as critical exponents or ratio of critical amplitudes does not depend on microscopic details of the studied system. This means that within each universality class, for many properties it is sufficient to study highly idealized model systems. However details of the models determine the transition temperature or the absolute amplitudes. Similar examples could be given in many other areas, for example the mechanical response of bulk solids, thin films or biological membranes are to a large extent governed by a small number of universal models but constitutive parameters depend crucially on the details of the underlying microstructure. In a computer simulation, in principle it would be possible to study systems on huge length scales and for long times (i.e. fracture mechanics based on an all atom simulation, function of a membrane protein in a fully fluctuating membrane etc.) if all the interactions would be fully treated and infinite CPU time would be available. While neither of the two is the case, such an ansatz probably also would produce too much information, obscuring a more general understanding.

Out of this, for several years now scale bridging or multiscale simulations methods are developed at many places. They are still in their infancy and the many different ideas did not converge into one or several generally accepted and validated schemes. Because of that, this fairly young and critical area of computational science can benefit greatly from advances in mathematics. Conversely, emerging computational experience on truly multiscale systems can serve as a great stimulus to mathematical understanding, which at present remains at its most thorough for two-scale systems (as treated, e.g., in classical and stochastic homogenization theory or Gamma-convergence).

Examples of truly multiscale systems include biological ion channels, proteins, emulsions, functional materials and quantum dots. They require new methods to address challenges such as hierarchies of structual organisation, fluctuating (electrostatic) fields, simultaneous treatment and interdependencies of very short and very long range interactions, and approximate Hamiltonians to model dynamics and reactivity of tens of thousands of atoms. In order to achieve this coupling schemes between different scales have to be developed which includes systematic coarse graining strategies, appropriate interaction potentials and force fields, and methods to link studies on different scales and tune the resolution of the computer model to coarser and finer resolution as needed. Ultimately such schemes have to include classical as well as quantum methods. All this requires new approaches beyond conventional computer modeling.

The workshop aims to address a number of exemplary questions. How does one parameterize coarse grained interaction potentials for bonded and nonbonded interactions? The latter is especially delicate for soft matter, because of the huge size of the molecules. What is the best point or regime in parameter and phase space to hand over from one to another level of description? How do errors propagate from one level to the next and what are the consequences when one wants to finegrain again? How specific or transferable are models and methods or are there general strategies to follow? Do we have strategies and general criteria for validation beyond trivial tests? Coarse graining means mapping of scales, but how does this work for nonequilibrium systems and time scales, i.e., for studying dynamics? All these questions will be addressed and discussed in terms of basic concepts as well as specific applications.


Connect With Us: