C.3.2 Multiscale spatial simulation of landscape processes based on duality of particles and fields.

Landscape process simulations will provide the methods for analysis of the current state of the landscape, predictions for for evaluation of the current and planned land use impacts and tools for intelligent landscape manipulation, optimization with a potential to develop innovative conservation strategies.

Process-based modeling of geospatial phenomena is difficult and involve much more uncertainty than highly developed computational methods in physics or chemistry. One of the key reasons is the complexity of landscape phenomena, multitude of processes across a range of scales and/or lack of data. The practical solutions rely on the best available combination of physical models, empirical evidence, experience from previous studies and available measured data. Landscape processes are therefore often described by a combination of physically-based and empirical models and integration with field measurements (data adaptive simulation) is especially important for increasing the realiability of predictions. Most of physically-based models have the following components:

Spatial processes described by the governing differential equations are usually solved by discretization methods. Typical apporaches include finite difference (REF, Julien and Saghafian, ...) and finite element methods (REF: Burnett, 1987,  SMS, GMS, WMS). Besides these approaches we also plan to explore other alternatives such as path sampling (similar methods are known as path integrals in physics and random walks in stochastic processes). The path sampling is based on the property that a field can be represented by an ensemble of sampling points. For eaxample, the scalar field is defined by the density of sampling points in space. This correspondence is valid also in the opposite direction and very often the ensembles of particles are described by continuous fields. The duality of field <=> particle density is routinely employed in physics to reformulate and solve complicated problems involving interacting systems with many degrees of freedom. Gardiner (1985) The path sampling method has been successfully used for linear or weakly nonlinear transport or time propagation problems which involve processes such as diffusion, advection, rate (proliferation/decay), reactions and others. The method (under various names and modifications) have been applied in in physics, chemistry, finance and other disciplines (REF LUBOS add). It has several important advantages when compared with more traditional approaches. The method is very robust, can be easily extended into arbitrary dimension, is mesh-free and is very efficient on parallel architectures (as an example of "embarrassing parallelism" it is efficient even on loosely coupled heterogeneous clusters of PCs and workstations). The method is also rather straightforward to implement in a multi-scale framework with data adaptive capabilities.  For landscape applications, the method has been used very successfully for distributed modeling of overland water and sediment flow  and erosion/deposition (Mitas and Mitasova 1998) including multi-scale applications (XXX). The path sampling approaches have a promising potential also for a number of other geoscientific applications.

Many natural processes involve more than a single scale and exhibit multi-scale, multi-process phenomena (Steyaert1993, Green et al. 2000). Problem of multiple scales now permeates several scientific disciplines such as materials research, geosciences, and biology. Some multiscale problems can be partitioned into a hierarchy of nested models in the direction from fine to coarse scales. The model on a given scale incorporates simplified or "smoothed out" effects coming from finer, more accurate levels. In the direction from coarse to fine scales, one develops a set of effective embeddings which determine boundary and/or external conditions for the processes at finer scales. The high accuracy, resolution and processes of fine scales are then used only in "hot spots" of the studied system which require such a treatment.Implementation of multiscale get from Vienna paper EQUATIONS?

To improve the accuracy and reliability of the models the simulations will be coupled with field data. We will investigate the impact of location and temporal interval of sampling and possibilities to find the optimal/cost effective relation between the models and measurements (at which point additional measurements are not necessary, adapting time interval of sampling to the monitored conditions - e.g. short interval during storms, long interval during stable dry weather, short interval/higher resolution during construction, long interval/lower resolution after development is finished). The models will use the field data to adapt the simulation by  changing the conditions of simulation (terrain shape, transport parameters), calibrating the accuracy of the simulated water depth to the measured one and similar combinations of computational and real experiments.

Based on the experiences with the previous successful development and application of the path sampling method, the underlying algorithms will be further enhanced and implemented as robust simulation modules / components which will allow to build more sophisticated models of various spatial processes.

Figure 3 Multiscale path sampling simulations fig - different effects at dif. scales?
 

C.3.3 Spatial optimization

We plan the development of intelligent tools for manipulation of landscape,  and optimization of complex spatial systems with applications to landuse optimizations, improving conservation measures while minimizinf the cost and similar tasks. solutions within prescribed constraints.

One of the key ingredients is a framework for optimization of multi-variate objects (eg, spatial distribution of conservation measures). In order to formulate, quantify and solve these tasks we will use the fromulation of the 'cost or penalty' functional with two types of inputs. The first part is a set of fields fk (r,t), which represent the natural processes and phenomena of interest (for example, sediment flux resulting from a rainfall event). The set of fields gl(r,t), on the other hand, describe the spatial distribution of quantities or measures which represent anthropogenic activities to be optimized (eg, distribution of grass strips/buffers to prevent formation of highly concentrated flows). We assume that once the fields gl(r,t) are specified we can find the corresponding fk(r,t) by solving the appropriate model or using measured data or both.

The cost functional is then written as an integral over the space O and time T

C[fk(r, t), gl (r, t), pm]= \int_{O, T} F [fk (r, t), gl(r, t), pm] d(r)dt           (1)

where the function F determines the local cost for a given configuration of fields fk(r,t ) and gl(r,t) and k,l,m are enumerating indices. The optimal solution given by such a configuration of the fields fields gl^{opt}(r, t) which fulfill the minimization condition

B C[ {fk( r, t), gl^{opt}( r, t), pm^{opt}] = min            (2)

and at the same time fulfill any additional prescribed constraints. The constraints can have various character such as non-negativity of a particular field gl(r, t)> 0, prescribed interval of values, continuity or compactness in spatial distribution. For the actual optimization of the functional (2) it is important to specify how to vary efficiently the fields gl(r, t) and the choice of the optimization techniques.

In order to transform the variation of the multi-variate fields into a variation of simple variables we will employ expansions in several types of basis sets . The multi-variate field is expressed as a linera combination of basis set functions and and the expansion coefficients become variational variables. The basis sets we plan to explore include raster or block-raster basis, local continuous functions such as gaussians and regularized spline with tensions. The choice will depend on the actual application and will be determined by considerations such as required resolution and accuracy, continuity or character of constraints.

The optimization of large number of expansion coefficients can quickly become intractable and the choice of a number of basis functions, desired accuracy and choice of optimization methods will have to be done judiciously. There are essentially two limiting types of the "cost landscape" and which also determine the choice of appropriate optimization strategies. In the case of a single or a few local minima, standard methods from optimization libraries, such as efficient quasi-Newton method can be used. In the other extreme is the cost landscape with a large number of local extremes with almost the same cost or with hierarchical structures. For these cases we plan to use robust minimization techniques based on simulated annealing or genetic algorithms (Kirkpatrick 1983, Goldberg 1989). Most likely in real situations one will need to include a combination of both methods to achieve the desired generality and robustness.

In addition, we will try to explore the combination of computer optimization with human-computer interactive optimization steering. It is a common problem in large scale optimizations that even the combination of best strategies is inefficient in finding the right "basin" where the optimal or acceptable solutions are located. Sometimes such a region can emerge on a qualitatively different level of detail and representation then the optimization methods deal with (ie, large vectors of variational parameters). We plan to investigate whether human cognitive capabilities coupled with the computer through appropriate visualization would be able to quickly extract such a qualitatively higher level information. Subsequent fast feedback to the optimizer would then speed-up the the convergence towards the acceptable solution by a significant factor. In our preliminary trials on similar problems we have found that a combination of human brain with computer is suprisingly efficient overall, especially when one counts the necessity of searching for new methods when the implemented approaches fail, etc.