C. Project description
ITR/AP(BCS/GRS):
Computational dynamic landscape manipulation/optimization for OPEN source GIS
C.1 Background
New advanced mapping and monitoring technologies generate large volumes of high resolution spatio-temporal data and offer unique opportunity to dramatically improve the land management, environmetal pressures and biodiversity. The major challenge for the rapidly evolving Geographical Information Science (GIS) technology is to provide tools for efficient data analysis, for environmentally and economically sound land use management with the focus on prevention rather than just remediation of problems. Huge resources are being invested into watershed and coastal management (for example Illinois is spending $500 million for Illinois river conservation, REF, 1.8 billion is planned for North Carolina coast, REF, 7 billion for Everglades restoration, REF) with significant uncertainty about the effectiveness and long term impact of ongoing programs. There are also numerous federal, state and local government regulations aimed at environmental protection which often pose a significant challenge for land owners and developers because of difficulties with localization and control of non-point source pollution. It is therefore crucial to speed-up the relevant landscape modeling/management research so that it can provide the tools to evaluate the impact of changes before they are implemented and to support design of new, effective conservation and pollution prevention measures.
The landscape-scale land management strategies are extremely difficult to test experimentally because of the cost and time requirements of such experiments and because of limited possibilities to evaluate their functioning under extreme conditions. Experimental models in laboratories often behave quite differently from the large scale real-world systems limiting the applicability of their results. Field experiments can support the studies of certain properties of conservation measures, however, they are constrained both spatially and temporally and cannot capture the long range interactions typical for fluxes in complex landscapes. According to the NRC report on "New strategies for Americas watersheds, "One area of special promise to address these issues is simulation modeling which can give decision makers interactive tools for both understanding the system and judging how management actions might affect that system" (NRC, 1999)". Computational spatial simulations belong to "young" methodologies which were developed only over the past few decades and their progress is closely tied to the advances in computer performance. As such, they have their own rules, challenges, successes and limitations. The role of model development and verification, algorithms, data structures, advanced visualization and parallel computing are crucial and require close collaboration between traditional research disciplines and computational science. In spite of significant progress in GIS technology and environmental modeling (Goodchild et al. 1993, 1996, Parks et al. 2001, ...Boyle et al. 1998, Kellershohn et al. 1999) there are persistent problems with accuracy and reliability of spatial simulations (NRC 1999), preparation of data is often time consuming and running the models requires substantial expertise. Also, the modeling efforts have focused on assessment and prediction, while their use for development of new land management and conservation strategies is largely unexplored.
C.2 Objectives
The
proposed project aims at the development of new type of landscape
modelling and management support. To a certain extent,
we plan to build upon experiences with design of new
materials in physics and chemistry based on virtual environments and
simulations. In these disciplines new materials and structures are
being investigated by "virtually experimenting" with a
prototype atomic structure which is modified by direct human
input with subsequent computer refinement. Such cycle is repeated
until
a stable structure is found. In the case of landscape
management and design the manipulated object is "too big"
for direct experiment and therefore we propose to use its virtual
representation and simulations to find favorable "landscape
structures" and interactively investigate various land
management alternatives to achieve evolving, yet
sustainable landscape.
This approach to computational land use management requires new
developments in several areas of geographic information science with
this project focusing on (Figure 1):
GIS supported multiscale, 3D dynamic representation of a studied landscape directly linked with field sensors providing continuous updates on the landscape state and its changes;
spatial simulation of landscape processes at multiple spatial and temporal scales with data adaptive capabilities;
optimization of complex spatial systems
intelligent virtual environment for interaction with landscapes (IVESIL?)
OPEN source GIS implementation
The
developed methods and tools will be tested using two different study
areas: a) North Carolina State University Centennial Campus - a new campus
site for a "Technopolis" community which
includes education, research, business and recreation activities
in an evironmetally sensitive and attractive manner
b) section of highly dynamic North
Carolina coast. Development and application of the proposed methods
will enable to carry out landscape simulations in 3D space over the
range of seamlessly embedded scales, with modeled processes adapting
to scale and real time field measurements. The project aims to reach
beyond risk assessment and prediction, towards planning and
landscape-scale design
through data adaptive simulations and spatial optimization.
C. 3 Methods - general plan, design of activities
The research will be based on multidisciplinary approach which will involve close collaboration between computer science, earth and GISscience and computational physics.
Relation to work in progress by Pis, long term goals of Pis projects
C.3.1 Multiresolution, 3D dynamic landscape model linked? with field sensors
GIS supported model of the studied landscape will provide the basis for landscape monitoring, simulation and manipulation. Multiresolution landscape model will be based on the combination of regional scale model with embedded high resolution models for "hot spots" . The low resolution models will be created using the widely available USGS data at 30-10m resolution while the rapidly developing areas will be mapped and monitored at much higher levels of detail using state-of-the-art mapping technology (such as LIDAR, RTK GPS, etc). (Multiresolution figure 1 ). The model will be spatio-temporal, to capture the dynamics of landscape evolution at different temporal scales.
An important component of the landscape model will be its two way link with mobile mapping and field monitoring units, with the model providing support for field observations and with field data "feeding"/updating the model. The concept of moving the 3D GIS from workstation to mobile computational unit will be tested by exploring the use of augmented reality for effective selection of monitoring and high resolution mapping sites - when selecting these sites the researcher will be able to view a virtual representation of the landscape at different future stages of development on his mobile computer and identify locations where monitoring would be critical during the construction and after the development (we can start with laptop and then try to get some wearable units - IBM is supposedly developing wearable computers with scaled down LINUX). The model will be continuously updated by field mapping and monitoring of selected landscape features/fields, supported by leveraging projects (ARO, WRRI, NRC) by using RTK GPS, ancillary laser-rangefinder technologies, geo-referenced video surveys and automated water, sediment and pollutant samplers.
rozved a vloz na spravne miesto
accuracy (spatial, temporl and quantitative) is a continuing problem - models still rely on large number of empirical parameters which are not generally applicable.... One promissing approach (used in meteorology Dangermeier, groundwater pollution MIT) is integration of measurements into models (pozri co je presne data assimilation, variational approach to modeling ...., rather than measuring a number of additional empirical parameters the modeled phenomenon is measured and ...
figure 2 map with location of monitoring stations
Tom, please add your input here, for this project the linkage between the desktop and field will be important and a concrete example of what is realistically possible with the instruments available at NCSU should be included.
C.3.2 Multiscale spatial simulation of landscape processes based on duality of particles and fields.
Landscape process simulations will provide clear insights into the current state of landscape processes, predictions for evaluation of the current and planned land use impacts and tools for intelligent landscape manipulation, optimization with a potential to develop innovative conservation strategies.
Process-based modeling of geospatial phenomena is difficult and involve much higher level of uncertainty than, for example, simulations of microscopic phenomena in methods in physics or chemistry which are based on virtually exact theories (statistical and quantum mechanics). One of the key reasons is the complexity of landscape phenomena, multitude of processes across a range of
scales, nonequilibrium phenomena and/or lack of experimental data. The practical solutions rely on the best available combination of physical models, empirical evidence, experience from previous studies and available measurements. Landscape processes are therefore often described by a combination of physically-based and empirical models and integration with field measurements (data adaptive simulation) is especially important for increasing the reliability of predictions. Typically, the physically-based models have the following components:
Model constituents with corresponding physical quantities such as concentration, density, velocity, etc. In general, the physical quantities depend on position in space and on time and can be characterized either as fields, particles or ensembles of particles.
Configuration space and its range of validity for fields and/or particles. This includes specification of initial, external or boundary conditions, as well as physical conditions and parameters.
Interactions between the constituents such as impact of one field on another, interactions between particles and fields, etc.
Governing equations derived from natural laws which describe the behavior of the system in space, and time. The typical examples are continuity, mass and momentum conservation, diffusion-advection, reaction kinetics and similar types of equations.
Spatial processes described by the governing partial differential equations are usually solved by discretization methods. Typical apporaches include finite difference (REF, Julien and Saghafian, ...) and finite element methods (REF: Burnett, 1987, SMS, GMS, WMS). Besides these approaches we also plan to explore other alternatives such as path sampling (similar methods are known as path integrals in physics or random walks in stochastic processes). The path sampling is based on the property that essentially any field can be represented by an ensemble of sampling points. For example, a scalar field can be determined by the distribution (density) of sampling points in the corresponding region of space. This relation is valid also in the opposite sense and very often the ensembles of particles are described by continuous fields. The duality of field <=> particle density is routinely employed in physics and helps to reformulate and solve complicated problems involving interacting systems with many degrees of freedom. The path sampling methods have been successfully used for linear or weakly nonlinear transport or time propagation problems which involve processes such as diffusion, advection, rate (proliferation/decay), reactions and others.
The method (under various names and modifications) is being used in physics, chemistry, finance and other disciplines (REF LUBOS add). It has several important advantages when compared with more traditional approaches. The method is very robust, can be easily extended into arbitrary dimension, is mesh-free and is very efficient on parallel architectures including heterogeneous clusters of PCs and workstations. The method is also rather straightforward to implement in a multi-scale framework with data adaptive capabilities. For landscape applications, the method has been used very successfully for distributed modeling of overland water and sediment flow and erosion/deposition studies (Mitas and Mitasova 1998) including multi-scale applications (XXX).
Many natural processes involve more than a single scale and exhibit multi-scale, multi-process phenomena (Steyaert1993, Green et al. 2000). Problem of multiple scales now permeates several scientific disciplines such as materials research, physics, and biology. Many multiscale problems can be partitioned into a hierarchy of effective models which are nested in the direction from fine to coarse scales. The model on a given scale incorporates simplified or "smoothed out" effects coming from finer, more accurate levels. In the direction from coarse to fine scales, one develops a set of effective embeddings which determine boundary and/or external conditions for the processes at finer scales. The high accuracy, resolution and processes of fine scales are then used only in "hot spots" of the studied system which require such a treatment. Implementation of multiscale get from Vienna paper REF
To improve the accuracy and reliability of the models the simulations will be coupled with field data. We will investigate the impact of location and temporal interval of sampling and possibilities to find the optimal/cost effective relation between the models and measurements (at which point additional measurements are not necessary, adapting time interval of sampling to the monitored conditions - e.g. short interval during storms, long interval during stable dry weather, short interval/higher resolution during construction, long interval/lower resolution after development is finished). The models will use the field data to adapt the simulation by changing the conditions of simulation (terrain shape, transport parameters), calibrating the accuracy of the simulated water depth to the measured one and similar combinations of computational and real experiments. The path sampling methods are particularly appropriate for adaptive strategies because the positions of sampling points can be easily biased through translations, rotations or space-deforming transformations. Since there is no need to rebuild any underlying finite element mesh such procedures are very fast and straightforward to implement.
Based on the experiences with the previous successful development and application of the path sampling method, the underlying algorithms will be further enhanced and implemented as robust simulation modules /components which will allow to build more sophisticated models of various spatial processes.
Figure
3 Multiscale path sampling simulations fig - different
effects at dif. scales?
C.3.3 Spatial optimization
We plan to develop intelligent tools for manipulation and optimization of complex spatial systems with applications to landuse optimizations, optimizations of conservation measures and similar land management tasks.
One of the key ingredients is an appropriate framework for optimization of multi-variate objects (eg, spatial distribution of conservation measures). In order to formulate, quantify and solve these tasks we will use the 'cost or penalty' functional with the following two types of inputs. The first part is a set of fields {fk (r,t)}, which represent the natural processes and phenomena of interest (for example, sediment flux resulting from a rainfall event). The second set of fields {gl(r,t)} describe the spatial distribution of quantities or measures which represent anthropogenic activities to be optimized (eg, distribution of grass strips/buffers to prevent formation of highly concentrated flows, etc). We assume that once the fields gl(r,t) are specified we can find the corresponding fk(r,t) either by solving the appropriate model or using measured data or both.
The cost functional is then written as an integral over the space O and time T
C[fk(r, t), gl (r, t), pm]= \int_{O, T} F [fk (r, t), gl(r, t), pm] d(r)dt (1)
where the function F determines the local cost for a given configuration of fields fk(r,t ) and gl(r,t) and k,l,m are enumerating indices. The optimal solution is given by a configuration of fields gl^{opt}(r, t) which fulfill the minimization condition
B C[ {fk( r, t), gl^{opt}( r, t), pm^{opt}] = min (2)
and any additional such as non-negativity of a particular field gl(r, t)> 0, prescribed interval of values, continuity or compactness in spatial distribution.
In order to transform the variation of the distributed fields into a variation of usual variables we will employ expansions in several types of basis sets . The multi-variate field can be expressed as a linear combination of basis set functions and the expansion coefficients then assume the roles of variational variables. The basis sets we plan to explore include raster or block-raster basis, local continuous functions such as gaussians and regularized spline with tension (all of these can be used in an arbitrary dimension). The choice will depend on the actual application and will be determined by considerations such as required resolution and accuracy, continuity or character of constraints and
computational feasibility.
The optimization of large number of expansion coefficients can quickly become intractable and the choice of the number of basis functions, desired accuracy and choice of optimization methods will have to be done judiciously. There are essentially two limiting types of the "cost landscapes" which also determine the choice of appropriate optimization strategies. In the case of a single or a few local minima, standard methods from optimization libraries, such as efficient quasi-Newton method can be used. In the other limit the cost landscape has a large number of local extremes with almost the same cost and/or with hierarchical structures. For these cases we plan to use robust minimization techniques based on simulated annealing or genetic algorithms (Kirkpatrick 1983, Goldberg 1989). Most likely in real situations we will include a combination of both methods to achieve the desired generality and robustness.
In addition, we will try to explore the combination of computer optimization with human-computer interactive optimization steering. It is a common problem in large scale optimizations that even the combination of best strategies is inefficient in finding the right "basin" where the optimal or acceptable solutions are located. Sometimes such a region can emerge on a qualitatively different level of detail and representation then the optimization methods deal with (ie, large vectors of variational parameters). We plan to investigate whether human cognitive capabilities coupled with the computer through appropriate visualization would be able to quickly extract such patterns. Subsequent fast feedback to the optimizer would then speed-up the the convergence towards the acceptable solution by a significant factor. Our preliminary trials on similar problems indicate that a combination of human capabilities with computer is surprisingly robust and efficient overall, especially when one counts the necessity of searching for new methods when the implemented approaches fail, implementation and testing, etc.
C.3.4 Intelligent virtual environment for interaction with evolving landscapes
The main focus of the proposed project will be the development of intelligent virtual environment which will allow users to visualize and interact with 3D dynamic landscapes by introducing modifications of landscape during an ongoing simulation. The interaction with landscape processes will allow the users to "feel" the impact of their actions and explore various solutions. This would be especially important for the decision making process when the stake holders with very specific, and often differing views of the problem (e.g. developers, local government, landowners, environmental groups) will be able to present their proposals and concerns and the entire group would be able to view and evaluate the impacts.
The development will involve research in three related areas.
a) Data management tools.
The foundation of any intelligent environment lies in the power and flexibility of the tools it provides the user. In a landscape simulation and visualization environment, users will have access to data management tools that can handle large, spatio-temporal multiresolution datasets. Support will include the ability to carry out interactive analyses of data in and across different time scales and spatial scales, to focus attention on localized regions, to filter and combine data and results, and comparable activities. For example, the SE Triangle watershed will be modeled at 10-30m resolution to provide an overall framework for exploration and analysis. Centennial Campus will be modeled within the framework of to capture flows into and out of the study area. The embedded Centennial Campus area will be modeled at 2m resolution, with the stream areas embedded at 10-30cm. For an example of the kinds of analysis possible in this framework, consider a simulation that combines general changes at a construction site, modeled at a daily interval, with water and sediment flow during a storm on the construction site modeled a minute
interval. The tools to construct and manage such a simulation facilitate user action as well as intelligent assistance in the process.
b) Techniques for interactive simulation control.
The control of complex simulations is an active area of research in AI, particularly in the area of mixed-initiative systems. In a mixed-initiative system, a human user and an automated system contribute to a problem solution--formulation, development, analysis, repair--without the need for constant exchange of information. Ideally, a mixed-initiative system supports the dynamic assignment of system/user responsibility for directing and analyzing a complex process, such as a simulation. In a landscape context, this begins with an environment that contains representations of basic processes and their parameters: fluxes (e.g. water and sediment flow rates), source and sinks (e.g. rainfall intensity), and so forth. In addition, an intelligent assistant simplifies the user's control of the simulation by allowing delegation of repetitive or demanding tasks. Such an assistant contributes in a number of ways: systematic consideration of alternative analysis procedures, autonomous exploration action consequences (lookahead), and automated analysis based on general and domain-specific evaluation rules, among many other possibilities (Pegram et al., 1999, St. Amant, 1997, St. Amant and Cohen, 1998a, 1998b). An an example in conservation design, the exploration of landscape might involve the user's selecting an abstract object such as a a stream buffer, sedimentation pond, or dry dam, and then selecting a general location. The system's automatic contribution would involve exploration of plausible alternatives for the shape, properties, and exact placement of the object, based on a topographic analysis and simulation of relevant processes. The overall goal for such interactions is to integrate human judgment into an automated problem-solving process, relying on an explicit representation of strategic knowledge of the domain. Translating
mixed-initiative techniques to landscape simulation will involve some knowledge engineering, but much of the conceptual work should generalize in a straightforward way.
c) Techniques for flexible visualization.
A key factor in effective simulation control is an appropriate visual representation of landscape properties and behaviors. The PIs have
developed a theoretical framework to explain the role of interactive software tools; the framework helps designers identify and categorize
system properties that facilitate appropriate action on the part of the user (St. Amant, 1999; St. Amant and Riedl, in press). To continue the conservation design example, the system might identify a dozen plausible refinements to the user's abstract action. Rather than presenting these results sequentially, or attempting to reduce them to a more concise (but less expressive) textual representation, the system can constrain and organize the visual environment to facilitate the user's exploration within a restricted space of "good" solutions. Just as graphical user interfaces gray out inappropriate menu selections and highlight windows and buttons needing immediate attention, a simulation system can restrict user operations appropriately and provide visual cues as to the best solutions available. In the conservation example, the system might fix the set
of properties common to the best solutions and then visually highlight a decision (e.g., presenting three "ghost" objects in different locations, requiring a user selection) that most effectively reduces the remaining choices. The role of visualization in such examples is to provide an external representation that allows the user to perceive, evaluate, and potentially modify the system's actions. Appropriate techniques draw on several areas of our expertise: support for navigation in information spaces (St. Amant, 1997); end-user programming through visual techniques (St. Amant et al., 2000), and visualization support for intelligent data analysis (Healey and St. Amant, 1999).
C.3.5 OPEN source implementation.
In the fall of 2000 Dr. St. Amant initiated the first honors course in the Department of Computer Science at NCSU. Student projects during
the course supported a long-term goal of generating an open source environment tailored to the needs and interests of CSC students. This
course fits into a larger effort in the College of Engineering to incorporate Open Source concepts and software into the engineering
curriculum. This effort has received support and software contributions from RedHat and IBM.
GRASS5 visualization tool will be used as a basis ... NVIZ2.2 open gl based 3D visualization tool with tcltk - multiple surfaces, vector and site data, a multiscale-spatially variable resolution - improve efficiency, interactive manipulation, query, distance measurement, flow tracing. Interactive manipulation can be implemented through call map algebra and other GIS functions(Buffer, ...), site symbols. Improve effectiveness, add manipulation modules and interface
The prototype virtual landscape environment for creating and manipulation 3D dynamic landscapes and spatial simulation modules will be implemented in OPEN source environment supported by Open source GIS.
Open
source GIS used and enhanced by this project will be based on
the latest release of GRASS5. GRASS (Geographic resources analysis
support system) is one of the top ten open source projects
(link).....it has played pioneering role in integrating GIS and
environmental modeling, with numerous environmental models linked to
it (ANSWERS, AGNPS, SWAT, CASC2d, TOPMODEL, .....). Link to R stats
package and OSSIM...
This type of development and implementation will contribute to the rapidly growing open source geospatial computing infrastructure and benefit from modifications and enhancements by international team of GIS developers.
C.3.6 Study areas
The interactive computational landscape manipulation environment will be developed/tested using 2 different landscapes: the rapidly developing Centennial Campus (http://centennial.ncsu.edu/) and monitored dynamic section of North Carolina coast.
Centennial Campus is North Carolina State University's vision of the village of the future-a "technopolis" of university, corporate and government R&D facilities and business incubators, with an exciting town center, executive conference center and hotel, housing, and recreational amenities. This 1,192-acre site, adjacent to NC State's main campus, is quickly emerging as the Research Triangle Area's fastest growing development. A futuristic fixed guideway transportation system will link NC State's main campus. with this environmentally sensitive, mixed-use, academic village.
The Centennial Campus area is now partially forested (add % LU composition) and includes XX buildings with about 4000 employees and students. Projected at build-out will include XX buildings with planned 25,000 employees, 7,000 housing residents and most of the forested area will be transformed into a championship golf course developed as a demonstration of environmentally sustainable design, construction and operation (to be opened in 2002) (Figure 2 here comparing current and future land use ).
The field measurements (supported by ARO, NCSU CALS and WRRI) will address both long-term, watershed-scale impacts and short-term, sub-watershed scale impacts. Long-term impacts will be estimated using measurements of accumulated sediment in ponds, sediment retention reservoirs, and larger impoundments using high-resolution swath bathymetry equipment. Short-term impacts will be quantified using high-spatial-resolution measurements of channels, gullies and large rills using RTK GPS, ancillary laser-rangefinder technologies and geo-referenced video surveys. Video surveys will be conducted at the outset and the end of the study; and sites exhibiting extensive changes will be surveyed more often. NCSU is also monitoring storm flows and water quality at several small streams. Detailed surveys will be conducted in areas of active earthmoving, including the locations of temporary and permanent sediment control features.
The monitoring program will capture the transformation of the Campus from mostly forested, natural area to anthropogenic landscape with substantial urban type land use, golf course and parks and evaluate the impact of this change on the environment (including the areas beyond the Campus). The changing impact of ongoing development on landscape processes will be simulated and if unexpected or potentially damaging impacts are predicted the feedback will be provided to the stakeholders (developers, contractors). Interactive land use design and optimization tools will be used to improve the conservation and pollution prevention measures and ensure sustainability of the new, futuristic type of village being built in the Triangle.
The development of this community provides a unique opportunity for: add here something nice.....to contribute to what could become a model for other technopolis developments over the country....
North Carolina Coast - Tom can you please advice on this (which area and what problems/features we should study)- this project should leverage the NRC effort and I will be working on it if I get the NRC fellowship.
C.4 Impact/benefits data sharing, education and outreach
The proposed project will bring ideas of using simulations and use of virtual object representation for design and manipulation of objects which are difficult to experiment with due to the scale and temporal constraints - an approach used in physics, material science (nanotubes) will be extended to landscapes, land use management, design of conservation and environmental protection.
The proposed virtual landscape environment will be also used as a demonstration and educational tool for stakeholders in watershed and coastal communities, local government, NC museums(?), providing visual representations of human impacts on landscape processes. The environment will be also used to train students and professionals to use the advanced GIS, simulation, visualization and mobile measurements to support land use management.
Rob should we include here your open source class?
The project will also contribute to the Open source infrastructure by developing new capabilities for the OPEN source GIS.
In long term future it is possible to envision a full analogy with nanoMano when the lanscape will be modified using robotic/automated machinery... already experiments in precision agriculture (REF) and construction (REF ?). It may also have a impact on real time response to natural disasters, etc
C.5 Participants and international collaboration, qualifications of the investigator and the grantee organization
Rob, Tom please add here whatever is appropriate
The project will be performed as a collaborative effort among the GIScience lead by Dr. Helena Mitasova, computer science lead by Dr. Robert StAmant, earth science lead by Dr. Tom Drake, and Physics lead by Dr. Lubos Mitas. We will also collaborate with Soils science department team lead by Dr. Rich McLaughlin.
Co-PI Dr. Helena Mitasova has been active in open source GIS development for over 10 years and was awarded an Excellence in development award in 1994 from OPEN GIS foundation. She has served as PI on terrain and landscape process modeling projects for DoD and Illinois over the past 8 years.
NCSU is building open source , sponsored by IBM and Red Hat, already LINUX based environment is used by students and faculty
International collaboration: We will coordinate the OPEN source GIS development with the international team of developers lead by Markus Neteler from University of Hannover, Germany and collaborate on the development of landscape modeling tools with Geomodel s.k. in Slovakia (Jaroslav Hofierka).