| |
 Out-of-Core Visualizing Massive Data Using External Memory
Members Steffen Prohaska Andrei Hutanu
Links
www.zib.de
|
| Visualizing Massive Data Using External Memory |
| |
Datasets of 10-100 gigabyte sizes are getting more and more common.
For experimental data this is mainly due to advances in detector technology that
continuously increase spatial and temporal resolutions.
Besides data acquisition systems, numerical simulations are another important source of large datasets.
With the advent of commodity based cluster computing, cheap processing power is becoming
available with lots of big datasets in its wake. Data from simulations might be
time dependent or even higher dimensional if several external parameters are varied.
These massive data typically cannot be loaded completely into main memory.
Designing a practically useful visualization system one has to take this fact into account.
External memory algorithms and data structures address these problems.
The general goal is to redesign the algorithms to run with
minimal performance loss due to out-of-core data storage.
|
| |
In cooperation with the gridlab project
remote access to the datasets is implemented using a modified
HDF5 library.
|
|
|