## Tensor Field Visualization

### Tensor Field Exploration

The focus of this work is on analysis and visualization of stress tensor data, which appear, for example, in geo- and material science. They express the response of material to applied forces in terms of stress. In material science, we are interested in how a material behaves under pressure and if it resists the applied forces.

### Topology Aware Tensor Interpolation

Interpolation is an essential step in the visualization process. While most data from simulations or experiments are discrete many visualization methods are based on smooth, continuous data approximation or interpolation. We introduce an interpolation method for symmetric two-dimensional tensor fields given on a triangulated domain. Differently from standard tensor field interpolation, which is based on the tensor components, we use tensor invariants for the interpolation. This interpolation minimizes the number of eigenvectors and eigenvalues computations by restricting it to mesh vertices and makes an exact integration of the tensor lines possible.

### Tensor Segmentation

Segmentation of complex fields into regions of similar behavior can enhance fundamental field structures and ease its comprehension. The goal of this effort is to develop a method for computing segmentations of two-dimensional symmetric tensor fields. The resulting cells should respect directional as well as scalar characteristics of the underlying field. The directional structure of the field can be expressed by the integral topological skeleton and thus is well suited to serve as pre-segmentation. The resulting cells are bound by tensorlines and already delineate regions of equivalent eigenvector behavior. Subsequently the eigenvalue field guides a further subdivision and simplification of the topological skeleton into regions of similar behavior.

### Anisotropic Sampling

Surface sampling plays an important role in a variety of applications as Glyph placement, mesh generation or non-photorealistic rendering. The goal of this project is to generate stochastic anisotropic samples exhibiting a generalized Poisson-disk characteristic. Thereby size and density of spot samples are determined by a given anisotropic metric tensor. The central step of the algorithm is a generalized anisotropic Lloyd relaxation. Our method supports automatic packing of the elliptical samples, resulting in textures similar to those generated by anisotropic reaction-diffusion methods. The samples satisfy a blue noise property where low frequencies in the power spectrum are reduced to a minimum.

### Textures for Tensor Visualization

Textures provide a means for dense representation of continuous fields. Often numerous free parameters are involved in texture definitions, which makes them well suited to represent complex fields as, e.g., tensor fields. In this work we define is a texture representing the physical meaning of tensor fields exhibiting properties similar to stress tensors. Such tensor fields play an important role in many application areas such as structure mechanics or solid-state physics. Central features are: principal stress directions as well as regions of compression or expansion. The proposed texture resembles a piece of fabric distorted by the stress tensor field. The method supports an intuitive distinction between positive and negative eigenvalues.

## Vector Field Visualization

### Combinatorial Vector Field Topology

Analysis and comparison of vector fields can be done via a segmentation of the flow field into regions of similar behavior and a subsequent extraction of the topological skeleton. Our goals are a numerically stable computation of the topological skeleton including the extraction and classification of all closed streamlines, and a simple but consistent simplification of the topological skeleton to allow for a multi-scale vector field analysis. To achieve these goals we are investigating the applicability of a combinatorial approach to vector field topology.

### Finite-Time Topology

The increasing size and complexity of flow datasets raises new visualization challenges. In order to understand such datasets it is important to characterize, distill and visually represent salient flow structures across spatial and temporal scales. Current methods like vector field topology are only able to generate snapshot topologies. Typically they exhibit a high feature density, if appropriate filter mechanisms are not applied. Further research in time dependent fields, e.g. the development of algorithms for computation of the finite-time Lyapunov exponent (FTLE), provided helpful analysis tools.

### Dual Streamline Seeding

Streamlines are still one of the most popular methods for a first vector field inspection. When sparsely seeded they can be combined with other visualization and thus serve as basis for for comparing of different fields. Dual Streamline Seeding is a novel streamline seeding technique based on dual streamlines that are orthogonal to the flow field, instead of tangential. A greedy algorithm is applied to produce a net of orthogonal streamlines that is iteratively refined resulting in good domain coverage and a high degree of continuity and uniformity. The algorithm it is easy to implement and efficient. Its special strength is, that it naturally extends to curved surfaces.

### Invariant Flow Moments as Feature Descriptors

Feature extraction has become a central task for visual data exploration. A core effort for feature-based methods is the proper definition and extraction method. Moment invariants are commonly used as shape descriptors in computer vision applications. We extended this concept to two-dimensional vector fields. A new class of moment invariants allows defining flow patterns, invariant under translation, scaling, and rotation. After a pre-computation of a multi-scale "moment pyramid" of the flow an interactive feature selection and extraction is possible. Different similarity measures can be defined based on these moments. Using invariant vector moments it is possible to compare different flow data sets as well as follow interesting structures over time.

### Priority Streamlines

When analyzing flow fields, not only the central vector field, often also other related fields are of high relevance, e.g., temperature, pressure, vorticity. Modulating the density of streamlines offers the possibility of simultaneous visualization of multiple fields. "Priority streamlines" is a context-based method for visualizing vector fields in two and three dimensions. A scalar importance field controls the density of the streamlines. Color and the free space can be used to visualize further field characteristics. The algorithm uses an image-based uniformity measure to steer a greedy streamline seeding. Streamlines are added one by one in regions with highest deviation from the target density until the image is saturated.

## Other Projects

### Generalized Heat Kernel Signatures

In this work we propose a generalization of the Heat Kernel Signature (HKS). The HKS is a point signature derived from the heat kernel of the Laplace-Beltrami operator of a surface. In the theory of exterior calculus on a Riemannian manifold, the Laplace-Beltrami operator of a surface is a special case of the Hodge Laplacian which acts on r-forms, i. e. the Hodge Laplacian on 0-forms (functions) is the Laplace-Beltrami operator. We investigate the usefulness of the heat kernel of the Hodge Laplacian on 1-forms (which can be seen as the vector Laplacian) to derive new point signatures which are invariant under isometric mappings.

### Adaptive Volume Rendering

This project is concerned with adaptive volume rendering. We develop a higher-order algorithm that solves the emission-absorption model, the prevalent optical model in volume rendering. The goal is to produce high-quality images with high resolutions (e.g. resolutions of 2048 x 2048) and simultaneously provide an interactive user interaction. Therefore the volume is rendered in a low resolution while it is transformed or the transfer function is changed. Once user interaction ceases the image is refined in an adaptive manner until it converges against the exact solution of the emission-absorption model. The algorithm is based on an adaptive higher-order method that is often used in the field of finite element methods (hp-FEM). In this project a simplified version of hp-FEM was adapted to GPU-based volume rendering to reach the goals listed above.

### Cerebral Aneurysms

Within this project, the development of a methodology for the patient specific analysis of cerebral aneurysms is in focus. Such a cerebral aneurysm is potentially life threatening. Every year, about 10 out of 100,000 people suffer a spontaneous rupture, with a mortality of 50%. For the patients and their attending physicians, a sensible estimation of the rupture risk and the best strategy of treatment is of great importance. We support these investigations by providing an analysis and visualization framework to study the interplay of the aneurysm geometry and the associated blood flow. The mapping of individual patient specific geometries to a reference shape facilitates a comparison of different aneurysms.