Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2012
Defining high-level features, detecting them, tracking them and deriving quantities based on them is an integral aspect of modern data analysis and visualization. In combustion simulations, for example, burning regions, which are characterized by high fuel-consumption, are a possible feature of interest. Detecting these regions makes it possible to derive statistics about their size and track them over time. However, features of interest in scientific simulations are extremely varied, making it challenging to develop cross-domain feature definitions. Topology-based techniques offer an extremely flexible means for general feature definitions and have proven useful in a variety of scientific domains. This paper will provide a brief introduction into topological structures like the contour tree and Morse-Smale complex and show how to apply them to define features in different science domains such as combustion. The overall goal is to provide an overview of these powerful techniques and start a discussion how these techniques can aid in the analysis of astrophysical simulations.
2011
We describe a combinatorial streaming algorithm to extract features which identify regions of local intense rates of mixing in two terascale turbulent combustion simulations. Our algorithm allows simulation data comprised of scalar fields represented on 728x896x512 or 2025x1600x400 grids to be processed on a single relatively lightweight machine. The turbulence-induced mixing governs the rate of reaction and hence is of principal interest in these combustion simulations. We use our feature extraction algorithm to compare two very different simulations and find that in both the thickness of the extracted features grows with decreasing turbulence intensity. Simultaneous consideration of results of applying the algorithm to the HO 2 mass fraction field indicates that autoignition kernels near the base of a lifted flame tend not to overlap with the high mixing rate regions.
2009 Fifth IEEE International Conference on e-Science, 2009
The advent of highly accurate, large scale volumetric simulations has made data analysis and visualization techniques an integral part of the modern scientific process. To develop new insights from raw data, scientists need the ability to define features of interest in a flexible manner and to understand how changes in the feature definition impact the subsequent analysis of the data. Therefore, simply exploring the raw data is not sufficient.
IEEE Transactions on Visualization and Computer Graphics, 2000
Large-scale simulations are increasingly being used to study complex scientific and engineering phenomena. As a result, advanced visualization and data analysis are also becoming an integral part of the scientific process. Often, a key step in extracting insight from these large simulations involves the definition, extraction, and evaluation of features in the space and time coordinates of the solution. However, in many applications these features involve a range of parameters and decisions that will affect the quality and direction of the analysis. Examples include particular level sets of a specific scalar field, or local inequalities between derived quantities. A critical step in the analysis is to understand how these arbitrary parameters/decisions impact the statistical properties of the features, since such a characterization will help to evaluate the conclusions of the analysis as a whole. We present a new topological framework that in a single pass extracts and encodes entire families of possible features definitions as well as their statistical properties. For each time step we construct a hierarchical merge tree a highly compact, yet flexible feature representation. While this data structure is more than two orders of magnitude smaller than the raw simulation data it allows us to extract a set of feature for any given parameter selection in a post-processing step. Furthermore, we augment the trees with additional attributes making it possible to gather a large number of useful global, local, as well as conditional statistic that would otherwise be extremely difficult to compile. We also use this representation to create tracking graphs that describe the temporal evolution of the features over time. Our system provides a linked-view interface to explore the time-evolution of the graph interactively alongside the segmentation, thus making it possible to perform extensive data analysis in a very efficient manner. We demonstrate our framework by extracting and analyzing burning cells from a large-scale turbulent combustion simulation. In particular, we show how the statistical analysis enabled by our techniques provides new insight into the combustion process.
IEEE Transactions on Visualization and Computer Graphics, 2000
. Our framework provides a natural and intuitive work-flow for the exploration of global trends in feature-based statistics. By efficiently encoding hierarchical meta-data in a pre-processing step, interactive data exploration of the equivalent of one terabyte of simulation data is performed on a commodity desktop.
Topological Methods in Data Analysis and Visualization IV, 2017
Many scientific applications deal with data from a multitude of different sources, e.g., measurements, imaging and simulations. Each source provides an additional perspective on the phenomenon of interest, but also comes with specific limitations, e.g. regarding accuracy, spatial and temporal availability. Effectively combining and analyzing such multimodal and partially incomplete data of limited accuracy in an integrated way is challenging. In this work, we outline an approach for an integrated analysis and visualization of the atmospheric impact of volcano eruptions. The data sets comprise observation and imaging data from satellites as well as results from numerical particle simulations. To analyze the clouds from the volcano eruption in the spatiotemporal domain we apply topological methods. Extremal structures reveal structures in the data that support clustering and comparison. We further discuss the robustness of those methods with respect to different properties of the data and different parameter setups. Finally we outline open challenges for the effective integrated visualization using topological methods.
IEEE VIS Tutorials, 2019
Figure 1: TTK is a software platform for topological data analysis in scientific visualization. It is both easily accessible to end users (ParaView plugins (a), VTK-based generic GUIs (b) or command-line programs (c)) and flexible for developers (Python (d), VTK/C++ (e) or dependence-free C++ (f) bindings). TTK provides an efficient and unified approach to topological data representation and simplification, which enables in this example a discrete Morse-Smale complex (a) to comply to the level of simplification dictated by a piecewise linear persistence diagram (bottom-right linked view, a). Code snippets are provided (d-f) to reproduce this pipeline. 1 LEVEL OF THE TUTORIAL
There are two kinds of computer graphics: the illustrative one and the cognitive one. Appropriate the cognitive pictures not only make evident and clear the sense of complex and difficult scientific concepts, but promote, - and not so very rarely, - a birth of a new knowledge. On the basis of the cognitive graphics concept, we worked out the Space Walker (SW)-system for visualization and analysis. It allows to train and to aggravate intuition of researcher, to raise his interest and motivation to the creative, scientific cognition, and to discover new relations in multidimensional spaces of observable parameters. The modern observational cosmology brings more and more volumes of experimental data. Only the astrophysical observation archives have up to petabytes these now. The enormous data analysis is coming to a huge problem for investigators. We offer a rather natural way for resolution of this complex of difficult scientific and problems. These problems can be resolved with the h...
Discrete Geometry for Computer Imagery, 1995
Our previous works in image analysis dealing with image representations by means of adjacency or boundary graphs, led us to the need for a coherent representation model. In fact, classical approaches to this problem seem to be unsu cient o r e v en uncoherent for example they are unclear with the well-known connectivity paradox o r w i t h t h e b o r d e r description of regions. Di erent works like those of Kovalevsky, Herman and Malandain pointed out the advantages of cellular complex based topologies Kovalevsky89, Herman90, Malandain93]. But no one of them suggested a formalism that can be applied to any type of image. In this work we propose a topological representation for any type of image, colour or grey-level of whatever dimension. It is based on convex complexes, and looks closely at the elements realizing the connectivity within complexes and later within regions. Futhermore it remains coherent with pixel and voxel only based representations. An important feature is still maintaining a direct correspondance with the classical IR n topology. Finally, w e suggest a characterization of regions, their borders and boundaries which is useful as a basic tool for segmentation.
Journal of Physics: Conference Series, 2009
We segment the stabilization region in a simulation of a lifted jet flame based on its topology induced by the Y OH field. Our segmentation method yields regions that correspond to the flame base and to potential auto-ignition kernels. We apply a region overlap based tracking method to follow the flame-base and the kernels over time, to study the evolution of kernels, and to detect when the kernels merge with the flame. The combination of our segmentation and tracking methods allow us observe flame stabilization via merging between the flame base and kernels; we also obtain Y CH 2 O histories inside the kernels and detect a distinct decrease in radical concentration during transition to a developed flame.
Journal of Mathematical Imaging and …, 2003
We present a new method for preprocessing and organizing discrete scalar volume data of any dimension on external storage. We describe our implementation of a visual navigation system using our method. The techniques have important applications for out-of-core visualization of volume data sets and image understanding. The applications include extracting isosurfaces in a manner that helps reduce both I/O and disk seek time, a priori topologically correct isosurface simplification (prior to extraction), and producing a visual atlas of all topologically distinct objects in the data set. The preprocessing algorithm computes regions of space that we call topological zone components, so that any isosurface component (contour) is completely contained in a zone component and all contours contained in a zone component are topologically equivalent. The algorithm also constructs a criticality tree that is related to the recently studied contour tree. However, unlike the contour tree, the zones and the criticality tree hierarchically organize the data set. We demonstrate that the techniques work on both irregularly and regularly gridded data, and can be extended to data sets with nonunique values, by the mathematical analysis we call Digital Morse Theory (DMT), so that perturbation of the data set is not required. We present the results of our initial experiments with three dimensional volume data (CT) and describe future extensions of our DMT organizing technology.
2004
The Contour Tree of a scalar field is the graph obtained by contracting all the connected components of the level sets of the field into points. This is a powerful abstraction for representing the structure of the field with explicit description of the topological changes of its level sets. It has proven effective as a data-structure for fast extraction of isosurfaces and its application has been advocated as a user interface component guiding interactive data exploration sessions. We propose a new metaphor for visualizing the Contour Tree borrowed from the classical design of a mechanical orrery-see Figure 1(a)-reproducing a hierarchy of orbits of the planets around the sun or moons around a planet. In the toporrery-see Figure 1(b)-the hierarchy of stars, planets and moons is replaced with a hierarchy of maxima, minima and saddles that can be interactively filtered, both uniformly and adaptively, by importance with respect to a given metric.
SC14: International Conference for High Performance Computing, Networking, Storage and Analysis, 2014
The ever increasing amount of data generated by scientific simulations coupled with system I/O constraints are fueling a need for in-situ analysis techniques. Of particular interest are approaches that produce reduced data representations while maintaining the ability to redefine, extract, and study features in a post-process to obtain scientific insights. This paper presents two variants of in-situ feature extraction techniques using segmented merge trees, which encode a wide range of threshold based features. The first approach is a fast, low communication cost technique that generates an exact solution but has limited scalability. The second is a scalable, local approximation that nevertheless is guaranteed to correctly extract all features up to a predefined size. We demonstrate both variants using some of the largest combustion simulations available on leadership class supercomputers. Our approach allows state-of-the-art, feature-based analysis to be performed in-situ at significantly higher frequency than currently possible and with negligible impact on the overal simulation runtime.
Journal of Physics: Conference Series, 2007
Scientific datasets obtained by measurement or produced by computational simulations must be analyzed to understand the phenomenon under study. The analysis typically requires a mathematically sound definition of the features of interest and robust algorithms to identify these features, compute statistics about them, and often track them over time. Because scientific datasets often capture phenomena with multi-scale behaviour, and almost always contain noise the definitions and algorithms must be designed with sufficient flexibility and care to allow multi-scale analysis and noise-removal. In this paper, we present some recent work on topological feature extraction and tracking with applications in molecular analysis, combustion simulation, and structural analysis of porous materials.
Feature-based conditional statistical methods are essential for the analysis of complex, large-scale data. We introduce two shape-based conditional analysis algorithms that can be deployed in complementary settings: local methods are required when the phenomena under study comprise many small intermittent features, while global shape methods are required to study large-scale structures. We present the algorithms in context with their motivating combustion science case studies, but we note that the methods are applicable to a broad class of physics-based phenomena.
Multi-Resolution computation and presentation of Contour Trees * PAPER 541 (a) (left) Orrery reproducing the hierarchical relationship between the orbits of the sun, the planets and their moons. Original design (1812) by A. Janvier reprinted recently by E. Tufte [25]. (center) Contour Tree drawn as an orrery where stars/planets/moons are replaced by critical points and the nested orbits represent the hierarchy of topological features. The scalar field of this particular example is the inverse air density distribution α of a fuel injection into a combustion chamber. (right) Semi-transparent level sets of α displayed together with the Contour Tree drawn with its critical points in their original position. (b) (left) Contour Tree at three levels of resolution for the electron density distribution (ρ) computed with an ab initio simulation for water molecules at high pressure. At the coarse scale (top) the tree has only one maximum (blue sphere) per water molecule. At medium scale (middle) the topology reconstructs one dipole per molecule with one maximum and one minimum (red sphere). At fine scale (bottom) only the noise is removed and three extrema per molecule reconstruct each atom in the simulation. (middle) Semi-transparent level sets of ρ together with the fine scale topology. Adaptive refinement in the area marked with a rectangle can be used to refine the topology for two particular molecules. This as shown on the right at coarse, medium and fine scale both for the Contour Tree and for the embedding (shown side by side).
IEEE Transactions on Visualization and Computer Graphics, 2000
This paper presents topology-based methods to robustly extract, analyze, and track features defined as subsets of isosurfaces. First, we demonstrate how features identified by thresholding isosurfaces can be defined in terms of the Morse complex. Second, we present a specialized hierarchy that encodes the feature segmentation independent of the threshold while still providing a flexible multi-resolution representation. Third, for a given parameter selection we create detailed tracking graphs representing the complete evolution of all features in a combustion simulation over several hundred time steps. Finally, we discuss a user interface that correlates the tracking information with interactive rendering of the segmented isosurfaces enabling an in-depth analysis of the temporal behavior. We demonstrate our approach by analyzing three numerical simulations of lean hydrogen flames subject to different levels of turbulence. Due to their unstable nature, lean flames burn in cells separated by locally extinguished regions. The number, area, and evolution over time of these cells provide important insights into the impact of turbulence on the combustion process. Utilizing the hierarchy we can perform an extensive parameter study without re-processing the data for each set of parameters. The resulting statistics enable scientist to select appropriate parameters and provide insight into the sensitivity of the results wrt. to the choice of parameters. Our method allows for the first time to quantitatively correlate the turbulence of the burning process with the distribution of burning regions, properly segmented and selected. In particular, our analysis shows that counter-intuitively stronger turbulence leads to larger cell structures, which burn more intensely than expected. This behavior suggests that flames could be stabilized under much leaner conditions than previously anticipated.
2011
Feature-based conditional statistical methods are essential for the analysis of complex, large-scale data. We introduce two shape-based conditional analysis algorithms that can be deployed in complementary settings: local methods are required when the phenomena under study comprises many small intermittent features, while global shape methods are required to study large-scale structures. We present the algorithms in context with their motivating combustion science case studies, but note that the methods are applicable to a broad class of physics-based phenomena.
Mathematics and Visualization, 2011
Mathematics and Visualization, 2007
The use of general descriptive names, registered names, trademarks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use.
IEEE Transactions on Visualization and Computer Graphics, 2019
With the rapid adoption of machine learning techniques for large-scale applications in science and engineering comes the convergence of two grand challenges in visualization. First, the utilization of black box models (e.g., deep neural networks) calls for advanced techniques in exploring and interpreting model behaviors. Second, the rapid growth in computing has produced enormous datasets that require techniques that can handle millions or more samples. Although some solutions to these interpretability challenges have been proposed, they typically do not scale beyond thousands of samples, nor do they provide the high-level intuition scientists are looking for. Here, we present the first scalable solution to explore and analyze high-dimensional functions often encountered in the scientific data analysis pipeline. By combining a new streaming neighborhood graph construction, the corresponding topology computation, and a novel data aggregation scheme, namely topology aware datacubes, we enable interactive exploration of both the topological and the geometric aspect of high-dimensional data. Following two use cases from high-energy-density (HED) physics and computational biology, we demonstrate how these capabilities have led to crucial new insights in both applications.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.