Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2000, IEEE Transactions on Visualization and Computer Graphics
. Our framework provides a natural and intuitive work-flow for the exploration of global trends in feature-based statistics. By efficiently encoding hierarchical meta-data in a pre-processing step, interactive data exploration of the equivalent of one terabyte of simulation data is performed on a commodity desktop.
Computing in Science and Engineering, 2007
To understand dynamic mechanisms, scientists need intuitive and convenient ways to validate known relationships and reveal hidden ones among multiple variables.
2011
We describe a combinatorial streaming algorithm to extract features which identify regions of local intense rates of mixing in two terascale turbulent combustion simulations. Our algorithm allows simulation data comprised of scalar fields represented on 728x896x512 or 2025x1600x400 grids to be processed on a single relatively lightweight machine. The turbulence-induced mixing governs the rate of reaction and hence is of principal interest in these combustion simulations. We use our feature extraction algorithm to compare two very different simulations and find that in both the thickness of the extracted features grows with decreasing turbulence intensity. Simultaneous consideration of results of applying the algorithm to the HO 2 mass fraction field indicates that autoignition kernels near the base of a lifted flame tend not to overlap with the high mixing rate regions.
SC14: International Conference for High Performance Computing, Networking, Storage and Analysis, 2014
The ever increasing amount of data generated by scientific simulations coupled with system I/O constraints are fueling a need for in-situ analysis techniques. Of particular interest are approaches that produce reduced data representations while maintaining the ability to redefine, extract, and study features in a post-process to obtain scientific insights. This paper presents two variants of in-situ feature extraction techniques using segmented merge trees, which encode a wide range of threshold based features. The first approach is a fast, low communication cost technique that generates an exact solution but has limited scalability. The second is a scalable, local approximation that nevertheless is guaranteed to correctly extract all features up to a predefined size. We demonstrate both variants using some of the largest combustion simulations available on leadership class supercomputers. Our approach allows state-of-the-art, feature-based analysis to be performed in-situ at significantly higher frequency than currently possible and with negligible impact on the overal simulation runtime.
2009 Fifth IEEE International Conference on e-Science, 2009
The advent of highly accurate, large scale volumetric simulations has made data analysis and visualization techniques an integral part of the modern scientific process. To develop new insights from raw data, scientists need the ability to define features of interest in a flexible manner and to understand how changes in the feature definition impact the subsequent analysis of the data. Therefore, simply exploring the raw data is not sufficient.
IEEE Computer Graphics and Applications, 2010
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2009
Knowledge extraction from data volumes of ever increasing size requires ever more flexible tools to facilitate interactive query. Interactivity enables real-time hypothesis testing and scientific discovery, but can generally not be achieved without some level of data reduction. The approach described in this paper combines multi-resolution access, region-of-interest extraction, and structure identification in order to provide interactive spatial and statistical analysis of a terascale data volume. Unique aspects of our approach include the incorporation of both local and global statistics of the flow structures, and iterative refinement facilities, which combine geometry, topology, and statistics to allow the user to effectively tailor the analysis and visualization to the science. Working together, these facilities allow a user to focus the spatial scale and domain of the analysis and perform an appropriately tailored multivariate visualization of the corresponding data. All of these ideas and algorithms are instantiated in a deployed visualization and analysis tool called VAPOR, which is in routine use by scientists internationally. In data from a 1024 3 simulation of a forced turbulent flow, VAPOR allowed us to perform a visual data exploration of the flow properties at interactive speeds, leading to the discovery of novel scientific properties of the flow, in the form of two distinct vortical structure populations. These structures would have been very difficult (if not impossible) to find with statistical overviews or other existing visualization-driven analysis approaches. This kind of intelligent, focused analysis/refinement approach will become even more important as computational science moves towards petascale applications.
2019
In-situ visualization and analysis is a powerful concept that aims to give users the ability to process data while it is still resident in memory, thereby vastly reducing the amount of data left for post-hoc analysis. The problem of having too much data for posthoc analysis is exacerbated in large-scale high-performance computing applications such as Nek5000, a massively-parallel CFD (Computational Fluid Dynamics) code used primarily for thermal hydraulics problems. Specifically, one problem users of Nek5000 often face is validating the mesh, that is identifying the exact location of problematic mesh elements within the whole mesh. Employing the standard post-hoc approach to address this problem is both time consuming and requires vast storage space. In this paper, we demonstrate how in-situ visualization, produced with SENSEI, a generic in-situ platform, helps users quickly validate the mesh. We also provide a bridge between Nek5000 and SENSEI that enables users to use any existing...
Feature-based conditional statistical methods are essential for the analysis of complex, large-scale data. We introduce two shape-based conditional analysis algorithms that can be deployed in complementary settings: local methods are required when the phenomena under study comprise many small intermittent features, while global shape methods are required to study large-scale structures. We present the algorithms in context with their motivating combustion science case studies, but we note that the methods are applicable to a broad class of physics-based phenomena.
2011
Feature-based conditional statistical methods are essential for the analysis of complex, large-scale data. We introduce two shape-based conditional analysis algorithms that can be deployed in complementary settings: local methods are required when the phenomena under study comprises many small intermittent features, while global shape methods are required to study large-scale structures. We present the algorithms in context with their motivating combustion science case studies, but note that the methods are applicable to a broad class of physics-based phenomena.
IEEE Transactions on Visualization and Computer Graphics, 2000
This paper presents topology-based methods to robustly extract, analyze, and track features defined as subsets of isosurfaces. First, we demonstrate how features identified by thresholding isosurfaces can be defined in terms of the Morse complex. Second, we present a specialized hierarchy that encodes the feature segmentation independent of the threshold while still providing a flexible multi-resolution representation. Third, for a given parameter selection we create detailed tracking graphs representing the complete evolution of all features in a combustion simulation over several hundred time steps. Finally, we discuss a user interface that correlates the tracking information with interactive rendering of the segmented isosurfaces enabling an in-depth analysis of the temporal behavior. We demonstrate our approach by analyzing three numerical simulations of lean hydrogen flames subject to different levels of turbulence. Due to their unstable nature, lean flames burn in cells separated by locally extinguished regions. The number, area, and evolution over time of these cells provide important insights into the impact of turbulence on the combustion process. Utilizing the hierarchy we can perform an extensive parameter study without re-processing the data for each set of parameters. The resulting statistics enable scientist to select appropriate parameters and provide insight into the sensitivity of the results wrt. to the choice of parameters. Our method allows for the first time to quantitatively correlate the turbulence of the burning process with the distribution of burning regions, properly segmented and selected. In particular, our analysis shows that counter-intuitively stronger turbulence leads to larger cell structures, which burn more intensely than expected. This behavior suggests that flames could be stabilized under much leaner conditions than previously anticipated.
IEEE Computer Graphics and Applications, 2012
A dvanced combustion research is essential to designing more efficient engines. Nextgeneration engines will operate in nonconventional, mixed-mode, and turbulent conditions. Combustion processes in such an environment, combined with new physical and chemical fuel properties, feature complicated interactions that are poorly understood at a fundamental level. Recently, Sandia National Laboratories scientists have instrumented their simulations with particles to capture and better understand the turbulent dynamics in combustion processes. So, how to analyze and visualize these particles' temporal behaviors from different aspects is critical to understanding combustion. When visualizing a large number of moving particles, we confront two main issues. The first is what properties of the particle data to visualize; the other is how to deal with the large data. To conduct a comprehensive study of particle behaviors, a visualization system must be able to present the temporal
IEEE Transactions on Visualization and Computer Graphics, 2000
Large-scale simulations are increasingly being used to study complex scientific and engineering phenomena. As a result, advanced visualization and data analysis are also becoming an integral part of the scientific process. Often, a key step in extracting insight from these large simulations involves the definition, extraction, and evaluation of features in the space and time coordinates of the solution. However, in many applications these features involve a range of parameters and decisions that will affect the quality and direction of the analysis. Examples include particular level sets of a specific scalar field, or local inequalities between derived quantities. A critical step in the analysis is to understand how these arbitrary parameters/decisions impact the statistical properties of the features, since such a characterization will help to evaluate the conclusions of the analysis as a whole. We present a new topological framework that in a single pass extracts and encodes entire families of possible features definitions as well as their statistical properties. For each time step we construct a hierarchical merge tree a highly compact, yet flexible feature representation. While this data structure is more than two orders of magnitude smaller than the raw simulation data it allows us to extract a set of feature for any given parameter selection in a post-processing step. Furthermore, we augment the trees with additional attributes making it possible to gather a large number of useful global, local, as well as conditional statistic that would otherwise be extremely difficult to compile. We also use this representation to create tracking graphs that describe the temporal evolution of the features over time. Our system provides a linked-view interface to explore the time-evolution of the graph interactively alongside the segmentation, thus making it possible to perform extensive data analysis in a very efficient manner. We demonstrate our framework by extracting and analyzing burning cells from a large-scale turbulent combustion simulation. In particular, we show how the statistical analysis enabled by our techniques provides new insight into the combustion process.
2011 IEEE Pacific Visualization Symposium, 2011
Current simulations of turbulent flames are instrumented with particles to capture the dynamic behavior of combustion in nextgeneration engines. Categorizing the set of many millions of particles, each of which is featured with a history of its movement positions and changing thermo-chemical states, helps understand the turbulence mechanism. We introduce a dual-space method to analyze such data, starting by clustering the time series curves in the phase space of the data, and then visualizing the corresponding trajectories of each cluster in the physical space. To cluster time series curves, we adopt a model-based clustering technique in a two-stage scheme. In the first stage, the characteristics of shape and relative position are particularly concerned in classifying the time series curves, and in the second stage, within each group of curves, clustering is further conducted based on how the curves change over time. In our work, we perform the model-based clustering in a semi-supervised manner. Users' domain knowledge is integrated through intuitive interaction tools to steer the clustering process. Our dual-space method has been used to analyze particle data in combustion simulations and can also be applied to other scientific simulations involving particle trajectory analysis work.
Journal of Physics: Conference Series, 2008
Computational science is paramount to the understanding of underlying processes in internal combustion engines of the future that will utilize non-petroleum-based alternative fuels, including carbon-neutral biofuels, and burn in new combustion regimes that will attain high efficiency while minimizing emissions of particulates and nitrogen oxides. Next-generation engines will likely operate at higher pressures, with greater amounts of dilution and utilize alternative fuels that exhibit a wide range of chemical and physical properties. Therefore, there is a significant role for high-fidelity simulations, direct numerical simulations (DNS), specifically designed to capture key turbulence-chemistry interactions in these relatively uncharted combustion regimes, and in particular, that can discriminate the effects of differences in fuel properties. In DNS, all of the relevant turbulence and flame scales are resolved numerically using high-order accurate numerical algorithms. As a consequence terascale DNS are computationally intensive, require massive amounts of computing power and generate tens of terabytes of data. Recent results from terascale DNS of turbulent flames are presented here, illustrating its role in elucidating flame stabilization mechanisms in a lifted turbulent hydrogen/air jet flame in a hot air coflow, and the flame structure of a fuel-lean turbulent premixed jet flame. Computing at this scale requires close collaborations between computer and combustion scientists to provide optimized scaleable algorithms and software for terascale simulations, efficient collective parallel I/O, tools for volume visualization of multiscale, multivariate data and automating the combustion workflow. The enabling computer science, applied to combustion science, is also required in many other terascale 015001 J H Chen et al physics and engineering simulations. In particular, performance monitoring is used to identify the performance of key kernels in the DNS code, S3D and especially memory intensive loops in the code. Through the careful application of loop transformations, data reuse in cache is exploited thereby reducing memory bandwidth needs, and hence, improving S3D's nodal performance. To enhance collective parallel I/O in S3D, an MPI-I/O caching design is used to construct a two-stage write-behind method for improving the performance of write-only operations. The simulations generate tens of terabytes of data requiring analysis. Interactive exploration of the simulation data is enabled by multivariate time-varying volume visualization. The visualization highlights spatial and temporal correlations between multiple reactive scalar fields using an intuitive user interface based on parallel coordinates and time histogram. Finally, an automated combustion workflow is designed using Kepler to manage large-scale data movement, data morphing, and archival and to provide a graphical display of run-time diagnostics.
2018
Repositories for scientific and scholarly data are valuable resources to share, search, and reuse data by the community. Such repositories are essential in data-driven research based on experimental data. In this paper we focus on the case of combustion kinetic modeling, where the goal is to design models typically validated by means of comparisons with a large number of experiments.
Verläßliche Informationssysteme, 2000
Visualization can be an important tool for displaying, categorizing and digesting large quantities of inter-related information during laboratory and simulation experiments. Summary visualizations that compare and represent data sets in the context of a collection are particularly valuable. Applicable visualizations used in these settings must be fast (near real time) and should allow the addi- tion of data sets as
Direct Numerical Simulations of premixed combustion produce terabytes of raw data, which are prohibitively large to be stored, and have to be analyzed and visualized. A simultaneous and integrated treatment of data storage, data analysis and data visualization is required. For this, we introduce a sparse representation tailored to DNS data which can directly be used for both analysis and visualization. The method is based on the observation that most information is located in narrow-band regions where the chemical reactions take place, but these regions are not well defined. An approach for the visual investigation of feature surfaces of the scalar fields involved in the simulation is shown as a possible application. We demonstrate our approach on multiple real datasets. Categories and Subject Descriptors (according to ACM CCS): I.6.6 [Simulation and Modeling]: Simulation Output Analysis-Analysis of such raw data is usually done in a postprocessing step: Bremer et al. [BWT * 09] present a framework
Energy, Environment, and Sustainability
Journal of Imaging Science and Technology, 2015
Simulation and modeling of turbulent flow, and of turbulent reacting flow in particular, involve solving for and analyzing time-dependent and spatially dense tensor quantities, such as turbulent stress tensors. The interactive visual exploration of these tensor quantities can effectively steer the computational modeling of combustion systems. In this article, the authors analyze the challenges in dense symmetric-tensor visualization as applied to turbulent combustion calculation; most notable among these challenges are the dataset size and density. They analyze, together with domain experts, the feasibility of using several established tensor visualization techniques in this application domain. They further examine and propose visual descriptors for volume rendering of the data. Of these novel descriptors, one is a density-gradient descriptor which results in Schlieren-style images, and another one is a classification descriptor inspired by machine-learning techniques. The result is a hybrid visual analysis tool to be utilized in the debugging, benchmarking and verification of models and solutions in turbulent combustion. The authors demonstrate this analysis tool on two example configurations, report feedback from combustion researchers, and summarize the design lessons learned.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.