Cloud computing offers a powerful abstraction that provides a scalable, virtualized infrastructure as a service where the complexity of fine-grained resource management is hidden from the end-user. Running data analytics applications in... more
Abstract - 3-D optical data storage technology is one of the modern methods of storing large volumes of data. This paper, discusses in details the fundamentals of 3D optical data storage. This includes the features of the 3D optical data... more
System availability is one of the crucial properties of a dependable knowledge repository system in order to preserve and pull through from minor outages in a short timespan by an automated process. National Marine Bioinformatics System... more
Optical engineering continues to impact more and more modern devices for communication, color display, data storage, illumination, and remote sensing in the new millenium. The amalgamation of modern optics and associated microelectronic... more
Customary capacity the board methods are turning out to be less powerful for dealing with this huge volume of information because of the improvement of server farms and the unforeseen ascent away requirements. By eliminating the capacity... more
Normal 0 false false false EN-US X-NONE X-NONE Because of its superior information processing capability, previous authors have proposed that phase conjugation holography offers a feasible mechanism to explain various aspects of human... more
An image-processing system based on four-wave mixing in a film of polyacetylene less than 100 nm thick is demonstrated with a-ps processing cycle time. Image phase conjugation and cross correlation are performed with a resolution... more
Hadoop Distributed File System (HDFS) is the core component of Apache Hadoop project. In HDFS, the computation is carried out in the nodes where relevant data is stored. Hadoop also implemented a parallel computational paradigm named as... more
File-sharing semantics is used by the file systems for sharing data among concurrent client processes in a consistent manner. Session semantics is a widely used file-sharing semantics in Distributed File Systems (DFSs). The main... more
Hadoop Distributed File System (HDFS) is the core component of Apache Hadoop project. In HDFS, the computation is carried out in the nodes where relevant data is stored. Hadoop also implemented a parallel computational paradigm named as... more
A method is examined to obtain the same interference pattern from separate holographic recordings as from recordings superposed on the same region of the plate. Experimental results are shown.
To improve the checkpoint bandwidth of critical applications at LANL, we developed the Parallel Log Structured File System (PLFS)[1]. PLFS is a transformative I/O middleware layer placed within our storage stack. It transforms a... more
A simple inethod of pseudocoloring gray levelimages is proposed. It is based on Young's fringes modulated speckle encoding of images. Reconstruction from a photographic register is obtained using two light beams with different wavelengths... more
The main characteristics of five distributed file systems required for big data: A comparative study
These last years, the amount of data generated by information systems has exploded. It is not only the quantities of information that are now estimated in Exabyte, but also the variety of these data which is more and more structurally... more
Modern day systems are facing an avalanche of data, and they are being forced to handle more and more data intensive use cases. These data comes in many forms and shapes: Sensors (RFID, Near Field Communication, Weather Sensors),... more
Background and Purpose: One major challenge encountered during crime investigation via automated systems is the inability of conventional data analysis techniques to adequately handle the enormous data that are made available during the... more
Background and Purpose: The need for better and faster decision-making based on data is stronger than ever; and being able to use massive amounts of data is a necessary competitive advantage. This has necessitated the urgent need for a... more
The cloud computing model aims to make largescale data-intensive computing affordable even for users with limited financial resources, that cannot invest into expensive infrastructures necesssary to run them. In this context, MapReduce is... more
With the emergence of Cloud Computing, the amount of data generated in different fields such as physics, medical, social networks, etc. is growing exponentially. This increase in the volume of data and their large scale make the problem... more
In this era of developing technologies, one of the most promising is cloud computing that has been functioning since years and used by individuals and large enterprises to provide different kind of services to the world. Cloud computing... more
A simple inethod of pseudocoloring gray levelimages is proposed. It is based on Young's fringes modulated speckle encoding of images. Reconstruction from a photographic register is obtained using two light beams with different wavelengths... more
A large class of modern distributed file systems treat metadata services as an independent system component, separately from data servers. The availability of the metadata service is key to the availability of the overall system. Given... more
As scientific research becomes more data intensive, there is an increasing need for scalable, reliable, and high performance storage systems. Such data repositories must provide both data archival services and rich metadata, and cleanly... more
Virtualization enables the consolidation of virtual machines (VMs) to increase the utilization of physical servers in Infrastructure-as-a-Service (IaaS) cloud providers. Unfortunately, our quantification of storage I/O performance across... more
With the daily increase of data production and collection, Hadoop is a platform for processing big data on a distributed system. A master node globally manages running jobs, whereas worker nodes process partitions of the data locally.... more
Resource management is a key factor in the performance and efficient utilization of cloud systems, and many research works have proposed efficient policies to optimize such systems. However, these policies have traditionally managed the... more
AbFS is a distributed file system that makes it possible to efficiently share the inexpensive devices attached to the commodity computers of a cluster. The implementation of AbFS offers high-performance metadata management by combining... more
This study involvesrecording of transmission holograms of various objects which are processed to obtain phase holograms. Good quality holograms are obtained by optimizing the necessary parameters as presented in[29]. A fter replaying the... more
We have investigated the dynamics of the record-erase process of holograms in photochromic glass using continuum Nd:YVO 4 laser radiation (λ ¼ 532 nm). A bidimensional microgrid pattern was formed and visualized in photochromic glass, and... more
With new database system development and new data types emerging, many applications are no longer using a monolithic, simple client/server structure, but using more than one types of database systems to store heterogeneous data. In this... more
The efficiency of optooptical light deflection by nondegenerate four-wave mixing can be increased significantly by placing the grating in a resonant cavity. Theory and experiment are presented for linear and ring cavities. Efficiency... more
Object-Based Storage Devices (OSDs) offer an object-based data layout instead of using the traditional block based design. Object storage ideas are increasingly being used in parallel file systems and cloud storage. Previous work has... more
There has been considerable interest recently in the use of highly-available configuration management services based on the Paxos family of algorithms to address long-standing problems in the management of large-scale heterogeneous... more
Two-photon holography with continuous-wave laser sources is accomplished by using carbazole dissolved in a polymethyl methacrylate host matrix as a recording medium. Gating of the holographic sensitivity for 4880-A radiation by... more
Hadoop Distributed File System (HDFS) is the core component of Apache Hadoop project. In HDFS, the computation is carried out in the nodes where relevant data is stored. Hadoop also implemented a parallel computational paradigm named as... more
This special issue of Optical Engineering addresses a number of critical issues in the continuing invention, development, and characterization of components for optical information processing and computing applications. This is the second... more
In computation flow visualization, integration based geometric flow visualization is often used to explore the flow field structure. A typical time-varying dataset from a Computational Fluid Dynamics (CFD) simulation can easily require... more
Conducting digital forensic investigations in a big data distributed file system environment presents significant challenges to an investigator given the high volume of physical data storage space. Presented is an approach from which the... more
Lippmann photography is a more than one century old interferometric process invented for recording colored images in thick black and white photographic emulsions. After a comparison between this photographic process and Denisyuk... more
Cloud computing is an emerging computing platform and service mode, which organize and schedule service based on the Internet. Cloud storage is one of the services which provide storage resource and service based on the remote storage... more
Nonvolatile photorefractive gratings both in bulk and waveguide samples of some electrooptic crystals like lithium niobate (LiNbO3) or strontium – barium niobate (SBN) are promising elements for different applications in photonics. For... more
Distributed storage systems have become the core of many large-scale applications today. Much research has focused on the distribution of data in such systems, with two main approaches of tree-based and hash-based algorithms. Hash-based... more
Permanent two-dimensional optical waveguide arrays are demonstrated by exposing diffusion-mediated photopolymer with a multiple-beam interference pattern. A fiber-based phase control system ensures a stable interference pattern during... more
The Oak Ridge Leadership Computing Facility (OLCF) has deployed multiple large-scale parallel file systems (PFS) to support its operations. During this process, OLCF acquired significant expertise in large-scale storage system design,... more
HPC and Big Data stacks are completely separated today. The storage layer offers opportunities for convergence, as the challenges associated with HPC and Big Data storage are similar: trading versatility for performance. This motivates a... more