Papers by Brajendra Panda

Emerald Publishing Limited eBooks, Dec 4, 2020
Data Science is an emerging discipline that, by its nature, integrates traditional disciplines. T... more Data Science is an emerging discipline that, by its nature, integrates traditional disciplines. The degree program leverages prior investments in the computing disciplines across campuses and colleges within each campus. The MU Institute for Data Science & Informatics coordinates this collaborative degree program with other MU departments to deliver hands-on, problem-based learning, core, and emphasis area courses. The learning objectives for these novel courses were informed by our industry review board and are taught by dedicated faculty that support a high-touch, interactive learning environment. The confluence of big data, massively powerful cloud computing platforms, and need of businesses from all sectors to leverage their data repositories has created a high-growth environment and demand for data scientists. Data scientists routinely leverage tools and techniques from computer science, information systems, advanced statistics, and machine learning. To satisfy the growing need for data scientists who can transform large collections of data into actionable decision making products for their employers, we are offering the Master of Science in Data Science and Analytics. This multidisciplinary Data Science and Analytics (DSA) degree program consists of 34-credit hours of learning that can be completed oncampus or as an executive online program. The online students will visit campus one time each academic year for an intensive on-site learning experience. The academic program has 19 credit hours of core, fundamental data science courses; followed by 9 credits of emphasis area-specific courses (Geospatial Analytics, BioHealth Analytics, High-Performance Computing, Human-Centered Science Design, Data Journalism & Strategic Communication) and 6 credits of industry-relevant case studies and capstone project courses, or for on-campus students, the option to complete 6 hours of thesis instead of case study and capstone.
Communications of The ACM, Jul 1, 1999

International Conference on Emerging Security Information, Systems and Technologies, Nov 14, 2021
Recently, critical infrastructure systems have become increasingly vulnerable to attacks on their... more Recently, critical infrastructure systems have become increasingly vulnerable to attacks on their data systems. If an attacker is successful in breaching a system's defenses, it is imperative that operations are restored to the system as quickly as possible. This research focuses on damage assessment and recovery following an attack. We review work done in both database protection and critical infrastructure protection. Then, we propose a model using a graph construction to show the cascading affects within a system after an attack. We also present an algorithm that uses our graph to compute an optimal recovery plan that prioritizes the most important damaged components first so that the vital modules of the system become functional as soon as possible. This allows for the most critical operations of a system to resume while recovery for less important components is still being performed.
Computers and Their Applications, 2002
Trust and shared interest are the building blocks for most relationships in human society. A dece... more Trust and shared interest are the building blocks for most relationships in human society. A deceptive action and the associated risks can affect many people. Although trust relationship in virtual communities can be built up more quickly and easily, it is more fragile. This research concentrates on analyzing the Information Quality in the open rating systems; especially studying the way deceptive data spread in virtual communities. In this paper, we have proposed several novel ideas on assessing deceptive actions and how the structure of the virtual community affects the information flow among subjects in the web of trust. Furthermore, our experiments illustrate how deceptive data would spread and to what extent the deceptive data would affect subjects in virtual communities.

For the past few years, research in multilevel secure database systems has received a great deal ... more For the past few years, research in multilevel secure database systems has received a great deal of attention. Such systems are quite essential in military as well as many commercial applications where data are classified according to their sensitivity and where each user has a clearance level. Users access the data as per the system's security policy. A system is most secure if it guards against an unauthorized flow of information either directly or indirectly. In this research, the issue of query processing that takes place among the various base relations in a kernelized multilevel secure database system was analyzed. Specifically, the SeaView model, a research prototype developed as a joint effort by SRI International and Gemini Computer, was followed since it is the only model that uses element level (i.e., the finest granularity level) classification of data. Although the SeaView model aims at achieving class A1 system classification, it has two major drawbacks. First, the query response time is high, due to the large number of outer join operations performed during the reconstruction of multilevel relations. Second, the reconstruction process results in some spurious tuples that provide misleading information to users having higher clearances, which is rather dangerous since decision-making at higher levels plays a crucial role in every application. In this research, a model called Protected Information System Manager, or PRISM in short, was developed. Different security parameters, the access control policies, the multilevel relation, different integrity constraints, and the decomposition and reconstruction algorithms in multilevel relations were designed. For the recovery process, the domain vector accelerator technique was utilized. It has been shown that by using the domain vector accelerator mechanism, a significant performance improvement is achieved over the SeaView model. At the same time, this protocol avoids the generation of spurious tuples, giving the users a correct and fast result. In addition, a concurrency control protocol on the PRISM was proposed in this work. Finally, an integrated mechanism for both concurrency control and query processing was presented in order to preserve the concurrent execution of transactions at all levels while improving their response time.
Although several algorithms have been developed for transaction management in multilevel secure d... more Although several algorithms have been developed for transaction management in multilevel secure database systems, most of them abort high-level transactions or require that the read and write sets of transactions be predeclared. In this paper, an algorithm based on the Serialization Graph Testing (SGT) protocol is presented that eliminates these problems in multilevel secure database systems and provides concurrent secure executions of transactions at each security level.

In case of an information attack on databases it is vital to start the recovery effort as soon as... more In case of an information attack on databases it is vital to start the recovery effort as soon as an attack is identified. The existing recovery techniques first undo all malicious and affected transactions and then redo all affected transactions. This is too time-consuming. In this work, we have developed a model to fuse groups of malicious and affected transactions. During the undo process all new fused transactions are undone. Then, only fused affected transactions are re-executed to bring the database to a consistent state. This offers the following major benefits. First, by combining transactions, the number of operations such as start, commit, read, and write are reduced. As a result, several data items that were required to be accessed multiple times in each individual transaction are now accessed only once in a fused transaction. Moreover, the amount of log access is also reduced. This helps in accelerating the recovery process.
2022 IEEE/ACM 15th International Conference on Utility and Cloud Computing (UCC)

2020 International Conference on Computational Science and Computational Intelligence (CSCI)
During the last decades, not only the number of cyberattacks have increased significantly, they h... more During the last decades, not only the number of cyberattacks have increased significantly, they have also become more sophisticated. Hence designing a cyber-resilient approach is of paramount importance. Traditional security methods are not adequate to prevent data breaches in case of cyberattacks. Cybercriminals have learned how to use new techniques and robust tools to hack, attack, and breach data. Fortunately, Artificial Intelligence (AI) technologies have been introduced into cyberspace to construct smart models for defending systems from attacks. Since AI technologies can rapidly evolve to address complex situations, they can be used as fundamental tools in the field of cybersecurity. Al-based techniques can provide efficient and powerful cyber defense tools to recognize malware attacks, network intrusions, phishing and spam emails, and data breaches, to name a few, and to alert security incidents when they occur. In this paper, we review the impact of AI in cybersecurity and summarize existing research in terms of benefits of AI in cybersecurity.

The advancement of information technology in coming years will bring significant changes to the w... more The advancement of information technology in coming years will bring significant changes to the way healthcare data is processed. Technologies such as cloud computing, fog computing, and the Internet of things (IoT) will offer healthcare providers and consumers opportunities to obtain effective and efficient services via real-time data exchange. However, as with any computer system, these services are not without risks. There is the possibility that systems might be infiltrated by malicious users and, as a result, data could be corrupted, which is a cause for concern. Once an attacker damages a set of data items, the damage can spread through the database. When valid transactions read corrupted data, they can update other data items based on the value read. Given the sensitive nature of healthcare data and the critical need to provide real-time access for decision-making, it is vital that any damage done by a malicious transaction and spread by valid transactions must be corrected i...

For the past few years, research in multilevel secure database systems has received a great deal ... more For the past few years, research in multilevel secure database systems has received a great deal of attention. Such systems are quite essential in military as well as many commercial applications where data are classified according to their sensitivity and where each user has a clearance level. Users access the data as per the system's security policy. A system is most secure if it guards against an unauthorized flow of information either directly or indirectly. In this research, the issue of query processing that takes place among the various base relations in a kernelized multilevel secure database system was analyzed. Specifically, the SeaView model, a research prototype developed as a joint effort by SRI International and Gemini Computer, was followed since it is the only model that uses element level (i.e., the finest granularity level) classification of data. Although the SeaView model aims at achieving class A1 system classification, it has two major drawbacks. First, the...

2015 IEEE 2nd International Conference on Cyber Security and Cloud Computing, 2015
Cloud computing has brought many advantages to organizations and computer users. It allows differ... more Cloud computing has brought many advantages to organizations and computer users. It allows different service providers to distribute many applications as services in an economical way. Therefore, many users and companies have begun using cloud computing. However, they are concerned about their data when they store it on a third party server, the cloud. The private data of individual users and companies is stored and managed by the service providers on the cloud, which offers services on the other side of the Internet in terms of its users, and consequently results in privacy concerns [1]. In this paper, a technique has been explored to encrypt the data on the cloud and to execute and run SQL queries on the cloud over encrypted data. The strategy is to process the query at the service providers' site without having to decrypt the data. Also, to achieve efficiency, no more than the exact set of requested data is returned to the client. Data decryption is performed at the client site to prevent any leakage at the cloud or during transmission. Two techniques have been provided to effectively store the encrypted data. Also, an experiment evaluation has been provided to compare between the two techniques.

Information sharing is crucial for various organizations operating in a global environment. Varie... more Information sharing is crucial for various organizations operating in a global environment. Varieties of existing virtual organizations support some forms of information sharing. Since the scope of a virtual organization can span over multiple administrative domains, information assurance is challenging. While trust plays key roles in eliminating the scalability restriction of traditional security mechanisms and provides more than merely security, existing trust models focus on subject trust management. But studying a subject's trustworthiness alone offers part of the solution to ensure the quality and security of the information the subject produced. Furthermore, most current research on information assurance and security for a virtual organization focuses on information confidentiality and information protection from unauthorized modifications. Very little work has been done in ensuring the quality and security features of external information. Taking these issues into consideration, this dissertation proposes a framework for information assurance using object trust management, where users evaluate external information based on how they trust the security and quality features of the information. The framework is designed for an information-oriented virtual organization, called a virtual information consortium (VIC). The proposed component-based approach and the hierarchical decision model allow a user to evaluate and select external information based on their intrinsic and extrinsic trust features. Only the information with the required level of quality and security can be available for internal use and appropriately grouped and assigned with different privileges to access local resources. This dissertation contributes in the following areas: (1) The first to define object trust management and introduce the concept of object, object version, and trust-related attributes; (2) The first to specify object trust principles such as trust combination, trust composition, similarity trust, trust multi-dimension, and trust testimony; (3) The first to use a component-based approach to evaluate the quality of a given object version; (4) The first to present systematic methods of calculating trust values (primary and secondary) of an object version; (5) A study of information quality and security from the perspective of object trust management; (6) The development of a two-level decision model applying the object trust principles to information selection and organization.

Electronics
The world has experienced a huge advancement in computing technology. People prefer outsourcing t... more The world has experienced a huge advancement in computing technology. People prefer outsourcing their confidential data for storage and processing in cloud computing because of the auspicious services provided by cloud service providers. As promising as this paradigm is, it creates issues, including everything from data security to time latency with data computation and delivery to end-users. In response to these challenges, the fog computing paradigm was proposed as an extension of cloud computing to overcome the time latency and communication overhead and to bring computing and storage resources close to both the ground and the end-users. However, fog computing inherits the same security and privacy challenges encountered by traditional cloud computing. This paper proposed a fine-grained data access control approach by integrating the ciphertext policy attribute-based encryption (CP-ABE) algorithm and blockchain technology to secure end-users’ data security against rogue fog nodes...
Proceedings of the 5th Annual Workshop on Cyber Security and Information Intelligence Research: Cyber Security and Information Intelligence Challenges and Strategies, 2009
In this paper, we present an insider attack detection model that is designed to profile traceabil... more In this paper, we present an insider attack detection model that is designed to profile traceability links based on document dependencies and calendar-based file usage patterns for detecting insider threats. This model is utilized to detect insiders' malicious activities targeted at tampering the contents of files for various purposes. We apply the concept of traceability links in the software engineering field to this research. Our approach mainly employs document dependency traceability links for constructing insider attack detection model.
Web of trust is the foundation of the reputation system, recommendation system and semantic Web. ... more Web of trust is the foundation of the reputation system, recommendation system and semantic Web. Most of existing research on web of trust concentrates on aggregating the trust ratings on subjects and objects in the Web of trust. The problem with this approach is that an adversary subject can accumulate reputation gradually and can be highly trusted by many other subjects. If this subject later deliberately releases a deceptive data, the effect caused by this deceptive data may be disastrous. Not only can this deceptive data greatly affect people who directly trust this individual, but also, its consequence may have an effect on many other subjects in the network. Our model illustrates how the structural analysis of Web can help evaluate the deleterious result of the deceptive data.
2017 International Conference on Computational Science and Computational Intelligence (CSCI), 2017
While the user-base of cloud computing is growing rapidly, data owners worry about security of th... more While the user-base of cloud computing is growing rapidly, data owners worry about security of the data they store on clouds. Lack of appropriate control over the data might cause security violations. Therefore, all sensitive data stored in cloud databases must be protected at all times. This research paper outlines how data owners can keep their data secure and trustworthy, and how they can verify integrity of data in a cloud computing environment. The proposed model uses data partitioning to reach this goal. We have carried out performance analyses of the model through simulation and the results demonstrate the effectiveness of the model.
Uploads
Papers by Brajendra Panda