Papers by Dr. Jitendra Kurmi

SAMRIDDHI : A Journal of Physical Sciences, Engineering and Technology, May 15, 2023
Cloud database serves flexible, affordable, and scalable database systems. Even the cloud databas... more Cloud database serves flexible, affordable, and scalable database systems. Even the cloud database is secure with transport layer security (TLS), but the performance overhead that TLS introduces when executing operations on one of the major No SQL databases: Mongo DB in terms of latency. To explore TLS performance overhead for Mongo DB, we performed two tests simulating common database usage patterns. We first investigated connection pooling, where an application uses a single connection for many database operations. Then, we considered one request per connection in which an application opens a connection, executes a process, and immediately closes the connection after completing the operation. Our experimental result shows that applications that cannot endure significant performance overhead should be deployed within a properly segmented network rather than enabling TLS. Applications using TLS should use a connection pool rather than a connection-per-request.

Cloud database serves flexible, affordable, and versatile database frameworks. Indeed, even the c... more Cloud database serves flexible, affordable, and versatile database frameworks. Indeed, even the cloud database is secure with Transport Layer Security (TLS). However, the performance overhead that TLS presents while executing procedures on the five major No SQL databases: Mongo DB, Apache Cassandra DB, Amazon Dynamo DB, Redis DB, and Couch DB regarding dormancy. We proposed a Multiple Replica Database Architecture (MRD-ARC) to investigate TLS execution overhead with forward secrecy for five NoSql databases, we performed two tests mimicking average database utilization designs with TLS cipher suite ECDHE-prime256v1. We examined connection pooling, where an application involves a solitary connection for some databases activities. Then, we considered one request for each connection. An application opens a connection, executes an interaction, and quickly shuts the connection in the wake of finishing the activity based on read-only throughput, read-only response, and connection throughpu...

IJARCCE, 2019
Data mining could be a method of extracting desired and helpful data from the pool of information... more Data mining could be a method of extracting desired and helpful data from the pool of information. Cluster in data processing is that the grouping of information points with some common similarity. Cluster is a vital aspect of information mining. It simply clusters the information sets into given no. of clusters. Various no. of ways are used for the information cluster among that K-suggests that is that the most generally used cluster formula. During this paper we've briefed within the kind of a review work done by completely different researcher's victimization K-means cluster formula. As a partition primarily based cluster algorithmic program, K-Means is wide employed in several areas for the options of its efficiency and simply understood. However, it's documented that the K-Means algorithmic program could get suboptimal solutions, looking on the selection of the initial cluster centers. During this paper, they propose a projection-based K-Means low-level formatting formula. The planned formula initial use standard mathematician kernel density estimation technique to search out the extremely density information areas in one dimension. Then the projection step is to iteratively use density estimation from the lower variance dimensions to the upper variance ones till all the scale square measure computed. Experiments on actual datasets show that our technique will get similar results compared with different standard ways with fewer computation tasks.

Journal of Pharmacognosy and Phytochemistry, 2018
Vegetable cultivation is being adopted by farmers for regular income generation and also emerging... more Vegetable cultivation is being adopted by farmers for regular income generation and also emerging as important diversification for economic growth in rural India. Improved demand, balanced health and dietary habits, urge regular supply of vegetables round the year. This scenario gives a fair chance for entrepreneurship development among rural youth. The present study was conducted in Rewa district Madhya Pradesh which comprises of 9 blocks. Gangeo block was selected purposively because this blocks has largest area under vegetable crops. The study found that out of total vegetable growers 49.16 per cent had medium entrepreneurial behavior. The study also revealed that non availability of improved seeds (I Rank) followed by lack of insurance of vegetables (II Rank), lack of training about scientific production technology of vegetable (III Rank) are major constraints in vegetable production as enterprise.
Journal of Scientific Research
Security in Computer Network Communication is of great importance because unauthorized users atte... more Security in Computer Network Communication is of great importance because unauthorized users attempt to steal, modify, misuse, interrupt, and try to un-stabilize, smartly our network systems. Therefore up to some extent, the secure communication provided by Transport Layer Protocol, implementation of the TLS function, and distinct libraries were designed by researchers, of which each library has the broad support of the encryption algorithms. But security can be compromised and seen in an offensive maneuver of the digital world as the main challenge in communication. In this paper, performance analysis of the most authentic six libraries: OpenSSL, AWS s2n, GnuTLS, NSS, BoringSSL, and Cryptlib performed to find appropriate TLS libraries for uncompromised communication based on throughput, CPU usage in the different virtual operating environments.

This paper presents the novel idea that how we can use a loss making factor of one technology int... more This paper presents the novel idea that how we can use a loss making factor of one technology into a useful output generator for the other technique. According to the researches the variation in the calibrated values of these problem from one water medium to another water medium is due to change in the chemical and physical properties of the water medium such as temperature, salinity and density. So one of the proposed solution is to use these problems for our benefit as to calculate the value of the chemical properties of the water medium. Idea is about to achieve higher range of the data sending and receiving under the water. Basically the idea is about the effect of the chemical and physical properties of the water on the communication under it. So there can we two ways that we can have either we can rectify those losses or use those losses for our profit generation. In this paper I work over the second concept and try to come up with some new outputs and analytical data.

Data Mining is the procedure of discovering new, raw and interesting patterns from the previously... more Data Mining is the procedure of discovering new, raw and interesting patterns from the previously based data repositories. It is entitled as to determine useful information with the help of different algorithms. Data Mining is applicable to almost every field presently for example banking, hospital, market basket analysis, education, CRM, fraud detection, tourism etc. Tourism is the key element in the economy of any country. About 12% of the country's economy comes from the tourism. It is the sector, which is for people and from people. A number of experiments are going on in the field of data mining related to tourism but if we compare with other sectors, it is still at the early stage of development. Therefore, there is a need to focus more on the tourism sector from the research perspective. This paper elucidates the use and work of data mining in the tourism sector up to date more effectively.
International Journal of Computer Sciences and Engineering
International Journal of Advanced Research in Computer Science, Apr 30, 2017
International Journal of Advanced Research in Computer Science, Apr 30, 2017
These days, World Wide Web gets to be distinctly gigantic data asset. Research Engine likewise ge... more These days, World Wide Web gets to be distinctly gigantic data asset. Research Engine likewise gets to be distinctly famous apparatus that help client find required data rapidly. Due to the enormous number of websites and further more website pages, web crawlers assume an essential part in nowadays. One of the fundamental elements that make the distinction of a web index with other is the ranking mechanism (page rank algorithm). In this paper, we will condense some noticeable page rank algorithm and after that we will exhibit our executions that actualize two page rank algorithm PageRank and Weighted Page Rank Algorithm.

Abstract- Travelling salesman problem (TSP) is one of the most popular real world combinatorial o... more Abstract- Travelling salesman problem (TSP) is one of the most popular real world combinatorial optimization problem in which we have to find a shortest possible tour that visits each city exactly once and come back to starting city. It ranges among NP hard problem so it is often used as a benchmark for optimization techniques. In this paper a hybrid of Ant Colony Optimization (ACO) and Cuckoo Search (CS) algorithm is proposed for travelling salesman problem. ACO is good metaheuristic algorithm but drawback of this algorithm is that, the ant will walk through the path where the chemical substances called pheromone density is high. It makes the whole process slow hence CS is employed to carry out the problem of local search of ACO. Cuckoo search uses single parameter apart from the population size because of this reason it works efficiently and performs local search more efficiently. The performance of new hybrid algorithm is compared with ACO. The result shows that new hybrid algori...
Abstract: Job scheduling is a NP –hard problem in which we have to minimize the makespan time. Sc... more Abstract: Job scheduling is a NP –hard problem in which we have to minimize the makespan time. Scheduling is the algorithm of assigning resources to the jobs in such a way that all jobs get required resource in fairly manner without affecting one another. In this paper we have proposed a hybrid algorithm for job scheduling using genetic and cuckoo search algorithm. This proposed algorithm combines the advantages of both genetic algorithm and cuckoo search algorithm. Genetic algorithm is an evolutionary algorithm that provides optimal solution for optimization problem but the dis advantage of the genetic algorithm is that it can be easily trapped in local optima to overcome this difficulty we are using cuckoo search algorithm. Index Terms: job scheduling, cuckoo search algorithm, genetic algorithm, hybrid algorithm

International Journal of Advanced Research in Computer Science, 2018
Unmanned Surface Vehicle is the dimension of robotics and wireless sensor network on the water su... more Unmanned Surface Vehicle is the dimension of robotics and wireless sensor network on the water surface. Many countries are continuously developing USVs for the defense and research purposes in water bodies. This paper addresses the year wise study of some popular USVs. As the computer developed with respect to its generation, the efficiency of the USVs also got enhanced. Broadly water is categorized into two categories as salt water and fresh water. Both water bodies are different in their properties like size, salinity, pH level and ecology. In the same way, the vehicle’s design, sensors for navigation, motor control, and others parameters get changed. The USA played an important role in developing unmanned Surface vehicles. This paper briefs about the gradual improvisation in the field of USV with respect to development in the computers and its generations. It is an interesting fact that the application domain of USVs is shifted from war zone to human's afflux.
International Journal of Innovative Research in Science, Engineering and Technology, 2017
This paper presents a design of WINS (Wireless Integrator Network Sensor) in a distributed networ... more This paper presents a design of WINS (Wireless Integrator Network Sensor) in a distributed network using multi hop communication, operated at low power and low frequency. In WINS, the PIR Sensor (PASSIVE INFRARED) detects the human body around 200 feet and uses the concept of Black Body Radiation. The combined effect of Sensing, Signal Processing, Computation and Decision capability is been described with more advancement and accuracy in an Integrated System. WINS has an advantage over other security systems as it is cheaper, faster, compact, scalable and is implemented using micro power CMOS Circuits.

Authentication is primary process by which you can verify that someone is legitimate user or not.... more Authentication is primary process by which you can verify that someone is legitimate user or not. The identification of an entity or person is based on the username and password provided to that entity. In security systems, authentication is playing an important role by which it provides access to the system to an entity based on their identity. Authentication only ensures that the entity who is claims to be, but do not passes any information about the access rights of the entity. The zero- knowledge protocol used to provide data security and zero-knowledge transfer during authentication. The proposed model for node authentication using zero - knowledge proof for secure login is much faster than existing model in terms of execution time, CPU usage, time complexity and performance. It also provides security features likes confidentiality, integrity, authentication and non-repudiation.

International Journal of Advanced Research in Computer Science, 2017
Now-a-days many unauthorized users try to get the protected information and therefore it is neces... more Now-a-days many unauthorized users try to get the protected information and therefore it is necessary to secure our data. Besides, there are also scenarios that data hiding needs to be done in the encrypted domain or combined with the encryption, especially in the age of big data and cloud computing. In the previous methods first encryption was done and then the room was vacated to hide the data. But there were some errors in data extraction and image recovery. In this paper a novel technique is proposed termed as reversible data hiding (RDH). It is an excellent technique by which we can hide our data. RDH is applied to encrypted images by which we can properly recover our data and the cover image. The hidden data can be in the form of text or image. In the proposed method we first vacate room to hide data and after encrypting image using certain encryption key the data hider reversibly hides the data whether text or image using data hiding key. In previous methods only text was hid...

Zero-knowledge proofs are cryptographic protocols which do not disclose the information or secret... more Zero-knowledge proofs are cryptographic protocols which do not disclose the information or secret itself during the protocol. Zero-knowledge proofs plays an important role in the design of cryptographic protocols. The application of Zero-knowledge protocols can be in authentication, identification, key exchange and other basic cryptographic operations. Zero-knowledge proof has been implemented without expose any secret information during the conversation and with smaller computational requirement than using comparable public key protocols. The most cryptographic problems can be solved with the help of zero-knowledge protocols, as well as with cryptography. Zero- knowledge protocols can be a best solution in many occasions. The Zero-knowledge proof protocols are very lightweight, due to which it requires less amount of memory. Thus Zero-knowledge protocols widely used especially in authentication. This paper presents an overview of zero-knowledge protocol used for authentication, ide...
Now a days, wireless sensor network is affected with an attack, called as wormhole attack. To dea... more Now a days, wireless sensor network is affected with an attack, called as wormhole attack. To deal with wormhole attack one of the widely used requirement either they use specialized hardware or in order to capture a specific pattern extra overhead over the network. The paper presents an efficient and reliable wormhole detection and localization based scheme on the basis of key observation that a huge amount of network traffic will be attracted by the wormholes. Our aim is to minimize the cost of detection of wormhole attack.
Job scheduling is a NP –hard problem in which we have to minimize the makespan time. Scheduling i... more Job scheduling is a NP –hard problem in which we have to minimize the makespan time. Scheduling is the algorithm of assigning resources to the jobs in such a way that all jobs get required resource in fairly manner without affecting one another. In this paper we have proposed a hybrid algorithm for job scheduling using genetic and cuckoo search algorithm. This proposed algorithm combines the advantages of both genetic algorithm and cuckoo search algorithm. Genetic algorithm is an evolutionary algorithm that provides optimal solution for optimization problem but the dis advantage of the genetic algorithm is that it can be easily trapped in local optima to overcome this difficulty we are using cuckoo search
Uploads
Papers by Dr. Jitendra Kurmi