Papers by Dr. Soubhik Chakraborty

Research Square (Research Square), Oct 20, 2022
Analysis of algorithms is an interesting topic in the theoretical computer science area of resear... more Analysis of algorithms is an interesting topic in the theoretical computer science area of research. An algorithm analyzed with respect to its run time complexity gives an overall idea about the working of the algorithm in the practical scenario. Empirical complexity is a comparatively newer analysis approach which focuses on running the algorithm practically with varying and increasing input and then predicting the complexity of the algorithm. This is then followed by a statistical analysis of the empirical estimate to give a clearer bound on the empirical complexity. Cryptographic algorithms are the sole bearer of providing security for any application and hiding con dential information of national or personal importance. In today's fast running world, besides security, the run time of the algorithm is also of utmost importance. Cryptographic algorithm constitutes of many complex mathematical functions. In this paper, we have analyzed the most sought after cryptographic algorithm AES-128 in terms of its computational complexity and then practically drawn an asymptotical bound through the empirical complexity approach. Then, a regression analysis on the concluded bound is done to formally settle on the empirical complexity of AES-128.
International Journal of Mathematical Archive, 2011
The note gives an application of Binomial theorem for multiplying any two digited numbers to itse... more The note gives an application of Binomial theorem for multiplying any two digited numbers to itself any number of times which we call Parallel Binomial Expansion. The result has an easy extension to the multinomial case.

The bisection method is an iterative approach used in numerical analysis to find solutions to non... more The bisection method is an iterative approach used in numerical analysis to find solutions to nonlinear equations. The main purpose of this paper is to study how the parameters of a probability distribution characterizing the coefficients of a cubic polynomial can influence the convergence of the bisection method. The study covers discrete and continuous distributions, including discrete uniform, continuous uniform, and normal distributions. It was found that for both types of uniform distribution inputs, a second-degree polynomial equation can predict the average iteration for a given parameter r, where r indicates the distribution interval [−r, r]. Interestingly, the coefficients of the second-degree polynomial are nearly identical for discrete and continuous uniform distributions. For normal distribution input, the average iteration does not depend upon the standard deviation when the mean is fixed and the standard deviation is varying. But when the standard deviation is fixed and the mean is varying, the second-degree polynomial is still the best fit. This means the average iteration depends upon the mean of the normal distribution. Overall, our paper concludes that: I. For uniform distribution input, the average iteration does not depend on whether the distribution is discrete or continuous but rather depends on the range of the distribution which is its parameter. II. For non-uniform distribution input, the average iteration depends on the mean of the distribution (location parameter) but not on the standard deviation (scale parameter). Finally, a curtain is raised in the future direction of research in which we propose to combine the bisection method with the regula falsi and Newton-Raphson methods to increase the rate of convergence.

Agricultural planning relying on evapotranspiration suffers due to inaccuracy in its estimation. ... more Agricultural planning relying on evapotranspiration suffers due to inaccuracy in its estimation. The nonavailability of meteorological parameters required for accurate estimation of reference evapotranspiration (ET o) resulted in the development of different methods of ET o estimation. The present study compares various universally accepted methods of ET o estimation by considering the Penman Monteith as a standard method. Comparative analysis indicated the suitability of Hargreaves (1985) method followed by Christiansen (1968) method and Pan Evaporation method (1977). The improvement in ET o estimation was carried out through transformation of standard equations using single or multi parametric approach after analyzing dependency and sensitivity of different meteorological parameters on ET o. The developed transformed models indicated that during ET o estimation morning time relative humidity (RH 1) can play the dominant role (99%). ET o estimation by combination of bright sunshine hours and wind speed (WV) exhibit better role (98.8%) than the combination of minimum temperature (Tmin) and WV (98.6%).
Journal of Mathematical & Computer Applications
The slope of the line reveals changes in y-coordinate with respect to x-coordinate, represented b... more The slope of the line reveals changes in y-coordinate with respect to x-coordinate, represented by the equation “y=mx+c”, known as the equation of a line.

MANET, a wireless, infrastructure-less, ad-hoc network, uses a dynamic topology rendering for the... more MANET, a wireless, infrastructure-less, ad-hoc network, uses a dynamic topology rendering for the establishment of connections much easier in any given conditions. The black hole attack is a malicious activity that is performed in MANET which causes depletion of the network. It poses a threat to MANET by incorporating malicious nodes into the network which gives fake routing information to the source node. As the black hole attack acquires the data from the source node, it drops the data packet from the network resulting in the depletion of the network. In this paper, to tackle the black hole attack, we have proposed a node credibility-based approach focusing on Andrews plot to project the node credibility after a few transactions which infer that a high node with high node credibility is the more trusted node in the network. We have used Network Simulator software for the creation of MANET scenarios.
arXiv (Cornell University), Feb 27, 2012
arXiv (Cornell University), Sep 17, 2008
God created the natural numbers. All the rest is the work of man.
The note analyzes a code for multiplying large positive integers of arbitrary lengths from the fi... more The note analyzes a code for multiplying large positive integers of arbitrary lengths from the first principles and suggests how this may be used for factoring large positive integers of arbitrary lengths.

Agricultural planning relying on evapotranspiration suffers due to inaccuracy in its estimation. ... more Agricultural planning relying on evapotranspiration suffers due to inaccuracy in its estimation. The non- availability of meteorological parameters required for accurate estimation of reference evapotranspiration (ETo) resulted in the development of different methods of ETo estimation. The present study compares various universally accepted methods of ETo estimation by considering the Penman Monteith as a standard method. Comparative analysis indicated the suitability of Hargreaves (1985) method followed by Christiansen (1968) method and Pan Evaporation method (1977). The improvement in ETo estimation was carried out through transformation of standard equations using single or multi parametric approach after analyzing dependency and sensitivity of different meteorological parameters on ETo. The developed transformed models indicated that during ETo estimation morning time relative humidity (RH1) can play the dominant role (99%). ETo estimation by combination of bright sunshine hours...

This paper involves a simulation study to predict the maximum and minimum eigenvalues of a random... more This paper involves a simulation study to predict the maximum and minimum eigenvalues of a random matrix whose elements are coming from N (µ, σ 2). In the first study, we fix σ and for different values of µ we generate 100 matrices of order 10×10 in MATLAB. Then, by plotting a graph between mean maximum eigenvalue and µ, a pattern is detected and we obtain the equation of best curve fit using MS-EXCEL. However, no pattern is detected for mean minimum eigenvalue with respect to μ. In the second study, the same procedure is repeated except that here we fix µ and vary σ. Here the reverse happens interestingly. Pattern is detected for mean minimum eigenvalue with respect to σ but no pattern is detected for the case of mean maximum eigenvalue. Both these studies are repeated for random matrices of order 5×5 with identical results as in the case of 10x10 matrices except that the magnitude of the maximum eigenvalue is reduced by about half when the order of the matrices is reduced by half while magnitude of the minimum eigenvalue is not significantly affected. The paper also includes a theoretical analysis of predicting the range of the sum of all the eigenvalues of a diagonalizable random matrix with the help of its trace and Chebyshev's inequality. This paper is organised as follows. Section 1 is the introduction. Section 2 is the literature review. Section 3 gives the methodology. Section 4 gives the experimental results and discussion. Section 5 provides some theoretical results for a diagonalizable random square matrix. Finally, Section 6 gives the concluding remarks.
International Journal of Mathematical Archive, 2011
The note gives an application of Binomial theorem for multiplying any two digited numbers to itse... more The note gives an application of Binomial theorem for multiplying any two digited numbers to itself any number of times which we call Parallel Binomial Expansion. The result has an easy extension to the multinomial case.
Electronic Journal of Applied Statistical Analysis, Apr 26, 2011
No statistical model is right or wrong, true or false in a strict sense. We only evaluate and com... more No statistical model is right or wrong, true or false in a strict sense. We only evaluate and compare their contributions. Based on this theme, Jorma Rissanen has written a short but beautiful book titled "Information and Complexity in Statistical Modeling" (Springer, 2007), where modeling is done primarily by extracting the information from the data that can be learned with suggested classes of probability models. The note reviews this book and on the way rediscovers the chain information-knowledge-wisdom.
Let i, j and k be three distinct positive integers (we call it a triplet)such that the sum of eve... more Let i, j and k be three distinct positive integers (we call it a triplet)such that the sum of every pair in the triplet is a perfect square. For example, (2, 34, 47) is a permissible triplet. Permuting the elements of a triplet among themselves, we get six triplets for each combination. The note raises the non trivial problem of finding a functional relationship between c and n where c is the number of such combination triplets such that i, j, k <=n where n is a positive integer.
The present paper examines the behavior of Shift-insertion sort (insertion sort with shifting) fo... more The present paper examines the behavior of Shift-insertion sort (insertion sort with shifting) for normal distribution inputs and is in continuation of our earlier work on this new algorithm for discrete distribution inputs, namely, negative binomial. Shift insertion sort is found more sensitive for main effects but not for all interaction effects compared to conventional insertion sort.

International Journal on Computational Science & Applications, 2012
In the present paper, an attempt is made to describe the response of questionnaire by a multinomi... more In the present paper, an attempt is made to describe the response of questionnaire by a multinomial model. The questionnaire was designed for the undergraduate students to know whether students had agility implicitly without knowing agile software development model. In order to instill agile software development practices among students, data were collected about the effectiveness of few parameters in students group. There are 116 respondents and six questions, each with possible answers yes, neutral or no. While the individual responses can be assumed to be independent of one another, it is not clear whether the probabilities of yes, neutral, or no are fixed for every individual for a particular question. This is therefore tested, for each question separately, using Chi-Square test of goodness of fit which confirms the multinomial model. However, the probabilities of the possible responses are found to be varying from question to question.
This note shows Des Raj's ordered estimator can be easily constructed for srswor. We next pr... more This note shows Des Raj's ordered estimator can be easily constructed for srswor. We next prove its unbiasedness and give its variance expression. A C- code on a popular srswor algorithm that takes order into account with run time results is also provided.

Journal of Colloid and Interface Science, 2000
A data-driven, risk-based approach is being pursued by the Royal National Lifeboat Institution (R... more A data-driven, risk-based approach is being pursued by the Royal National Lifeboat Institution (RNLI) to guide the selection of beaches for new lifeguard services around the UK coast. In this contribution, life risk to water-users is quantified from the number and severity of life-threatening incidents at a beach during the peak summer tourist season, and this predictand is modelled using both multiple linear regression and Bayesian belief network approaches. First, the underlying levels of hazard and water-user exposure at each beach were quantified, and a dataset of 77 potential predictor variables was collated at 113 lifeguarded beaches. These data were used to develop exposure and hazard sub-models, and a final prediction of peak-season life risk was made at each beach from the product of the exposure and hazard predictions. Both the regression and Bayesian network algorithms identified that intermediate morphology is associated with increased hazard, while beaches with a slipway were predicted to be less hazardous than those without a slipway. Beaches with increased car parking area and beaches enclosed by headlands were associated with higher water-user numbers by both algorithms, and beach morphology type was seen to either increase water-user numbers (intermediate morphology in the regression model) or decrease water-user numbers (reflective morphology in the Bayesian network). Overall, intermediate beach morphology can be considered the most crucial contributor to water-user life risk, as it was linked to both higher hazard, and higher water-user exposure. The predictive skill of the regression and Bayesian network models are compared, and the benefits that each approach provides to beach risk managers are discussed.
Applied Mathematics Letters, 2000
In doing the statistical analysis of bubble sort program, we compute its execution times with var... more In doing the statistical analysis of bubble sort program, we compute its execution times with various parameters. The statistical analysis endorses the specific quadratic pattern of the execution time on the number of items to be sorted. Next, a cursor along the future direction is indicated. (~
Uploads
Papers by Dr. Soubhik Chakraborty