Papers by Rameswar Debnath
Kernel Selection for the Support Vector Machine
IEICE Transactions on Information and Systems
The choice of kernel is an important issue in the support vector machine algorithm, and the perfo... more The choice of kernel is an important issue in the support vector machine algorithm, and the performance of it largely depends on the kernel. Up to now, no general rule is available as to which kernel should be used. In this paper we investigate two kernels: Gaussian RBF kernel and polynomial kernel. So far Gaussian RBF kernel is the best choice for practical applications. This paper shows that the polynomial kernel in the normalized feature space behaves better or as good as Gaussian RBF kernel. The polynomial kernel in the normalized feature space is the best alternative to Gaussian RBF kernel.
An Evolutionary Gene Selection Method for Microarray Data Based on SVM Error Bound Theories
Bioinformatics & Computational Biology, 2009
SVM Training: Second-Order Cone Programming versus Quadratic Programming
International Joint Conference on Neural Networks, 2006
The support vector machine (SVM) problem is a convex quadratic programming problem which scales w... more The support vector machine (SVM) problem is a convex quadratic programming problem which scales with the training data size. If the training size is large, the problem cannot be solved by straighforward methods. The large-scale SVM problems are tackled by applying chunking (decomposition) technique. The quadratic programming problem involves a square matrix which is called kernel matrix is positive semi-definite.
An improved working set selection method for SVM decomposition method
2004 2nd International IEEE Conference on 'Intelligent Systems'. Proceedings (IEEE Cat. No.04EX791), 2004
The support vector machine learning problem is a convex quadratic programming problem. For large ... more The support vector machine learning problem is a convex quadratic programming problem. For large learning tasks with many training examples, the general quadratic programs quickly become intractable in their memory and time requirements. Thus the decomposition method is essential for the support vector machine learning. The working set selection is the most important issue of the decomposition method. Convergence of
Learning Capability: Classical RBF Network vs. SVM with Gaussian Kernel
Lecture Notes in Computer Science, 2002
The Support Vector Machine (SVM) has recently been introduced as a new learning technique for sol... more The Support Vector Machine (SVM) has recently been introduced as a new learning technique for solving variety of real-world applications based on statistical learning theory. The classical Radial Basis Function (RBF) network has similar structure as SVM with Gaussian kernel. In this paper we have compared the generalization performance of RBF network and SVM in classification problems. We applied Lagrangian
A new ensemble learning with support vector machines
2010 International Conference on Computer and Information Application, 2010
ABSTRACT Cascade of classifiers can, in general, improve the performance of any given classifier.... more ABSTRACT Cascade of classifiers can, in general, improve the performance of any given classifier. In this paper, we present a new cascade classifier constructed with the support vector machine (SVM) classifiers where a set of SVMs is learned repeatedly with the bounded support vectors of the previous SVM. A binary decision tree is formed using the learned classifiers to take the decision of a new example. Experimental results show that the proposed method can improve the generalization performance over a single SVM
DWT Based Digital Watermarking Technique and its Robustness on Image Rotation, Scaling, JPEG compression, Cropping and Multiple Watermarking
2007 International Conference on Information and Communication Technology, 2007
Page 1. Page 2. DWT Based Digital Watermarking Technique and its Robustness on Image Rotation, Sc... more Page 1. Page 2. DWT Based Digital Watermarking Technique and its Robustness on Image Rotation, Scaling, JPEG compression, Cropping and Multiple Watermarking SM Mohidul Islam, Rameswar Debnath, SK Alamgir Hossain ...
A New Model for Large Margin Classifiers by Second Order Cone Programming
A Fast Learning Decision-Based SVM for Multi-Class Problems
An efficient method for tuning kernel parameter of the support vector machine
Communications and Information …, 2004
... give a good set of kernel parameter. ~n the following section we will show our expenmental re... more ... give a good set of kernel parameter. ~n the following section we will show our expenmental results that fo~ow idea. ... Thus the number of support vector will be large and the support vector machine may implement a rapidly oscillating function rather than the smooth mapping. ...

Pattern Analysis and Applications, 2004
The support vector machine (SVM) has a high generalisation ability to solve binary classification... more The support vector machine (SVM) has a high generalisation ability to solve binary classification problems, but its extension to multi-class problems is still an ongoing research issue. Among the existing multi-class SVM methods, the one-against-one method is one of the most suitable methods for practical use. This paper presents a new multi-class SVM method that can reduce the number of hyperplanes of the oneagainst-one method and thus it returns fewer support vectors. The proposed algorithm works as follows. While producing the boundary of a class, no more hyperplanes are constructed if the discriminating hyperplanes of neighbouring classes happen to separate the rest of the classes. We present a large number of experiments that show that the training time of the proposed method is the least among the existing multiclass SVM methods. The experimental results also show that the testing time of the proposed method is less than that of the one-against-one method because of the reduction of hyperplanes and support vectors. The proposed method can resolve unclassifiable regions and alleviate the over-fitting problem in a much better way than the one-against-one method by reducing the number of hyperplanes. We also present a direct acyclic graph SVM (DAGSVM) based testing methodology that improves the testing time of the DAGSVM method.
Implementation Issues of Second-Order Cone Programming Approaches for Support Vector Machine Learning Problems
IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences, 2009
The core of the support vector machine (SVM) problem is a quadratic programming problem with a li... more The core of the support vector machine (SVM) problem is a quadratic programming problem with a linear constraint and bounded variables. This problem can be transformed into the second order cone programming (SOCP) problems. An interior-point-method (IPM) can be designed for the SOCP problems in terms of storage requirements as well as computational complexity if the kernel matrix has low-rank.

Biosystems, 2010
Microarrays have thousands to tens-of-thousands of gene features, but only a few hundred patient ... more Microarrays have thousands to tens-of-thousands of gene features, but only a few hundred patient samples are available. The fundamental problem in microarray data analysis is identifying genes whose disruption causes congenital or acquired disease in humans. In this paper, we propose a new evolutionary method that can efficiently select a subset of potentially informative genes for support vector machine (SVM) classifiers. The proposed evolutionary method uses SVM with a given subset of gene features to evaluate the fitness function, and new subsets of features are selected based on the estimates of generalization error of SVMs and frequency of occurrence of the features in the evolutionary approach. Thus, in theory, selected genes reflect to some extent the generalization performance of SVM classifiers. We compare our proposed method with several existing methods and find that the proposed method can obtain better classification accuracy with a smaller number of selected genes than the existing methods.
An Efficient Support Vector Machine Learning Method with Second-Order Cone Programming for Large-Scale Problems
Applied Intelligence, 2005
In this paper we propose a new fast learning algorithm for the support vector machine (SVM). The ... more In this paper we propose a new fast learning algorithm for the support vector machine (SVM). The proposed method is based on the technique of second-order cone programming. We reformulate the SVM's quadratic programming problem into the second-order cone programming problem. The proposed method needs to decompose the kernel matrix of SVM's optimization problem, and the decomposed matrix is used
An Improved Interleaver Design for Turbo Codes
2007 International Conference on Information and Communication Technology, 2007
ABSTRACT This paper is aimed at designing of an effective interleaver for parallel concatenated c... more ABSTRACT This paper is aimed at designing of an effective interleaver for parallel concatenated coding schemes which utilizes the weight distribution as design criterion. We present a method here for designing an interleaver for achieving a minimum effective free distance of the code word so that error floor can be decreased. The method is expected to achieve a separation of radicN in the output code word. The proposed design has the advantage that the complexity grows linearly with the interleaver length.
Uploads
Papers by Rameswar Debnath