Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2008
…
8 pages
1 file
Abstract. Recently stochastic models such as mixture models, graphical models, Markov random fields and hidden Markov models have key role in probabilistic data analysis. Also image segmentation means to divide one picture into different types of classes or regions, for example a picture of geometric shapes has some classes with different colors such as ’circle’, ’rectangle’, ’triangle’ and so on. Therefore we can suppose that each class has normal distribution with specify mean and variance. Thus in general a picture can be Gaussian mixture model. In this paper, we have learned Gaussian mixture model to the pixel of an image as training data and the parameter of the model are learned by EM-algorithm. Meanwhile pixel labeling corresponded to each pixel of true image is done by Bayes rule. This hidden or labeled image is constructed during of running EM-algorithm. In fact, we introduce a new numerically method of finding maximum a posterior estimation by using of EM-algorithm and Gau...
INTERNATIONAL JOURNAL OF ENGINEERING …, 2008
Recently stochastic models such as mixture models, graphical models, Markov random fields and hidden Markov models have key role in probabilistic data analysis. Also image segmentation means to divide one picture into different types of classes or regions, for example a picture of geometric shapes has some classes with different colors such as 'circle', 'rectangle', 'triangle' and so on. Therefore we can suppose that each class has normal distribution with specify mean and variance. Thus in general a picture can be Gaussian mixture model. In this paper, we have learned Gaussian mixture model to the pixel of an image as training data and the parameter of the model are learned by EM-algorithm. Meanwhile pixel labeling corresponded to each pixel of true image is done by Bayes rule. This hidden or labeled image is constructed during of running EM-algorithm. In fact, we introduce a new numerically method of finding maximum a posterior estimation by using of EM-algorithm and Gaussians mixture model which we called EM-MAP algorithm. In this algorithm, we have made a sequence of the priors, posteriors and they then convergent to a posterior probability that is called the reference posterior probability. So Maximum a posterior estimation can be determined by this reference posterior probability which will make labeled image. This labeled image shows our segmented image with reduced noises. This method will show in several experiments.
… Conference on Pattern …, 2010
In this paper, a parametric and unsupervised histogram-based image segmentation method is presented. The histogram is assumed to be a mixture of asymmetric generalized Gaussian distributions. The mixture parameters are estimated by using the Expectation Maximization algorithm. Histogram fitting and region uniformity measures on synthetic and real images reveal the effectiveness of the proposed model compared to the generalized Gaussian mixture model.
R Ra ah hm ma an n F Fa ar rn no oo os sh h, , a an nd d B Be eh hn na am m Z Za ar rp pa ak k Abstract: Stochastic models such as mixture models, graphical models, Markov random fields and hidden Markov models have key role in probabilistic data analysis. In this paper, we used Gaussian mixture model to the pixels of an image. The parameters of the model were estimated by EM-algorithm. In addition pixel labeling corresponded to each pixel of true image was made by Bayes rule. In fact, a new numerically method was introduced for finding the maximum a posterior estimation by using EM-algorithm and Gaussians mixture distribution. In this algorithm, we were made a sequence of priors, posteriors were made and then converged to a posterior probability that is called the reference posterior probability. Maximum a posterior estimated can determine by the reference posterior probability which can make labeled image. This labeled image shows our segmented image with reduced noises. We presented this method in several experiments.
2012
The Expectation-Maximization algorithm has been classically used to find the maximum likelihood estimates of parameters in probabilistic models with unobserved data, for instance, mixture models. A key issue in such problems is the choice of the model complexity. The higher the number of components in the mixture, the higher will be the data likelihood, but also the higher will be the computational burden and data overfitting. In this work we propose a clustering method based on the expectation maximization algorithm that adapts on-line the number of components of a finite Gaussian mixture model from multivariate data. Or method estimates the number of components and their means and covariances sequentially, without requiring any careful initialization. Our methodology starts from a single mixture component covering the whole data set and sequentially splits it incrementally during expectation maximization steps. The coarse to fine nature of the algorithm reduce the overall number of computations to achieve a solution, which makes the method particularly suited to image segmentation applications whenever computational time is an issue. We show the effectiveness of the method in a series of experiments and compare it with a state-of-the-art alternative technique both with synthetic data and real images, including experiments with images acquired from the iCub humanoid robot.
EUROCON, 2007. The …, 2007
In this paper we study an unsupervised algorithm for radiographic image segmentation, based on the Gaussian mixture models (GMMs). Gaussian mixture models constitute a well-known type of probabilistic neural networks. One of their many successful applications is in image segmentation. Mixture model parameters have been trained using the expectation maximization (EM) algorithm. Numerical experiments using radiographic images illustrate the superior performance of EM method in term of segmentation accuracy compared to fuzzy c-means algorithm.
2011
Abstract: The Expectation Maximization (EM) algorithm and the clustering method Fuzzy-C-Means (FCM) are widely used in image segmentation. However, the major drawback of these methods is their sensitivity to the noise. In this paper, we propose a variant of these methods which aim at resolving this problem. Our approaches proceed by the characterization of pixels by two features: the first one describes the intrinsic properties of the pixel and the second characterizes the neighborhood of pixel. Then, the classification is made on the base on adaptive distance which privileges the one or the other features according to the spatial position of the pixel in the image. The obtained results have shown a significant improvement of our approaches performance compared to the standard version of the EM and FCM, respectively, especially regarding about the robustness face to noise and the accuracy of the edges between regions.
IEEE transactions on neural networks / a publication of the IEEE Neural Networks Council, 2005
Gaussian mixture models (GMMs) constitute a well-known type of probabilistic neural networks. One of their many successful applications is in image segmentation, where spatially constrained mixture models have been trained using the expectation-maximization (EM) framework. In this letter, we elaborate on this method and propose a new methodology for the M-step of the EM algorithm that is based on a novel constrained optimization formulation. Numerical experiments using simulated images illustrate the superior performance of our method in terms of the attained maximum value of the objective function and segmentation accuracy compared to previous implementations of this approach.
Expert Systems with Applications, 2012
Finite mixture models are one of the most widely and commonly used probabilistic techniques for image segmentation. Although the most well known and commonly used distribution when considering mixture models is the Gaussian, it is certainly not the best approximation for image segmentation and other related image processing problems. In this paper, we propose and investigate the use of several other mixture models based namely on Dirichlet, generalized Dirichlet and Beta-Liouville distributions, which offer more flexibility in data modeling, for image segmentation. A maximum likelihood (ML) based algorithm is applied for estimating the resulted segmentation model's parameters. Spatial information is also employed for figuring out the number of regions in an image and several color spaces are investigated and compared. The experimental results show that the proposed segmentation framework yields good overall performance, on various color scenes, that is better than comparable techniques.
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2010
Image segmentation is a critical low-level visual routine for robot perception. However, most image segmentation approaches are still too slow to allow real-time robot operation. In this paper we explore a new method for image segmentation based on the expectation maximization algorithm applied to Gaussian Mixtures. Our approach is fully automatic in the choice of the number of mixture components, the initialization parameters and the stopping criterion. The rationale is to start with a single Gaussian in the mixture, covering the whole data set, and split it incrementally during expectation maximization steps until a good data likelihood is reached. Singe the method starts with a single Gaussian, it is more computationally efficient that others, especially in the initial steps. We show the effectiveness of the method in a series of simulated experiments both with synthetic and real images, including experiments with the iCub humanoid robot.
Statistics and Computing, 2008
In this paper, we propose a model for image segmentation based on a finite mixture of Gaussian distributions. For each pixel of the image, prior probabilities of class memberships are specified through a Gibbs distribution, where association between labels of adjacent pixels is modeled by a class-specific term allowing for different interaction strengths across classes. We show how model parameters can be estimated in a maximum likelihood framework using Mean Field theory. Experimental performance on perturbed phantom and on real benchmark images shows that the proposed method performs well in a wide variety of empirical situations.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Lecture Notes in Computer Science, 2011
IEEE Transactions on Image Processing, 1997
IEEE transactions on image processing : a publication of the IEEE Signal Processing Society, 2010
International Journal of Signal Processing, Image Processing and Pattern Recognition, 2016
International Journal of Advanced Computer Science and Applications, 2011
International Journal of Scientific Research in Science and Technology, 2021
2007 IEEE Conference on Computer Vision and Pattern Recognition, 2007
IEEE Transactions on Image Processing, 1997
Neurocomputing, 2018
IEEE transactions on image processing : a publication of the IEEE Signal Processing Society, 2007