Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2022, Selecciones Matemáticas
https://doi.org/10.17268/sel.mat.2022.01.05…
26 pages
1 file
This work has three important purposes: first it is the study of Markov Chains, the second is to show that Markov chains have different applications and finally it is to model a process of this behaves. Throughout this work we will describe a Markov chain, what these processes are for and how these chains are classified. We will describe a Markov Chain, that is, analyze what are the primary elements that make up a Markov chain, among others.
International Journal of Trend in Scientific Research and Development, 2019
Copyright © 2019 by author(s) and International Journal of Trend in Scientific Research and Development Journal. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (CC BY 4.0) (http://creativecommons.org/licenses/by /4.0) ABSTRACT Markov chain is one of the techniques used in operations research with possibilities view that managers in organizational decision making (industrial and commercial) use it. Markov processes arise in probability and statistics in one of two ways. Markov process is a tool to predict that it can make logical and accurate decisions about various aspects of management in the future.
2006
, except for brief excerpts in connection with reviews or scholarly analysis. Use in connection with any form of information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now know or hereafter developed is forbidden. The use in this publication of trade names, trademarks, service marks and similar terms, even if the are not identified as such, is not to be taken as an expression of opinion as to whether or not they are subject to proprietary rights.
Mathematics
Probability resembles the ancient Roman God Janus since, like Janus, probability also has a face with two different sides, which correspond to the metaphorical gateways and transitions between the past and the future[...]
1997
In this paper we rederive some well known results for continuous time Markov processes that live on a nite state space. Martingale techniques are used throughout the paper. Special attention is paid to the construction of a continuous time Markov process, when we start from a discrete time Markov chain. The Markov property here holds with respect to ltrations that need not be minimal.
Energy, 1990
This paper introduces the use of Markov models for engineering-economic planning. Markov models capture the uncertainty and dynamics in the engineering-economic decision environment.
Computers and Biomedical Research, 1986
This paper examines the use of the Markov chain model to study the condition of asthma patients with respect to seasonal variations. The model can be utilized for predicting the health status of these patients. 0 1986 Academic PESS. ITIC.
This paper introduced a general class of mathematical models, Markov chain models, which are appropriate for modeling of phenomena in the physical life, medicine, engineering and social sciences. Application of Markov chains are quite common and have become a standard tool of decision making. What matters in predicting the future of the system is its present state, and not the path by which the system got to its present state. Two methods are presented that exemplify the flexibility of this approach: the regular Markov chain and absorbing Markov chain. The long-term trend in absorbing Markov chains depends on the initial state. In addition, changing the initial state can change the final result. This property distinguishes absorbing Markov chains from regular Markov chains, where the final result is independent of the initial state. The problems are formulated by using the Wolfram Mathematical Programming System.
2006
We describe the life, times and legacy of Andrei Andreevich Markov (1856 -1922), and his writings on what became known as Markov chains. One focus is on his first paper [27] of 1906 on this topic, which already contains important contractivity principles embodied in the Markov Dobrushin coefficient of ergodicity, which in fact makes an explicit appearance in that paper. The contractivity principles are shown directly to underpin a number of results of the later theory. The coefficient is especially useful as a condition number in measuring the effect of perturbation of a stochastic matrix on the stationary distribution (sensitivity analysis). Some recent work in this direction is reviewed from the standpoint of the paper [53], presented at the first of the present series of conferences [63].
One hundred years removed from A. A. Markov's development of his chains, we take stock of the field he generated and the mathematical impression he left. As a tribute to Markov, we present what we consider to be the five greatest applications of Markov chains.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Universiti Utara Malaysia, 2006
arXiv (Cornell University), 2016
Annals of the University of Craiova - Mathematics and Computer Science Series, 2003
Jahresbericht Der Deutschen Mathematiker-vereinigung, 2010
Probability in the Engineering and Informational Sciences, 2012
Cornell University - arXiv, 2021
arXiv (Cornell University), 2022
Stochastic Processes and their Applications, 2017
International Journal of Systems Science, 2014