Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2008, Operations Research
Information measures arise in many disciplines, including forecasting (where scoring rules are used to provide incentives for probability estimation), signal processing (where information gain is measured in physical units of relative entropy), decision analysis (where new information can lead to improved decisions), and finance (where investors optimize portfolios based on their private information and risk preferences). In this paper, we generalize the two most commonly used parametric families of scoring rules and demonstrate their relation to well-known generalized entropies and utility functions, shedding new light on the characteristics of alternative scoring rules as well as duality relationships between utility maximization and entropy minimization. In particular, we show that weighted forms of the pseudospherical and power scoring rules correspond exactly to measures of relative entropy (divergence) with convenient properties, and they also correspond exactly to the solutio...
Theory and Decision, 2012
Expected utility maximization problem is one of the most useful tools in mathematical finance, decision analysis and economics. Motivated by statistical model selection, via the principle of expected utility maximization, Friedman and Sandow (J Mach Learn Res 4:257–291, 2003a) considered the model performance question from the point of view of an investor who evaluates models based on the performance of
Entropy, 2007
We review a decision theoretic, i.e., utility-based, motivation for entropy and Kullback-Leibler relative entropy, the natural generalizations that follow, and various properties of these generalized quantities. We then consider these generalized quantities in an easily interpreted special case. We show that the resulting quantities, share many of the properties of entropy and relative entropy, such as the data processing inequality and the second law of thermodynamics. We formulate an important statistical learning problem-probability estimation-in terms of a generalized relative entropy. The solution of this problem reflects general risk preferences via the utility function; moreover, the solution is optimal in a sense of robust absolute performance.
Proceedings of 2nd International Electronic Conference on Entropy and Its Applications, 2015
The application of entropy in finance can be regarded as the extension of the information entropy and the probability entropy. It can be an important tool in various financial methods such as measure of risk, portfolio selection, option pricing and asset pricing. A typical example for the field of option pricing is the Entropy Pricing Theory (EPT) introduced by Les Gulko [1996]. The Black-Scholes model [1973] exhibits the idea of no arbitrage which implies the existence of universal risk-neutral probabilities but unfortunately it does not guarantees the uniqueness of the risk-neutral probabilities. In a second step the parameterization of these risk-neutral probabilities needs a frame of stochastic calculus and to be more specific for example the Black and Scholes frame is controlled by Geometric Brownian Motion (GBM). This implies the existence of riskneutral probabilities in the field of option pricing and their uniqueness is vital. The Shannon entropy can be used in particular manners to evaluate entropy of probability density distribution around some points but in the case of specific events for example deviation from mean and any sudden news for stock returns up (down), needs additional information and this concept of entropy can be generalized. If we want to compare entropy of two distributions by considering the two events i.e. deviation from mean and sudden news then Shannon entropy [1964] assumes implicit certain exchange that occurs as a compromise between contributions from the tail and main mass of the distribution. This is important now to control this trade-off explicitly. In order to solve this problem the use of entropy measures that depend on powers of probability for example Tsallis [1988], Kaniadakis [2001], Ubriaco [2009], Shafee [2007] and Rényi [1961] provide such control. In this article we use entropy measures depend on the powers of the probability. We OPEN ACCESS 2 propose some entropy maximization problems in order to obtain the risk neutral densities. We present also the European call and put in this frame work.
Stochastics and Quality Control, 2019
Di Crescenzo and Longobardi [Di Crescenzo and Longobardi, On cumulative entropies, J. Statist. Plann. Inference 139 (2009), no. 12, 4072-4087] proposed the cumulative entropy (CE) as an alternative to the differential entropy. They presented an estimator of CE using empirical approach. In this paper, we consider a risk measure based on CE and compare it with the standard deviation and the Gini mean difference for some distributions. We also make empirical comparisons of these measures using samples from stock market in members of the Organization for Economic Cooperation and Development (OECD) countries.
Open Systems & Information Dynamics, 2008
The notion of utility maximising entropy (u-entropy) of a probability density, which was introduced and studied in [37], is extended in two directions. First, the relative u-entropy of two probability measures in arbitrary probability spaces is defined. Then, specialising to discrete probability spaces, we also introduce the absolute u-entropy of a probability measure. Both notions are based on the idea, borrowed from mathematical finance, of maximising the expected utility of the terminal wealth of an investor. Moreover, u-entropy is also relevant in thermodynamics, as it can replace the standard Boltzmann-Shannon entropy in the Second Law. If the utility function is logarithmic or isoelastic (a power function), then the well-known notions of Boltzmann-Shannon and Rényi relative entropy are recovered. We establish the principal properties of relative and discrete u-entropy and discuss the links with several related approaches in the literature.
2016
Uncertainty is one of the most important concept in financial mathematics applications. In this paper we review some important aspects related to the application of entropy-related concepts to option pricing. The Kullback-Leibler information divergence and the informational energy introduced by Onicescu are the main tools investigated in this paper. We highlight a necessary condition that must be verified when obtaining the probability distribution minimising the Kullback-Leibler information divergence. Deriving a probability distribution by optimising the information energy has some pitfalls that are discussed in this paper.
Relative entropy optimization (REO) and distortion are tools for transforming probability measures, and they are closely related to the exponential family. In this paper, the authors establish the link between REO and distortion. REO is commonly used for pricing risky financial assets in incomplete markets, while distortion methods are widely used in pricing insurance risks and in risk management. The link between REO and distortion provides some intuition behind distorted risk measures such as value-at-risk. Furthermore, distorted risk measures having desirable properties, such as coherence, are easily generated via REO.
2016
Dedicated to Professor Andrzej Lasota on his 70th birthday Expected utility maximization problems in mathematical finance lead to a generalization of the classical definition of entropy. It is demonstrated that a necessary and sufficient condition for the second law of thermodynamics to operate is that any one of the generalized entropies should tend to its minimum value of zero.
Journal of Physics: Conference Series, 2012
When uncertainty dominates understanding stock market volatility is vital. There are a number of reasons for that. On one hand, substantial changes in volatility of financial market returns are capable of having significant negative effects on risk averse investors. In addition, such changes can also impact on consumption patterns, corporate capital investment decisions and macroeconomic variables. Arguably, volatility is one of the most important concepts in the whole finance theory. In the traditional approach this phenomenon has been addressed based on the concept of standard-deviation (or variance) from which all the famous ARCH type models -Autoregressive Conditional Heteroskedasticity Models-depart. In this context, volatility is often used to describe dispersion from an expected value, price or model. The variability of traded prices from their sample mean is only an example. Although as a measure of uncertainty and risk standard-deviation is very popular since it is simple and easy to calculate it has long been recognized that it is not fully satisfactory. The main reason for that lies in the fact that it is severely affected by extreme values. This may suggest that this is not a closed issue. Bearing on the above we might conclude that many other questions might arise while addressing this subject. One of outstanding importance, from which more sophisticated analysis can be carried out, is how to evaluate volatility, after all? If the standard-deviation has some drawbacks shall we still rely on it? Shall we look for an alternative measure? In searching for this shall we consider the insight of other domains of knowledge? In this paper we specifically address if the concept of entropy, originally developed in physics by Clausius in the XIX century, which can constitute an effective alternative. Basically, what we try to understand is, which are the potentialities of entropy compared to the standard deviation. But why entropy? The answer lies on the fact that there is already some research on the domain of Econophysics, which points out that as a measure of disorder, distance from equilibrium or even ignorance, entropy might present some advantages. However another question arises: since there is several measures of entropy which one since there are several measures of entropy, which one shall be used? As a starting point we discuss the potentialities of Shannon entropy and Tsallis entropy. The main difference between them is that both Renyi and Tsallis are adequate for anomalous systems while Shannon has revealed optimal for equilibrium systems.
Operations Research, 2006
This paper presents a method to assign utility values when only partial information is available about the decision maker’s preferences. We introduce the notion of a utility density function and a maximum entropy principle for utility assignment. The maximum entropy utility solution embeds a large family of utility functions that includes the most commonly used functional forms. We discuss the implications of maximum entropy utility on the preference behavior of the decision maker and present an application to competitive bidding situations where only previous decisions are observed by each party. We also present minimum cross entropy utility, which incorporates additional knowledge about the shape of the utility function into the maximum entropy formulation, and work through several examples to illustrate the approach.
International Journal of Applied Decision Sciences, 2013
Methodologies related to information theory have been increasingly used in studies in economics and management. In this paper, we use generalised maximum entropy as an alternative to ordinary least squares in the estimation of utility functions. Generalised maximum entropy has some advantages: it does not need such restrictive assumptions and could be used with both well and ill-posed problems, for example, when we have small samples, which is the case when estimating utility functions. Using linear, logarithmic and power utility functions, we estimate those functions and confidence intervals and perform hypothesis tests. Results point to the greater accuracy of generalised maximum entropy, showing its efficiency in estimation.
Although the concept of entropy is originated from thermodynamics, its concepts and relevant principles, especially the principles of maximum entropy and minimum cross-entropy, have been extensively applied in finance. In this paper, we review the concepts and principles of entropy, as well as their applications in the field of finance, especially in portfolio selection and asset pricing. Furthermore, we review the effects of the applications of entropy and compare them with other traditional and new methods.
2017
We highlight the role of entropy maximization in several fundamental results in financial mathematics. They are the two fund theorem for Markowitz efficient portfolios, the existence and uniqueness of a market portfolio in the capital asset pricing model, the fundamental theorem of asset pricing, the selection of a martingale measure for pricing contingent claims in an incomplete market and the calculation of super/sub-hedging bounds and portfolios. The connection of diverse important results in finance with the method of entropy maximization indicates the significant influence of methodology of physical science in financial research.
Entropy
In this paper we investigate the relationship between the information entropy of the distribution of intraday returns and intraday and daily measures of market risk. Using data on the EUR/JPY exchange rate, we find a negative relationship between entropy and intraday Value-at-Risk, and also between entropy and intraday Expected Shortfall. This relationship is then used to forecast daily Value-at-Risk, using the entropy of the distribution of intraday returns as a predictor.
Journal of Econometrics, 2002
SSRN Electronic Journal, 2000
Consider any investor who fears ruin facing any set of investments that satisfy no-arbitrage. Before investing, he can purchase information about the state of nature in the form of an information structure. Given his prior, information structure α is more informative than information structure β if whenever he rejects α at some price, he also rejects β at that price.
We investigate entropy as a financial risk measure. Entropy explains the equity premium of securities and portfolios in a simpler way and, at the same time, with higher explanatory power than the beta parameter of the capital asset pricing model. For asset pricing we define the continuous entropy as an alternative measure of risk. Our results show that entropy decreases in the function of the number of securities involved in a portfolio in a similar way to the standard deviation, and that efficient portfolios are situated on a hyperbola in the expected return – entropy system. For empirical investigation we use daily returns of 150 randomly selected securities for a period of 27 years. Our regression results show that entropy has a higher explanatory power for the expected return than the capital asset pricing model beta. Furthermore we show the time varying behavior of the beta along with entropy.
Natural Science, 2014
In this paper, it is discussed a framework combining traditional expected utility and weighted entropy (EU-WE)-also named mean contributive value index-which may be conceived as a decision aiding procedure, or a heuristic device generating compositional scenarios, based on information theory concepts, namely weighted entropy. New proofs concerning the maximum value of the index and the evaluation of optimal proportions are outlined, with emphasis on the optimal value of the Lagrange multiplier and its meaning. The rationale is a procedure of maximizing the combined value of a system expressed as a mosaic, denoted by characteristic values of the states and their proportions. Other perspectives of application of this EU-WE framework are suggested.
Physica A-statistical Mechanics and Its Applications, 2008
The maximum entropy principle can be used to assign utility values when only partial information is available about the decision maker's preferences. In order to obtain such utility values it is necessary to establish an analogy between probability and utility through the notion of a utility density function. According to some ] the maximum entropy utility solution embeds a large family of utility functions. In this paper we explore the maximum entropy principle to estimate the utility function of a risk averse decision maker.
2008
In this work, we consider a recently proposed entropy S (called varentropy) defined by a variational relationship dI=beta*(d<x>-<dx>) as a measure of uncertainty of random variable x. By definition, varentropy underlies a generalized virtual work principle <dx>=0 leading to maximum entropy d(I-beta*<x>)=0. This paper presents an analytical investigation of this maximizable entropy for several distributions such as stretched exponential distribution, kappa-exponential distribution and Cauchy distribution.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.