Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2014, Natural Science
In this paper, it is discussed a framework combining traditional expected utility and weighted entropy (EU-WE)-also named mean contributive value index-which may be conceived as a decision aiding procedure, or a heuristic device generating compositional scenarios, based on information theory concepts, namely weighted entropy. New proofs concerning the maximum value of the index and the evaluation of optimal proportions are outlined, with emphasis on the optimal value of the Lagrange multiplier and its meaning. The rationale is a procedure of maximizing the combined value of a system expressed as a mosaic, denoted by characteristic values of the states and their proportions. Other perspectives of application of this EU-WE framework are suggested.
2017
At present, the choice of the best solutions out of many possible under conditions of uncertainty is the actual economic task, arising and to be solved in many economic situations. Famous classical approaches to its solution are based on various assessments of decision-making practical situations. However, they often give insufficiently accurate or incorrect results, and do not satisfy sustainability requirements, when the only invariant calculation result relative to calculation methodology is a reliable one and a corresponding to the reality result. This article describes an alternative approach to the justification of decisions under conditions of uncertainty without the construction and use of assumptions about the decision-making situation and in conformity with the approaches of the stability theory. The problem of multi-criteria decision-making in conditions of complete uncertainty, wherein structuring of alternatives is performed using the fuzzy entropy, has been formulated ...
International Journal of Applied Decision Sciences, 2013
Methodologies related to information theory have been increasingly used in studies in economics and management. In this paper, we use generalised maximum entropy as an alternative to ordinary least squares in the estimation of utility functions. Generalised maximum entropy has some advantages: it does not need such restrictive assumptions and could be used with both well and ill-posed problems, for example, when we have small samples, which is the case when estimating utility functions. Using linear, logarithmic and power utility functions, we estimate those functions and confidence intervals and perform hypothesis tests. Results point to the greater accuracy of generalised maximum entropy, showing its efficiency in estimation.
Open Systems & Information Dynamics, 2008
The notion of utility maximising entropy (u-entropy) of a probability density, which was introduced and studied in [37], is extended in two directions. First, the relative u-entropy of two probability measures in arbitrary probability spaces is defined. Then, specialising to discrete probability spaces, we also introduce the absolute u-entropy of a probability measure. Both notions are based on the idea, borrowed from mathematical finance, of maximising the expected utility of the terminal wealth of an investor. Moreover, u-entropy is also relevant in thermodynamics, as it can replace the standard Boltzmann-Shannon entropy in the Second Law. If the utility function is logarithmic or isoelastic (a power function), then the well-known notions of Boltzmann-Shannon and Rényi relative entropy are recovered. We establish the principal properties of relative and discrete u-entropy and discuss the links with several related approaches in the literature.
Entropy, 2010
The user has requested enhancement of the downloaded file. All in-text references underlined in blue are added to the original document and are linked to publications on ResearchGate, letting you access and read them immediately.
Operations Research, 2008
Information measures arise in many disciplines, including forecasting (where scoring rules are used to provide incentives for probability estimation), signal processing (where information gain is measured in physical units of relative entropy), decision analysis (where new information can lead to improved decisions), and finance (where investors optimize portfolios based on their private information and risk preferences). In this paper, we generalize the two most commonly used parametric families of scoring rules and demonstrate their relation to well-known generalized entropies and utility functions, shedding new light on the characteristics of alternative scoring rules as well as duality relationships between utility maximization and entropy minimization. In particular, we show that weighted forms of the pseudospherical and power scoring rules correspond exactly to measures of relative entropy (divergence) with convenient properties, and they also correspond exactly to the solutions of expected utility maximization problems in which a risk-averse decision maker whose utility function belongs to the linear-risk-tolerance family interacts with a risk-neutral betting opponent or a complete market for contingent claims in either a one-period or a two-period setting. When the market is incomplete, the corresponding problems of maximizing linear-risk-tolerance utility with the risk-tolerance coefficient are the duals of the problems of minimizing the pseudospherical or power divergence of order between the decision maker's subjective probability distribution and the set of risk-neutral distributions that support asset prices.
AIP Conference Proceedings, 2004
Recent literature in the last Maximum Entropy workshop introduced an analogy between cumulative probability distributions and normalized utility functions [1]. Based on this analogy, a utility density function can de defined as the derivative of a normalized utility function. A utility density function is non-negative and integrates to unity. These two properties of a utility density function form the basis of a correspondence between utility and probability, which allows the application of many tools from one domain to the other. For example, La Place's principle of insufficient reason translates to a principle of insufficient preference. The notion of uninformative priors translates to uninformative utility functions about a decision maker's preferences. A natural application of this analogy is a maximum entropy principle to assign maximum entropy utility values. Maximum entropy utility interprets many of the common utility functions based on the preference information needed for their assignment, and helps assign utility values based on partial preference information. This paper reviews maximum entropy utility, provides axiomatic justification for its use, and introduces further results that stem from the duality between probability and utility, such as joint utility density functions, utility inference, and the notion of mutual preference.
Entropy, 2018
Entropy is one of many important mathematical tools for measuring uncertain/fuzzy information. As a subclass of neutrosophic sets (NSs), simplified NSs (including single-valued and interval-valued NSs) can describe incomplete, indeterminate, and inconsistent information. Based on the concept of fuzzy exponential entropy for fuzzy sets, this work proposes exponential entropy measures of simplified NSs (named simplified neutrosophic exponential entropy (SNEE) measures), including single-valued and interval-valued neutrosophic exponential entropy measures, and investigates their properties. Then, the proposed exponential entropy measures of simplified NSs are compared with existing related entropy measures of interval-valued NSs to illustrate the rationality and effectiveness of the proposed SNEE measures through a numerical example. Finally, the developed exponential entropy measures for simplified NSs are applied to a multi-attribute decision-making example in an interval-valued NS setting to demonstrate the application of the proposed SNEE measures. However, the SNEE measures not only enrich the theory of simplified neutrosophic entropy, but also provide a novel way of measuring uncertain information in a simplified NS setting.
Plithogenic set is an extension of the crisp set, fuzzy set, intuitionistic fuzzy set, and neutrosophic sets, whose elements are characterized by one or more attributes, and each attribute can assume many values. Each attribute has a corresponding degree of appurtenance of the element to the set with respect to the given criteria. In order to obtain a better accuracy and for a more exact exclusion (partial order), a contradiction or dissimilarity degree is defined between each attribute value and the dominant attribute value. In this paper, entropy measures for plithogenic sets have been introduced. The requirements for any function to be an entropy measure of plithogenic sets are outlined in the axiomatic definition of the plithogenic entropy using the axiomatic requirements of neutrosophic entropy. Several new formulae for the entropy measure of plithogenic sets are also introduced. The newly introduced entropy measures are then applied to a multi-attribute decision making problem related to the selection of locations.
2011
Forest planning is a major issue for future development of regions, and landscape changes reflect to a large extent land-use/cover patterns related to economic and option's values as driving forces. Managing for reducing variability may have high costs of stability or resilience. Decision theory relies on maximizing an average or expected value of a preference pattern expressed by utility functions. In this paper I discuss quantitative indices bonded to expected utility concepts that may provide diagnosis tools and become generators of the relative extension of the different habitats that compose an ecomosaic. The indices combine characteristic values and context values in contributive values, defined in a normalized measure space. Scenarios of composition for forest planning in the region of Nisa, Portugal, are discussed and benchmarked with standard measures: mean economic value, related to recover costs of forest habitats, and landscape diversity. Situation theory and relevance theory are axiomatic baselines of the abductive process and heuristic procedures here developed.
2010
The user has requested enhancement of the downloaded file. All in-text references underlined in blue are added to the original document and are linked to publications on ResearchGate, letting you access and read them immediately.
Theoretical Economics Letters, 2015
We present and discuss a conceptual decision-making procedure supported by a mathematical device combining expected utility and a generalized information measure: the weighted Gini-Simpson index, linked to the scientific fields of information theory and ecological diversity analysis. After a synthetic review of the theoretical background relative to those themes, such a devicean EU-WGS framework denoting a real function defined with positive utility values and domain in the simplex of probabilities-is analytically studied, identifying its range with focus on the maximum point, using a Lagrange multiplier method associated with algorithms, exemplified numerically. Yet, this EU-WGS device is showed to be a proper analog of an expected utility and weighted entropy (EU-WE) framework recently published, both being cases of mathematical tools that can be referred to as non-expected utility methods using decision weights, framed within the field of decision theory linked to information theory. This kind of decision modeling procedure can also be interpreted to be anchored in Kurt Lewin utility's concept and may be used to generate scenarios of optimal compositional mixtures applied to generic lotteries associated with prospect theory, financial risk assessment, security quantification and natural resources management. The epistemological method followed in the reasoned choice procedure that is presented in this paper is neither normative nor descriptive in an empirical sense, but instead it is heuristic and hermeneutical in its conception.
Theory and Decision, 2012
Expected utility maximization problem is one of the most useful tools in mathematical finance, decision analysis and economics. Motivated by statistical model selection, via the principle of expected utility maximization, Friedman and Sandow (J Mach Learn Res 4:257–291, 2003a) considered the model performance question from the point of view of an investor who evaluates models based on the performance of
Physica A-statistical Mechanics and Its Applications, 2008
The maximum entropy principle can be used to assign utility values when only partial information is available about the decision maker's preferences. In order to obtain such utility values it is necessary to establish an analogy between probability and utility through the notion of a utility density function. According to some ] the maximum entropy utility solution embeds a large family of utility functions. In this paper we explore the maximum entropy principle to estimate the utility function of a risk averse decision maker.
Entropy weight method (EWM) is a commonly used weighting method that measures value dispersion in decision-making. e greater the degree of dispersion, the greater the degree of differentiation, and more information can be derived. Meanwhile, higher weight should be given to the index, and vice versa. is study shows that the rationality of the EWM in decision-making is questionable. One example is water source site selection, which is generated by Monte Carlo Simulation. First, too many zero values result in the standardization result of the EWM being prone to distortion. Subsequently, this outcome will lead to immense index weight with low actual differentiation degree. Second, in multi-index decision-making involving classification, the classification degree can accurately reflect the information amount of the index. However, the EWM only considers the numerical discrimination degree of the index and ignores rank discrimination. ese two shortcomings indicate that the EWM cannot correctly reflect the importance of the index weight, thus resulting in distorted decision-making results.
Communications in Statistics - Theory and Methods, 2019
Cross entropy is an important index for determining the divergence between two sets or distributions. Most existing cross entropy are proposed in a fuzzy environment and undefined in some uncertain situations (e.g., Dempster-Shafer theory). This study proposes an extended cross entropy measure of belief values based on a belief degree using available evidence. Thus, a new aspect of belief functions represents in the name of a belief set. Then, a new cross entropy measure between two belief sets is defined. Furthermore, the application of the cross-entropy measure in multi-criteria decision making (MCDM) is provided with belief valued information.
Entropy, 2007
We review a decision theoretic, i.e., utility-based, motivation for entropy and Kullback-Leibler relative entropy, the natural generalizations that follow, and various properties of these generalized quantities. We then consider these generalized quantities in an easily interpreted special case. We show that the resulting quantities, share many of the properties of entropy and relative entropy, such as the data processing inequality and the second law of thermodynamics. We formulate an important statistical learning problem-probability estimation-in terms of a generalized relative entropy. The solution of this problem reflects general risk preferences via the utility function; moreover, the solution is optimal in a sense of robust absolute performance.
Operations Research, 2006
This paper presents a method to assign utility values when only partial information is available about the decision maker’s preferences. We introduce the notion of a utility density function and a maximum entropy principle for utility assignment. The maximum entropy utility solution embeds a large family of utility functions that includes the most commonly used functional forms. We discuss the implications of maximum entropy utility on the preference behavior of the decision maker and present an application to competitive bidding situations where only previous decisions are observed by each party. We also present minimum cross entropy utility, which incorporates additional knowledge about the shape of the utility function into the maximum entropy formulation, and work through several examples to illustrate the approach.
2003
We introduce an axiomatic approach to the problem of inferring a complete and transitive weak ordering representing the agent's preferences given a set of observed constraints. The axioms characterize a unique inference rule, which amounts to the constrained maximization of a certain formula we derive. The formula can be interpreted as the entropy of the agent's preference ordering, and its unique maximand identifies the simplest rationalization of the observed behavior.
Systems, 2020
The uncertainty, or entropy, of an atom of an ideal gas being in a certain energy state mirrors the way people perceive uncertainty in the making of decisions, uncertainty that is related to unmeasurable subjective probability. It is well established that subjects evaluate risk decisions involving uncertain choices using subjective probability rather than objective, which is usually calculated using empirically derived decision weights, such as those described in Prospect Theory; however, an exact objective-subjective probability relationship can be derived from statistical mechanics and information theory using Kullback-Leibler entropy divergence. The resulting Entropy Decision Risk Model (EDRM) is based upon proximity or nearness to a state and is predictive rather than descriptive. A priori EDRM, without factors or corrections, accurately aligns with the results of prior decision making under uncertainty (DMUU) studies, including Prospect Theory and others. This research is a first step towards the broader effort of quantifying financial, programmatic, and safety risk decisions in fungible terms, which applies proximity (i.e., subjective probability) with power utility to evaluate choice preference of gains, losses, and mixtures of the two in terms of a new parameter referred to as Prospect. To facilitate evaluation of the EDRM against prior studies reported in terms of the percentage of subjects selecting a choice, the Percentage Evaluation Model (PEM) is introduced to convert choice value results into subject response percentages, thereby permitting direct comparison of a utility model for the first time.
The Journal of Defense Modeling and Simulation: Applications, Methodology, Technology, 2019
This article compares the entropy weight scheme to other subjective weighting schemes using various multi-attribute decision making criteria. We apply the entropy weighting scheme to improve the CARVER center of gravity analysis and targeting analysis that are currently used by Special Operations Forces. We also compare the entropy weighing schemes to other weighting schemes using the ranking of terrorist for targeting. Next, we apply several multi-attribute decision making (MADM) methods using our suggested various weighting schemes to obtain the rankings of the alternatives. We compare the results and provide sensitivity analysis to examine the robustness of each MADM analysis. We conclude that any decision methodology for CARVER and terrorist ranking that used the actual data collected to compute the weights might be a better method than subjective weights.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.