Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2005, Journal of multivariate analysis
…
11 pages
1 file
An extension to an L p -spaces, p > 1, of Pearson-Kolmogorov-Renyi correlation ratio is constructed. It is proved that correlation does not exceed 2 2 p −1 , and can be used as a measure of dependence of the random variable from a sigmafield.
IEEE Transactions on Information Theory, 2019
In this paper, we introduce a new measure of correlation for bipartite quantum states. This measure depends on a parameter α, and is defined in terms of vector-valued L p -norms. The measure is within a constant of the exponential of α-Rényi mutual information, and reduces to the trace norm (total variation distance) for α = 1. We will prove some decoupling type theorems in terms of this measure of correlation, and present some applications in privacy amplification as well as in bounding the random coding exponents. In particular, we establish a bound on the secrecy exponent of the wiretap channel (under the total variation metric) in terms of the α-Rényi mutual information according to Csiszár's proposal.
Journal of Multivariate Analysis, 1987
Fuzzy Sets and Systems, 2005
In 2004 Fullér and Majlender introduced the notion of covariance between fuzzy numbers by their joint possibility distribution to measure the degree to which they interact. Based on this approach, in this paper we will present the concept of possibilistic correlation representing an average degree of interaction between marginal distributions of a joint possibility distribution as compared to their respective dispersions. Moreover, we will formulate the classical Cauchy-Schwarz inequality in this possibilistic environment and show that the measure of possibilistic correlation satisfies the same property as its probabilistic counterpart. In particular, applying the idea of transforming level sets of possibility distributions into uniform probability distributions, we will point out a fundamental relationship between our proposed possibilistic approach and the classical probabilistic approach to measuring correlation. * The final version of this paper appeared in: C. Carlsson, R. Fullér and P. Majlender, On possibilistic correlation, Fuzzy Sets and Systems, 155(2005) 425-445.
The dependence in the world of the uncertainty is a complex concept. The textbooks do avoid any discussions in this regard. However, dependence exists and can be measured. We use the concept of dependence proposed about 50 years ago by the famous Bulgarian mathematician N. Obreshkov and discuss the ways of its interpretation establish some additional interesting properties, and point out areas of applications. Then we apply it to some examples to illustrate how suitable this approach is in the studies of local dependence between nonnumeric and numeric random variables.
Mathematical Notes, 2006
We study the class of endomorphisms of the cone of correlation functions generated by probability measures. We consider algebraic properties of the products ( ·, ) and the maps K , K −1 which establish relationships between the properties of functions on the configuration space and the properties of the corresponding operators (matrices with Boolean indices):
Metrika, 2009
Two random variables X and Y are mutually completely dependent (m.c.d.) if there is a measurable bijection f with P (Y = f (X)) = 1. For continuous X and Y , a natural approach to constructing a measure of dependence is via the distance between the copula of X and Y and the independence copula. We show that this approach depends crucially on the choice of the distance function. For example, the L p-distances, suggested by Schweizer and Wolff, cannot generate a measure of (mutual complete) dependence, since every copula is the uniform limit of copulas linking m.c.d. variables. Instead, we propose to use a modified Sobolev norm, with respect to which, mutual complete dependence cannot approximate any other kind of dependence. This Sobolev norm yields the first nonparametric measure of dependence capturing precisely the two extremes of dependence, i.e., it equals 0 if and only if X and Y are independent, and 1 if and only if X and Y are m.c.d.
Measuring strength or degree of statistical dependence between two random variables is a common problem in many domains. Pearson's correlation coefficient ρ is an accurate measure of linear dependence. We show that ρ is a normalized, Euclidean type distance between joint probability distribution of the two random variables and that when their independence is assumed while keeping their marginal distributions. And the normalizing constant is the geometric mean of two maximal distances; each between the joint probability distribution when the full linear dependence is assumed while preserving respective marginal distribution and that when the independence is assumed. Usage of it is restricted to linear dependence because it is based on Euclidean type distances that are generally not metrics and considered full dependence is linear. Therefore, we argue that if a suitable distance metric is used while considering all possible maximal dependences then it can measure any non-linear dependence. But then, one must define all the full dependences. Hellinger distance that is a metric can be used as the distance measure between probability distributions and obtain a generalization of ρ for the discrete case.
2016
Measuring strength or degree of statistical dependence between two random variables is a common problem in many domains. Pearson's correlation coefficient ρ is an accurate measure of linear dependence. We show that ρ is a normalized, Euclidean type distance between joint probability distribution of the two random variables and that when their independence is assumed while keeping their marginal distributions. And the normalizing constant is the geometric mean of two maximal distances; each between the joint probability distribution when the full linear dependence is assumed while preserving respective marginal distribution and that when the independence is assumed. Usage of it is restricted to linear dependence because it is based on Euclidean type distances that are generally not metrics and considered full dependence is linear. Therefore, we argue that if a suitable distance metric is used while considering all possible maximal dependences then it can measure any non-linear dependence. But then, one must define all the full dependences. Hellinger distance that is a metric can be used as the distance measure between probability distributions and obtain a generalization of ρ for the discrete case.
Based on recent progress in research on copula based dependence measures, we review the original Rényi's axioms on symmetric measures and propose a new set of axioms that applies to nonsymmetric measures. We show that nonsymmetric measures can actually better characterize the relationship between a pair of random variables including both independence and complete dependence. The new measures also satisfy the Data Processing Inequality (DPI) on the * product on copulas, which leads to nice features including the invariance of dependence measure under bijective transformation on one of the random variables. The issues with symmetric measures are also clarified.
Mathematica Slovaca, 2008
Various authors have studied extensions of Shannon’s entropy but their inferential properties and applications in applied sciences have not invited proper attention from researchers. In the present paper we explore the motivation and implication of using various classes of the generalized entropies and conditional entropies. We evaluate β-class and (α, β)-class entropies for multivariate normal density function. We also obtain the measures of dependence in terms of the classes of generalized entropies.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Journal of Statistical Physics, 1997
Contemporary Mathematics, 1999
Electronic Notes in Discrete Mathematics, 2005
Annals of Probability, 2005
Journal of Multivariate Analysis, 1990
Journal of Multivariate Analysis, 2011
Journal of Multivariate Analysis, 2007
Proceedings of the American Mathematical Society, 1973
Journal of Statistical Physics, 1996
Canadian Journal of Statistics, 1988
Studies in Fuzziness and Soft Computing, 2002