In this paper, we propose a duality theory for semi-in…nite linear programming problems under unc... more In this paper, we propose a duality theory for semi-in…nite linear programming problems under uncertainty in the constraint functions, the objective function, or both, within the framework of robust optimization. We present robust duality by establishing strong duality between the robust counterpart of an uncertain semi-in…nite linear program and the optimistic counterpart of its uncertain Lagrangian dual. We show that robust duality holds whenever a robust moment cone is closed and convex. We then show that the closed-convex robust moment cone condition in the case of constraint-wise uncertainty is in fact necessary and su¢ cient for robust duality in the sense that robust moment cone is closed and convex if and only if robust duality holds for every linear objective function of the program. In the case of uncertain problems with a¢ nely parameterized data uncertainty, we establish that robust duality is easily satis…ed under a Slater type constraint quali…cation. Consequently, we derive robust forms of the Farkas lemma for systems of uncertain semi-in…nite linear inequalities.
In this paper we examine multi-objective linear programming problems in the face of data uncertai... more In this paper we examine multi-objective linear programming problems in the face of data uncertainty both in the objective function and the constraints. First, we derive a formula for radius of robust feasibility guaranteeing constraint feasibility for all possible uncertainties within a specified uncertainty set under affine data parametrization. We then present a complete characterization of robust weakly effcient solutions that are immunized against rank one objective matrix data uncertainty. We also provide classes of commonly used constraint data uncertainty sets under which a robust feasible solution of an uncertain multi-objective linear program can be numerically checked whether or not it is a robust weakly efficient solution.
European Journal of Operational Research, Feb 1, 2022
This is a PDF file of an article that has undergone enhancements after acceptance, such as the ad... more This is a PDF file of an article that has undergone enhancements after acceptance, such as the addition of a cover page and metadata, and formatting for readability, but it is not yet the definitive version of record. This version will undergo additional copyediting, typesetting and review before it is published in its final form, but we are providing this version to give early visibility of the article. Please note that, during the production process, errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
European Journal of Operational Research, Oct 1, 2018
Highlights • The paper deals with uncertain convex multiobjective problems. • We consider ball un... more Highlights • The paper deals with uncertain convex multiobjective problems. • We consider ball uncertainty aecting all data. • We define a radius of highly robust weak e ciency certifying its existence. • We provide bounds, and an exact formula, for this radius. • We provide simple formulas for convex quadratic and linear multi-objective programs. • These formulas are applied to two variants of a test problem due to Ben-Tal and Nemirovski.
The radius of robust feasibility of a convex program with uncertain constraints gives a value for... more The radius of robust feasibility of a convex program with uncertain constraints gives a value for the maximal 'size' of an uncertainty set under which robust feasibility can be guaranteed. This paper provides an upper bound for the radius for convex programs with uncertain convex polynomial constraints and exact formulas for convex programs with SOS-convex polynomial constraints (or convex quadratic constraints) under affine data uncertainty. These exact formulas allow the radius to be computed by commonly available software.
Mathematical Methods of Operations Research, Nov 10, 2005
This paper deals with the stability of the intersection of a given set X R n with the solution, F... more This paper deals with the stability of the intersection of a given set X R n with the solution, F R n , of a given linear system whose coe¢ cients can be arbitrarily perturbed. In the optimization context, the …xed constraint set X can be the solution set of the (possibly nonlinear) system formed by all the exact constraints (e.g., the sign constraints), a discrete subset of R n (as Z n or f0; 1g n , as it happens in integer or Boolean programming) as well as the intersection of both kind of sets. Conditions are given for the intersection F \ X to remain nonempty (or empty) under su¢ ciently small perturbations of the data.
Journal of Optimization Theory and Applications, Mar 25, 2021
The radius of robust feasibility provides a numerical value for the largest possible uncertainty ... more The radius of robust feasibility provides a numerical value for the largest possible uncertainty set that guarantees robust feasibility of an uncertain linear conic program. This determines when the robust feasible set is non-empty. Otherwise the robust counterpart of an uncertain program is not well-defined as a robust optimization problem. In this paper, we address a key fundamental question of robust optimization: How to compute the radius of robust feasibility of uncertain linear conic programs, including linear programs? We first provide computable lower and upper bounds for the radius of robust feasibility for general uncertain linear conic programs under the commonly used ball uncertainty set. We then provide important classes of linear conic programs where the bounds are calculated by finding the optimal values of related semidefinite linear programs, among them uncertain semidefinite programs, uncertain second-order cone programs and uncertain support vector machine problems. In the case of an uncertain linear program, the exact formula allows us to calculate the radius by finding the optimal value of an associated second-order cone program.
This paper provides characterizations of the weak solutions of optimization problems where a give... more This paper provides characterizations of the weak solutions of optimization problems where a given vector function F, from a decision space X to an objective space Y , is "minimized" on the set of elements x ∈ C (where C ⊂ X is a given nonempty constraint set), satisfying G (x) ≦ S 0 Z , where G is another given vector function from X to a constraint space Z with positive cone S. The three spaces X, Y, and Z are locally convex Hausdorff topological vector spaces, with Y and Z partially ordered by two convex cones K and S, respectively, and enlarged with a greatest and a smallest element. In order to get suitable versions of the Farkas lemma allowing to obtain optimality conditions expressed in terms of the data, the triplet (F, G, C) , we use non-asymptotic representations of the K−epigraph of the conjugate function of F + I A , where I A denotes the indicator function of the feasible set A, that is, the function associating the zero vector of Y to any element of A and the greatest element of Y to any element of X A.
Springer Undergraduate Texts in Mathematics and Technology, 2019
This textbook on nonlinear optimization focuses on model building, real world problems, and appli... more This textbook on nonlinear optimization focuses on model building, real world problems, and applications of optimization models to natural and social sciences. Organized into two parts, this book may be used as a primary text for courses on convex optimization and non-convex optimization. Definitions, proofs, and numerical methods are well illustrated and all chapters contain compelling exercises. The exercises emphasize fundamental theoretical results on optimality and duality theorems, numerical methods with or without constraints, and derivative-free optimization. Selected solutions are given. Applications to theoretical results and numerical methods are highlighted to help students comprehend methods and techniques
Springer Undergraduate Texts in Mathematics and Technology, 2019
Unconstrained Optimization This chapter studies a collection of optimization problems without fun... more Unconstrained Optimization This chapter studies a collection of optimization problems without functional constraints, that is, problems of the form P : Min f (x) s.t. x ∈ C,
We introduce a robust optimization model consisting in a family of perturbation functions giving ... more We introduce a robust optimization model consisting in a family of perturbation functions giving rise to certain pairs of dual optimization problems in which the dual variable depends on the uncertainty parameter. The interest of our approach is illustrated by some examples, including uncertain conic optimization and infinite optimization via discretization. The main results characterize desirable robust duality relations (as robust zero-duality gap) by formulas involving the epsilon-minima or the epsilon-subdifferentials of the objective function. The two extreme cases, namely, the usual perturbational duality (without uncertainty), and the duality for the supremum of functions (duality parameter vanishing) are analyzed in detail.
In this paper we associate with an in…nite family of real extended functions de…ned on a locally ... more In this paper we associate with an in…nite family of real extended functions de…ned on a locally convex space, a sum, called robust sum, which is always well-de…ned. We also associate with that family of functions a dual pair of problems formed by the unconstrained minimization of its robust sum and the so-called optimistic dual. For such a dual pair, we characterize weak duality, zero duality gap, and strong duality, and their corresponding stable versions, in terms of multifunctions associated with the given family of functions and a given approximation parameter " 0 which is related to the "-subdi¤erential of the robust sum of the family. We also consider the particular case when all functions of the family are convex, assumption allowing to characterize the duality properties in terms of closedness conditions.
Journal of Optimization Theory and Applications, 2018
This paper provides new Farkas-type results characterizing the inclusion A ⊂ B, where A and B are... more This paper provides new Farkas-type results characterizing the inclusion A ⊂ B, where A and B are subsets of a locally convex space X. The sets A and B are described here by means of vector functions from X to other locally convex spaces Y (equipped with the partial ordering associated with a given convex cone K ⊂ Y) and Z, respectively. These new Farkas lemmas are obtained via the complete characterization of the K-epigraphs of certain conjugate mappings which constitute the core of our approach. In contrast with a previous paper of three of the autors [J. Optim. Theory Appl. 173, 357-390 (2017)], the characterizations of A ⊂ B are expressed here in terms of the data.
The paper deals with optimization problems with uncertain constraints and linear perturbations of... more The paper deals with optimization problems with uncertain constraints and linear perturbations of the objective function, which are associated with given families of perturbation functions whose dual variable depends on the uncertainty parameters. More in detail, the paper provides characterizations of stable strong robust duality and stable robust duality under convexity and closedness assumptions. The paper also reviews the classical Fenchel duality of the sum of two functions by considering a suitable family of perturbation functions.
Journal of Optimization Theory and Applications, 2017
The main purpose of this paper consists of providing characterizations of the inclusion of the so... more The main purpose of this paper consists of providing characterizations of the inclusion of the solution set of a given conic system posed in a real locally convex topological space into a variety of subsets of the same space de…ned by means of vector-valued functions. These Farkas-type results are used to derive characterizations of the weak solutions of vector optimization problems (including multiobjective and scalar ones), vector variational inequalities, and vector equilibrium problems.
Given an arbitrary set T in the Euclidean space whose elements are called sites, and a particular... more Given an arbitrary set T in the Euclidean space whose elements are called sites, and a particular site s; the Voronoi cell of s; denoted by V T (s) ; consists of all points closer to s than to any other site. The Voronoi mapping of s; denoted by s ; associates to each set T 3 s the Voronoi cell V T (s) of s w.r.t. T: These Voronoi cells are solution sets of linear inequality systems, so they are closed convex sets: In this paper we study the Voronoi inverse problem consisting in computing, for a given closed convex set F 3 s; the family of sets T 3 s such that s (T) = F: More in detail, the paper analyzes relationships between the elements of this family, 1 s (F) ; and the linear representations of F; provides explicit formulas for maximal and minimal elements of 1 s (F), and studies the closure operator that assigns, to each closed set T containing s; the largest element of 1 s (F) ; where F = V T (s).
Revista de la Real Academia de Ciencias Exactas, Físicas y Naturales. Serie A. Matemáticas, 2014
Given a convex optimization problem (P) in a locally convex topological vector space X with an ar... more Given a convex optimization problem (P) in a locally convex topological vector space X with an arbitrary number of constraints, we consider three possible dual problems of (P) ; namely, the usual Lagrangian dual (D) ; the perturbational dual (Q) ; and the surrogate dual () ; the last one recently introduced in a previous paper of the authors [M.A. Goberna, M.A. López, M. Volle, Primal attainment in convex in…nite optimization duality, J. Convex Analysis 21, No. 4, 2014, in press]. As shown by simple examples, these dual problems may be all di¤erent. This paper provides conditions ensuring that inf(P) = max(D); inf(P) = max(Q); and inf(P) = max () (dual equality and existence of dual optimal solutions) in terms of the so-called closedness regarding to a set. Su¢ cient conditions guaranteeing min(P) = sup(Q) (dual equality and existence of primal optimal solutions) are also provided, for the nominal problems and also for their perturbational relatives. The particular cases of convex semi-in…nite optimization problems (in which either the number of constraints or the dimension of X, but not both, is …nite) and linear in…nite optimization problems are analyzed. Finally, some applications to the feasibility of convex inequality systems are described.
In this paper, we propose a duality theory for semi-in…nite linear programming problems under unc... more In this paper, we propose a duality theory for semi-in…nite linear programming problems under uncertainty in the constraint functions, the objective function, or both, within the framework of robust optimization. We present robust duality by establishing strong duality between the robust counterpart of an uncertain semi-in…nite linear program and the optimistic counterpart of its uncertain Lagrangian dual. We show that robust duality holds whenever a robust moment cone is closed and convex. We then show that the closed-convex robust moment cone condition in the case of constraint-wise uncertainty is in fact necessary and su¢ cient for robust duality in the sense that robust moment cone is closed and convex if and only if robust duality holds for every linear objective function of the program. In the case of uncertain problems with a¢ nely parameterized data uncertainty, we establish that robust duality is easily satis…ed under a Slater type constraint quali…cation. Consequently, we derive robust forms of the Farkas lemma for systems of uncertain semi-in…nite linear inequalities.
In this paper we examine multi-objective linear programming problems in the face of data uncertai... more In this paper we examine multi-objective linear programming problems in the face of data uncertainty both in the objective function and the constraints. First, we derive a formula for radius of robust feasibility guaranteeing constraint feasibility for all possible uncertainties within a specified uncertainty set under affine data parametrization. We then present a complete characterization of robust weakly effcient solutions that are immunized against rank one objective matrix data uncertainty. We also provide classes of commonly used constraint data uncertainty sets under which a robust feasible solution of an uncertain multi-objective linear program can be numerically checked whether or not it is a robust weakly efficient solution.
European Journal of Operational Research, Feb 1, 2022
This is a PDF file of an article that has undergone enhancements after acceptance, such as the ad... more This is a PDF file of an article that has undergone enhancements after acceptance, such as the addition of a cover page and metadata, and formatting for readability, but it is not yet the definitive version of record. This version will undergo additional copyediting, typesetting and review before it is published in its final form, but we are providing this version to give early visibility of the article. Please note that, during the production process, errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
European Journal of Operational Research, Oct 1, 2018
Highlights • The paper deals with uncertain convex multiobjective problems. • We consider ball un... more Highlights • The paper deals with uncertain convex multiobjective problems. • We consider ball uncertainty aecting all data. • We define a radius of highly robust weak e ciency certifying its existence. • We provide bounds, and an exact formula, for this radius. • We provide simple formulas for convex quadratic and linear multi-objective programs. • These formulas are applied to two variants of a test problem due to Ben-Tal and Nemirovski.
The radius of robust feasibility of a convex program with uncertain constraints gives a value for... more The radius of robust feasibility of a convex program with uncertain constraints gives a value for the maximal 'size' of an uncertainty set under which robust feasibility can be guaranteed. This paper provides an upper bound for the radius for convex programs with uncertain convex polynomial constraints and exact formulas for convex programs with SOS-convex polynomial constraints (or convex quadratic constraints) under affine data uncertainty. These exact formulas allow the radius to be computed by commonly available software.
Mathematical Methods of Operations Research, Nov 10, 2005
This paper deals with the stability of the intersection of a given set X R n with the solution, F... more This paper deals with the stability of the intersection of a given set X R n with the solution, F R n , of a given linear system whose coe¢ cients can be arbitrarily perturbed. In the optimization context, the …xed constraint set X can be the solution set of the (possibly nonlinear) system formed by all the exact constraints (e.g., the sign constraints), a discrete subset of R n (as Z n or f0; 1g n , as it happens in integer or Boolean programming) as well as the intersection of both kind of sets. Conditions are given for the intersection F \ X to remain nonempty (or empty) under su¢ ciently small perturbations of the data.
Journal of Optimization Theory and Applications, Mar 25, 2021
The radius of robust feasibility provides a numerical value for the largest possible uncertainty ... more The radius of robust feasibility provides a numerical value for the largest possible uncertainty set that guarantees robust feasibility of an uncertain linear conic program. This determines when the robust feasible set is non-empty. Otherwise the robust counterpart of an uncertain program is not well-defined as a robust optimization problem. In this paper, we address a key fundamental question of robust optimization: How to compute the radius of robust feasibility of uncertain linear conic programs, including linear programs? We first provide computable lower and upper bounds for the radius of robust feasibility for general uncertain linear conic programs under the commonly used ball uncertainty set. We then provide important classes of linear conic programs where the bounds are calculated by finding the optimal values of related semidefinite linear programs, among them uncertain semidefinite programs, uncertain second-order cone programs and uncertain support vector machine problems. In the case of an uncertain linear program, the exact formula allows us to calculate the radius by finding the optimal value of an associated second-order cone program.
This paper provides characterizations of the weak solutions of optimization problems where a give... more This paper provides characterizations of the weak solutions of optimization problems where a given vector function F, from a decision space X to an objective space Y , is "minimized" on the set of elements x ∈ C (where C ⊂ X is a given nonempty constraint set), satisfying G (x) ≦ S 0 Z , where G is another given vector function from X to a constraint space Z with positive cone S. The three spaces X, Y, and Z are locally convex Hausdorff topological vector spaces, with Y and Z partially ordered by two convex cones K and S, respectively, and enlarged with a greatest and a smallest element. In order to get suitable versions of the Farkas lemma allowing to obtain optimality conditions expressed in terms of the data, the triplet (F, G, C) , we use non-asymptotic representations of the K−epigraph of the conjugate function of F + I A , where I A denotes the indicator function of the feasible set A, that is, the function associating the zero vector of Y to any element of A and the greatest element of Y to any element of X A.
Springer Undergraduate Texts in Mathematics and Technology, 2019
This textbook on nonlinear optimization focuses on model building, real world problems, and appli... more This textbook on nonlinear optimization focuses on model building, real world problems, and applications of optimization models to natural and social sciences. Organized into two parts, this book may be used as a primary text for courses on convex optimization and non-convex optimization. Definitions, proofs, and numerical methods are well illustrated and all chapters contain compelling exercises. The exercises emphasize fundamental theoretical results on optimality and duality theorems, numerical methods with or without constraints, and derivative-free optimization. Selected solutions are given. Applications to theoretical results and numerical methods are highlighted to help students comprehend methods and techniques
Springer Undergraduate Texts in Mathematics and Technology, 2019
Unconstrained Optimization This chapter studies a collection of optimization problems without fun... more Unconstrained Optimization This chapter studies a collection of optimization problems without functional constraints, that is, problems of the form P : Min f (x) s.t. x ∈ C,
We introduce a robust optimization model consisting in a family of perturbation functions giving ... more We introduce a robust optimization model consisting in a family of perturbation functions giving rise to certain pairs of dual optimization problems in which the dual variable depends on the uncertainty parameter. The interest of our approach is illustrated by some examples, including uncertain conic optimization and infinite optimization via discretization. The main results characterize desirable robust duality relations (as robust zero-duality gap) by formulas involving the epsilon-minima or the epsilon-subdifferentials of the objective function. The two extreme cases, namely, the usual perturbational duality (without uncertainty), and the duality for the supremum of functions (duality parameter vanishing) are analyzed in detail.
In this paper we associate with an in…nite family of real extended functions de…ned on a locally ... more In this paper we associate with an in…nite family of real extended functions de…ned on a locally convex space, a sum, called robust sum, which is always well-de…ned. We also associate with that family of functions a dual pair of problems formed by the unconstrained minimization of its robust sum and the so-called optimistic dual. For such a dual pair, we characterize weak duality, zero duality gap, and strong duality, and their corresponding stable versions, in terms of multifunctions associated with the given family of functions and a given approximation parameter " 0 which is related to the "-subdi¤erential of the robust sum of the family. We also consider the particular case when all functions of the family are convex, assumption allowing to characterize the duality properties in terms of closedness conditions.
Journal of Optimization Theory and Applications, 2018
This paper provides new Farkas-type results characterizing the inclusion A ⊂ B, where A and B are... more This paper provides new Farkas-type results characterizing the inclusion A ⊂ B, where A and B are subsets of a locally convex space X. The sets A and B are described here by means of vector functions from X to other locally convex spaces Y (equipped with the partial ordering associated with a given convex cone K ⊂ Y) and Z, respectively. These new Farkas lemmas are obtained via the complete characterization of the K-epigraphs of certain conjugate mappings which constitute the core of our approach. In contrast with a previous paper of three of the autors [J. Optim. Theory Appl. 173, 357-390 (2017)], the characterizations of A ⊂ B are expressed here in terms of the data.
The paper deals with optimization problems with uncertain constraints and linear perturbations of... more The paper deals with optimization problems with uncertain constraints and linear perturbations of the objective function, which are associated with given families of perturbation functions whose dual variable depends on the uncertainty parameters. More in detail, the paper provides characterizations of stable strong robust duality and stable robust duality under convexity and closedness assumptions. The paper also reviews the classical Fenchel duality of the sum of two functions by considering a suitable family of perturbation functions.
Journal of Optimization Theory and Applications, 2017
The main purpose of this paper consists of providing characterizations of the inclusion of the so... more The main purpose of this paper consists of providing characterizations of the inclusion of the solution set of a given conic system posed in a real locally convex topological space into a variety of subsets of the same space de…ned by means of vector-valued functions. These Farkas-type results are used to derive characterizations of the weak solutions of vector optimization problems (including multiobjective and scalar ones), vector variational inequalities, and vector equilibrium problems.
Given an arbitrary set T in the Euclidean space whose elements are called sites, and a particular... more Given an arbitrary set T in the Euclidean space whose elements are called sites, and a particular site s; the Voronoi cell of s; denoted by V T (s) ; consists of all points closer to s than to any other site. The Voronoi mapping of s; denoted by s ; associates to each set T 3 s the Voronoi cell V T (s) of s w.r.t. T: These Voronoi cells are solution sets of linear inequality systems, so they are closed convex sets: In this paper we study the Voronoi inverse problem consisting in computing, for a given closed convex set F 3 s; the family of sets T 3 s such that s (T) = F: More in detail, the paper analyzes relationships between the elements of this family, 1 s (F) ; and the linear representations of F; provides explicit formulas for maximal and minimal elements of 1 s (F), and studies the closure operator that assigns, to each closed set T containing s; the largest element of 1 s (F) ; where F = V T (s).
Revista de la Real Academia de Ciencias Exactas, Físicas y Naturales. Serie A. Matemáticas, 2014
Given a convex optimization problem (P) in a locally convex topological vector space X with an ar... more Given a convex optimization problem (P) in a locally convex topological vector space X with an arbitrary number of constraints, we consider three possible dual problems of (P) ; namely, the usual Lagrangian dual (D) ; the perturbational dual (Q) ; and the surrogate dual () ; the last one recently introduced in a previous paper of the authors [M.A. Goberna, M.A. López, M. Volle, Primal attainment in convex in…nite optimization duality, J. Convex Analysis 21, No. 4, 2014, in press]. As shown by simple examples, these dual problems may be all di¤erent. This paper provides conditions ensuring that inf(P) = max(D); inf(P) = max(Q); and inf(P) = max () (dual equality and existence of dual optimal solutions) in terms of the so-called closedness regarding to a set. Su¢ cient conditions guaranteeing min(P) = sup(Q) (dual equality and existence of primal optimal solutions) are also provided, for the nominal problems and also for their perturbational relatives. The particular cases of convex semi-in…nite optimization problems (in which either the number of constraints or the dimension of X, but not both, is …nite) and linear in…nite optimization problems are analyzed. Finally, some applications to the feasibility of convex inequality systems are described.
Uploads
Papers by Miguel Goberna