Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
1999, Proceedings 11th International Conference on Tools with Artificial Intelligence
Filtering techniques are essential to efficiently look for a solution in a constraint network (CN). However, for a long time it has been considered that to efficiently reduce the search space, the best choice is the limited local consistency achieved by forward checking . However, more recent works show that maintaining arc consistency (which is a more pruningful local consistency) during search outperforms forward checking on hard and large constraint networks. In this paper, we show that maintaining a local consistency stronger than arc consistency during search can be advantageous. According to the comparison of the local consistencies more pruningful than arc consistency that can be used on large CNs in , Max-restricted path consistency ) is one of the most promising local consistencies. We propose a new local consistency, called Max-RPCEn, that is stronger than Max-RPC and that has almost the same cpu time requirements.
Lecture Notes in Computer Science, 2003
A binary constraints network consists of a set of n variables, defined on domains of size at most d, and a set of e binary constraints. The binary constraint satisfaction problem consists in finding a solution for a binary constraints network, that is an instantiation of all the variables which satisfies all the constraints. A value a in the domain of variable x is inconsistent if there is no solution which assigns a to x. Many filtering techniques have been proposed to filter out inconsistent values from the domains. Most of them are based on enforcing a given kind of local consistency. One of the most important such consistencies is max-restricted path consistency. The fastest algorithm to enforce maxrestricted path consistency has a O(end 3 ) time complexity and a O(end) space complexity. In this paper we present two improved algorithms for the same problem. The first still has a O(end 3 ) time complexity, but it reduces the space usage to O(ed). The second improves the time complexity to O(end 2.575 ), and has a O(end 2 ) space complexity.
Principles of Knowledge Representation and Reasoning, 1994
Constraint networks are a simple representation and reasoning framework with diverse applications. In this paper, we present a new property called constraint tightness that can be used for characterizing the di culty of problems formulated as constraint networks. Speci cally, we show that when the constraints are tight they may require less preprocessing in order to guarantee a backtrackfree solution. This suggests, for example, that many instances of crossword puzzles are relatively easy while scheduling problems involving resource constraints are quite hard. Formally, we present a relationship between the tightness or restrictiveness of the constraints, and the level of local consistency su cient to ensure global consistency, thus ensuring backtrack-freeness. Two de nitions of local consistency are employed. The traditional variable-based notion leads to a condition involving the tightness of the constraints, the level of local consistency, and the arity of the constraints, while a new de nition of relational consistency leads to a condition expressed in terms of tightness and local-consistency level, alone. New algorithms for enforcing relational consistency are introduced and analyzed.
International Journal of Applied Mathematics and Computer Science, 2011
Constraint programming is a powerful software technology for solving many real-life problems. Many of these problems can be modeled as constraint satisfaction problems (CSPs) and can be solved using constraint programming techniques. However, solving a CSP is NP-Complete so that filtering techniques to reduce the search space are still necessary. Arcconsistency algorithms are widely used to prune the search space. The concept of arc-consistency is bidirectional, that is, it must be ensured in both directions of the constraint (direct constraint and inverse constraint). Two of the most wellknown and frequently used arc-consistency algorithms for filtering CSPs are AC3 and AC4. These algorithms repeatedly carry out revisions and they require support checks for identifying and deleting all unsupported values from the domains. Nevertheless, many revisions are ineffective, that is, they cannot delete any value and they consume a lot of checks and time.
2011
Adviser: Berthe Y. Choueiry Freuder and Elfe [ 1996 ] introduced Neighborhood Inverse Consistency (NIC) as a local consistency property defined on the values in the variables' domains of a Constraint Satisfaction Problem (CSP). Debruyne and Bessière [ 2001 ] showed that enforcing NIC on binary CSPs is ineffective on sparse graph and too costly on dense graphs. In this thesis, we propose Relational Neighborhood Inverse Consistency (RNIC), an extension of NIC defined as a local consistency property on the tuples of the relations of a CSP. We characterize RNIC for both binary and non-binary CSPs, and propose an algorithm for enforcing it whose complexity is bounded by the degree of the dual graph on which the algorithm is applied. We propose to reduce the computational cost of our algorithm by reformulating the dual graph of the CSP. We present two reformulation techniques and their combinations, and discuss their effects on the consistency property enforced by the algorithm. We also describe a selection policy for choosing an appropriate reformulation technique, tying together the various components of our approach, which we show to outperforms, in a statistically significant manner, other common approaches for solving benchmark problems. Finally, we study the effect of the structure of the dual graph on the ordering of the propagation queue of our algorithm when applied as a preprocessing step to backtrack search and also as a lookahead strategy during search. We conclude, empirically, that the most effective ordering is the one that follows the tree decomposition of the dual graph.
Filtering techniques are essential to eeciently look for a so-lution in a constraint n e t work (CN). They remove some local inconsisten-cies and so reduce the search space. However, a given local consistency has to be not too expensive i f w e w ant to use it to eeciently prune the search tree during search. Hence, for a long time it has been considered that the best choice is the limited local consistency achieved by forward checking 14, 16]. However, more recent w orks 17, 4, 15] show that main-taining arc consistency (which is a more pruningful local consistency) during search outperforms forward checking on hard and large constraint netwo r k s . I t i s v ery likely that maintaining an even more pruningful local consistency may p a y o o n v ery hard problems. A comparison of the local consistencies more pruningful than arc consistency that can be used on large CNs has been done in 8]. The conclusion of this work is that Max-restricted path consistency (Max-RPC, 7]) is one of ...
We address the problem of extracting Minimal Unsatisfiable Cores (MUCs) from constraint networks. This computationally hard problem has a practical interest in many application domains such as configuration, planning, diagnosis, etc. Indeed, identifying one or several disjoint MUCs can help circumscribe different sources of inconsistency in order to repair a system. In this paper, we propose an original approach that involves performing successive runs of a complete backtracking search, using constraint weighting, in order to surround an inconsistent part of a network, before identifying all transition constraints belonging to a MUC using a dichotomic process. We show the effectiveness of this approach, both theoretically and experimentally.
1998
Constraint satisfaction is one of the major areas in AI that has important real-life applications. Lee et al. propose E-GENET, a stochastic solver for general constraint solving based on iterative repair. Performance figures show that E-GENET compares favorably against tree-search based solvers in many hard problems. On the other hand, global constraints have been shown to be very effective in modeling complicated CSP's. They have also improved substantially the efficiency of tree-search based solvers in solving real-life problems.
1997
The r e i s n o n eed to s h ow t he importance of the l tering t echniques to solve constraint s a tisfaction problems i.e. to n d v alues for problem variables subject to constraints that specify which c o m binations of values are consistent. They can be used during a preprocessing s t ep to remove once and for all some local inconsistencies, or during t he s e a r c h to e c i e n tly prune t he search tree. Recently, in 5], a comparison of the most practicable ltering t echniques concludes that restricted path c o nsistency (RPC) is a promising local consistency that requires little additional cpu time compared to arc consistency while removing most of the p a th i n verse inconsistent v alues. However, the RPC algorithm used for this comparison (presented in 1] and called RPC1 in the following) has a non optimal worst case time complexity a n d bad average time a n d space complexities. Therefore, we propose RPC2, a new RPC algorithm with O(end 2) w orst case time complexity a n d requiring less space than RPC1 in practice. The second aim of this paper is to extend RPC to n ew local consistencies, k-RPC and Max-RPC, and t o compare their pruning e ciency with t he o t her practicable local consistencies. Furthermore, we propose and s t udy a Max-RPC algorithm based on AC-6 that w e used for this comparison.
2012
Arc-Consistency algorithms are the most commonly used filtering techniques to prune the search space in Constraint Satisfaction Problems (CSPs). 2-consistency is a similar technique that guarantees that any instantiation of a value to a variable can be consistently extended to any second variable. Thus, 2-consistency can be stronger than arc-consistency in binary CSPs. In this work we present a new algorithm to achieve 2consistency called 2-C4. This algorithm is a reformulation of AC4 algorithm that is able to reduce unnecessary checking and prune more search space than AC4. The experimental results show that 2-C4 was able to prune more search space than arc-consistency algorithms in non-normalized instances. Furthermore, 2-C4 was more efficient than other 2-consistency algorithms presented in the literature.
1997
Abstract A constraint satisfaction problem (CSP) involves a set of variables, a domain of potential values for each variable, and a set of constraints, which specifies the acceptable combinations of values. One popular approach is to represent the original problem as a constraint network where nodes represent variables and arcs represent constraints between variables. Node consistency and arc consistency techniques are first applied to prune the domains of variables.
Trends in Applied Intelligent Systems, 2010
Arc-consistency algorithms are widely used to prune the search space of Constraint Satisfaction Problems (CSPs). One of the most well-known arc-consistency algorithms for filtering CSPs is AC3. This algorithm repeatedly carries out revisions and requires support checks for identifying and deleting all unsupported values from the domains. Nevertheless, many revisions are ineffective, that is, they cannot delete any value and they require a lot of checks and are time-consuming. We present AC3-OP, an optimized and reformulated version of AC3 that reduces the number of constraint checks and prunes the same CSP search space with arithmetic constraints. In inequality constraints, AC3-OP, checks the binary constraints in both directions (full arc-consistency), but it only propagates new constraints in one direction. Thus, it avoids checking redundant constraints that do not filter any value of the variable's domain. The evaluation section shows the improvement of AC3-OP over AC3 in random instances.
2009
Dual Consistency (DC) was introduced by Lecoutre, Cardon and Vion in [10, 11]. DC is a novel way of handling Path Consistency (PC), with a simpler definition, and new efficient algorithms and approximations. Interestingly, the new definition may be extended to non-binary constraint networks (CNs). DC is thus a way to generalize PC to any CN, while keeping the initial non-binary constraints of the CN, and their associated propagators, untouched. DC can also be seen as a simple and efficient way to generate automatically implicit binary constraints. This article presents the implications of this generalization in terms of complexity. Preliminary experimental results shows the potential effectiveness of dynamic implicit constraints generation, as well as identifying its weaknesses. Prospective ideas of approximations, whose purpose is to handle these weaknesses in practice, are then proposed. Consistencies are properties of Constraint Networks (CN) that can be used to identify and prun...
Constraints, 2015
Table constraints are important in constraint programming as they are present in many real problems from areas such as configuration and databases. As a result, numerous specialized algorithms that achieve generalized arc consistency (GAC) on table constraints have been proposed. Since these algorithms achieve GAC, they operate on one constraint at a time. In this paper we propose new filtering algorithms for positive table constraints that achieve stronger local consistency properties than GAC by exploiting intersections between constraints. The first algorithm, called maxRPWC+, is a domain filtering algorithm that is based on the local consistency maxRPWC and extends the GAC algorithm of Lecoutre and Szymanek [23]. The second algorithm extends the state-of-the-art STR-based algorithms to stronger relation filtering consistencies, i.e., consistencies that can remove tuples from constraints' relations. Experimental results from benchmark problems demonstrate that the proposed algorithms are quite competitive with standard GAC algorithms like STR2 in some classes of problems with intersecting table constraints, being orders of magnitude faster in some cases. * Some results included in this paper first appeared in . † This author has been funded by the EU project ICON (FP7-284715). 1 faster support search (e.g., and). Simple Tabular Reduction (STR) and its refinements maintain dynamically the support tables by removing invalid tuples from them during search. A recent approach holds information about removed values in the propagation queue and utilizes it to speed up support search . Given that GAC is a property defined on individual constraints, algorithms for GAC operate on one constraint at a time trying to filter infeasible values from the variables of the constraint. A different line of research has investigated stronger consistencies and algorithms to enforce them. Some of them are domain filtering, meaning that they only prune values from the domains of variables, e.g., see , whereas a few other ones are higher-order (or relation filtering), e.g., see , indicating that inconsistent tuples of values (nogoods of size 2 or more) can be identified. In contrast to GAC algorithms, the proposed algorithms to enforce these stronger consistencies are able to consider several constraints simultaneously. For example, pairwise consistency (PWC) is a relation filtering consistency that considers intersections between pairs of constraints. Recently there has been renewed interest for strong domain or relation filtering local consistencies as new ones have been proposed or/and efficient algorithms for existing ones have been devised . One of the most promising such consistencies is Max Restricted Pairwise Consistency (maxRPWC) , which is local consistency that is based PWC but can only make value deletions. In practice, strong consistencies are mainly applicable on constraints that are extensionally defined since intensionally defined constraints usually have specific semantics and are provided with efficient specialized filtering algorithms. However, a significant shortcoming of existing works on maxRPWC and other strong local consistencies is that the proposed algorithms for them are generic. That is, like earlier GAC algorithms such as GAC2001/3.1 [6], they are designed to operate on both intensional and extensional constraints, failing to recognize that strong consistencies are predominantly applicable on extensional constraints and should thus focus on such constraints. Despite the wealth of research on strong consistencies, they have not been widely adopted by CP solvers. State-of-the art solvers such as Gecode, Abscon, Choco, Minion, etc. predominantly apply GAC, and lesser forms of consistency such as bounds consistency, when propagating constraints. Regarding table constraints, CP solvers typically offer one or more of the above mentioned GAC methods for propagation. In this paper we propose filtering algorithms that achieve stronger consistency properties than GAC and have been specifically designed for table constraints. We contribute to both directions of domain and relation filtering methods by extending existing GAC algorithms for table constraints. The proposed methods are a step towards the efficient handling of intersecting table constraints and also provide specialization of strong local consistencies to extensional constraints that can be useful in practice. The first algorithm, called maxRPWC+, extends the GAC algorithm for table constraints of Lecoutre and Szymanek [23] and specializes the generic maxRWPC algorithm maxRPWC1 . The proposed domain filtering algorithm incorporates several techniques that help alleviate redundancies (i.e., redundant constraint checks and other operations on data structures) displayed by existing maxRPWC algorithms. We also describe a variant of maxRPWC+ which is more efficient when applied during search due to the lighter use of data structures.
2012
Local consistency properties and algorithms for enforcing them are central to the success of Constraint Processing. In this paper, we explore how to exploit the structure of the problem on the performance of the algorithm for enforcing consistency. We propose various strategies for managing the propagation queue of an algorithm for enforcing consistency, and empirically compare their effectiveness for solving CSPs with backtrack search and full lookahead. We focus our investigations on consistency algorithms that operate on the dual graph of a CSP and demonstrate the importance of exploiting a tree decomposition of the dual graph. Further, we note that exploiting structure is particularly striking on unsatisfiable instances. We conjecture that the approach for queue-management strategies benefits virtually all other propagation algorithms.
Constraints, 2011
Max Restricted Path Consistency (maxRPC) is a local consistency for binary constraints that enforces a higher order of consistency than arc consistency. Despite the strong pruning that can be achieved, maxRPC is rarely used because existing maxRPC algorithms suffer from overheads and redundancies as they can repeatedly perform many constraint checks without triggering any value deletions. In this paper we propose and evaluate techniques that can boost the performance of maxRPC algorithms by eliminating many of these overheads and redundancies. These include the combined use of two data structures to avoid many redundant constraint checks, and the exploitation of residues to quickly verify the existence of supports. Based on these, we propose a number of closely related maxRPC algorithms. The first one, maxRPC3, has optimal O(end 3 ) time complexity, displays good performance when used stand-alone, but is expensive to apply during search. The second one, maxRPC3 rm , has O(en 2 d 4 ) time complexity, but a restricted version with O(end 4 ) complexity can be very efficient when used during search. The other algorithms are simple modifications of maxRPC3 rm . All algorithms have O(ed) space complexity when used stand-alone. However, maxRPC3 has O(end) space This paper is an extended version of [1] that appeared in the proceedings of CP-2010.
Journal of the ACM, 1997
Constraint networks are a simple representation and reasoning framework with diverse applications. In this paper, we identify two new complementary properties on the restrictiveness of the constraints in a networkconstraint tightness and constraint looseness-and we show their usefulness for estimating the level of local consistency needed to ensure global consistency, and for estimating the level of local consistency present in a network. In particular, we present a sufficient condition, based on constraint tightness and the level of local consistency, that guarantees that a solution can be found in a backtrack-free manner. The condition can be useful in applications where a knowledge base will be queried over and over and the preprocessing costs can be amortized over many queries. We also present a sufficient condition for local consistency, based on constraint looseness, that is straightforward and inexpensive to determine. The condition can be used to estimate the level of local consistency of a network. This in turn can be used in deciding whether it would be useful to preprocess the network before a backtracking search, and in deciding which local consistency conditions, if any, still need to be enforced if we want to ensure that a solution can be found in a backtrack-free manner. Two definitions of local consistency are employed in characterizing the conditions: the traditional variable-based notion and a recently introduced definition of local consistency called relational consistency 1 .
2009
Abstract Powerful consistency techniques, such as AC* and FDAC*, have been developed for Weighted Constraint Satisfaction Problems (WCSPs) to reduce the space in solution search, but are restricted to only unary and binary constraints. On the other hand, van Hoeve et al. developed efficient graph-based algorithms for handling soft constraints as classical constraint optimization problems.
2017 IEEE 29th International Conference on Tools with Artificial Intelligence (ICTAI), 2017
Partial singleton (weak) path consistency, or partial ♦-consistency, for a qualitative constraint network, ensures that the process of instantiating any constraint of that network with any of its base relations b and enforcing partial (weak) path consistency, or partial ⋄-consistency, in the updated network, yields a partially ⋄-consistent subnetwork where the respective constraint is still defined by b. This local consistency is essential for helping to decide the satisfiability of challenging qualitative constraint networks and has been shown to play a crucial role in tackling more demanding problems associated with a given qualitative constraint network, such as the problem of minimal labeling. One of the main downsides to using partial ♦-consistency, is that it is computationally expensive to enforce in a given qualitative constraint network, as, despite being a local consistency in principle, it retains a global scope of the network at hand. In this paper, we propose a lazy alg...
1997
Filtering techniques are essential in order to ef ciently solve constraint satisfaction problems CSPs A blind search often leads to a com binatorial explosion the algorithm repeatedly nding the same local inconsistencies Main taining a local consistency can strongly reduce the search e ort especially on hard and large problems A good illustration are the good time performances on such problems of maintaining arc consistency during search compared to for ward checking which maintains a lower level of local consistency On the one hand arc consist ency consistency is the most used ltering technique because it cheaply removes some val ues that cannot belong to any solution On the other hand other k consistencies k have important space and time requirements because they can change the set of constraints They can only be used on very small CSPs Thus in this paper we study and compare the ltering techniques that are more pruningful than arc consistency while leaving unchanged the set of co...
2005
The GAC-Scheme has become a popular general purpose algorithm for solving n-ary constraints, although it may scan an exponential number of supporting tuples. In this paper, we develop a major improvement of this scheme. When searching for a support, our new algorithm is able to skip over a number of tuples exponential in the arity of the constraint by exploiting knowledge about the current domains of the variables. We demonstrate the effectiveness of the method for large table constraints.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.