PhD Theses
http://cp2013.a4cp.org/theses?combine=&sort_by=field_graduated_value&sort_order=DESC
enAnalysis, Synthesis and Application of Automaton-Based Constraint Descriptions
http://cp2013.a4cp.org/node/1182
<section class="field field-name-field-author field-type-text field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Author: </h2><div class="field-items"><div class="field-item even">Maria Andreina Francisco Rodriguez</div></div></section><section class="field field-name-field-school field-type-text field-label-above view-mode-teaser"><h2 class="field-label">School: </h2><div class="field-items"><div class="field-item even">Uppsala University</div></div></section><section class="field field-name-field-supervisors field-type-text field-label-above view-mode-teaser"><h2 class="field-label">Supervisors: </h2><div class="field-items"><div class="field-item even">Justin Pearson</div><div class="field-item odd">Pierre Flener </div></div></section><section class="field field-name-body field-type-text-with-summary field-label-above view-mode-teaser"><h2 class="field-label">Abstract: </h2><div class="field-items"><div class="field-item even"><p>Constraint programming (CP) is a technology in which a combinatorial problem is modelled as a conjunction of constraints on variables ranging over given initial domains, and optionally an objective function on the variables. Such a model is given to a general-purpose solver performing systematic search to find constraint-satisfying domain values for the variables, giving an optimal value to the objective function. A constraint predicate (also known as a global constraint) does two things: from the modelling perspective, it allows a modeller to express a commonly occurring combinatorial substructure, for example that a set of variables must take distinct values; from the solving perspective, it comes with a propagation algorithm, called a propagator, which removes some but not necessarily all impossible values from the current domains of its variables when invoked during search.</p>
<p>Although modern CP solvers have many constraint predicates, often a predicate one would like to use is not available. In the past, the choices were either to reformulate the model or to write one's own propagator. In this dissertation, we contribute to the automatic design of propagators for new predicates.</p>
<p>Integer time series are often subject to constraints on the aggregation of the features of all maximal occurrences of some pattern. For example, the minimum width of the peaks may be constrained. Automata allow many constraint predicates for variable sequences, and in particular many time-series predicates, to be described in a high-level way. Our first contribution is an algorithm for generating an automaton-based predicate description from a pattern, a feature, and an aggregator.</p>
<p>It has previously been shown how to decompose an automaton-described constraint on a variable sequence into a conjunction of constraints whose predicates have existing propagators. This conjunction provides the propagation, but it is unknown how to propagate it efficiently. Our second contribution is a tool for deriving, in an off-line process, implied constraints for automaton-induced constraint decompositions to improve propagation. Further, when a constraint predicate functionally determines a result variable that is unchanged under reversal of a variable sequence, we provide as our third contribution an algorithm for deriving an implied constraint between the result variables for a variable sequence, a prefix thereof, and the corresponding suffix.</p>
</div></div></section><section class="field field-name-field-graduated field-type-datetime field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Graduated: </h2><div class="field-items"><div class="field-item even"><span class="date-display-single">Friday, December 15, 2017</span></div></div></section><section class="field field-name-field-link-to-full-text field-type-link-field field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Link to full text: </h2><div class="field-items"><div class="field-item even"><a href="http://urn.kb.se/resolve?urn=urn%3Anbn%3Ase%3Auu%3Adiva-332149">DiVA</a></div></div></section>Wed, 20 Dec 2017 14:06:16 +0000manenina1182 at http://cp2013.a4cp.orgMaking the Most of Structure in Constraint Models
http://cp2013.a4cp.org/node/1185
<section class="field field-name-field-author field-type-text field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Author: </h2><div class="field-items"><div class="field-item even">Kevin Leo</div></div></section><section class="field field-name-field-school field-type-text field-label-above view-mode-teaser"><h2 class="field-label">School: </h2><div class="field-items"><div class="field-item even">Monash University</div></div></section><section class="field field-name-field-supervisors field-type-text field-label-above view-mode-teaser"><h2 class="field-label">Supervisors: </h2><div class="field-items"><div class="field-item even">Guido Tack</div><div class="field-item odd">Maria Garcia de la Banda</div></div></section><section class="field field-name-body field-type-text-with-summary field-label-above view-mode-teaser"><h2 class="field-label">Abstract: </h2><div class="field-items"><div class="field-item even"><p>Combinatorial problems are those where a set of decisions that satisfy a given set of constraints, must be selected from a set of possible combinations. Solving these problems is critical in a wide range of areas, such as telecommunications, circuit design and transportation. It is also remarkably hard, which is why many solving techniques exist, each with their own strengths and weaknesses. High-level modelling languages allow combinatorial problems to be specified as high-level models that can then be compiled into different low-level programs, each suitable for a solving technique. While the high-level modelling and solver-independence characteristics of these languages are very useful for modellers, they make it difficult for the compiler to yield programs that can be solved efficiently. Further, changes to a model can have a significant and unexpected impact on the efficiency of the resulting programs. This makes the modelling, compilation, and solving of combinatorial problems a challenging iterative process.</p>
<p>This thesis states that information regarding the implicit and explicit structure present in a model can be very beneficial for this challenging process, as it allows users, compilers and solvers to make better decisions during modelling, compilation, and solving. The main objective of the thesis then, is to effectively take advantage of structure in constraint models to improve the modelling, compilation, and solving process. To achieve this, a new framework is proposed for exploring how structural information can be discovered, shared, and generalised between different stages of the process. In particular, the framework is used as a guide to determine whether the explicit structure of a model can be used to (a) improve the quality of compiled programs; (b) guide the search for and exposure of hidden implicit structure; and (c) help identify parts of a model that are incorrectly formulated.</p>
<p>This is achieved by the following three main contributions of this thesis. First, a method that preserves model structure deeper into the compilation process, thus enabling more efficient programs to be produced. Second, a method for finding implicit structure that can be made explicit in a model, thus making modelling easier. And third, a method that uses preserved model structure to help explain why a model is unsatisfiable, thus helping users produce correct models. These three methods have been implemented and evaluated for the MiniZinc language. The experimental results show them to be effective in practice. Together, these contributions help make high-level modelling a more powerful and attractive approach for solving hard combinatorial problems.</p>
</div></div></section><section class="field field-name-field-graduated field-type-datetime field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Graduated: </h2><div class="field-items"><div class="field-item even"><span class="date-display-single">Wednesday, July 26, 2017</span></div></div></section><section class="field field-name-field-link-to-full-text field-type-link-field field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Link to full text: </h2><div class="field-items"><div class="field-item even"><a href="https://figshare.com/articles/Making_the_Most_of_Structure_in_Constraint_Models/5216329">Making the Most of Structure in Constraint Models</a></div></div></section><section class="field field-name-upload field-type-file field-label-above view-mode-teaser"><h2 class="field-label">PDF of thesis: </h2><div class="field-items"><div class="field-item even"><span class="file"><img class="file-icon" alt="PDF icon" title="application/pdf" src="/modules/file/icons/application-pdf.png" /> <a href="http://cp2013.a4cp.org/sites/default/files/kevin_leo_-_making_the_most_of_structure_in_constraint_models.pdf" type="application/pdf; length=5537699">kevin_leo_-_making_the_most_of_structure_in_constraint_models.pdf</a></span></div></div></section>Tue, 09 Jan 2018 00:14:58 +0000kleo1185 at http://cp2013.a4cp.orgSolving Hard Subgraph Problems in Parallel
http://cp2013.a4cp.org/node/1188
<section class="field field-name-field-author field-type-text field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Author: </h2><div class="field-items"><div class="field-item even">Ciaran McCreesh</div></div></section><section class="field field-name-field-school field-type-text field-label-above view-mode-teaser"><h2 class="field-label">School: </h2><div class="field-items"><div class="field-item even">School of Computing Science</div></div></section><section class="field field-name-field-supervisors field-type-text field-label-above view-mode-teaser"><h2 class="field-label">Supervisors: </h2><div class="field-items"><div class="field-item even">Patrick Prosser</div></div></section><section class="field field-name-body field-type-text-with-summary field-label-above view-mode-teaser"><h2 class="field-label">Abstract: </h2><div class="field-items"><div class="field-item even"><p>This thesis improves the state of the art in exact, practical algorithms for finding subgraphs. We study maximum clique, subgraph isomorphism, and maximum common subgraph problems. These are widely applicable: within computing science, subgraph problems arise in document clustering, computer vision, the design of communication protocols, model checking, compiler code generation, malware detection, cryptography, and robotics; beyond, applications occur in biochemistry, electrical engineering, mathematics, law enforcement, fraud detection, fault diagnosis, manufacturing, and sociology. We therefore consider both the ``pure'' forms of these problems, and variants with labels and other domain-specific constraints.</p>
<p>Although subgraph-finding should theoretically be hard, the constraint-based search algorithms we discuss can easily solve real-world instances involving graphs with thousands of vertices, and millions of edges. We therefore ask: is it possible to generate ``really hard'' instances for these problems, and if so, what can we learn? By extending research into combinatorial phase transition phenomena, we develop a better understanding of branching heuristics, as well as highlighting a serious flaw in the design of graph database systems.</p>
<p>This thesis also demonstrates how to exploit two of the kinds of parallelism offered by current computer hardware. Bit parallelism allows us to carry out operations on whole sets of vertices in a single instruction---this is largely routine. Thread parallelism, to make use of the multiple cores offered by all modern processors, is more complex. We suggest three desirable performance characteristics that we would like when introducing thread parallelism: lack of risk (parallel cannot be exponentially slower than sequential), scalability (adding more processing cores cannot make runtimes worse), and reproducibility (the same instance on the same hardware will take roughly the same time every time it is run). We then detail the difficulties in guaranteeing these characteristics when using modern algorithmic techniques.</p>
<p>Besides ensuring that parallelism cannot make things worse, we also increase the likelihood of it making things better. We compare randomised work stealing to new tailored strategies, and perform experiments to identify the factors contributing to good speedups. We show that whilst load balancing is difficult, the primary factor influencing the results is the interaction between branching heuristics and parallelism. By using parallelism to explicitly offset the commitment made to weak early branching choices, we obtain parallel subgraph solvers which are substantially and consistently better than the best sequential algorithms.</p>
</div></div></section><section class="field field-name-field-graduated field-type-datetime field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Graduated: </h2><div class="field-items"><div class="field-item even"><span class="date-display-single">Tuesday, July 25, 2017</span></div></div></section><section class="field field-name-field-link-to-full-text field-type-link-field field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Link to full text: </h2><div class="field-items"><div class="field-item even"><a href="http://theses.gla.ac.uk/8322/"> Solving hard subgraph problems in parallel </a></div></div></section><section class="field field-name-upload field-type-file field-label-above view-mode-teaser"><h2 class="field-label">PDF of thesis: </h2><div class="field-items"><div class="field-item even"><span class="file"><img class="file-icon" alt="PDF icon" title="application/pdf" src="/modules/file/icons/application-pdf.png" /> <a href="http://cp2013.a4cp.org/sites/default/files/ciaran_mccreesh_-_solving_hard_subgraph_problems_in_parallel.pdf" type="application/pdf; length=12074454">ciaran_mccreesh_-_solving_hard_subgraph_problems_in_parallel.pdf</a></span></div></div></section>Tue, 20 Feb 2018 18:10:46 +0000cmccreesh1188 at http://cp2013.a4cp.orgCost-based filtering algorithms for a Capacitated Lot Sizing Problem and the Constrained Arborescence Problem
http://cp2013.a4cp.org/node/1184
<section class="field field-name-field-author field-type-text field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Author: </h2><div class="field-items"><div class="field-item even">Houndji Vinasetan Ratheil</div></div></section><section class="field field-name-field-school field-type-text field-label-above view-mode-teaser"><h2 class="field-label">School: </h2><div class="field-items"><div class="field-item even">Université catholique de Louvain (UCL, Belgium) and Université d'Abomey-Calavi (UAC, Benin)</div></div></section><section class="field field-name-field-supervisors field-type-text field-label-above view-mode-teaser"><h2 class="field-label">Supervisors: </h2><div class="field-items"><div class="field-item even">Laurence A. Wolsey</div><div class="field-item odd">Pierre Schaus</div><div class="field-item even">Mahouton Norbert Hounkonnou</div></div></section><section class="field field-name-body field-type-text-with-summary field-label-above view-mode-teaser"><h2 class="field-label">Abstract: </h2><div class="field-items"><div class="field-item even"><p>Constraint Programming (CP) is a paradigm derived from artificial intelligence, operational research, and algorithmics that can be used to solve combinatorial problems. CP solves problems by interleaving search (assign a value to an unassigned variable) and propagation. Constraint propagation aims at removing/filtering inconsistent values from the domains of the variables in order to reduce the search space of the problem. In this thesis, we develop filtering algorithms for two complex combinatorial optimization problems: a Capacitated Lot Sizng Problem (CLSP) and the Constrained Arborescence Problem (CAP). Each of these problems has many variants and practical applications.</p>
<p>The CLSP is the problem of finding an optimal production plan for single or multiple items while satisfying demands of clients and respecting resource restrictions. The CLSP finds important applications in production planning. In this thesis, we introduce a CLSP in CP. In many lot sizing and scheduling problems, in particular when the planning horizon is discrete and finite, there are stocking costs to be minimized. These costs depend on the time spent between the production of an order and its delivery. We focus on developing specialized filtering algorithms to handle the stocking cost part of a class of the CLSP. We propose the global optimization constraint StockingCost when the per-period stocking cost is the same for all orders; and its generalized ver- sion, the IDStockingCost constraint (ID stands for Item Dependent).</p>
<p>In this thesis, we also deal with a well-known problem in graph theory: the Minimum Weight Arborescence (MWA) problem. Consider a weighted directed graph in which we distinguish one vertex r as the root. An MWA rooted at r is a directed spanning tree rooted at r with minimum total weight. We focus on the CAP that requires one to find an arborescence that satisfies some side constraints and that has minimum weight. The CAP has many real life applications in telecom- munication networks, computer networks, transportation problems, scheduling problems, etc. After sensitivity analysis of the MWA, we introduce the CAP in CP. We propose a dedicated global optimization constraint to handle any variant of the CAP in CP: the MinArborescence constraint. All the proposed filtering algorithms are analyzed theoretically and evaluated experimentally. The different experimental evaluations of these propagators against the state-of-the-art propagators show their respective efficiencies.</p>
</div></div></section><section class="field field-name-field-graduated field-type-datetime field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Graduated: </h2><div class="field-items"><div class="field-item even"><span class="date-display-single">Wednesday, May 31, 2017</span></div></div></section><section class="field field-name-field-link-to-full-text field-type-link-field field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Link to full text: </h2><div class="field-items"><div class="field-item even"><a href="https://dial.uclouvain.be/pr/boreal/fr/object/boreal%3A186333">DIAL: Digital Access to Librairies</a></div></div></section><section class="field field-name-field-also-published-in field-type-link-field field-label-above view-mode-teaser"><h2 class="field-label">Also published in: </h2><div class="field-items"><div class="field-item even"><a href="https://www.researchgate.net/publication/317563428_Cost-based_filtering_algorithms_for_a_Capacitated_Lot_Sizing_Problem_and_the_Constrained_Arborescence_Problem">Researchgate</a></div></div></section><section class="field field-name-upload field-type-file field-label-above view-mode-teaser"><h2 class="field-label">PDF of thesis: </h2><div class="field-items"><div class="field-item even"><span class="file"><img class="file-icon" alt="PDF icon" title="application/pdf" src="/modules/file/icons/application-pdf.png" /> <a href="http://cp2013.a4cp.org/sites/default/files/houndji_vinasetan_ratheil_-_cost-based_filtering_algorithms_for_a_capacitated_lot_sizing_problem_and_the_constrained_arborescence_problem.pdf" type="application/pdf; length=877745">houndji_vinasetan_ratheil_-_cost-based_filtering_algorithms_for_a_capacitated_lot_sizing_problem_and_the_constrained_arborescence_problem.pdf</a></span></div></div></section>Mon, 08 Jan 2018 22:01:44 +0000ratheilesse1184 at http://cp2013.a4cp.orgNew solving methods for constrained optimization problems
http://cp2013.a4cp.org/node/1181
<section class="field field-name-field-author field-type-text field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Author: </h2><div class="field-items"><div class="field-item even">Amine Lamine</div></div></section><section class="field field-name-field-school field-type-text field-label-above view-mode-teaser"><h2 class="field-label">School: </h2><div class="field-items"><div class="field-item even">National Engineering School of Sfax</div></div></section><section class="field field-name-field-supervisors field-type-text field-label-above view-mode-teaser"><h2 class="field-label">Supervisors: </h2><div class="field-items"><div class="field-item even">Prof. Dr. Brahim Hnich</div><div class="field-item odd">Prof. Dr. Habib Chabchoub</div><div class="field-item even">Dr. Mahdi Khemakhem</div></div></section><section class="field field-name-body field-type-text-with-summary field-label-above view-mode-teaser"><h2 class="field-label">Abstract: </h2><div class="field-items"><div class="field-item even"><p>This thesis focuses on solving constrained optimization problems. In this work, the aim is multiple, (i) we study the relationship between a set of knapsack problems (ii) we provide a filtering algorithm for the new global constraint \emph{mcmdk} and, we design a new strategy for solving constrained optimization problems.</p>
<p>We first study the relationship between a set of knapsack problems involving dimensions, demands and multiple choice constraints such as: the MKP, the MDMKP, the MCKP, the MMKP and the GUBMKP. This study lead to define the generalized problem called the multiple demand multidimensional multiple choice knapsack problem (MDMMKP) as a generalization of these problems. And by applying a set of defined transformations between the different integer linear programs of these knapsack extensions, algorithm for that generalization is assumed as a a solver. Evenly, results show that the transformations is able to generate reasonable computing time compared with the original ones.</p>
<p>Secondly, we define a new global constraint belonging to the weighted constraints and closely related to the knapsack constraint denoted mcmdk. We modeled the
mcmdk constraint using the conjunction of sum and implies constraints and we associate it a filtering algorithm "based reasoning". Experiments show that propagating the mcmdk constraint via the proposed filtering algorithm is more effective and efficient than propagating it using the straightforward conjunction.</p>
<p>And finally, we present a new strategy for solving Constrained Optimization Problems (COPs) called solve and decompose (or $S&D$ for short). The proposed strategy is a systematic iterative depth-first strategy that is based on problem decomposition. $S&D$ uses a feasible solution of the COP, found by any exact method, to further decompose the original problem into a bounded number of subproblems which are considerably smaller in size. The number of subproblems is bounded and is controlled by parameter $p$ whereas their size is controlled by $depth$ which is a depth limit after which we stop the decomposition process. The proposed algorithm is designed so (i) to speed up the time to finding the optimal solution by problem decomposition and visiting of promising subproblems first; and (ii) to speed up the proof of optimality by strengthening the cost-based filtering. Results on two benchmark problems show that $S&D$ may reach improvements of up to three orders of magnitude in terms of runtime when compared to Branch and Bound.</p>
</div></div></section><section class="field field-name-field-graduated field-type-datetime field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Graduated: </h2><div class="field-items"><div class="field-item even"><span class="date-display-single">Tuesday, November 1, 2016</span></div></div></section><section class="field field-name-field-also-published-in field-type-link-field field-label-above view-mode-teaser"><h2 class="field-label">Also published in: </h2><div class="field-items"><div class="field-item even"><a href="https://link.springer.com/article/10.1007/s10878-015-9892-8">Solving constrained optimization problems by solution-based decomposition search</a></div><div class="field-item odd"><a href="https://pdfs.semanticscholar.org/8ccd/6309ce4932877d8e0b2cfc87aa3e55db11b6.pdf">Knapsack Problems involving dimensions, demands and multiple choice constraints: generalization and transformations between form</a></div></div></section><section class="field field-name-upload field-type-file field-label-above view-mode-teaser"><h2 class="field-label">PDF of thesis: </h2><div class="field-items"><div class="field-item even"><span class="file"><img class="file-icon" alt="PDF icon" title="application/pdf" src="/modules/file/icons/application-pdf.png" /> <a href="http://cp2013.a4cp.org/sites/default/files/amine_lamine_-_dr.pdf" type="application/pdf; length=870698">amine_lamine_-_dr.pdf</a></span></div></div></section>Sun, 10 Dec 2017 12:14:55 +0000Amine Lamine1181 at http://cp2013.a4cp.orgEfficient algorithms to solve scheduling problems with a variety of optimization criteria
http://cp2013.a4cp.org/node/1168
<section class="field field-name-field-author field-type-text field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Author: </h2><div class="field-items"><div class="field-item even">HAMED FAHIMI</div></div></section><section class="field field-name-field-school field-type-text field-label-above view-mode-teaser"><h2 class="field-label">School: </h2><div class="field-items"><div class="field-item even">Université Laval</div></div></section><section class="field field-name-body field-type-text-with-summary field-label-above view-mode-teaser"><h2 class="field-label">Abstract: </h2><div class="field-items"><div class="field-item even"><p>Constraint programming is a powerful methodology to solve large scale and practical scheduling problems. Resource-constrained scheduling deals with temporal allocation of a variety of tasks to a set of resources, where the tasks consume a certain amount of resource during their execution. Ordinarily, a desired objective function such as the total length of a feasible schedule, called the makespan, is optimized in scheduling problems. Solving the scheduling problem is equivalent to finding out when each task starts and which resource executes it. In general, the scheduling problems are NP-Hard. Consequently, there exists no known algorithm that can solve the problem by executing a polynomial number of instructions. Nonetheless, there exist specializations for scheduling problems that are not NP-Complete. Such problems can be solved in polynomial time using dedicated algorithms. We tackle such algorithms for scheduling problems in a variety of contexts. Filtering techniques are being developed and improved over the past years in constraint-based scheduling. These algorithms are prominent as they have the power to shrink the search tree by excluding values from the domains which do not yield a feasible solution. We propose improvements and present faster filtering algorithms for classical scheduling problems. Furthermore, we establish the adaptions of filtering techniques to the case that the tasks can be delayed. We also consider distinct properties of industrial scheduling problems and solve more efficiently the scheduling problems whose optimization criteria is not necessarily the makespan. For instance, we present polynomial time algorithms for the case that the amount of available resources fluctuates over time, or when the cost of executing a task at time t is dependent on t.</p>
</div></div></section><section class="field field-name-field-graduated field-type-datetime field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Graduated: </h2><div class="field-items"><div class="field-item even"><span class="date-display-single">Friday, September 30, 2016</span></div></div></section><section class="field field-name-field-link-to-full-text field-type-link-field field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Link to full text: </h2><div class="field-items"><div class="field-item even"><a href="http://theses.ulaval.ca/archimede/meta/32946">Efficient algorithms to solve scheduling problems with a variety of optimization criteria</a></div></div></section><section class="field field-name-upload field-type-file field-label-above view-mode-teaser"><h2 class="field-label">PDF of thesis: </h2><div class="field-items"><div class="field-item even"><span class="file"><img class="file-icon" alt="PDF icon" title="application/pdf" src="/modules/file/icons/application-pdf.png" /> <a href="http://cp2013.a4cp.org/sites/default/files/hamed_fahimi_-_efficient_algorithms_to_solve_scheduling_problems_with_a_variety_of_optimization_criteria.pdf" type="application/pdf; length=12962996">hamed_fahimi_-_efficient_algorithms_to_solve_scheduling_problems_with_a_variety_of_optimization_criteria.pdf</a></span></div></div></section>Mon, 08 May 2017 11:20:16 +0000hfpcqla1168 at http://cp2013.a4cp.orgTowards Statistical Consistency for Stochastic Constraint Programming
http://cp2013.a4cp.org/node/1151
<section class="field field-name-field-author field-type-text field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Author: </h2><div class="field-items"><div class="field-item even">Imen Zghidi</div></div></section><section class="field field-name-field-school field-type-text field-label-above view-mode-teaser"><h2 class="field-label">School: </h2><div class="field-items"><div class="field-item even">FSEGS</div></div></section><section class="field field-name-field-supervisors field-type-text field-label-above view-mode-teaser"><h2 class="field-label">Supervisors: </h2><div class="field-items"><div class="field-item even">Prof. Dr. Abdelwahab Rebaii</div></div></section><section class="field field-name-body field-type-text-with-summary field-label-above view-mode-teaser"><h2 class="field-label">Abstract: </h2><div class="field-items"><div class="field-item even"><p>In most industrial contexts, decisions are made based on incomplete information. This is due to the fact that decision makers cannot be certain of the future behavior of factors that will affect the outcome resulting from various options under consideration. Stochastic Constraint Satisfaction Problems provide a powerful modeling framework for problems in which one is required to take decisions under uncertainty. In these stochastic problems, the uncertainty is modeled by using discrete random variables to capture uncontrollable factors like the customer demands, the processing times of machines, house prices, etc. These discrete random variables can take on a set of possible different values, each with an associated probability and are useful to model factors that fall outside the control of the decision maker who only knows the probability distribution function of these random variables which can be forecasted, for instance, by looking at the past behavior of such factors. There are controllable variables on which one can decide, named decision variables which allow to model the set of possible choices for the decisions to be made. Finally, such problems comprise chance constraints which express the relationship between random and decision variables that should be satisfied within a satisfaction probability threshold -- since finding decisions that will always satisfy the constraints in an uncertain environment is almost impossible.<br />
If the random variables' support set is infinite, the number of scenarios would be infinite. Hence, finding a solution in such cases is impossible in general. In this thesis, within the context of an infinite set of scenarios, we propose a novel notion of statistical consistency. Statistical consistency lifts the notion of consistency of deterministic constraints to infinite chance constraints. The essence of this novel notion of consistency is to be able to make an inference, in the presence of infinite scenarios in an uncertain environment, based on a restricted finite subset of scenarios with a certain confidence level and a threshold error. The confidence level is the probability that characterises the extent to which our inference, based on a subset of scenarios, is correct whereas the threshold error is the error range that we can tolerate while making such an inference. The statistical consistency acknowledges the fact that making a perfect inference in an uncertain environment and with an infinite number of scenarios is impossible. The statistical consistency, thus, with its reliance on a limited number of scenarios, a confidence level, and a threshold error constitutes a valid and an appropriate practical road that one can take in order to tackle infinite chance constraints.<br />
We design two novel approaches based on confidence intervals to enforce statistical consistency as well as a novel third approach based on hypothesis testing. We analyze the various methods theoretically as well as experimentally. Our empirical evaluation shows the weaknesses and strengths of each of the three methods in making a correct inference from a restricted subset of scenarios for enforcing statistical consistency. Overall, while the first two methods are able to make a correct inference in most of the cases, the third is a superior, effective, and robust one in all cases.</p>
</div></div></section><section class="field field-name-field-graduated field-type-datetime field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Graduated: </h2><div class="field-items"><div class="field-item even"><span class="date-display-single">Saturday, May 14, 2016</span></div></div></section><section class="field field-name-upload field-type-file field-label-above view-mode-teaser"><h2 class="field-label">PDF of thesis: </h2><div class="field-items"><div class="field-item even"><span class="file"><img class="file-icon" alt="PDF icon" title="application/pdf" src="/modules/file/icons/application-pdf.png" /> <a href="http://cp2013.a4cp.org/sites/default/files/imen_zghidi_-_towards_statistical_consistency_for_stochastic_constraint_programming.pdf" type="application/pdf; length=1563912">imen_zghidi_-_towards_statistical_consistency_for_stochastic_constraint_programming.pdf</a></span></div></div></section>Thu, 15 Dec 2016 18:27:53 +0000zghidiimen1151 at http://cp2013.a4cp.orgCDF-intervals: a probabilistic interval constraint framework to reason about data with uncertainty
http://cp2013.a4cp.org/node/1180
<section class="field field-name-field-author field-type-text field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Author: </h2><div class="field-items"><div class="field-item even">Aya Saad</div></div></section><section class="field field-name-field-school field-type-text field-label-above view-mode-teaser"><h2 class="field-label">School: </h2><div class="field-items"><div class="field-item even">Fakultät für Ingenieurwissenschaften, Informatik und Psychologie</div></div></section><section class="field field-name-field-supervisors field-type-text field-label-above view-mode-teaser"><h2 class="field-label">Supervisors: </h2><div class="field-items"><div class="field-item even">Frühwirth, Thom</div><div class="field-item odd">Gervet, Carmen</div></div></section><section class="field field-name-body field-type-text-with-summary field-label-above view-mode-teaser"><h2 class="field-label">Abstract: </h2><div class="field-items"><div class="field-item even"><p>We propose a novel framework, the cummulative distribution function (cdf )-intervals, that intuitively describes data coupled with uncertainty without losing any information given in the problem definition. Our new proposition brings about a construction of two algebraic convex structures: the cdf -intervals and the Probability Box (p-box) cdf-intervals. The two proposed structures were driven by the practical usage of reliable approaches in Constraint Programming (CP) and Operation Research (OR) paradigms. These approaches tackle large scale constraint optimization (LSCO) problems associated with data uncertainty in a tractable manner. The key idea is to bound data observed in the problem definition, then perform the computation only on the bounds using interval reasoning techniques. Output solution set from this process satisfies all possible realization of the data sought. Approaches following the convex modeling commonly treat data in their interval representation with an equal weight, thus they do not reflect any possible degree of knowledge about the whereabouts.<br />
Motivated by bringing more knowledge to the realized solution set, we introduced the cdf -intervals in [Saad, Gervet, and Abdennadher (2010)]. The bounding points, in the cdf -intervals algebraic structure, each is specified by two values: data and its cumulative distribution function (cdf ) [Saad et al. (2010)]. This new structure attempts to represent data in a 2 dimensions (2D) manner, yet the probability distribution (the 2nd dimension) is an approximated representation of the actual distribution. We further extended the cdf -intervals, with the notion of p-box, in order to enclose all available information by two cdf distributions [Saad, Gervet, and Fruehwirth (2012b),Saad, Gervet, and Fruehwirth (2012a),Saad, Fruehwirth, and Gervet (2014) and Saad (2014)]. The bounding distributions are chosen to be uniform in order to ease the computations over the novel algebraic structure. The probabilities, within those bounds, are ranked based on the stochastic dominance. In this work, we define the formal frameworks for constraint reasoning over the cdf -intervals and the p-box cdf -intervals. The modeling and reasoning are constructed within the CP paradigm due to its powerful expressiveness. Moreover, we construct a system of global constraints, over the two algebraic structures, by extending Interval Linear Systems (ILS) with a second dimension (the cdf ). We develop a formal Constraint Logic Programming (CLP) language from the new defined domains and show how the new domains affect the problem variables and the decision process. We implement the new language as a separate solver module in the ECLiPSe constraint programming environment.<br />
The p-box cdf -intervals combine techniques from the convex modeling, to take advantage of their tractability, with approaches revealing quatifiable information from the probabilistc and stochastic world, to take advantage of their expressiveness. We perform a comparison in the data representation and in the reasoning performance over models from the two paradigms and our novel framework. This comparison is further adopted to model two different real-life applications: the Network Traffic Application problem, used in network design problems, and the Inventory Management problem of a manufacturing process. The empirical evaluation of our implementation shows that, with minimal overhead, the output solution set realizes a full enclosure of the data along with tighter bounds on its probabilistic distributions. Solutions sought to be feasible in the real domain are excluded by the p-box cdf -intervals reasoning since they are infeasible because they violate the properties of the cdf -domain. Additional knowledge, on the data whereabouts, gained by the implementation of our novel formal and practical framework gives rise to a wide range of future research work.</p>
</div></div></section><section class="field field-name-field-graduated field-type-datetime field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Graduated: </h2><div class="field-items"><div class="field-item even"><span class="date-display-single">Thursday, April 21, 2016</span></div></div></section><section class="field field-name-field-link-to-full-text field-type-link-field field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Link to full text: </h2><div class="field-items"><div class="field-item even"><a href="https://oparu.uni-ulm.de/xmlui/handle/123456789/4005">CDF-intervals: a probabilistic interval constraint framework to reason about data with uncertainty</a></div></div></section>Fri, 01 Dec 2017 20:56:31 +0000aya.saad1180 at http://cp2013.a4cp.orgOther Things Besides Number: Abstraction, Constraint Propagation, and String Variable Types
http://cp2013.a4cp.org/node/1100
<section class="field field-name-field-author field-type-text field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Author: </h2><div class="field-items"><div class="field-item even">Joseph Scott</div></div></section><section class="field field-name-field-school field-type-text field-label-above view-mode-teaser"><h2 class="field-label">School: </h2><div class="field-items"><div class="field-item even">Uppsala University</div></div></section><section class="field field-name-field-supervisors field-type-text field-label-above view-mode-teaser"><h2 class="field-label">Supervisors: </h2><div class="field-items"><div class="field-item even">Pierre Flener</div><div class="field-item odd">Justin Pearson</div><div class="field-item even">Parosh Abdulla</div></div></section><section class="field field-name-body field-type-text-with-summary field-label-above view-mode-teaser"><h2 class="field-label">Abstract: </h2><div class="field-items"><div class="field-item even"><p>In constraint programming (CP), a combinatorial problem is modeled declaratively as a conjunction of constraints, each of which captures some of the combinatorial substructure of the problem. Constraints are more than a modeling convenience: every constraint is partially implemented by an inference algorithm, called a propagator, that rules out some but not necessarily all infeasible candidate values of one or more unknowns in the scope of the constraint. Interleaving propagation with systematic search leads to a powerful and complete solution method, combining a high degree of re-usability with natural, high-level modeling.</p>
<p>A propagator can be characterized as a sound approximation of a constraint on an abstraction of sets of candidate values; propagators that share an abstraction are similar in the strength of the inference they perform when identifying infeasible candidate values. In this thesis, we consider abstractions of sets of candidate values that may be described by an elegant mathematical formalism, the Galois connection. We develop a theoretical framework from the correspondence between Galois connections and propagators, unifying two disparate views of the abstraction-propagation connection, namely the oft-overlooked distinction between representational and computational over-approximations. Our framework yields compact definitions of propagator strength, even in complicated cases (i.e., involving several types, or unknowns with internal structure); it also yields a method for the principled derivation of propagators from constraint definitions.</p>
<p>We apply this framework to the extension of an existing CP solver to constraints over strings, that is, words of finite length. We define, via a Galois connection, an over-approximation for bounded-length strings, and demonstrate two different methods for implementing this overapproximation in a CP solver. First we use the Galois connection to derive a bounded-length string representation as an aggregation of existing scalar types; propagators for this representation are obtained by manual derivation, or automated synthesis, or a combination. Then we implement a string variable type, motivating design choices with knowledge gained from the construction of the over-approximation. The resulting CP solver extension not only substantially eases modeling for combinatorial string problems, but also leads to substantial efficiency improvements over prior CP methods.</p>
</div></div></section><section class="field field-name-field-graduated field-type-datetime field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Graduated: </h2><div class="field-items"><div class="field-item even"><span class="date-display-single">Monday, March 14, 2016</span></div></div></section><section class="field field-name-field-link-to-full-text field-type-link-field field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Link to full text: </h2><div class="field-items"><div class="field-item even"><a href="http://urn.kb.se/resolve?urn=urn%3Anbn%3Ase%3Auu%3Adiva-273311">Uppsala University</a></div></div></section>Wed, 30 Mar 2016 09:07:43 +0000joseph.scott1100 at http://cp2013.a4cp.orgSearch and Coverage Path Planning
http://cp2013.a4cp.org/node/1166
<section class="field field-name-field-author field-type-text field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Author: </h2><div class="field-items"><div class="field-item even">Michael Morin</div></div></section><section class="field field-name-field-school field-type-text field-label-above view-mode-teaser"><h2 class="field-label">School: </h2><div class="field-items"><div class="field-item even">Department of Computer Science and Software Engineering, Université Laval, Québec, QC, Canada</div></div></section><section class="field field-name-body field-type-text-with-summary field-label-above view-mode-teaser"><h2 class="field-label">Abstract: </h2><div class="field-items"><div class="field-item even"><p>We tackle two different and complementary problems: the Coverage Path Planning (CPP) and the Optimal Search Path (OSP). The CPP is a main challenge in mobile robotics. The OSP is a classic from search theory. We first present a review of both problems that highlights their differences and their similarities from the point of view of search (coverage) operations. Both problems are positioned on the continuum of the a priori knowledge on the whereabouts of a search object. We then formalize an extension of the CPP we call the CPP with imperfect extended detections (CPPIED). We present a novel and powerful heuristic algorithm that uses dynamic programming and a traveling salesman (TSP) reduction. We apply the method to underwater minesweeping operations on maps with more than 21 thousand cells. We then study a novel Constraint Programming (CP) model to solve the OSP. We first improve on using the classical objective function found in the OSP definition. Our novel objective function, involving a single modification of the operators used to compute the probability of success of a search plan, leads to a stronger filtering of the probability variables of the model. Then, we propose a novel heuristic for the OSP: the Total Detection (TD) heuristic. Experiments show that our model, along with the proposed heuristic, is competitive with problem-specific Branch and Bounds supporting the claim that CP is a good technique to solve search theory problems. We finally propose the Markov Transition Constraint (MTC) as a novel modeling tool in CP to simplify the implementation of models based on Markov chains. We prove, both empirically and theoretically, that interval arithmetic is insufficient to filter the probability variables of a single MTC, i.e., to enforce bounds consistency on these variables. Interval arithmetic is the only available tool to filter an MTC when it is decomposed into individual arithmetic constraints. We thus propose an algorithm based on linear programming which is proved to enforce bounds consistency. Since linear programming is computationally expensive to use at each node of the search tree of a CP solver, we propose an in-between solution based on a fractional knapsack filtering. The MTC global constraint usage is illustrated on a CP model of the OSP.</p>
</div></div></section><section class="field field-name-field-graduated field-type-datetime field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Graduated: </h2><div class="field-items"><div class="field-item even"><span class="date-display-single">Thursday, December 31, 2015</span></div></div></section><section class="field field-name-field-link-to-full-text field-type-link-field field-label-inline clearfix view-mode-teaser"><h2 class="field-label">Link to full text: </h2><div class="field-items"><div class="field-item even"><a href="http://www.theses.ulaval.ca/2015/32066/">Search and Coverage Path Planning</a></div></div></section><section class="field field-name-upload field-type-file field-label-above view-mode-teaser"><h2 class="field-label">PDF of thesis: </h2><div class="field-items"><div class="field-item even"><span class="file"><img class="file-icon" alt="PDF icon" title="application/pdf" src="/modules/file/icons/application-pdf.png" /> <a href="http://cp2013.a4cp.org/sites/default/files/michael_morin_-_search_and_coverage_path_planning.pdf" type="application/pdf; length=5777931">michael_morin_-_search_and_coverage_path_planning.pdf</a></span></div></div></section>Thu, 20 Apr 2017 15:36:25 +0000michaelmorin1166 at http://cp2013.a4cp.org