Cut-generating linear programs (CGLPs) play a key role as a separation oracle to produce valid inequalities for the feasible region of optimization problems. When incorporated inside of branch-and-bound, the cutting planes obtained from CGLPs help to tighten relaxations and improve dual bounds. Running CGLPs at nodes of the branch-and-bound tree, however, is computationally cumbersome due to the large number of node candidates and the lack of a priori knowledge on which nodes admit useful cutting planes. As a result, CGLPs are often avoided at default settings of branch-and-cut algorithms despite their potential impact on improving dual bounds. In this paper, we propose a novel framework based on machine learning to approximate the optimal value of the CGLP, which is the deciding factor in generating cutting planes. Translating the CGLP as an indicator function of the objective function vector, we show that it can be approximated through conventional data classification techniques. We provide a systematic procedure to efficiently generate train data sets for the corresponding classification problem based on the CGLP structure. We conduct computational experiments using classification methods such as logistic regression, support vector machines, and neural networks. Computational results suggest that the outcome of the approximate CGLP obtained from classification achieves a high accuracy rate in a significantly smaller amount of time compared to modern LP solvers. Our pro- posed framework can be efficiently applied to a large number of nodes in the branch-and-bound tree to identify the best candidates for running the CGLP--a feature that can be implemented at the preprocessing phase of any branch-and-cut algorithm to improve solution time and bound quality.
Available at: http://works.bepress.com/danial-davarnia/5/
This is a pre-print of the article Rajabalizadeh, Atefeh, and Danial Davarnia. "Solving Cut-Generating Linear Programs via Machine Learning." (2021). Posted with permission.