Skip to main content
Article
Optimizing Parallel Belief Propagation in Junction Trees using Regression
The 19th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (2013)
  • Lu Zheng, Carnegie Mellon University
  • Ole J Mengshoel, Carnegie Mellon University
Abstract
The junction tree approach, with applications in artificial intelligence, computer vision, machine learning, and statistics, is often used for computing posterior distributions in probabilistic graphical models. One of the key challenges associated with junction trees is computational, and several parallel computing technologies - including many-core processors - have been investigated to meet this challenge. Many-core processors (including GPUs) are now programmable, unfortunately their complexities make it hard to manually tune their parameters in order to optimize software performance. In this paper, we investigate a machine learning approach to minimize the execution time of parallel junction tree algorithms implemented on a GPU. By carefully allocating a GPU’s threads to different parallel computing opportunities in a junction tree, and treating this thread allocation problem as a machine learning problem, we find in experiments that regression - specifically support vector regression - can substantially outperform manual optimization.
Keywords
  • belief propagation,
  • parallel computing,
  • GPUs,
  • regression,
  • junction trees,
  • Bayesian networks
Publication Date
August, 2013
Publisher Statement
@inproceedings{zheng13optimizing,
 author = {Zheng, L. and Mengshoel, O. J.},
 title     = {Optimizing Parallel Belief Propagation in Junction Trees using Regression}, 
 booktitle = {Proc. of 19th {ACM SIGKDD} Conference on Knowledge Discovery and Data Mining (KDD-13)},
 address = {Chicago, IL},
 month     = {August},
 year = {2013}
Citation Information
Lu Zheng and Ole J Mengshoel. "Optimizing Parallel Belief Propagation in Junction Trees using Regression" The 19th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (2013)
Available at: http://works.bepress.com/ole_mengshoel/41/