Author Topic: I seem to be running into some limits with the HUGIN software. What is the ma...  (Read 17693 times)

Offline Anders L Madsen

  • HUGIN Expert
  • Hero Member
  • *****
  • Posts: 2295
    • View Profile
I seem to be running into some limits with the HUGIN software. What is the maximum number of nodes and edges that HUGIN can handle?

There are no built-in limits on the number of nodes or the number of edges. The amount of memory (RAM) in your computer determines how complex your models can be (except that since Hugin is currently a 32-bit application, at most 4GB can be utilized).
The most important cause for excessive memory usage is a model that is very densely connected. For example, a completely connected network with 30 binary nodes would create a junction tree with a single clique, and the state space of that clique would be of size 2^30. Since each element of that state space has a floating-point value (representing the probability) of type "double" (occupying 8 bytes in the computer memory) associated with it, such a clique would occupy 8GB of memory.
On the other hand, a network that is a tree (that is, there is only one path between any two nodes) will only use memory proportional to the number of nodes and the largest conditional probability table. If your network is a tree (or "almost" a tree), then you should be able to have at least 10000 and maybe even 100000 nodes.
« Last Edit: April 02, 2007, 12:03:44 by Anders L Madsen »
HUGIN EXPERT A/S

Offline joost

  • Newbie
  • *
  • Posts: 7
    • View Profile
Also referring to the post "Why does entering Run mode seem to take forever and why does it sometimes fail?":
I use networks with a lot of interval nodes, so I get the error message "Hugin ran out of memory" a lot, even with relatively small networks. Probably this problem arizes when
1. I have densely connected networks, so large cliques
2. There ary many interval nodes in these cliques

Is there a way to reduce the clique size, so are there standard tricks to convert densely connected networks to tree-like networks, while keeping the same network?
And if not: does it also help a lot to a) reduce the nubers of intervals in a network, or b) reduce the numbers of computed points in every interval from 25 to a smaller number?

Offline Anders L Madsen

  • HUGIN Expert
  • Hero Member
  • *****
  • Posts: 2295
    • View Profile
In your case it seems to be most likely that "Hugin ran our of memory" due to very large clique state spaces. This is the case when one or more cliques have a large number of nodes or when nodes have many states.

The three main reasons for running out of memory in a compilation are:
1) a densely connected network
2) a node with a large number of parents
3) nodes with many states

The options to reduce the memory requirements of a model are to reduce the density of the network, i.e., try to eliminate cycles in the graph, reduce the number of parents of nodes with many parent, and reduce the number of states of nodes in large cliques.

The number of "samples/intervals" does not impact the size of the cliques in the junction tree. It is only used in the process of generating CPTs from mathematical expressions.

Initially, you could experiment with using different triangulation methods. For instance, the "optimal" triangulation method may produce a useful triangulation.

HUGIN EXPERT A/S