There can be multiple reason why the compilation of a model takes a significant amount of time:
- The triangulation of a graph is a NP-hard step (triangulation is a necessary step of the compilation process). If the "optimal triangulation" rule is used, then the triangulation may take a significant amount of time.
- The compilation process constructs a secondary computational structure known as a junction tree. The nodes of the junction tree are cliques in the triangulated graph (i.e., subsets of nodes). Each clique holds a table over its nodes. For this reason you want small cliques. Inference proceeds by message passing between cliques. During inference clique tables are updated using pointwise multiplication. Thus, a lot of multiplications are require for large cliques. Messages are computed by summing over clique tables.
Thus, the size of the largest clique tables is very important for the cost of inference.
- The generator of a table from an expression can be time consuming (see this FAQ).
- Allocation of large amounts of memory can be time consuming.
Good indicators for a complex compilation are:
- Size of parent sets and number of states in variables as this is a lower limit on complexity.
- Expressions with interval nodes as this make require a lot of time for generating tables with maybe interval nodes (see this FAQ).
- Size of longest cycles in the graph as they require fill-ins (additional edges increasing the size of cliques) in the triangulated graph.