The aggregate operator takes as parameter a numbered (frequency) and interval chance node (severity) and must itself be an interval discrete function node.

It is a requirement that the state values of the numbered node form an increasing sequence (0,1,2,3,4,5....) and that the interval state range in the discrete function node is sufficient to cover zero as well as min and max product of frequency and severity state values.

(If you get insufficient state range messages when compiling a network using discrete function node and aggregate operator, it is most likely the intervals of the discrete function node that needs to be revisited to make sure that this last requirement is met).

Also there is a small description in the C API manual on page 74 (look for h_operator_aggregate).

Here are two example network files:

aggregate1.oobn

http://download.hugin.com/webdocs/forum_example/aggregate/aggregate1.oobnaggregate2.oobn

http://download.hugin.com/webdocs/forum_example/aggregate/aggregate2.oobnIn aggregate1 the interval range is finite and in aggregate2 it goes to infinity.

In case you are familiar with the old convolution dialog, the aggregate operator can produce the same results.

The aggregate2 example meets the requirements of the old convolution dialog (interval range from 0 to infinity). One can then use convolution (select frequency+severity, network->analysis->convolution) and compute the total impact distribution - which should produce same result as the discrete function node distribution produced by the aggregate operator.

But where the convolution dialog only lets us inspect numbers, the aggregate operator allows us to use the numbers for further computaions in the network through the child nodes of the discrete function node.

Finally, as with normal function nodes the belief updating can only go forward through discrete function nodes. This means that evidence entered on the descendants of a discrete function node has no impact on beliefs of the parents.