Some other features in Bayesian inference

Creative Commons License

aGrUM

interactive online version

Lazy Propagation uses a secondary structure called the “Junction Tree” to perform the inference.

In [1]:
import pyagrum as gum
import pyagrum.lib.notebook as gnb

bn = gum.loadBN("res/alarm.dsl")
gnb.showJunctionTreeMap(bn);
../_images/notebooks_43-Inference_LazyPropagationAdvancedFeatures_3_0.svg

But this junction tree can be transformed to build different probabilistic queries.

In [2]:
bn = gum.fastBN("A->B->C->D;A->E->D;F->B;C->H")
ie = gum.LazyPropagation(bn)
bn
Out[2]:
G A A B B A->B E E A->E H H C C C->H D D C->D B->C F F F->B E->D

Evidence impact

Evidence Impact allows the user to analyze the effect of any variables on any other variables

In [3]:
ie.evidenceImpact("B", ["A", "H"])
Out[3]:
B
A
H
0
1
0
0
0.08410.9159
1
0.11040.8896
1
0
0.56750.4325
1
0.63960.3604

Evidence impact is able to find the minimum set of variables which effectively conditions the analyzed variable

In [4]:
ie.evidenceImpact("E", ["A", "F", "B", "D"])  # {A,D,B} d-separates E and F
Out[4]:
E
A
B
D
0
1
0
0
0
0.55820.4418
1
0.23030.7697
1
0
0.58300.4170
1
0.18800.8120
1
0
0
0.80340.1966
1
0.49180.5082
1
0
0.81890.1811
1
0.42820.5718
In [5]:
ie.evidenceImpact("E", ["A", "B", "C", "D", "F"])  # {A,C,D} d-separates E and {B,F}
Out[5]:
E
C
A
D
0
1
0
0
0
0.64550.3545
1
0.06000.9400
1
0
0.85480.1452
1
0.17110.8289
1
0
0
0.48140.5186
1
0.33890.6611
1
0
0.75020.2498
1
0.62380.3762

Evidence Joint Impact

In [6]:
ie.evidenceJointImpact(["A", "F"], ["B", "C", "D", "E", "H"])  # {B,E} d-separates [A,F] and [C,D,H]
Out[6]:
A
E
B
F
0
1
0
0
0
0.09740.7268
1
0.10180.0740
1
0
0.72910.1911
1
0.05140.0283
1
0
0
0.21800.5029
1
0.22790.0512
1
0
0.85940.0696
1
0.06060.0103