skactiveml.pool.average_kl_divergence#

skactiveml.pool.average_kl_divergence(probas, eps=1e-07)[source]#

Calculates the average Kullback-Leibler (KL) divergence for measuring the level of disagreement in QueryByCommittee.

Parameters
probasarray-like of shape (n_estimators, n_samples, n_classes)

The probability estimates of all estimators, samples, and classes.

epsfloat > 0, optional (default=1e-7)

Minimum probability threshold to compute log-probabilities.

Returns
scoresnp.ndarray, shape (n_samples,)

The Kullback-Leibler (KL) divergences.

References

1

A. K. McCallum and K. Nigamy. Employing EM and Pool-Based Active Learning for Text Classification. In Int. Conf. Mach. Learn., pages 359–367, 1998.