skactiveml.pool.average_kl_divergence#

skactiveml.pool.average_kl_divergence(probas, eps=1e-07)[source]#

Calculates the average Kullback-Leibler (KL) divergence for measuring the level of disagreement in QueryByCommittee.

Parameters
probasarray-like of shape (n_estimators, n_samples, n_classes)

The probability estimates of all estimators, samples, and classes.

epsfloat > 0, optional (default=1e-7)

Minimum probability threshold to compute log-probabilities.

Returns
scoresnp.ndarray, shape (n_samples,)

The Kullback-Leibler (KL) divergences.

References

1

A. McCallum and K. Nigam. Employing EM in pool-based active learning for text classification. In International Conference on Machine Learning, pages 359-367, 1998.