Strategy Overview#

This is an overview of all implemented active learning strategies, which are often divided into three main categories based on the utilities they compute for sample selection:

  1. Informativeness-based strategies mostly select samples for which the model is most uncertain (e.g., via information-theoretic measures).

  2. Representativeness-based strategies select samples that capture the overall data distribution (e.g., via clustering ordensity estimation).

  3. Hybrid strategies combine both criteria to select samples that are informative and representative.

Furthermore, we distinguish between regression and classification as supervised learning tasks, where labels canbe provided by a single annotator or multiple annotators. You can use the checkboxes below to filter the query strategies based on these distinctions.

Pool#

Baseline#

Method

Base Class

Tags

Reference

Random Sampling

RandomSampling

pool regression classification single-annotator

Hybrid#

Method

Base Class

Tags

Reference

Batch Active Learning by Diverse Gradient Embedding (BADGE)

Badge

pool classification single-annotator

Ash et al.1

Clustering Uncertainty-weighted Embeddings (CLUE)

Clue

pool classification single-annotator

Prabhu et al.2

Contrastive Active Learning (CAL)

ContrastiveAL

pool classification single-annotator

Margatina et al.3

Dropout Query (DropQuery)

DropQuery

pool classification single-annotator

Gupte et al.4

Fast Active Learning by Contrastive UNcertainty (FALCUN)

Falcun

pool classification single-annotator

Gilhuber et al.5

Density-Diversity-Distribution-Distance Sampling (4DS)

FourDs

pool classification single-annotator

Reitmaier and Sick6

Batch Density-Diversity-Distribution-Distance Sampling (Batch4DS)

FourDs

pool classification single-annotator

Reitmaier and Sick6

Multi-class Probabilistic Active Learning (McPAL)

ProbabilisticAL

pool classification single-annotator

Kottke et al.7

Querying Informative and Representative Examples (QUIRE)

Quire

pool classification single-annotator

Huang et al.8, Huang et al.9

Regression Tree Based Active Learning (RT-AL) with Random Selection

RegressionTreeBasedAL

pool regression single-annotator

Jose et al.10

Regression Tree Based Active Learning (RT-AL) with Diversity Selection

RegressionTreeBasedAL

pool regression single-annotator

Jose et al.10

Regression Tree Based Active Learning (RT-AL) with Representativity Selection

RegressionTreeBasedAL

pool regression single-annotator

Jose et al.10

Density-weighted Uncertainty Sampling

UncertaintySampling

pool classification single-annotator

Tang et al.11

Dual Strategy for Active Learning

UncertaintySampling

pool classification single-annotator

Donmez et al.12

Informativeness#

Method

Base Class

Tags

Reference

Batch Bayesian Active Learning by Disagreement (BatchBALD)

BatchBALD

pool classification single-annotator

Houlsby et al.13, Kirsch et al.14

Active Learning with Cost Embedding (ALCE)

CostEmbeddingAL

pool classification single-annotator

Huang and Lin15

Epistemic Uncertainty Sampling (EpisUS)

EpistemicUncertaintySampling

pool classification single-annotator

Nguyen et al.16

Expected Model Change

ExpectedModelChangeMaximization

pool regression single-annotator

Cai et al.17

Expected Model Output Change

ExpectedModelOutputChange

pool regression single-annotator

Käding et al.18

Expected Model Variance Reduction

ExpectedModelVarianceReduction

pool regression single-annotator

Cohn et al.19

Bayesian Active Learning by Disagreement (BALD)

GreedyBALD

pool classification single-annotator

Houlsby et al.13

Regression based Kullback Leibler Divergence Maximization

KLDivergenceMaximization

pool regression single-annotator

Elreedy et al.20

Monte-Carlo EER with Log-Loss

MonteCarloEER

pool classification single-annotator

Roy and McCallum21

Monte-Carlo EER with Misclassification-Loss

MonteCarloEER

pool classification single-annotator

Roy and McCallum21

Query-by-Committee (QBC) with Kullback-Leibler Divergence

QueryByCommittee

pool classification single-annotator

Seung et al.22, McCallum and Nigamy23

Query-by-Committee (QBC) with Vote Entropy

QueryByCommittee

pool classification single-annotator

Seung et al.22, Engelson and Dagan24

Query-by-Committee (QBC) with Variation Ratios

QueryByCommittee

pool classification single-annotator

Seung et al.22, Beluch et al.25

Query-by-Committee (QBC) with Empirical Variance

QueryByCommittee

pool regression single-annotator

Seung et al.22, Burbidge et al.26

Uncertainty Sampling with Margin

UncertaintySampling

pool classification single-annotator

Settles27

Uncertainty Sampling with Least-Confidence

UncertaintySampling

pool classification single-annotator

Settles27

Uncertainty Sampling with Entropy

UncertaintySampling

pool classification single-annotator

Settles27

Expected Average Precision

UncertaintySampling

pool classification single-annotator

Wang et al.28

Value of Information on Unlabeled Samples

ValueOfInformationEER

pool classification single-annotator

Joshi et al.29

Value of Information on Labeled Samples

ValueOfInformationEER

pool classification single-annotator

Margineantu30

Value of Information (VOI)

ValueOfInformationEER

pool classification single-annotator

Kapoor et al.31

Representativeness#

Method

Base Class

Tags

Reference

Core Set

CoreSet

pool regression classification single-annotator

Sener and Savarese32

Discriminative Active Learning (DAL)

DiscriminativeAL

pool classification regression single-annotator

Gissin and Shalev-Shwartz33

Greedy Sampling on the Target Space (GSy)

GreedySamplingTarget

pool regression single-annotator

Wu et al.34

Improved Greedy Sampling (GSi)

GreedySamplingTarget

pool regression single-annotator

Wu et al.34

Greedy Sampling on the Feature Space (GSx)

GreedySamplingX

pool regression classification single-annotator

Wu et al.34

Probability Coverage (ProbCover)

ProbCover

pool classification single-annotator

Yehuda et al.35

Typical Clustering (TypiClust)

TypiClust

pool regression classification single-annotator

Hacohen et al.36

Wrapper#

Method

Base Class

Tags

Reference

Parallel Utility Estimation Wrapper

ParallelUtilityEstimationWrapper

pool regression classification single-annotator

Sub-sampling Wrapper

SubSamplingWrapper

pool regression classification single-annotator

Stream#

Baseline#

Method

Base Class

Tags

Reference

Periodic Sampling

PeriodicSampling

stream classification single-annotator

Stream Random Sampling

StreamRandomSampling

stream classification single-annotator

Hybrid#

Method

Base Class

Tags

Reference

Cognitive Dual-Query Strategy with Fixed-Uncertainty

CognitiveDualQueryStrategyFixUn

stream classification single-annotator

Liu et al.37

Cognitive Dual-Query Strategy with Random Sampling

CognitiveDualQueryStrategyRan

stream classification single-annotator

Liu et al.37

Cognitive Dual-Query Strategy with Randomized-Variable-Uncertainty

CognitiveDualQueryStrategyRanVarUn

stream classification single-annotator

Liu et al.37

Cognitive Dual-Query Strategy with Variable-Uncertainty

CognitiveDualQueryStrategyVarUn

stream classification single-annotator

Liu et al.37

Split

Split

stream classification single-annotator

Žliobaitė et al.38

Density Based Active Learning for Data Streams

StreamDensityBasedAL

stream classification single-annotator

Ienco et al.39

Probabilistic Active Learning in Datastreams

StreamProbabilisticAL

stream classification single-annotator

Kottke et al.40

Informativeness#

Method

Base Class

Tags

Reference

Fixed-Uncertainty

FixedUncertainty

stream classification single-annotator

Žliobaitė et al.38

Randomized-Variable-Uncertainty

RandomVariableUncertainty

stream classification single-annotator

Žliobaitė et al.38

Variable-Uncertainty

VariableUncertainty

stream classification single-annotator

Žliobaitė et al.38

References#

1

Jordan T. Ash, Chicheng Zhang, Akshay Krishnamurthy, John Langford, and Alekh Agarwal. Deep Batch Active Learning by Diverse, Uncertain Gradient Lower Bounds. In Int. Conf. Learn. Represent. 2020.

2

Viraj Prabhu, Arjun Chandrasekaran, Kate Saenko, and Judy Hoffman. Active domain adaptation via clustering uncertainty-weighted embeddings. In IEEE/CVF Int. Conf. Comput. Vis., 8505–8514. 2021.

3

Katerina Margatina, Giorgos Vernikos, Loïc Barrault, and Nikolaos Aletras. Active Learning by Acquiring Contrastive Examples. In Conf. Empir. Methods Nat. Lang. Process., 650–663. 2021.

4

Sanket Rajan Gupte, Josiah Aklilu, Jeffrey J Nirschl, and Serena Yeung-Levy. Revisiting Active Learning in the Era of Vision Foundation Models. Trans. Mach. Learn. Res., 2024.

5

Sandra Gilhuber, Anna Beer, Yunpu Ma, and Thomas Seidl. FALCUN: A Simple and Efficient Deep Active Learning Strategy. In Jt. Eur. Conf. Mach. Learn. Knowl. Discov. Databases, 421–439. 2024.

6(1,2)

Tobias Reitmaier and Bernhard Sick. Let us know your decision: Pool-based active training of a generative classifier with the selection strategy 4DS. Inf. Sci., 230:106–131, 2013.

7

Daniel Kottke, Georg Krempl, Dominik Lang, Johannes Teschner, and Myra Spiliopoulou. Multi-class Probabilistic Active Learning. In Eur. Conf. Artif. Intell., 586–594. 2016.

8

Sheng-Jun Huang, Rong Jin, and Zhi-Hua Zhou. Active Learning by Querying Informative and Representative Examples. In Adv. Neural Inf. Process. Syst. 2010.

9

Sheng-Jun Huang, Rong Jin, and Zhi-Hua Zhou. Active Learning by Querying Informative and Representative Examples. IEEE Trans. Pattern Anal. Mach. Intell., 36(10):1936–1949, 2014.

10(1,2,3)

Ashna Jose, João Paulo Almeida de Mendonça, Emilie Devijver, Noël Jakse, Valérie Monbet, and Roberta Poloni. Regression Tree-based Active Learning. Data Min. Knowl. Discov., pages 420–460, 2023.

11

Min Tang, Xiaoqiang Luo, and Salim Roukos. Active Learning for Statistical Natural Language Parsing. In Annu. Meet. Assoc. Comput. Linguist., 120–127. 2002.

12

Pinar Donmez, Jaime G Carbonell, and Paul N Bennett. Dual Strategy Active Learning. In Eur. Conf. Mach. Learn., 116–127. 2007.

13(1,2)

Neil Houlsby, Ferenc Huszár, Zoubin Ghahramani, and Máté Lengyel. Bayesian Active Learning for Classification and Preference Learning. arXiv:1112.5745, 2011.

14

Andreas Kirsch, Joost Van Amersfoort, and Yarin Gal. BatchBALD: Efficient and Diverse Batch Acquisition for Deep Bayesian Active Learning. In Adv. Neural Inf. Process. Syst. 2019.

15

Kuan-hao Huang and Hsuan-tien Lin. A Novel Uncertainty Sampling Algorithm for Cost-Sensitive Multiclass Active Learning. In IEEE Int. Conf. Data Min., 925–930. 2016.

16

Vu-Linh Nguyen, Sébastien Destercke, and Eyke Hüllermeier. Epistemic Uncertainty Sampling. In Int. Conf. Discov. Sci., 72–86. 2019.

17

Wenbin Cai, Ya Zhang, and Jun Zhou. Maximizing Expected Model Change for Active Learning in Regression. In IEEE Int. Conf. Data Min., 51–60. 2013.

18

Christoph Käding, Erik Rodner, Alexander Freytag, Oliver Mothes, Björn Barz, and Joachim Denzler. Active Learning for Regression Tasks with Expected Model Output Changes. In Br. Mach. Vis. Conf. 2018.

19

David A Cohn, Zoubin Ghahramani, and Michael I Jordan. Active Learning with Statistical Models. J. Artif. Intell. Res., 4:129–145, 1996.

20

Dina Elreedy, Amir F. Atiya, and Samir I. Shaheen. A Novel Active Learning Regression Framework for Balancing the Exploration-Exploitation Trade-Off. Entropy, 21(7):651, 2019.

21(1,2)

Nicholas Roy and Andrew McCallum. Toward Optimal Active Learning through Monte Carlo Estimation of Error Reduction. In Int. Conf. Mach. Learn., 441–448. 2001.

22(1,2,3,4)

H Sebastian Seung, Manfred Opper, and Haim Sompolinsky. Query by Committee. In Annu. Workshop Comput. Learn. Theory., 287––294. 1992.

23

Andrew Kachites McCallum and Kamal Nigamy. Employing EM and Pool-Based Active Learning for Text Classification. In Int. Conf. Mach. Learn., 359–367. 1998.

24

Sean P Engelson and Ido Dagan. Minimizing Manual Annotation Cost in Supervised Training from Corpora. In Annu. Meet. Assoc. Comput. Linguist., 319–326. 1996.

25

William H Beluch, Tim Genewein, Andreas Nürnberger, and Jan M Köhler. The Power of Ensembles for Active Learning in Image Classification. In IEEE Conf. Comput. Vis. Pattern Recognit., 9368–9377. 2018.

26

Robert Burbidge, Jem J Rowland, and Ross D King. Active Learning for Regression Based on Query by Committee. In Intell. Data Eng. Autom. Learn., 209–218. 2007.

27(1,2,3)

Burr Settles. Active learning literature survey. Technical Report 1648, University of Wisconsin, Department of Computer Science, 2009.

28

Hanmo Wang, Xiaojun Chang, Lei Shi, Yi Yang, and Yi-Dong Shen. Uncertainty Sampling for Action Recognition via Maximizing Expected Average Precision. In Int. Jt. Conf. Artif. Intell., 964–970. 2018.

29

Ajay J Joshi, Fatih Porikli, and Nikolaos Papanikolopoulos. Multi-class Active Learning for Image Classification. In IEEE/CVF Conf. Comput. Vis. Pattern Recognit., 2372–2379. 2009.

30

Dragos D Margineantu. Active Cost-Sensitive Learning. In Int. Jt. Conf. Artif. Intell., 1622–1623. 2005.

31

Ashish Kapoor, Eric Horvitz, and Sumit Basu. Selective Supervision: Guiding Supervised Learning with Decision-Theoretic Active Learning. In Int. Jt. Conf. Artif. Intell., 877–882. 2007.

32

Ozan Sener and Silvio Savarese. Active Learning for Convolutional Neural Networks: A Core-Set Approach. In Int. Conf. Learn. Represent. 2018.

33

Daniel Gissin and Shai Shalev-Shwartz. Discriminative Active Learning. arXiv:1907.06347, 2019.

34(1,2,3)

Dongrui Wu, Chin-Teng Lin, and Jian Huang. Active Learning for Regression using Greedy Sampling. Inf. Sci., 474:90–105, 2019.

35

Ofer Yehuda, Avihu Dekel, Guy Hacohen, and Daphna Weinshall. Active Learning Through a Covering Lens. In Adv. Neural Inf. Process. Syst. 2022.

36

Guy Hacohen, Avihu Dekel, and Daphna Weinshall. Active Learning on a Budget: Opposite Strategies Suit High and Low Budgets. In Int. Conf. Mach. Learn., 8175–8195. 2022.

37(1,2,3,4)

Sanmin Liu, Shan Xue, Jia Wu, Chuan Zhou, Jian Yang, Zhao Li, and Jie Cao. Online Active Learning for Drifting Data Streams. IEEE Trans. Neural Netw. Learn. Syst., 34(1):186–200, 2023.

38(1,2,3,4)

Indrė Žliobaitė, Albert Bifet, Bernhard Pfahringer, and Geoffrey Holmes. Active Learning With Drifting Streaming Data. IEEE Trans. Neural Netw. Learn. Syst., 25(1):27–39, 2014.

39

Dino Ienco, Indrė Žliobaitė, and Bernhard Pfahringer. High density-focused uncertainty sampling for active learning over evolving stream data. In Int. Workshop Big Data Streams Heterog. Source Min. Algorithms Syst. Program. Models Appl., 133–148. 2014.

40

Daniel Kottke, Georg Krempl, and Myra Spiliopoulou. Probabilistic Active Learning in Datastreams. In Adv. Intell. Data Anal., 145–157. 2015.