Strategy Overview#
This is an overview of all implemented active learning strategies, which are often divided into three main categories based on the utilities they compute for sample selection:
Informativeness-based strategies mostly select samples for which the model is most uncertain (e.g., via information-theoretic measures).
Representativeness-based strategies select samples that capture the overall data distribution (e.g., via clustering ordensity estimation).
Hybrid strategies combine both criteria to select samples that are informative and representative.
Furthermore, we distinguish between regression and classification as supervised learning tasks, where labels canbe provided by a single annotator or multiple annotators. You can use the checkboxes below to filter the query strategies based on these distinctions.
Pool#
Baseline#
Method |
Base Class |
Tags |
Reference |
---|---|---|---|
pool regression classification single-annotator |
Hybrid#
Method |
Base Class |
Tags |
Reference |
---|---|---|---|
pool classification single-annotator |
Ash et al.1 |
||
pool classification single-annotator |
Prabhu et al.2 |
||
pool classification single-annotator |
Margatina et al.3 |
||
pool classification single-annotator |
Gupte et al.4 |
||
pool classification single-annotator |
Gilhuber et al.5 |
||
pool classification single-annotator |
Reitmaier and Sick6 |
||
Batch Density-Diversity-Distribution-Distance Sampling (Batch4DS) |
pool classification single-annotator |
Reitmaier and Sick6 |
|
pool classification single-annotator |
Kottke et al.7 |
||
pool classification single-annotator |
|||
Regression Tree Based Active Learning (RT-AL) with Random Selection |
pool regression single-annotator |
Jose et al.10 |
|
Regression Tree Based Active Learning (RT-AL) with Diversity Selection |
pool regression single-annotator |
Jose et al.10 |
|
Regression Tree Based Active Learning (RT-AL) with Representativity Selection |
pool regression single-annotator |
Jose et al.10 |
|
pool classification single-annotator |
Tang et al.11 |
||
pool classification single-annotator |
Donmez et al.12 |
Informativeness#
Method |
Base Class |
Tags |
Reference |
---|---|---|---|
pool classification single-annotator |
|||
pool classification single-annotator |
Huang and Lin15 |
||
pool classification single-annotator |
Nguyen et al.16 |
||
pool regression single-annotator |
Cai et al.17 |
||
pool regression single-annotator |
Käding et al.18 |
||
pool regression single-annotator |
Cohn et al.19 |
||
pool classification single-annotator |
Houlsby et al.13 |
||
pool regression single-annotator |
Elreedy et al.20 |
||
pool classification single-annotator |
Roy and McCallum21 |
||
pool classification single-annotator |
Roy and McCallum21 |
||
pool classification single-annotator |
|||
pool classification single-annotator |
|||
pool classification single-annotator |
|||
pool regression single-annotator |
|||
pool classification single-annotator |
Settles27 |
||
pool classification single-annotator |
Settles27 |
||
pool classification single-annotator |
Settles27 |
||
pool classification single-annotator |
Wang et al.28 |
||
pool classification single-annotator |
Joshi et al.29 |
||
pool classification single-annotator |
Margineantu30 |
||
pool classification single-annotator |
Kapoor et al.31 |
Representativeness#
Method |
Base Class |
Tags |
Reference |
---|---|---|---|
pool regression classification single-annotator |
Sener and Savarese32 |
||
pool classification regression single-annotator |
Gissin and Shalev-Shwartz33 |
||
pool regression single-annotator |
Wu et al.34 |
||
pool regression single-annotator |
Wu et al.34 |
||
pool regression classification single-annotator |
Wu et al.34 |
||
pool classification single-annotator |
Yehuda et al.35 |
||
pool regression classification single-annotator |
Hacohen et al.36 |
Wrapper#
Method |
Base Class |
Tags |
Reference |
---|---|---|---|
pool regression classification single-annotator |
|||
pool regression classification single-annotator |
Stream#
Baseline#
Method |
Base Class |
Tags |
Reference |
---|---|---|---|
stream classification single-annotator |
|||
stream classification single-annotator |
Hybrid#
Method |
Base Class |
Tags |
Reference |
---|---|---|---|
stream classification single-annotator |
Liu et al.37 |
||
stream classification single-annotator |
Liu et al.37 |
||
Cognitive Dual-Query Strategy with Randomized-Variable-Uncertainty |
stream classification single-annotator |
Liu et al.37 |
|
stream classification single-annotator |
Liu et al.37 |
||
stream classification single-annotator |
Žliobaitė et al.38 |
||
stream classification single-annotator |
Ienco et al.39 |
||
stream classification single-annotator |
Kottke et al.40 |
Informativeness#
Method |
Base Class |
Tags |
Reference |
---|---|---|---|
stream classification single-annotator |
Žliobaitė et al.38 |
||
stream classification single-annotator |
Žliobaitė et al.38 |
||
stream classification single-annotator |
Žliobaitė et al.38 |
References#
- 1
Jordan T. Ash, Chicheng Zhang, Akshay Krishnamurthy, John Langford, and Alekh Agarwal. Deep Batch Active Learning by Diverse, Uncertain Gradient Lower Bounds. In Int. Conf. Learn. Represent. 2020.
- 2
Viraj Prabhu, Arjun Chandrasekaran, Kate Saenko, and Judy Hoffman. Active domain adaptation via clustering uncertainty-weighted embeddings. In IEEE/CVF Int. Conf. Comput. Vis., 8505–8514. 2021.
- 3
Katerina Margatina, Giorgos Vernikos, Loïc Barrault, and Nikolaos Aletras. Active Learning by Acquiring Contrastive Examples. In Conf. Empir. Methods Nat. Lang. Process., 650–663. 2021.
- 4
Sanket Rajan Gupte, Josiah Aklilu, Jeffrey J Nirschl, and Serena Yeung-Levy. Revisiting Active Learning in the Era of Vision Foundation Models. Trans. Mach. Learn. Res., 2024.
- 5
Sandra Gilhuber, Anna Beer, Yunpu Ma, and Thomas Seidl. FALCUN: A Simple and Efficient Deep Active Learning Strategy. In Jt. Eur. Conf. Mach. Learn. Knowl. Discov. Databases, 421–439. 2024.
- 6(1,2)
Tobias Reitmaier and Bernhard Sick. Let us know your decision: Pool-based active training of a generative classifier with the selection strategy 4DS. Inf. Sci., 230:106–131, 2013.
- 7
Daniel Kottke, Georg Krempl, Dominik Lang, Johannes Teschner, and Myra Spiliopoulou. Multi-class Probabilistic Active Learning. In Eur. Conf. Artif. Intell., 586–594. 2016.
- 8
Sheng-Jun Huang, Rong Jin, and Zhi-Hua Zhou. Active Learning by Querying Informative and Representative Examples. In Adv. Neural Inf. Process. Syst. 2010.
- 9
Sheng-Jun Huang, Rong Jin, and Zhi-Hua Zhou. Active Learning by Querying Informative and Representative Examples. IEEE Trans. Pattern Anal. Mach. Intell., 36(10):1936–1949, 2014.
- 10(1,2,3)
Ashna Jose, João Paulo Almeida de Mendonça, Emilie Devijver, Noël Jakse, Valérie Monbet, and Roberta Poloni. Regression Tree-based Active Learning. Data Min. Knowl. Discov., pages 420–460, 2023.
- 11
Min Tang, Xiaoqiang Luo, and Salim Roukos. Active Learning for Statistical Natural Language Parsing. In Annu. Meet. Assoc. Comput. Linguist., 120–127. 2002.
- 12
Pinar Donmez, Jaime G Carbonell, and Paul N Bennett. Dual Strategy Active Learning. In Eur. Conf. Mach. Learn., 116–127. 2007.
- 13(1,2)
Neil Houlsby, Ferenc Huszár, Zoubin Ghahramani, and Máté Lengyel. Bayesian Active Learning for Classification and Preference Learning. arXiv:1112.5745, 2011.
- 14
Andreas Kirsch, Joost Van Amersfoort, and Yarin Gal. BatchBALD: Efficient and Diverse Batch Acquisition for Deep Bayesian Active Learning. In Adv. Neural Inf. Process. Syst. 2019.
- 15
Kuan-hao Huang and Hsuan-tien Lin. A Novel Uncertainty Sampling Algorithm for Cost-Sensitive Multiclass Active Learning. In IEEE Int. Conf. Data Min., 925–930. 2016.
- 16
Vu-Linh Nguyen, Sébastien Destercke, and Eyke Hüllermeier. Epistemic Uncertainty Sampling. In Int. Conf. Discov. Sci., 72–86. 2019.
- 17
Wenbin Cai, Ya Zhang, and Jun Zhou. Maximizing Expected Model Change for Active Learning in Regression. In IEEE Int. Conf. Data Min., 51–60. 2013.
- 18
Christoph Käding, Erik Rodner, Alexander Freytag, Oliver Mothes, Björn Barz, and Joachim Denzler. Active Learning for Regression Tasks with Expected Model Output Changes. In Br. Mach. Vis. Conf. 2018.
- 19
David A Cohn, Zoubin Ghahramani, and Michael I Jordan. Active Learning with Statistical Models. J. Artif. Intell. Res., 4:129–145, 1996.
- 20
Dina Elreedy, Amir F. Atiya, and Samir I. Shaheen. A Novel Active Learning Regression Framework for Balancing the Exploration-Exploitation Trade-Off. Entropy, 21(7):651, 2019.
- 21(1,2)
Nicholas Roy and Andrew McCallum. Toward Optimal Active Learning through Monte Carlo Estimation of Error Reduction. In Int. Conf. Mach. Learn., 441–448. 2001.
- 22(1,2,3,4)
H Sebastian Seung, Manfred Opper, and Haim Sompolinsky. Query by Committee. In Annu. Workshop Comput. Learn. Theory., 287––294. 1992.
- 23
Andrew Kachites McCallum and Kamal Nigamy. Employing EM and Pool-Based Active Learning for Text Classification. In Int. Conf. Mach. Learn., 359–367. 1998.
- 24
Sean P Engelson and Ido Dagan. Minimizing Manual Annotation Cost in Supervised Training from Corpora. In Annu. Meet. Assoc. Comput. Linguist., 319–326. 1996.
- 25
William H Beluch, Tim Genewein, Andreas Nürnberger, and Jan M Köhler. The Power of Ensembles for Active Learning in Image Classification. In IEEE Conf. Comput. Vis. Pattern Recognit., 9368–9377. 2018.
- 26
Robert Burbidge, Jem J Rowland, and Ross D King. Active Learning for Regression Based on Query by Committee. In Intell. Data Eng. Autom. Learn., 209–218. 2007.
- 27(1,2,3)
Burr Settles. Active learning literature survey. Technical Report 1648, University of Wisconsin, Department of Computer Science, 2009.
- 28
Hanmo Wang, Xiaojun Chang, Lei Shi, Yi Yang, and Yi-Dong Shen. Uncertainty Sampling for Action Recognition via Maximizing Expected Average Precision. In Int. Jt. Conf. Artif. Intell., 964–970. 2018.
- 29
Ajay J Joshi, Fatih Porikli, and Nikolaos Papanikolopoulos. Multi-class Active Learning for Image Classification. In IEEE/CVF Conf. Comput. Vis. Pattern Recognit., 2372–2379. 2009.
- 30
Dragos D Margineantu. Active Cost-Sensitive Learning. In Int. Jt. Conf. Artif. Intell., 1622–1623. 2005.
- 31
Ashish Kapoor, Eric Horvitz, and Sumit Basu. Selective Supervision: Guiding Supervised Learning with Decision-Theoretic Active Learning. In Int. Jt. Conf. Artif. Intell., 877–882. 2007.
- 32
Ozan Sener and Silvio Savarese. Active Learning for Convolutional Neural Networks: A Core-Set Approach. In Int. Conf. Learn. Represent. 2018.
- 33
Daniel Gissin and Shai Shalev-Shwartz. Discriminative Active Learning. arXiv:1907.06347, 2019.
- 34(1,2,3)
Dongrui Wu, Chin-Teng Lin, and Jian Huang. Active Learning for Regression using Greedy Sampling. Inf. Sci., 474:90–105, 2019.
- 35
Ofer Yehuda, Avihu Dekel, Guy Hacohen, and Daphna Weinshall. Active Learning Through a Covering Lens. In Adv. Neural Inf. Process. Syst. 2022.
- 36
Guy Hacohen, Avihu Dekel, and Daphna Weinshall. Active Learning on a Budget: Opposite Strategies Suit High and Low Budgets. In Int. Conf. Mach. Learn., 8175–8195. 2022.
- 37(1,2,3,4)
Sanmin Liu, Shan Xue, Jia Wu, Chuan Zhou, Jian Yang, Zhao Li, and Jie Cao. Online Active Learning for Drifting Data Streams. IEEE Trans. Neural Netw. Learn. Syst., 34(1):186–200, 2023.
- 38(1,2,3,4)
Indrė Žliobaitė, Albert Bifet, Bernhard Pfahringer, and Geoffrey Holmes. Active Learning With Drifting Streaming Data. IEEE Trans. Neural Netw. Learn. Syst., 25(1):27–39, 2014.
- 39
Dino Ienco, Indrė Žliobaitė, and Bernhard Pfahringer. High density-focused uncertainty sampling for active learning over evolving stream data. In Int. Workshop Big Data Streams Heterog. Source Min. Algorithms Syst. Program. Models Appl., 133–148. 2014.
- 40
Daniel Kottke, Georg Krempl, and Myra Spiliopoulou. Probabilistic Active Learning in Datastreams. In Adv. Intell. Data Anal., 145–157. 2015.