Efficient Prototype Selection via Multi-Armed BanditsIn this work, we propose a multi-armed bandit based framework for identifying
a compact set of informative data instances (i.e., the prototypes) that best
represents a given target set. Prototypical examples of a given dataset offer
interpretable insights into the underlying data distribution and assist in
example-based reasoning, thereby influencing every sphere of human decision
making. A key challenge is the large-scale setting, in which similarity
comparison between pairs of data points needs to be done for almost all
possible pairs. We propose to overcome this limitation by employing stochastic
greedy search on the space of prototypical examples and multi-armed bandit
approach for reducing the number of similarity comparisons. We analyze the
total number of similarity comparisons needed by approach and provide an upper
bound independent of the size of the target set.
arxiv.org