杨健老师的论文KPCA Plus LDA: A Complete Kernel Fisher Discriminant Framework for Feature Extraction and Recognition   5.1节采用该分类器,Why can LDA be performed in PCA transformed space也采用该分类器,等同于叶杰平老师论文Generalized Linear Discriminant Analysis: A Unified Framework and Efficient Model Selection    IV节的nearest-centroid classifier(也即汪增福老师讲的平均样本法),定义如下:(摘自网页http://homepages.inf.ed.ac.uk/rbf/HIPR2/classify.htm

Suppose that each training class is represented by a prototype (or mean) vector:

Eqn:eqncl1

where Eqn:eqnnj is the number of training pattern vectors from class Eqn:eqnomegj. In the example classification problem given above, Eqn:eqnmneed and Eqn:eqnmbolt as shown in Figure 2.




Figure 2 Feature space: + sewing needles, o bolts, * class mean

Based on this, we can assign any given pattern Eqn:eqnx to the class of its closest prototype by determining its proximity to each Eqn:eqnmj. If Euclidean distance is our measure of proximity, then the distance to the prototype is given by

Eqn:eqnclDJ

It is not difficult to show that this is equivalent to computing

Eqn:eqncl2

and assign Eqn:eqnx to class Eqn:eqnomegj if Eqn:eqndj yields the largest value.
显然,minimum distance classifier的效率要比nearest neighbor classifier (NN)要低,因为对于任意一个测试样本,前者只需要计算到训练样本的几个类心的距离,而nearest neighbor classifier (NN)要计算与所有训练样本的距离。杨健老师论文KPCA Plus LDA   5.2节也有原话:A minimum distance classifier is employed for computational efficiency.

Other reference:
Mar 24, 2012 gmail 附件讲义7.3节有minimum distance classifier的英文描述