Learning from non-iid data: Fast rates for the one-vs-all multiclass plug-in classifiers Conference

Dinh, V, Ho, LST, Cuong, NV et al. (2015). Learning from non-iid data: Fast rates for the one-vs-all multiclass plug-in classifiers . EURO-PAR 2011 PARALLEL PROCESSING, PT 1, 9076 375-387. 10.1007/978-3-319-17142-5_32

cited authors

  • Dinh, V; Ho, LST; Cuong, NV; Nguyen, D; Nguyen, BT

abstract

  • We prove new fast learning rates for the one-vs-all multiclass plug-in classifiers trained either from exponentially strongly mixing data or from data generated by a converging drifting distribution. These are two typical scenarios where training data are not iid. The learning rates are obtained under a multiclass version of Tsybakov’s margin assumption, a type of low-noise assumption, and do not depend on the number of classes. Our results are general and include a previous result for binaryclass plug-in classifiers with iid data as a special case. In contrast to previous works for least squares SVMs under the binary-class setting, our results retain the optimal learning rate in the iid case.

publication date

  • January 1, 2015

published in

Digital Object Identifier (DOI)

start page

  • 375

end page

  • 387

volume

  • 9076