In a standard framework of learning a geometric concept from examples, ex-
amples are classified into two types: examples contained in the concept (positive
examples) and those not contained in the concept (negative examples). However,
there exist cases where examples are classified into k(– 2) classes. For example,
clustering a concept space by the Voronoi diagram generated by k points is a very
common tool in image understanding and many other areas. We call such a space
a k-label space. The typical case consisting of positive and negative examples
corresponds to the 2-label space in this setting.
In this paper, we first extend the ffl-approximation for the 2-label space origi-
nally considered by Vapnik and Chervonenkis [11] (see also [1, 5]) to that for the
k-label space. Next, the sample complexity for the generalized ffl-approximation
is analyzed. The generalized ffl-approximation is then applied to the randomized
algorithm for the assignment problem by Tokuyama and Nakano [10] to obtain
tighter bounds. By combining the generalized ffl-approximation with the capacity
of a k-label space induced by Voronoi diagrams, bounds for learning noisy data
for such a k-label space may be obtained.