Learning One-Variable Pattern Languages
in Linear Average Time
R"udiger Reischuk
Institut f"ur Theoretische Informatik
Med. Universit"at zu L"ubeck
23560 L"ubeck, Germany
Thomas Zeugmann
Department of Informatics
Kyushu University
Fukuoka 812-81, Japan
September 1997
A new algorithm for learning one-variable pattern languages is
proposed and analyzed with respect to its average-case behavior.
We consider the total learning time that takes into account all
operations till an algorithm has converged to a correct hypothesis.
For the expectation it is shown that for almost all meaningful distributions
defining how the pattern variable is replaced by a string to generate
random samples of the target pattern language this algorithm converges
within a constant number of rounds with a total learning time
that is linear in the pattern length.
Thus, the algorithm is average-case optimal in a strong sense.