|
[full paper] |
Frédéric Koriche
A central issue in relational learning is the choice of an appropriate bias for representing hypotheses. Bias must be weak enough to avoid underfitting and strong enough to avoid intractability and overfitting. In order to circumvent this dilemma, we propose a framework inspired from windowing techniques. A bias window is a restricted subclass of the relational space determined by some constraint parameters. The idea is thus to learn a theory in a small window first, and only switch to larger windows if the result in the small window is unsatisfactory. To this end, our framework integrates three key components: a logical notion of window space, an approximation algorithm that explores this space, and a model selection principle that monitors the learning progress. Experiments on relational datasets show that, after a short period of underfitting, windowing converges to hypotheses which are both compact and effective.
Keywords: Machine Learning
Citation: Frédéric Koriche: Bias Windowing for Relational Learning. In R.López de Mántaras and L.Saitta (eds.): ECAI2004, Proceedings of the 16th European Conference on Artificial Intelligence, IOS Press, Amsterdam, 2004, pp.495-499.