Sample compression, learnability, and the Vapnik-Chervonenkis dimension.- Learning boxes in high dimension.- Learning monotone term decision lists.- Learning matrix functions over rings.- Learning from incomplete boundary queries using split graphs and hypergraphs.- Generalization of the PAC-model for learning with partial information.- Monotonic and dual-monotonic probabilistic language learning of indexed families with high probability.- Closedness properties in team learning of recursive functions.- Structural measures for games and process control in the branch learning model.- Learning under persistent drift.- Randomized hypotheses and minimum disagreement hypotheses for learning with noise.- Learning when to trust which experts.- On learning branching programs and small depth circuits.- Learning nearly monotone k-term DNF.- Optimal attribute-efficient learning of disjunction, parity, and threshold functions.- learning pattern languages using queries.- On fast and simple algorithms for finding Maximal subarrays and applications in learning theory.- A minimax lower bound for empirical quantizer design.- Vapnik-Chervonenkis dimension of recurrent neural networks.- Linear Algebraic proofs of VC-Dimension based inequalities.- A result relating convex n-widths to covering numbers with some applications to neural networks.- Confidence estimates of classification accuracy on new examples.- Learning formulae from elementary facts.- Control structures in hypothesis spaces: The influence on learning.- Ordinal mind change complexity of language identification.- Robust learning with infinite additional information.
"Sinopsis" puede pertenecer a otra edición de este libro.
(Ningún ejemplar disponible)
Buscar: Crear una petición¿No encuentra el libro que está buscando? Seguiremos buscando por usted. Si alguno de nuestros vendedores lo incluye en IberLibro, le avisaremos.
Crear una petición