Among other topics, The Informational Complexity of Learning: Perspectives on Neural Networks and Generative Grammar brings together two important but very different learning problems within the same analytical framework. The first concerns the problem of learning functional mappings using neural networks, followed by learning natural language grammars in the principles and parameters tradition of Chomsky.
These two learning problems are seemingly very different. Neural networks are real-valued, infinite-dimensional, continuous mappings. On the other hand, grammars are boolean-valued, finite-dimensional, discrete (symbolic) mappings. Furthermore the research communities that work in the two areas almost never overlap.
The book’s objective is to bridge this gap. It uses the formal techniques developed in statistical learning theory and theoretical computer science over the last decade to analyze both kinds of learning problems. By asking the same question - how much information does it take to learn? - of both problems, it highlights their similarities and differences. Specific results include model selection in neural networks, active learning, language learning and evolutionary models of language change.
The Informational Complexity of Learning: Perspectives on Neural Networks and Generative Grammar is a very interdisciplinary work. Anyone interested in the interaction of computer science and cognitive science should enjoy the book. Researchers in artificial intelligence, neural networks, linguistics, theoretical computer science, and statistics will find it particularly relevant.
"Sinopsis" puede pertenecer a otra edición de este libro.
Among other topics, The Informational Complexity of Learning: Perspectives on Neural Networks and Generative Grammar brings together two important but very different learning problems within the same analytical framework. The first concerns the problem of learning functional mappings using neural networks, followed by learning natural language grammars in the principles and parameters tradition of Chomsky.
These two learning problems are seemingly very different. Neural networks are real-valued, infinite-dimensional, continuous mappings. On the other hand, grammars are boolean-valued, finite-dimensional, discrete (symbolic) mappings. Furthermore the research communities that work in the two areas almost never overlap.
The book's objective is to bridge this gap. It uses the formal techniques developed in statistical learning theory and theoretical computer science over the last decade to analyze both kinds of learning problems. By asking the same question - how much information does it take to learn? - of both problems, it highlights their similarities and differences. Specific results include model selection in neural networks, active learning, language learning and evolutionary models of language change.
The Informational Complexity of Learning: Perspectives on Neural Networks and Generative Grammar is a very interdisciplinary work. Anyone interested in the interaction of computer science and cognitive science should enjoy the book. Researchers in artificial intelligence, neural networks, linguistics, theoretical computer science, and statistics will find it particularly relevant.
"Sobre este título" puede pertenecer a otra edición de este libro.
Librería: Lucky's Textbooks, Dallas, TX, Estados Unidos de America
Condición: New. Nº de ref. del artículo: ABLIING23Feb2416190185007
Cantidad disponible: Más de 20 disponibles
Librería: Ria Christie Collections, Uxbridge, Reino Unido
Condición: New. In English. Nº de ref. del artículo: ria9780792380818_new
Cantidad disponible: Más de 20 disponibles
Librería: BuchWeltWeit Ludwig Meier e.K., Bergisch Gladbach, Alemania
Buch. Condición: Neu. This item is printed on demand - it takes 3-4 days longer - Neuware -Among other topics, The Informational Complexity of Learning: Perspectives on Neural Networks and Generative Grammar brings together two important but very different learning problems within the same analytical framework. The first concerns the problem of learning functional mappings using neural networks, followed by learning natural language grammars in the principles and parameters tradition of Chomsky. These two learning problems are seemingly very different. Neural networks are real-valued, infinite-dimensional, continuous mappings. On the other hand, grammars are boolean-valued, finite-dimensional, discrete (symbolic) mappings. Furthermore the research communities that work in the two areas almost never overlap. The book's objective is to bridge this gap. It uses the formal techniques developed in statistical learning theory and theoretical computer science over the last decade to analyze both kinds of learning problems. By asking the same question - how much information does it take to learn - of both problems, it highlights their similarities and differences. Specific results include model selection in neural networks, active learning, language learning and evolutionary models of language change. The Informational Complexity of Learning: Perspectives on Neural Networks and Generative Grammar is a very interdisciplinary work. Anyone interested in the interaction of computer science and cognitive science should enjoy the book. Researchers in artificial intelligence, neural networks, linguistics, theoretical computer science, and statistics will find it particularly relevant. 248 pp. Englisch. Nº de ref. del artículo: 9780792380818
Cantidad disponible: 2 disponibles
Librería: Books Puddle, New York, NY, Estados Unidos de America
Condición: New. pp. 252. Nº de ref. del artículo: 263104613
Cantidad disponible: 4 disponibles
Librería: moluna, Greven, Alemania
Gebunden. Condición: New. Dieser Artikel ist ein Print on Demand Artikel und wird nach Ihrer Bestellung fuer Sie gedruckt. Among other topics, The Informational Complexity of Learning: Perspectives on Neural Networks and Generative Grammar brings together two important but very different learning problems within the same analytical framework. The first conc. Nº de ref. del artículo: 5970659
Cantidad disponible: Más de 20 disponibles
Librería: Majestic Books, Hounslow, Reino Unido
Condición: New. Print on Demand pp. 252 52:B&W 6.14 x 9.21in or 234 x 156mm (Royal 8vo) Case Laminate on White w/Gloss Lam. Nº de ref. del artículo: 5824698
Cantidad disponible: 4 disponibles
Librería: Biblios, Frankfurt am main, HESSE, Alemania
Condición: New. PRINT ON DEMAND pp. 252. Nº de ref. del artículo: 183104623
Cantidad disponible: 4 disponibles
Librería: THE SAINT BOOKSTORE, Southport, Reino Unido
Hardback. Condición: New. This item is printed on demand. New copy - Usually dispatched within 5-9 working days 1210. Nº de ref. del artículo: C9780792380818
Cantidad disponible: Más de 20 disponibles
Librería: preigu, Osnabrück, Alemania
Buch. Condición: Neu. The Informational Complexity of Learning | Perspectives on Neural Networks and Generative Grammar | Partha Niyogi | Buch | xxiii | Englisch | 1997 | Springer US | EAN 9780792380818 | Verantwortliche Person für die EU: Springer Verlag GmbH, Tiergartenstr. 17, 69121 Heidelberg, juergen[dot]hartmann[at]springer[dot]com | Anbieter: preigu Print on Demand. Nº de ref. del artículo: 102590243
Cantidad disponible: 5 disponibles
Librería: buchversandmimpf2000, Emtmannsberg, BAYE, Alemania
Buch. Condición: Neu. Neuware -Among other topics, The Informational Complexity of Learning: Perspectives on Neural Networks and Generative Grammar brings together two important but very different learning problems within the same analytical framework. The first concerns the problem of learning functional mappings using neural networks, followed by learning natural language grammars in the principles and parameters tradition of Chomsky.These two learning problems are seemingly very different. Neural networks are real-valued, infinite-dimensional, continuous mappings. On the other hand, grammars are boolean-valued, finite-dimensional, discrete (symbolic) mappings. Furthermore the research communities that work in the two areas almost never overlap.The book's objective is to bridge this gap. It uses the formal techniques developed in statistical learning theory and theoretical computer science over the last decade to analyze both kinds of learning problems. By asking the same question - how much information does it take to learn - of both problems, it highlights their similarities and differences. Specific results include model selection in neural networks, active learning, language learning and evolutionary models of language change.The Informational Complexity of Learning: Perspectives on Neural Networks and Generative Grammar is a very interdisciplinary work. Anyone interested in the interaction of computer science and cognitive science should enjoy the book. Researchers in artificial intelligence, neural networks, linguistics, theoretical computer science, and statistics will find it particularly relevant.Springer Verlag GmbH, Tiergartenstr. 17, 69121 Heidelberg 252 pp. Englisch. Nº de ref. del artículo: 9780792380818
Cantidad disponible: 2 disponibles