Spiders, bots, and aggregators are all so-called intelligent agents, which execute tasks on the Web without the intervention of a human being. Spiders go out on the Web and identify multiple sites with information on a chosen topic and retrieve the information. Bots find information within one site by cataloging and retrieving it. Aggregrators gather data from multiple sites and consolidate it on one page, such as credit card, bank account, and investment account data. As the Web grows more complex, there will be more and more applications of intelligent agents; Java is expected to be one of the principal languages used to build these agents.Biografía del autor:
Jeff Heaton is an author, college instructor, programmer, and Internet entrepreneur. He has worked with many languages, including C++, Java, and Visual Basic. He coauthored SAMS' Teach Yourself Visual C++ 6.0 Professional Reference Edition and has written for Java Developer's Journal, Windows Developer's Journal, and C++ Users Journal. He teaches Java programming at St. Louis Community College and has served as a consultant programmer for Anheuser--Busch, MasterCard, and Boeing, among others.
"Sobre este título" puede pertenecer a otra edición de este libro.
Descripción Sybex, 2002. Paperback. Estado de conservación: New. Nº de ref. de la librería P110782140408
Descripción Sybex, 2002. Paperback. Estado de conservación: New. Nº de ref. de la librería DADAX0782140408
Descripción Sybex, 2002. Paperback. Estado de conservación: New. book. Nº de ref. de la librería 0782140408