Descripción
A remarkably complete collection of works documenting the history of the theory of communication of information - what 'information' actually is, and what are the theoretical restrictions on the accurate transmission of information from source to receiver. Note: The numbers in brackets correspond to the titles listed in the accompanying pdf, accessible via the link below the images. The first group of works details the development and proof of what is now called the 'Nyquist-Shannon sampling theorem'. If an analog signal (e.g., voice or music) has to be converted to a digital signal, consisting of binary zeros and ones ('bits'), the theorem states that a sample of twice the highest signal frequency rate captures the signal perfectly thereby making it possible to reconstruct the original signal. This theorem laid the foundation for many advances in telecommunications. The first evidence for the sampling theorem was found experimentally by Miner in 1903 [8]. It was formally proposed by Nyquist in 1924 [9, 10] and by Küpfmüller in 1928 [8], but first proved by Nyquist [12] and later by Küpfmüller's student Raabe [8]. In 1941, Bennett [15] referred to Raabe's work and generalized it. A result equivalent to the sampling theorem had, however, been proved by Whittaker as early as 1915 [8, 14] in the context of interpolation theory. Finally, in 1948 Shannon [8, 19] published a proof of both the sampling theorem and the interpolation formula as one part of his broader development of information theory. The term 'information', as a precise concept susceptible of measurement, was coined by Hartley in 1928 [11]. "Hartley distinguished between meaning and information. The latter he defined as the number of possible messages independent of whether they are meaningful. He used this definition of information to give a logarithmic law for the transmission of information in discrete messages . Hartley had arrived at many of the most important ideas of the mathematical theory of communication: the difference between information and meaning, information as a physical quantity, the logarithmic rule for transmission of information, and the concept of noise as an impediment in the transmission of information" (Origins of Cyberspace 316). In the following year, the physicist Szilard established the connection between information and the thermodynamic quantity 'entropy'. "Szilard described a theoretical model that served both as a heat engine and an information engine, establishing the relationship between information (manipulation and transmission of bits) and thermodynamics (manipulation and transfer of energy and entropy). He was one of the first to show that 'Nature seems to talk in terms of information'" (Seife, Decoding the Universe, 2007, p. 77). Another physicist, Gabor, pointed out the relation between the sampling theorem and the uncertainty principle in quantum mechanics [16]: "Signals do not have arbitrarily precise time and frequency localization. It doesn't matter how you compute a spectrum, if you want time information, you must pay for it with frequency information. Specifically, the product of time uncertainty and frequency uncertainty must be at least 1/4Ï ." In 1942 Wiener issued a classified memorandum (published in 1949 [23]) which combining ideas from statistics and time-series analysis, and used Gauss's method of shaping the characteristic of a detector to allow for the maximal recognition of signals in the presence of noise. This method came to be known as the 'Wiener filter'. In his Mathematical Theory of Communication (1948) [19], Shannon notes: "Communication theory is heavily indebted to Wiener for much of its basic philosophy and theory. His classic NDRC report 'The Interpolation, Extrapolation, and Smoothing of Stationary Time Series', to appear soon in book form, contains the first clear-cut formulation of communication theory as a statistical problem, the study of operations on time series." Many of the developments in. N° de ref. del artículo 2806
Contactar al vendedor
Denunciar este artículo