759:-
Uppskattad leveranstid 10-16 arbetsdagar
Fri frakt för medlemmar vid köp för minst 249:-
It is commonly assumed that computers process information. But what is inf- mation? In a technical, important, but nevertheless rather narrow sense, Sh- nonsinformationtheorygivesa?rstanswertothisquestion.Thistheoryfocuses on measuring the information content of a message. Essentially this measure is the reduction of the uncertainty obtained by receiving a message. The unc- tainty of a situation of ignorance in turn is measured by entropy. This theory hashad an immense impact on the technologyof information storage,data c- pression, information transmission and coding and still is a very active domain of research. Shannons theory has also attractedmuch interest in a more philosophic look at information, although it was readily remarked that it is only a syntactic theory of information and neglects semantic issues. Several attempts have been made in philosophy to give information theory a semantic ?avor, but still mostly based on or at least linked to Shannons theory. Approaches to semantic informationtheoryalsoveryoftenmakeuseofformallogic.Thereby,information is linked to reasoning, deduction and inference, as well as to decision making. Further, entropy and related measure were soon found to have important connotations with regard to statistical inference. Surely, statistical data and observation represent information, information about unknown, hidden para- ters. Thus a whole branch of statistics developed around concepts of Shannons information theory or derived from them. Also some proper measurements - propriate for statistics, like Fishers information, were proposed.
- Illustratör: Illustrations
- Format: Pocket/Paperback
- ISBN: 9783642006586
- Språk: Engelska
- Antal sidor: 269
- Utgivningsdatum: 2009-04-22
- Förlag: Springer-Verlag Berlin and Heidelberg GmbH & Co. K