ReviewEssays.com - Term Papers, Book Reports, Research Papers and College Essays
Search

Information Theory

Essay by   •  December 15, 2010  •  Research Paper  •  1,349 Words (6 Pages)  •  1,136 Views

Essay Preview: Information Theory

Report this essay
Page 1 of 6

1. Introduction

Information theory is the mathematical theory of data communication and

storage generally considered to have been founded in 1948 by Claude E.

Shannon. The central paradigm of classic information theory is the

engineering problem of the transmission of information over a noisy channel.

The main result of this theory is Shannon's noisy-channel coding theorem,

which states that reliable communication is possible over unreliable channels. It

is possible to surround a noisy channel with appropriate encoding and

decoding systems, such that messages can be communicated at any rate less

than (but arbitrarily close to) the channel capacity with an arbitrarily small

probability of error.

Information theory in the 1950s was sometimes classified as a branch of the

then voguish field called "cybernetics", which included many aspects of

potential machine representation of the world; it is a broad and deep

mathematical theory, with equally broad and deep applications, chief among

them coding theory.

Coding theory is concerned with finding explicit methods, called codes, of

increasing the efficiency and fidelity of data communication over a noisy

channel up near the limit that Shannon proved is all but possible. These codes

can be roughly subdivided into data compression and error-correction codes.

It took many years to find the good codes whose existence Shannon proved.

A third class of codes are cryptographic ciphers; concepts from coding theory

and information theory are much used in cryptography and cryptanalysis; see

the article on deciban for an interesting historical application.

Information theory is also used in intelligence, gambling, statistics, and even

music composition.

2. Redundancy

Redundancy in information theory is the number of bits used to transmit a

message minus the number of bits of actual information in the message. Data

compression is a way to eliminate such redundancy, while checksums are a

way of adding redundancy.

3. Entropy

Entropy is a concept in thermodynamics (see thermodynamic entropy),

statistical mechanics and information theory. The concepts of information and

entropy have deep links with one another, although it took many years for the

development of the theories of statistical mechanics and information theory to

3

make this apparent. This article is about information entropy, the

information-theoretic formulation of entropy.

The basic concept of entropy in information theory has to do with how much

randomness there is in a signal or random event. An alternative way to look

at this is to talk about how much information is carried by the signal.

As an example consider some English text, encoded as a string of letters,

spaces and punctuation (so our signal is a string of characters). Since some

characters are not very likely (e.g. 'z') while others are very common (e.g. 'e')

the string of characters is not really as random as it might be. On the other

hand, since we cannot predict what the next character will be, it does have

some 'randomness'. Entropy is a measure of this randomness, suggested by

Claude E. Shannon in his 1948 paper A Mathematical Theory of

Communication.

Shannon offers a definition of entropy which satisfies the assumptions that:

* The measure should be proportional (continuous) - i.e. changing the

value of one of the probabilities by a very small amount should only

change the entropy by a small amount.

* If all the outcomes (letters in the example above) are equally likely then

increasing the number of letters should always increase the entropy.

* We should be able to make the choice (in our example of a letter) in

two steps, in which case the entropy of the final result should be a

weighted sum of the entropies of the two steps.

4. Source coding (data compression)

In computer science, data compression or source coding is the process of

encoding information using fewer bits (or other information-bearing units)

than a more obvious representation would use, through use of specific

encoding schemes. For example, this article could be encoded with fewer bits

if we accept the convention that the word "compression" is encoded as

"comp". One popular instance of compression that many computer users are

familiar with is the ZIP file format, which, as well as providing compression,

acts as an archiver, storing many files in a single output file.

As is the case with

...

...

Download as:   txt (9.4 Kb)   pdf (110.5 Kb)   docx (13.7 Kb)  
Continue for 5 more pages »
Only available on ReviewEssays.com