** (how do fashion pictures and their captions work to _distribute_ information, significane, and meaning?)

I have presented my ideas of what information can and cannot be, discussed information and communication systems, and provided different examples of how information can exist.

http://www.lucent.com/minds/infotheory

Information theory provides means to precisely describe how information can be quantified and measured using words and mathematical symbols. Information can be considered abstractly, and dealt with in a purely mathmatical sort of way, but information theory can also be applied to real life communication-related problems. In fact, Claude Shannon founded information theory when he was working at Bell Labs, trying to find a way to clear up noisy telephone communications.(http://www.skypoint.com/~gimonca/shannon.html)

***************************

" In information theory, the "content" of the information is irrelevant. Information theory turns all information into quantities, into the on-or-off bits that flip through our computers--not to mention our phones, TV sets, microwave ovens, musical greeting cards, or anything else with a chip in it. As Shannon puts it: 'These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages. The system must be designed to operate for each possible selection, not just the one which will actually be chosen, since this is unknown at the time of design.'"

http://www.skypoint.com/~gimonca/shannon.html

So what is information? Information is what you don't know. If the receiver already had the information, you couldn't say that a communication had taken place. We're used to thinking about "information" as facts, data, evidence--but in information theory, information is uncertainty. You have to get a message that you don't know the content of. Information theory talks about the messages you could possibly get, or the number of messages you could choose between. Or if you want to generalize even more, information theory talks about the statistical properties of a message, averaged out over the whole message--without any regard to what the message says.

In information theory, the more possible messages you could be receiving, the more uncertainty you have about what actual message you're getting. If you had to track the motion of each atom in the piston of an engine, you could just say they're all moving in the same direction. If you had to track the motion of each atom in the hot exhaust, you would be stuck with lots more information. More information means more entropy.

Like Boltzmann arond the turn of the century, Shannon took the broad observations people had had about information exchange and put them on a solid mathematical, statistical footing. Shannon's information theory, rather than eliminating noise, allows us to live with it--to keep it out of our hair, and possibly even to make it useful. You can use the entropy of thermodynamics to find out how what percentage of your available energy can be turned into useful work; you can use the entropy of information theory to find out how much of a channel can be used to transmit useful information--the signal. ******************

If you want to know more about information theory,you can find a pdf version of Shannon's paper "A Mathematical Theory of Communication" at :

http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf

At this point in time, the math in Shannon's paper is somewhat mystifying to me-- I can get a feel for it in places, and recognize what it might stand for and how equations operate, but I am in no way fluent in the language of mathematics. Since I could certainly understand information theory better myself, I will not discuss it much here other than to say that the ideas I present in this page should complement or be complemented by information theory and not contradict it... ideas I have presented that may seem to contradict those presented by information may appear in a different scale, or in a different context, and may need to be described, measured, and judged using different tools. My purpose here is to use what knowledge I have, reason, and a smattering of (hopefully not misguided) intuition to open a discussion of significance and meaning.

Here again is the definition of information (not my definition) provided in the beginning of the paper.

INFORMATION

1) that which reduces uncertainty. (Claude Shannon); 2) that which changes us. (Gregory Bateson)

Literally that which forms within, but more adequately: the equivalent of or the capacity of something to perform organizational work, the difference between two forms of organization or between two states of uncertainty before and after a message has been received, but also the degree to which one variable of a system depends on or is constrained by (see constraint) another. E.g., the dna carries genetic information inasmuch as it organizes or controls the orderly growth of a living organism. A message carries information inasmuch as it conveys something not already known. The answer to a question carries information to the extent it reduces the questioner's uncertainty. A telephone line carries information only when the signals sent correlate with those received. Since information is linked to certain changes, differences or dependencies, it is desirable to refer to theme and distinguish between information stored, information carried, information transmitted, information required, etc. Pure and unqualified information is an unwarranted abstraction. information theory measures the quantities of all of these kinds of information in terms of bits. The larger the uncertainty removed by a message, the stronger the correlation between the input and output of a communication channel, the more detailed particular instructions are the more information is transmitted. (Krippendorf)

Information is the meaning of the representation of a fact (or of a message) for the receiver. (Hornung)

http://pespmc1.vub.ac.be/ASC/Information.html

next page | " significance "


1, 2, 3, 4, 5, 6, 7, 8, 9