180 Notes
Day 4
1/19/90

1. One minute papers summaries
Points
6 dimensions of comm systems (# objects, message capacity, message volume, network structure, type of message, triggering mechanism)
systematic models of comm may be useful in practice
Qualitative shift in complexity from small group to large group (challenge)
Analytical models apply to general NOT to cases or individuals

Questions
How to overcome multilingual context?
Topics (speech, project)?
Can we generalize culture?
Is technology breaking down barriers or forming new ones? Influencing personal relationships? Equity? Ethnocentrism?
What are the commercial (business) applications?
Internet implications? E.g., small world? Citation maps? Collab filtering?
Is the spread of jokes like the spread of rumors?
Are Pool's models culture/society dependent? (not supposed to be)
What is entropy?

2. Speech schedule
Feb. 11 (note: CHANGE from Feb 9 data on syllabus): Anna Cleveland, Miles Efron, Camile Dudney, and Christine Raftelis
Feb 23: Susan Huffman, Christine Ferris, Mitake Holloman, & ?

3. Project ideas

4.  Discuss Weaver/Shannon paper

Three levels of communication problem
 Accuracy of transmission (technical problem)
 Degree of meaning (semantic problem)
 Effect of transmission (effectiveness problem)

Shannon's basic model: info source-[message]-transmitter--signal-noise source-received signal--receiver-[message]-destination

Do not confuse information with meaning!!
Shannon assumes communication initiated by the source SELECTING a desired message from a set of possible messages
Then, information is the amount of uncertainty in the SOURCE (not the message). “Information is a measure of one’s freedom of choices when one selects a message.” P. 9
Information does not apply to the individual messages but rather to the situation as a whole (p9)

A search grammar providing 32 commands (or 32 icons in a graphic language) implies 5 bits of information (log 322=5) in this grammar.  This assumes 32 commands are independent and equally likely at a given time, and that exactly one will be selected.  This works fine for a simple, one unit message (a battlefield command, an executive decision, etc.) but for human communication, conditional probability comes into play since the number of possible selections available once one is made may vary (leads to coding theory), complicating the technical subproblems (unit size, channel capacity, noise effects, etc.). In the case of more than one unit of communication (a continuous message), the overall situation (information) is dependent on what has already been selected (conditional probability).  This the amount of entropy (randomness) in the situtation.

Channel capacity (in theory)= info/time; in practice, channel carries signals/symbols that transmitter maps from source possibilities.  Coding compacts more info into the signals/symbols the channel actually carries.

Weaver compares level A to B&C by suggesting that a semantic receiver (we might consider this as mental models) with associated coding theories and semantic noise.  So a "message may get through the channel, but we may have another set of "selections" (interpretations) possible in the receiver.

4. Assignment
Read Tannen (G3)

5.The one-minute paper
What was the big point you learned in class today?
What is the main, unanswered question you leave class with today?