Reading is Chapter 7 on Information Processing
Grades
Discussion Boards
1 Revolutions and Metaphors
2 Donders and Processing stages
3 Information Theory
4 Hick-Hyman Law
Cognitive psychology often uses metaphors as an aid for explanation
Metaphorical Explanation
The structure and function of one thing is used to roughly describe another thing
Big metaphors in cognition roughly track technological revolutions
Cognition is like a factory assembly line
Cognition is like a telephone network
Cognition is like a computer
1 Revolutions and Metaphors
2 Donders and Processing stages
3 Information Theory
4 Hick-Hyman Law
Using measures of time to make inferences about cognitive processes
Infinitely fast?
Or, with a certain speed?
Measured nerve conduction speeds in sciatic nerve of a frog
Range of 24.6 - 38.4 meters per second
Dutch Ophthalmologist
Used mental chronometry to measure mental processing times
First used in astronomy
Referred to individual human error in the timing of recording observations of stars
Time taken for light to hit the eye, then be transduced and conducted along nerves to produce a response
The time associated with unique stages of mental processing could be measured by systematically asking people to complete tasks of increasing complexity…
Donders measured reaction times in increasingly complex tasks
Simple reaction time
Go-No Go
Choice reaction time
Simplest reaction time task
Participants wait for ANY stimulus
And, respond as quickly as possible when the stimulus occurs
Measures “physiological” reaction time
Participants wait for a specific stimulus (GO)
Respond only to the GO stimulus
Withhold response to other stimuli
Requires stimulus identification
There are multiple possible stimuli
Respond to each with a unique response
stimulus identification and response selection
Estimating identification time
Estimating response selection time
What happens if two processing stages can both occur in parallel at the same time?
Interest in reaction time studies was revived in the 1950s
And, the metaphor for explaining cognitive processes shifts from an assembly line metaphor to a telecommunications metaphor
PRP stands for the Psychological Refractory Period (Welford, 1952)
Theoretical debate about the PRP effect reflects the metaphorical shift
Responding to a first stimulus can sometimes delay a response to a second stimulus, especially if the stimuli are presented quickly, one after the other
Welford described a few:
In its bare essentials this theory assumes, firstly, a number of sensory input mechanisms each capable of receiving data and storing it for a limited period so that, for example, a short series of signals can be received as a unit. Secondly, it assumes a number of effector mechanisms containing both central and peripheral elements and capable of carrying out a series of actions such as the pressing and release of a key or a series of taps (Vince, 1949) as a single unit. Thirdly, between these two it postulates a single-channel decision mechanism. This is regarded as being of limited capacity in the sense that it takes a finite time to process information and can thus only deal with a limited amount of information in a given time
1 Revolutions and Metaphors
2 Donders and Processing stages
3 Information Theory
4 Hick-Hyman Law
“Father of Information Theory”
A Mathematical Theory of Communication (1948)
Founded digital circuit theory (1937)
Information theory was NOT originally developed as a theory in psychology
It is a set of mathematical formalisms to describe communication systems
Measurement tools from information theory became popular in Cognitive Psychology beginning around 1950
An information channel has three parts:
A sender, a channel, and a receiver
Two questions:
How much information was sent?
How much was received?
Toy telephone (two cans and a wire)
Real telephone networks
What about people talking to each other?
Important concept from information theory
The amount of information that can be transmitted and received through a communication channel
Shannon proposed to measure information in terms of entropy, or the amount of uncertainty in a system of messages:
\(H(X) = -1*\sum_\text{i=1}^n P(x_i) * log_2 P(x_i)\)
Capacity limitations for information channels could be measured and assigned a value
Content of signals could be analyzed in terms of how much information they carry
Amount of information lost during transmission could be quantified
Led to improvements in telecommunications systems and technology
Shannon’s formula defines information in terms of the predictability of a sequence of messages.
More predictable sequences = Low information
Less predictable sequences = High information
A book that contained the letter A repeatedly
A book that you read, enjoyed, and found meaningful
A book that contained completely random sequences of letters
https://www.crumplab.com/cognition/textbook/information-processing.html#computing-h
Shannon’s formula uses a base 2 logarithm which produces a number in the unit of bits.
Bits can be used to measure the total number of discrete events in a system of messages
Let’s review H, Bits, and Information together by examining a simple communication system involving four possible messages:
A B C D
1 Revolutions and Metaphors
2 Donders and Processing stages
3 Information Theory
4 Hick-Hyman Law
Ideas from information theory were imported into cognitive psychology around the 1950s
A big idea was to describe limitations in “information processing” abilities
E.g., how much “information” can you process?
A promising early demonstration suggesting that choice reaction time performance may be fundamentally governed by the amount of information in a set of choice stimuli.
Specifically, choice reaction time increases a linear function of the information in the set of choice stimuli.
Participants are presented with one stimulus at a time from a set of stimuli
Participants identify each stimuli with a unique response as fast as possible
Prior research had shown that choice reaction time increases with set-size
e.g., average response time to respond to any stimulus goes up and up, as the number of alternatives in the set increases
Why does mean reaction time go up when the number of possible choices goes up?
Perhaps people were responding to the amount of information in the choice set, and not simply the number of the alternatives.
If people were responding to the amount information, then they should be sensitive to the predictability of individual alternatives.
Choice stimuli were usually presented randomly to participants.
This meant the number of alternatives and the amount of information in a set was confounded.
Need to independently manipulate the number of alternatives, and the amount of information
Hyman, R. (1953). Stimulus information as a determinant of reaction time. Journal of Experimental Psychology, 45(3), 188–196. https://doi.org/cq3kjd
A choice-reaction time task with 8 different conditions, corresponding to the number of alternatives in a set, from 1 to 8.
In each condition, all stimuli were presented randomly.
Thus, the amount of bits in each condition was ranged from 0 to 3 (bits for 1 to 8 alternatives are: 0, 1, 1.58, 2, 2,32, 2,58, 2,81, and 3).
Reaction time increased linearly with number of alternatives…
BUT, the number of alternatives, was completely confounded with the amount of bits.
Choice reaction time increases linearly as a function of the information (bits) in the stimulus set
Behaviorists looked for lawful regularities connecting a stimulus with a subsequent response
Hick-Hyman law showed a case where responses to a stimulus did not depend on the stimulus that was presented!
People were not only responding to the stimulus in front of them, their performance was also determined by the relative predictability of other stimuli that were not presented.
Information theory was not an explanation, it was just another way to describe the experimental manipulation
Further research showed some violations to the Hick-Hyman Law (practiced subjects, non-linear trends)
Why does choice reaction time mostly increase linearly as a function of information?
Hick’s match to template hypothesis…
Hick’s binary logic test hypothesis…
Kornblum’s priming explanations
Complete the quiz for this learning module on Blackboard, and/or the writing assignments by the due date.
The next learning learning module is Memory I