Information processing theory is a cognitive theory that uses computer processing as a metaphor for the workings of the human brain. Initially proposed by George A. Miller and other American psychologists in the 1950s, the theory describes how people focus on information and encode it into their memories.
Key Takeaways: Information Processing Model
- Information processing theory is a cornerstone of cognitive psychology that uses computers as a metaphor for the way the human mind works.
- It was initially proposed in the mid-50s by American psychologists including George Miller to explain how people process information into memory.
- The most important theory in information processing is the stage theory originated by Atkinson and Shiffrin, which specifies a sequence of three stages information goes through to become encoded into long-term memory: sensory memory, short-term or working memory, and long-term memory.
Origins of Information Processing Theory
During the first half of the twentieth century, American psychology was dominated by behaviorism. Behaviorists only studied behaviors that could be directly observed. This made the inner-workings of the mind seem like an unknowable “black box.” Around the 1950s, however, computers came into existence, giving psychologists a metaphor to explain how the human mind functioned. The metaphor helped psychologists explain the different processes the brain engages in, including attention and perception, which could be compared to inputting information into a computer, and memory, which could be compared to a computer’s storage space.
This was referred to as the information processing approach and is still fundamental to cognitive psychology today. Information processing is especially interested in how people select, store and retrieve memories. In 1956, psychologist George A. Miller developed the theory and also contributed the idea that one can only hold a limited number of pieces of information in short-term memory. Miller specified this number as seven plus or minus two (or five to nine chunks of information), but more recently other scholars have suggested the number may be smaller.
The development of the information processing framework has continued through the years and has been broadened. Below are four models that are especially important to the approach:
Atkinson and Shiffrin’s Stage Theory
In 1968, Atkinson and Shiffrin developed the stage theory model. The model was later modified by other researchers but the basic outline of stage theory continues to be a cornerstone of information processing theory. The model concerns how information is stored in memory and presents a sequence of three stages, as follows:
Sensory Memory — Sensory memory involves whatever we take in through our senses. This kind of memory is exceedingly brief, only lasting up to 3 seconds. In order for something to enter sensory memory, the individual has to pay attention to it. Sensory memory can’t attend to every piece of information in the environment, so it filters out what it deems irrelevant and only sends what seems important to the next stage, short-term memory. The information that’s most likely to reach the next stage is either interesting or familiar.
Short-Term Memory/Working Memory — Once information reaches short-term memory, which is also called working memory, it is filtered further. Once again, this kind of memory doesn’t last long,only about 15 to 20 seconds. However, if information is repeated, which is referred to as maintenance rehearsal, it can be stored for up to 20 minutes. As observed by Miller, working memory’s capacity is limited so it can only process a certain number of pieces of information at a time. How many pieces is not agreed on, although many still point to Miller to identify the number as five to nine.
There are several factors that will impact what and how much information will be processed in working memory. Cognitive load capacity varies from person to person and from moment to moment based on an individual’s cognitive abilities, the amount of information being processed, and one's ability to focus and pay attention. Also, information that is familiar and has often been repeated doesn’t require as much cognitive capacity and, therefore, will be easier to process. For example, riding a bike or driving a car take minimal cognitive load if you’ve performed these tasks numerous times. Finally, people will pay more attention to information they believe is important, so that information is more likely to be processed. For example, if a student is preparing for a test, they are more likely to attend to information that will be on the test and forget about information they don’t believe they will be asked about.
Long-Term Memory — Although short-term memory has a limited capacity, the capacity of long-term memory is thought to be limitless. Several different types of information are encoded and organized in long-term memory: declarative information, which is information that can be discussed such as facts, concepts, and ideas (semantic memory) and personal experiences (episodic memory); procedural information, which is information about how to do something like drive a car or brush your teeth; and imagery, which are mental pictures.
Craik and Lockhart’s Level of Processing Model
Although Atkinson and Shiffrin’s stage theory is still highly influential and is the basic outline on which many later models are built, its sequential nature over-simplified how memories are stored. As a result, additional models were created to expand upon it. The first of these was created by Craik and Lockhart in1973. Their levels of processing theory states that the ability to access information in long-term memory will be affected by how much it was elaborated upon. Elaboration is the process of making information meaningful so it is more likely to be remembered.
People process information with different levels of elaboration that will make the information more or less likely to be retrieved later. Craik and Lockhart specified a continuum of elaboration that starts with perception, continues through attention and labeling, and ends at meaning. Regardless of the level of elaboration, all information is likely to be stored in long-term memory,but higher levels of elaboration make it more likely that the information will be able to be retrieved. In other words, we can recall far less information that we’ve actually stored in long-term memory.
Parallel-Distributed Processing Model and Connectionist Model
The parallel-distributed processing model and connectionist model contrast to the linear three-step process specified by the stage theory. The parallel-distributed processing model was a precursor to connectionism that proposed that information is processed by multiple parts of the memory system at the same time.
This was extended by Rumelhart and McClelland’s connectionist model in 1986, which said that information is stored in various locations throughout the brain that is connected through a network. Information that has more connections will be easier for an individual to retrieve.
While the information processing theory’s use of a computer as a metaphor for the human mind has proven to be potent, it’s also limited. Computers aren’t influenced by things like emotions or motivations in their ability to learn and remember information, but these things can have a powerful impact on people. In addition, while computers tend to process things sequentially, evidence shows humans are capable of parallel processing.
- Anderson, John R. Cognitive Psychology and Its Implications. 7th ed., Worth Publishers, 2010.
- Carlston, Don. “Social Cognition.” Advanced Social Psychology: The State of the Science, edited by Roy F. Baumeister and Eli J. Finkel, Oxford University Press, 2010, pp. 63-99.
- David L. "Information Processing Theory." Learning Theories. 2015 December 5. https://www.learning-theories.com/information-processing-theory.html
- Huitt, William G. "The Information Processing Approach to Cognition." Educational Psychology Interactive. 2003. http://www.edpsycinteractive.org/topics/cognition/infoproc.html
- Instructional Design. "Information Processing Theory (G. Miller)." https://www.instructionaldesign.org/theories/information-processing/
- McLeod, Saul. “Information Processing.”Simply Psychology, 24 October 2018. https://www.simplypsychology.org/information-processing.html
- Psychology Research and Reference. "Information Processing Theory." iResearchnet.com. https://psychology.iresearchnet.com/developmental-psychology/cognitive-development/information-processing-theory/
The idea of information processing was adopted by cognitive psychologists as a model of how human thought works. For example, the eye receives visual information and codes information into electric neural activity which is fed back to the brain where it is “stored” and “coded”.What does the information processing theory explain? ›
Information processing theory is an approach to cognitive development studies that aims to explain how information is encoded into memory. It is based on the idea that humans do not merely respond to stimuli from the environment. Instead, humans process the information they receive.What are the examples of information processing in daily life? ›
One of the most common applications of information process in our daily life can be experienced the moment we are making use of our cell phone devices. 2. Today almost everyone is used to using cell phones for various purposes including communications, to internet browsing or performing a simple online search.What are examples of social information processing theory? ›
Social information processing theories
For example, in a case where someone gets bumped into in a crowd, the person with hostile interpretation bias may be more likely to interpret being bumped as an aggressive action rather than as a mistake (Wilkowski & Robinson, 2010).
- Allow extra classroom time to process information.
- Allow extra time for tests.
- Teach students how to “self-talk”
- Helps them problem solve.
- Helps them remember routines.
- Create Memories Using Different Stimuli. ...
- The Role of Short-Term or Working Memory. ...
- Encoding Information Into Long-Term Memory. ...
- Break Information Into Smaller Parts. ...
- Make It Meaningful. ...
- Connect the Dots. ...
- Keep Repeating. ...
- Limited Analogy Between Computer and Human.