Brainwork. David A. Sousa

Чтение книги онлайн.

Читать онлайн книгу Brainwork - David A. Sousa страница 4

Brainwork - David A. Sousa

Скачать книгу

in this area are not suggesting that the emotional brain entirely co-opts the decision-making process when working memory is overloaded. Rather, they suggest that just a few salient facts and feelings are processed over time below the level of consciousness—in unconscious thought—while the individual is engaged in unrelated conscious activities. Eventually, this unconscious process renders a decision that we recall and act on.

      Exactly what happens in the brain during working memory overload has been the interest of researcher Angelika Dimoka at Temple University’s Center for Neural Decision Making.6 She has studied the effects of too much information by working with bidders who are involved in a complex marketing frenzy called combinatorial auctions. These are bidding wars for numerous items that people can buy alone or bundled, such as landing slots at a busy metropolitan airport. The vast number of variables that the bidders need to consider eventually leads to information overload. Dimoka used a brain imaging technique known as functional magnetic resonance imaging (fMRI) to measure brain activity in the prefrontal cortex.

      As the bidders received more and more information, Dimoka noticed that activity in the prefrontal cortex decreased quickly (see fig. 1.2, page 8). Working memory was getting full and rebelling. The bidders began to make dumb mistakes and bad choices because the prefrontal cortex essentially abandoned its role as the reasoned decision maker. Furthermore, without the prefrontal cortex exerting its control over the limbic system, emotions began to run rampant, causing a rise in the bidders’ anxiety and frustration. This combined effect resulted in many bad decisions or no decision at all. Apparently, if a little knowledge is a dangerous thing, too much knowledge can be paralyzing.

      One curious characteristic of working memory is the way it assigns importance to incoming information. In any learning situation, we tend to remember items presented at the beginning and the end much better than the items that came in the middle. The opening and closing of a presentation stay with us longer than the material in between. Researchers call this the primacy-recency effect. You probably experienced this effect earlier when you tried to remember that ten-digit number. Chances are high that you remembered the first several digits (9, 2, 3, 7) and last several (3, 0, 2) but had difficulty remembering those in the middle (5, 4, 6).

      Figure 1.3 illustrates how the degree of remembering varies throughout a learning situation. At the beginning, working memory has the capacity to process new information, so it commands our attention (the first peak in the figure). However, as the number of new items approaches the capacity limit, anything else coming into working memory is likely to be lost or remembered only partially (the dip in the figure). As the presentation concludes, working memory sorts the information and once again pays attention, this time to the final items (the second peak).

      Because of this effect, we are likely to give more importance to the most recent information we receive, while giving little weight to what came earlier. In this day, when information arrives often and fast, we frequently mistake immediacy for quality.

      Every instant, the human brain does an enormous amount of information processing as signals race across neurons to keep our minds alert and our bodies alive. Some experts claim that there are as many as a quadrillion (1,000,000,000,000,000) instructions zooming around the brain every second. Granted, many of these signals are handling internal information, such as body temperature, blood pressure, heart rate, movement, and other such functional data. But even if this estimate is only partially close to reality, why does such processing power prevent us from coping with external information overload? Actually, the problem is not processing capacity so much as it is attention span.

      As we noted earlier, working memory has a limited capacity. The human brain tries to focus on a small number of items to determine whether they should be stored or rejected. As more items enter the system, attention shifts among them and focus diminishes. In effect, we lose our ability to concentrate on single items long enough to determine their importance. Items blur into a vague mass of unknown importance, and the brain responds with frustration and anxiety.

      British marketing analyst Gary Giddings offers a simple mathematical expression for this phenomenon.7 He says that the amount of total attention available (A) is equal to the number of items in an information source (s) multiplied by the amount of attention needed to examine each item (a). Thus, A = a × s. Let’s take a closer look at the import of this expression. Total attention available, or A, can be both a constant and a variable. At work, we subconsciously set the attention span time for items based on our previous experiences handling similar problems and by estimating how much time we can devote to the task before something else comes along. As a result, most people have a fairly constant attention span (A) when dealing with information at work. Consequently, if A is constant, then as the number of information items (s) increases, the amount of time spent on each (a) has to decrease. Giddings wisely avoids putting numbers into his equation, because the attention resources and allocations are not that precise.

      The total attention available, however, can vary dramatically when the situation changes. For example, the time we are able to attend to a task may be much longer when we are dealing with information related to home activities, such as interacting with a spouse or children, or when involved in a hobby. When I was a superintendent of schools, I often had to struggle to concentrate for just a few minutes on what seemed to be a silly problem that someone should have solved at a lower level. (I had lots of these on some days.) Yet, I could go home that same day and spend hours concentrating on an article about new discoveries in brain research. My attention span increased when the situation changed to something of greater interest to me. This example also explains why most of us are apt to respond to the ring of a cell phone even though we are trying to complete a work-related project. Oh, who could that be? How important is it? Interest perks up, and some of the attention resources devoted to the work project are diverted to musing about—and perhaps answering—the phone call. We might justify this action by saying that we are simply multitasking, but as we shall see in the next chapter, that explanation just doesn’t cut it.

      You know the drill. You want to get some information for a presentation you are giving to the senior vice presidents. As part of your presentation, you want to show your competitors’ sales numbers from the last quarter. You decide to search the Internet or an online database: Hmm, which of the 200,000 sites should I search? Which of the 150 press reports on these companies should I read? Oh, great! There are conflicting sales data from different sources. Which ones should I trust?

      Information overload is described as the feeling you get when being inundated by too much information at too fast a pace to use it appropriately. It is often associated with a sense of being overwhelmed and a loss of control. It is not a new phenomenon. In the Bible we find, “Of making many books there is no end; and much study is a weariness of the flesh.”8 The eighteenth-century French author Voltaire noted, “The multitude of books is making us ignorant.” Only a few decades ago, the major sources of information were radio, television, printed media,

Скачать книгу