You CAN Stop Stupid. Ira Winkler
Чтение книги онлайн.
Читать онлайн книгу You CAN Stop Stupid - Ira Winkler страница 16
The takeaway from our discussion of social engineering is that while insiders may not intend to be malicious, they can be exploited by a malicious outsiders who can obtain insider-level access, both physically and technically. This has to be a critical consideration for any UIL mitigation strategy.
User Error
User error is a commonality within most of the other categories in this chapter. Some errors are more consequential than others. Some errors cause loss of life, which is sometimes the case in medical procedures and diagnoses. Other errors cause large losses of money. Many errors cause inconvenience. Some errors have no consequence at all.
In the other categories within this chapter, we assume error to be induced in part by other factors, such as confusing interfaces. In this section, we focus primarily on error due to carelessness or accidents. Users are human beings, and the fact is that they sometimes just make mistakes.
Many people who are overworked, underpaid, or otherwise not treated well are less motivated to avoid making errors. Other people may be distracted for a variety of reasons such as personal issues, medical conditions, lack of sleep, drug use, or similar issues. Sometimes, even good people can feel apathetic, overwhelmed, or pressured. All of this is to be expected. While you hope carelessness is not the norm, its gravity is more dire in some situations than others. Ideally, culture should help to assist in creating more attentiveness in situations where people have to perform at higher levels. Even in less critical situations, you can try to prevent carelessness as much as possible.
We classify accidents as being different than carelessness because even the best people in the best situations will make an error. For example, many users have experienced accidentally deleting a message instead of forwarding it because they mistakenly clicked on the wrong button on an interface, particularly when they encounter an unexpected lag in cursor speed.
Sometimes there can be multiple supposedly legitimate actions to take, and a person makes an error in determining which action is correct. Everyone has made a legitimate mistake while driving. It is assumed that in accounting, even the most attentive accountant makes a mistake.
Whether it occurs through a legitimate accident or carelessness, you must assume that users will make an error. You need to proactively plan for such errors and have audit procedures, warning systems, redundancies, and so on, to ensure that potential errors are mitigated before a loss can be initiated.
NOTE One of the most common examples of preventing accidental errors is providing a confirmation message that has to be acknowledged prior to the permanent deletion of an email message.
Inadequate Training
One fundamental aspect of awareness training is that people believe a properly trained user will not make mistakes. The reality is that even with the best training, a user will make fewer mistakes, not no mistakes.
Many people take for granted that common sense will help prevent a lot of mistakes. That might be an overly optimistic assumption. Either way, there can be no common sense without common knowledge. It is critical to ensure that all users are grounded in common knowledge. Training attempts to establish and strengthen this common knowledge.
However, training frequently falls short. Some training provides an adequate amount of knowledge but is short on practical experience. Knowledge without application is short lived. A random piece of information will rapidly dissipate from memory and, without reinforcement, will be quickly forgotten. We explore this further when we discuss the concept of the forgetting curve in Chapter 5, “The Problem with Awareness Efforts.”
Proper training should ensure that users understand what their responsibilities are and how to perform them. Ideally, training also impresses the need for users to be attentive in the performance of their duties. This requires accuracy and completeness in training, as well as motivation.
Some training is grossly inadequate, inaccurate, and irrelevant. In 2019, two Boeing 737 MAX airplanes crashed. There were multiple causes of these incidents, including technology implementation and user error (as will be discussed further in the upcoming “Technology Implementation” section). However, training requirements were also insufficient and led to pilots not knowing how to handle the malfunctioning equipment. While that is an extreme example, failed training plagues all organizations to varying levels.
Everyone has experience with inadequate training and can relate to the fact that such training results in loss. Fortunately, training can be strengthened to make it more effective. Chapter 15, “Creating Effective Awareness Programs,” addresses the improvement of training.
Technology Implementation
As we talk about UIL, it is important to consider contributing factors to those losses. Everyone has experience with difficult-to-use systems that inevitably contribute to loss of some type. Some systems cause typographical errors that cause people to transfer the wrong amount of money. Some navigational systems cause drivers to go to the wrong destination or to drive the wrong way on a one-way street.
Some user interfaces contribute to users making security-related mistakes. For example, one of the most common security-related loss results from autocomplete in the Recipients field of an email. People frequently choose the wrong recipient from common names. In one case, after one of our employees left our organization to work for a client and solicited bids from a variety of vendors, we received a proposal from a competitor. The competitor apparently used an old email to send the proposal to our former employee. In another case, we had a client issue an email with a request for proposals to various organizations, including us and our competitors. One competitor mistakenly clicked Reply All, and all potential bidders received a copy of their proposal.
These are minor issues compared to some more serious losses. For example, as mentioned earlier, in 2019 two Boeing 737 MAX airplanes crashed. In both cases, Boeing initially attributed the crashes to pilot error, but it appeared that malfunctions of a mechanical device, an angle of attack sensor (AOA), caused the computer system to force the plane to descend rapidly. There are two AOAs on each plane, and computer systems in similar Boeing 737 models would let the pilot know that there was a discrepancy between the readings of the two AOAs. In the 737 MAX airplanes, the warning for the discrepancy was removed and made an optional feature, and pilots were not properly warned about the differing functionality. (See “Boeing Waited Until After Lion Air Crash to Tell Southwest Safety Alert Was Turned Off on 737 Max,” CNBC, www.cnbc.com/2019/04/28/boeing-didnt-tell-southwest-that-safety-feature-on-737-max-was-turned-off-wsj.html
.) Clearly, even if there was some pilot error involved, it was only enabled due to the technological implementation of the system.
There are many aspects of technological implementation that contribute to UIL. The following sections examine design and maintenance, user enablement, shadow IT, and user interfaces.
Design and Maintenance
There are a wide variety of decisions made in the implementation of technology. These design decisions drive the interactions and capabilities provided to the end users. Although it is easy to blame end users when they commit an act that inevitably leads to damage, if the design of the system leads them to commit the harmful action, it is hard to attribute the blame solely to the end user. Such is the case in attempting to blame the Lion Air and Ethiopian Airlines pilots of the doomed Boeing 737 MAX airplanes.