You CAN Stop Stupid. Ira Winkler

Чтение книги онлайн.

Читать онлайн книгу You CAN Stop Stupid - Ira Winkler страница 11

You CAN Stop Stupid - Ira  Winkler

Скачать книгу

for completeness in the annual audit.

      That ideal world represents the embodiment of a system. A good example of this is McDonald's. Generally, McDonald's expects to hire minimally qualified people to deliver a consistent product anywhere in the world. This involves specifying a process and using technology to consistently implement that process. Although people may be involved in performing a function, such as cooking and food preparation, technology is now driving those processes. A person might put the hamburgers on a grill, but the grill is automated to cook the hamburgers for a specific time at a given temperature. The same is true for french fries. Even the amount of ketchup that goes on a hamburger is controlled by a device. Robots control the drink preparation. McDonald's is now distributing kiosks to potentially eliminate cashiers. Although a fast-food restaurant might not seem to be technology-related, the entire restaurant has become a system, driven by governance that is implemented almost completely through technology.

      We described in the book's introduction how the scuba and loss prevention industries look at the concept of mitigating loss as a comprehensive strategy. When organizations fail to do this, they attempt to implement random tactics that are not cohesive and supporting of each other. For example, if you think the fact that users create loss is an awareness failing and that the solution is better awareness, you are focusing on a single countermeasure. This approach will fail.

      NOTE Implementing the strategy across the entire business at all levels doesn't mean that every user needs to actively know and apply the depth and the breadth of the entire strategy. (The fry cook doesn't need to know how the accounting department works, and vice versa.) The team that implements the strategy coordinates its efforts in a way that informs, directs, and empowers every user to accomplish the strategy in whichever ways are most relevant for their role.

      In an ideal world, you will always look at any user-involved process and determine what damage the user can initiate and how the opportunity to cause damage may be removed, as best as possible. If the opportunity for damage cannot be completely removed, you will then look to specify for the user how to make the right decisions and take the appropriate actions to manage the possibility of damage. You then must consider that some user will inevitably act in a way that leads to damage, so you consider how to detect the damaging actions and mitigate the potential for resulting loss as quickly as possible.

      Minimally, when you come across a situation where a user creates damage, you should no longer think, “Just another stupid user.” You should immediately consider why the user was in a position to create damage and why the organization wasn't more effective in preventing it.

      Users inevitably make mistakes. That is a given. At the same time, within an environment that supports good user behavior, users behave reasonably well. The same weakest link who creates security problems and damages systems can also be an effective countermeasure that proactively detects, reports, and stops attacks.

      While the previous statements are paradoxically true, the reality is that users are inconsistent. They are not computers that you can expect to consistently perform the same function from one occurrence to the next. More important, all users are not alike. There is a continuum across which you can expect a range of user behaviors.

      It is a business fact that users are part of the system. Some users might be data entry workers, accountants, factory workers, help desk responders, team members performing functions in a complex process, or other types of employees. Other users might be outside the organization, such as customers on the Internet or vendors performing data entry. Whatever the case, any person who accesses the system must be considered a part of the system.

      It is especially critical to note that the technology and security teams rarely have any control over the hiring of users. Depending upon the environment, the end users might not be employees, but potentially customers and vendors over whom there is relatively little control. The technology and security teams have to account for every possible end user of any ability.

      Given the limited control that technology and security teams have over users, it is not uncommon for some of these professionals to think of users as the weakest link in the system. However, doing so is one of the biggest copouts in security, if not technology management as a whole.

      Users are not a “necessary evil.” They are not an annoyance to be endured when they have questions. Looking down upon users ignores the fact that they are a critical part of the system that security and technology teams are responsible for. In some cases, they might be the reason that these teams have a job in the first place.

      It is your job to ensure that you proactively address any expected areas of loss in the system, including users. Users can only be your weakest link if you fail to mitigate expected user-related issues such as user error and malfeasance.

      Perhaps one of the more notable examples is that of the B-17 bomber troubles. Clearly, a pilot is a critical part of flying the airplane. They are not just a “user” in the most limited sense of the term. When the B-17 underwent the first test flights in 1935, it was the most complex airplane at that time. The pilots chosen as the test pilots were among the top pilots in the country. Yet, these top test pilots crashed the plane. The reason was that they failed to disengage a locking mechanism on the flight controls.

      It was determined that the pilots were overwhelmed by the complexity and made a simple mistake. As the pilots were a critical part of the system, removing them was not an option. They were highly experienced and trained professionals, so the problem was not that they were poorly trained. The government could have sent the pilots for additional training, but retraining top pilots in the basics of how to fly the plane was not going to be an efficient approach. Instead, they recognized that the problem was that the complexity of the airplane was overwhelming.

      Users can be both a blessing and a curse. For the

Скачать книгу