Trust in Computer Systems and the Cloud. Mike Bursell

Чтение книги онлайн.

Читать онлайн книгу Trust in Computer Systems and the Cloud - Mike Bursell страница 20

Trust in Computer Systems and the Cloud - Mike Bursell

Скачать книгу

of these approaches have led to developments that are important and relevant to our field of study, the most obvious being interest in using blockchains as the basis of crypto-currencies, providing an alternative to fiat currencies and research into self-sovereign identity (SSI). This approach rejects state, national, regional, or commercial organisations as the appropriate repositories for, and owners of, personal information held about individuals, such as their health or financial data, and seeks to provide means to allow the individuals to control this data and how it is collected, used, and changed. The mechanics of handling different types of data and its various usages are still under debate, and the trust issues also are still being studied. Other movements that we could associate with these approaches include the copyleft movement, which attempts to undermine the controls put in place to support copyright, and the open source movement,33 a subject of discussion later in the book.

      Having spent some time considering the questions associated with trusting institutions of various types, we need to look at issues around trusting individuals. In what may seem like a strange move, we are going to start by asking whether we can even trust ourselves.

      Trusting Ourselves

      Wikipedia provides many examples of cognitive bias, and they may seem irrelevant to our quest. However, as we consider risk—which, as discussed in Chapter 1, is what we are trying to manage when we build trust relationships to other entities—there needs to be a better understanding of our own cognitive biases and those of the people around us. We like to believe that we and they make decisions and recommendations rationally, but the study of cognitive bias provides ample evidence that:

       We generally do not.

       Even if we do, we should not expect those to whom we present them to consider them entirely rationally.

      There are opportunities for abuse here. There are techniques beloved of advertisers and the media to manipulate our thinking to their ends, which we could use to our advantage and to try to manipulate others. One example is the framing effect. If you do not want your management to fund a new anti-virus product because you have other ideas for the currently ear-marked funding, you might say:

       “Our current product is 80% effective!”

      Whereas if you do want them to fund it, you might say:

       “Our current product is 20% ineffective!”

      Three further examples of cognitive bias serve to show how risk calculations may be manipulated, either by presentation or just by changing the thought processes of those making calculations:

       Irrational Escalation, or the Fallacy of Sunk Costs This is the tendency for people to keep throwing money or resources at a project, vendor, or product when it is clear that it is no longer worth it, with the rationale that to stop spending money (or resources) now would waste what has already been spent—despite the fact that those resources have already been consumed. This often comes over as misplaced pride or people not wishing to let go of a pet project because they have become attached to it, but it is really dangerous for security. If something clearly is not effective, it should be thrown out, rather than good money being sent after bad.

       Normalcy Bias This is the refusal to address a risk because the event associated with it has never happened before. It is an interesting one when considering security and risk, for the simple reason that so many products and vendors are predicated on exactly that: protecting organisations from events that have so far not occurred. The appropriate response is to perform a thorough risk analysis and then put measures in place to deal with those risks that are truly high priority, not those that may not happen or that do not seem likely at first glance.

       Observer-Expectancy Effect This is when people who are looking for a particular result find it because they have (consciously or unconsciously) misused the data. It is common in situations such as those where there is a belief that a particular attack or threat is likely, and the data available (log files, for instance) are used in a way that confirms this expectation

Скачать книгу