Trust in Computer Systems and the Cloud. Mike Bursell

Чтение книги онлайн.

Читать онлайн книгу Trust in Computer Systems and the Cloud - Mike Bursell страница 24

Trust in Computer Systems and the Cloud - Mike Bursell

Скачать книгу

when I cannot necessarily discover what behaviour is happening, because if I cannot, then I have no way to know if it is in my best interests or not. One might even expect that if behaviours are in my best interests, they would be disclosed to me as part of the description of the actions about which I am deciding to accept assurances. When I have significant concerns that there are behaviours that are explicitly against my interests, things become concerning. A large-scale example of this is the trust relationship that governments need to have to critical national infrastructure. The exact definition of critical national infrastructure—often capitalised or abbreviated to CNI—varies between experts and countries but is the collection of core hardware, software, and services that are key to keeping citizens safe and key elements of society functioning. A list might include the following:

       Power generation

       Water and sewerage

       Basic transport networks

       Emergency services

       Healthcare

       Location services (e.g., GPS)

       Telecommunications

       Internet access

      For the purposes of many governments, the final two have become so intertwined that they can hardly be separated. What is noteworthy about telecommunications and core Internet capabilities is the small number of suppliers across the world. One of those is Huawei, which is based in the People's Republic of China. The government of the United States, whose relationship with the Chinese state and government can be characterised as a rivalry, if not out-and-out enemies, takes the view that given the nature of the ownership of Huawei, and its base in China, the telecommunications equipment that it manufactures and provides cannot be trusted.

      This is a strong stance to take, and the concerns that are expressed are well-defined. The US government asserts that there is a real risk that a telecommunications equipment—and associated software—provider who is based within China may be under enough pressure from the Chinese government to include hidden features that could affect the confidentiality, integrity, or availability of services that are part of the United States' critical national infrastructure. If this were the case, it would allow communications that could be critical to the United States to be eavesdropped on or even tampered with by the Chinese government or those acting for it. The suggestion that the Chinese government would ever exert pressure to insert such capabilities—typically known as back doors—is strongly disputed by the Chinese and Huawei itself. However, to frame these concerns within our definition of trust relationships as well as from the point of view of the US government, there is insufficient assurance that the actions to be taken by such pieces of equipment are as expected and, therefore, the US government has taken the view that there should be no trust relationship formed with equipment that might be supplied by Huawei.

      Many applications, when running, will perform actions that are not core to the functioning of the program itself, which we might call side effects. At the API level, there is a more formal use of this phrase, where actions are performed on data or variables that are not “local” to the function or operator being called. The general case where non-core actions are performed provides us with enough real concern. Two typical examples, which are recurring problems for IT security, will serve to illustrate the problem.

      Similar problems can occur with backup files. These differ from log files in that they are not intended for consumption by anything other than the application that may need to recover in the event of a problem, but the files need to contain enough data and information for the application to recover all of the state that it needs to continue operation. There is definitely a possible cost here to the user of an application if these files are accessed by unauthorised parties, but at least in this case, there is a possible benefit, too: the application can continue to be used. The question is whether this benefit outweighs the possible cost and, more specifically, whether the trustor even has the ability to make a choice as to whether backup files are stored or has enough information to make an informed choice as to whether they should be. While backup files on a local system are typically accessible to a user—though not always, nor always advertised—the likelihood of this being the case for remote or multi-user systems is significantly reduced. It would be good—that is, in my interests—if I, as trustor, were given the option to back up my own data in this case and insist that any backups generated by the trustee be anonymised or have any critical data removed. But even if I can insist on this, the chances that I can realistically enforce it are very low.

      This is, in fact, another example of security economics: the backups are put in place not for my benefit but for the benefit of the entity operating the application or service I am using. Even if I have visibility into the actions they are performing, I have little or no chance or opportunity to influence them in my favour. Sometimes, despite

Скачать книгу