The Internet of Medical Things (IoMT). Группа авторов
Чтение книги онлайн.
Читать онлайн книгу The Internet of Medical Things (IoMT) - Группа авторов страница 18
2.4.1 Access Control
The aim of the access control is to provide access only to those who are authorized to be in a building or workplace. Together with the matching metal key, the deadbolt lock was the gold standard of access control for many years, but modern enterprises want more. Yes, you want to check who is passing through your doors, but you also want to monitor and manage access. Keys now have passed the baton to computer based electronic access control systems that give authorized users fast and comfortable access and prohibit access to unauthorized persons.
Today, we carry access cards or ID badges to secure places instead of keys. Access control systems may also be utilized in order to restrict access to workstations and file rooms containing sensitive information, printers, and portals. In bigger buildings, entrance to the external door is typically managed by a tenant or managing agency, but access to the internal office door is controlled by the tenant.
Frequency of access: Frequency of Access control is a fundamental component of data security that dictates who‘s allowed to access and use
Figure 2.2 Data classification in cloud computing. company information and resources. Through authentication and authorization, access control policies make sure users are who they say they are and that they have appropriate access to company data.
Frequency of update: Update will update the data to be duplicated. Is it a low, medium, or result?
Visibility and accessibility: The ability of one entity to “see” (i.e., have direct access to) another.
A related concept: The lexical scope of a name binding is the part of the source code in which the name can refer to the entity
Retention: Data retention, or record retention, is exactly what it sounds like—the practice of storing and managing data and records for a designated period of time. There are many reasons why a business might need to retain data: to maintain accurate financial records; to abide by local, state, and federal laws; to comply with industry regulations; to ensure that information is easily accessible for eDiscovery and litigation purposes; and so on. To fulfill these and other business requirements, it is imperative that every organization develops and implements data retention policies.
2.4.2 Content
These are data related to quality content modification. There are many properties that can make data content and can be classified into the following:
Accuracy: Use high data accuracy can be classified as low or poor. High-content precision and accuracy, on the other hand, are required for some data elements.
Reliability/Validity: Concepts used to assess the quality of research are reliability and validity. They show how well something is measured through a method, methodology, or test.
Data Resolution: is a leading global provider of hosted technology solutions for businesses of all sizes. SaaS, managed virtual environments, business continuity solutions, cloud computing and advanced data center services.
Auditability: A data audit refers to the auditing of data to assess its quality or utility for a specific purpose. Auditing data, unlike auditing finances, involves looking at key metrics, other than quantity, to create conclusions about the properties of a data set.
2.4.3 Storage
Data retention policies can be applied based on the lack of applicable criteria diversity.
Storage Encryption: is the use of encryption for data both in transit and on storage media. Data is encrypted while it passes to storage devices, such as individual hard disks, tape drives, or the libraries and arrays that contain them.
K-communication encryption: Leakage and data from the system or eavesdropping risk. A sensitive and data communication must be provided in encryption items.
Problems Integrity: Data integrity is handled by critical issues and has a hash algorithm such as MD5 and SHA. This also applies to the security level of the data essential element.
Policy Access Control: It’s aims to ensure that, by having the appropriate access controls in place, the right information is accessible by the right people at the right time and that access to information, in all forms, is appropriately managed and periodically audited.
Backup and Recovery Plan: A backup plan is required for disaster recovery storage purposes. Data must be connected to base backup scheme. There are separate standards for authenticating user data as required by data quality standards for classification data.
2.4.4 Soft Computing Techniques for Data Classification
Soft computing techniques are collection of soft computing techniques methodology.
• Exploit the tolerance for imperfection and uncertainty.
• Provide capability to handle real-life ambiguous situations.
• Try to achieve robustness against imperfection.
One of the most popular soft computing-based classification techniques is fuzzy classification. Fuzzy classes can better represent transitional areas than hard classification, as class membership is not binary but instead one location can belong to a few classes. In fuzzy set-based systems, membership values of data items range between 0 and 1, where 1 indicates full membership and 0 indicates no membership. Figure 2.3 shows a block diagram of fuzzy classification technique.
This section explains the various layers of analysis framework. Analytical framework is divided into user interface layer and processing layer. User interface layer is responsible for taking input from the user and processing. Processing layer is responsible for classification and comparison. Data access layer is responsible for connecting applications to databases for storing data. Figure 2.4 shows the system architecture and the interaction between the various components. Each layer is implemented use the class file that will implement the interface and data processing.
Figure 2.3 Fuzzy classification block diagram.
Figure 2.4 illustrates that the analytical framework consists of two layers where first layer provide user interface that allows users to select the desired dataset and algorithms and second layer provide processing component to selected algorithm.