Linux Bible. Christopher Negus

Чтение книги онлайн.

Читать онлайн книгу Linux Bible - Christopher Negus страница 66

Linux Bible - Christopher Negus

Скачать книгу

its most basic level, creating and managing cgroups is generally not a job for new Linux system administrators. It can involve editing configuration files to create your own cgroups (/etc/cgconfig.conf) or set up limits for particular users or groups (/etc/cgrules.conf). Or you can use the cgcreate command to create cgroups, which results in those groups being added to the /sys/fs/cgroup hierarchy. Setting up cgroups can be tricky and, if done improperly, can make your system unbootable.

      The reason I bring up the concept of cgroups here is to help you understand some of the underlying features in Linux that can be used to limit and monitor resource usage. In the future, you will probably run into these features from controllers that manage your cloud infrastructure. You will be able to set rules like “Allow the Marketing department's virtual machines to consume up to 40 percent of the available memory” or “Pin the database application to a particular CPU and memory set.”

      Knowing how Linux can limit and contain the resource usage by the set of processes assigned to a task will ultimately help you manage your computing resources better. If you are interested in learning more about cgroups, you can refer to the following:

       Red Hat Enterprise Linux Resource Management and Linux Containers Guide:https://access.redhat.com/documentation/en-us/red_hat_enterprise_linux/7/html-single/resource:management_guide/index

       Kernel documentation on cgroups: Refer to files in the /usr/share/doc/kernel-doc-*/Documentation/cgroups directory after installing the kernel-doc package.

      Even on a Linux system where there isn't much activity, typically dozens or even hundreds of processes are running in the background. Using the tools described in this chapter, you can view and manage the processes running on your system.

      In the next chapter, you learn how to combine commands and programming functions into files that can be run as shell scripts.

      Use these exercises to test your knowledge of viewing running processes and then changing them later by killing them or changing processor priority (nice value). These tasks assume that you are running a Fedora or Red Hat Enterprise Linux system (although some tasks work on other Linux systems as well). If you are stuck, solutions to the tasks are shown in Appendix B (although in Linux, you can often use multiple ways to complete a task).

      1 List all processes running on your system, showing a full set of columns. Pipe that output to the less command so that you can page through the list of processes.

      2 List all processes running on the system and sort those processes by the name of the user running each process.

      3 List all processes running on the system, and display the following columns of information: process ID, username, group name, virtual memory size, resident memory size, and the command.

      4 Run the top command to view processes running on your system. Go back and forth between sorting by CPU usage and memory consumption.

      5 Start the gedit process from your desktop. Make sure that you run it as the user you are logged in as. Use the System Monitor window to kill that process.

      6 Run the gedit process again. This time, using the kill command, send a signal to the gedit process that causes it to pause (stop). Try typing some text into the gedit window and make sure that no text appears yet.

      7 Use the killall command to tell the gedit command that you paused in the previous exercise to continue working. Make sure that the text you type in after gedit was paused now appears on the window.

      8 Install the xeyes command (in Fedora, it is in the xorg-x11-apps package). Run the xeyes command about 20 times in the background so that 20 xeyes windows appear on the screen. Move the mouse around and watch the eyes watch your mouse pointer. When you have had enough fun, kill all xeyes processes in one command using killall.

      9 As a regular user, run the gedit command so that it starts with a nice value of 5.

      10 Using the renice command, change the nice value of the gedit command you just started to 7. Use any command you like to verify that the current nice value for the gedit command is now set to 7.

      IN THIS CHAPTER

       Working with shell scripts

       Doing arithmetic in shell scripts

       Running loops and cases in shell scripts

       Creating simple shell scripts

      You'd never get any work done if you typed every command that needs to be run on your Linux system when it starts. Likewise, you could work more efficiently if you grouped together sets of commands that you run all the time. Shell scripts can handle these tasks.

      A shell script is a group of commands, functions, variables, or just about anything else you can use from a shell. These items are typed into a plain-text file. That file can then be run as a command. Linux systems have traditionally used system initialization shell scripts during system startup to run commands needed to get services going. You can create your own shell scripts to automate the tasks that you need to do regularly.

      For decades, building shell scripts was the primary skill needed to join together sets of tasks in UNIX and Linux systems. As demands for configuring Linux systems grew beyond single-system setups to complex, automated cluster configurations, more structured methods have arisen. These methods include Ansible playbooks and Kubernetes YAML files, described later in cloud-related chapters. That said, writing shell scripts is still the best next step from running individual commands to building repeatable tasks in Linux systems.

      This chapter provides a rudimentary overview of the inner workings of shell scripts and how they can be used. You learn how simple scripts can be harnessed to a scheduling facility (such as cron or at) to simplify administrative tasks or just run on demand as they are needed.

      Have you ever had a task that you needed to do over and over that took lots of typing on the command line? Do you ever think to yourself, “Wow, I wish I could just type one command to do all this”? Maybe a shell script is what you're after.

Скачать книгу