Advanced Bash Scripting Guide
Advanced Bash Scripting Guide
<[email protected]> 6.6 27 Nov 2012 Revision History Revision 6.4 'VORTEXBERRY' release Revision 6.5 'TUNGSTENBERRY' release Revision 6.6 'YTTERBIUMBERRY' release
This tutorial assumes no previous knowledge of scripting or programming, but progresses rapidly toward an intermediate/advanced level of instruction . . . all the while sneaking in little nuggets of UNIX wisdom and lore. It serves as a textbook, a manual for self-study, and as a reference and source of knowledge on shell scripting techniques. The exercises and heavily-commented examples invite active reader participation, under the premise that the only way to really learn scripting is to write scripts. This book is suitable for classroom use as a general introduction to programming concepts.
Dedication
For Anita, the source of all the magic
Table of Contents
Chapter 1. Shell Programming! .........................................................................................................................1 Chapter 2. Starting Off With a Sha-Bang........................................................................................................3 2.1. Invoking the script............................................................................................................................6 2.2. Preliminary Exercises.......................................................................................................................6 Part 2. Basics.......................................................................................................................................................7 Chapter 3. Special Characters...........................................................................................................................8 Chapter 4. Introduction to Variables and Parameters ..................................................................................30 4.1. Variable Substitution......................................................................................................................30 4.2. Variable Assignment .......................................................................................................................33 4.3. Bash Variables Are Untyped..........................................................................................................34 4.4. Special Variable Types...................................................................................................................35 Chapter 5. Quoting...........................................................................................................................................41 5.1. Quoting Variables...........................................................................................................................41 5.2. Escaping..........................................................................................................................................43 Chapter 6. Exit and Exit Status.......................................................................................................................51 Chapter 7. Tests................................................................................................................................................54 7.1. Test Constructs...............................................................................................................................54 7.2. File test operators............................................................................................................................62 7.3. Other Comparison Operators..........................................................................................................65 7.4. Nested if/then Condition Tests.......................................................................................................70 7.5. Testing Your Knowledge of Tests..................................................................................................71 Chapter 8. Operations and Related Topics....................................................................................................72 8.1. Operators.........................................................................................................................................72 8.2. Numerical Constants.......................................................................................................................78 8.3. The Double-Parentheses Construct.................................................................................................80 8.4. Operator Precedence.......................................................................................................................81 Part 3. Beyond the Basics.................................................................................................................................84 Chapter 9. Another Look at Variables...........................................................................................................85 9.1. Internal Variables............................................................................................................................85 9.2. Typing variables: declare or typeset.............................................................................................104 9.2.1. Another use for declare.......................................................................................................106 9.3. $RANDOM: generate random integer..........................................................................................107 Chapter 10. Manipulating Variables .............................................................................................................119 10.1. Manipulating Strings ...................................................................................................................119 10.1.1. Manipulating strings using awk........................................................................................126 10.1.2. Further Reference..............................................................................................................127 10.2. Parameter Substitution................................................................................................................127 i
Table of Contents
Chapter 11. Loops and Branches..................................................................................................................138 11.1. Loops..........................................................................................................................................138 11.2. Nested Loops..............................................................................................................................151 11.3. Loop Control...............................................................................................................................152 11.4. Testing and Branching................................................................................................................156 Chapter 12. Command Substitution.............................................................................................................164 Chapter 13. Arithmetic Expansion................................................................................................................170 Chapter 14. Recess Time................................................................................................................................171 Part 4. Commands..........................................................................................................................................172 Chapter 15. Internal Commands and Builtins.............................................................................................180 15.1. Job Control Commands..............................................................................................................209 Chapter 16. External Filters, Programs and Commands...........................................................................214 16.1. Basic Commands........................................................................................................................214 16.2. Complex Commands ...................................................................................................................220 16.3. Time / Date Commands..............................................................................................................230 16.4. Text Processing Commands ........................................................................................................234 16.5. File and Archiving Commands...................................................................................................256 16.6. Communications Commands......................................................................................................275 16.7. Terminal Control Commands.....................................................................................................289 16.8. Math Commands.........................................................................................................................290 16.9. Miscellaneous Commands..........................................................................................................301 Chapter 17. System and Administrative Commands..................................................................................316 17.1. Analyzing a System Script..........................................................................................................347 Part 5. Advanced Topics .................................................................................................................................349 Chapter 18. Regular Expressions..................................................................................................................351 18.1. A Brief Introduction to Regular Expressions ..............................................................................351 18.2. Globbing.....................................................................................................................................355 Chapter 19. Here Documents.........................................................................................................................357 19.1. Here Strings................................................................................................................................367 Chapter 20. I/O Redirection ...........................................................................................................................371 20.1. Using exec ...................................................................................................................................374 20.2. Redirecting Code Blocks............................................................................................................377 20.3. Applications................................................................................................................................382 Chapter 21. Subshells.....................................................................................................................................384
ii
Table of Contents
Chapter 22. Restricted Shells.........................................................................................................................389 Chapter 23. Process Substitution ...................................................................................................................391 Chapter 24. Functions....................................................................................................................................396 24.1. Complex Functions and Function Complexities.........................................................................400 24.2. Local Variables...........................................................................................................................411 24.2.1. Local variables and recursion............................................................................................412 24.3. Recursion Without Local Variables............................................................................................415 Chapter 25. Aliases.........................................................................................................................................418 Chapter 26. List Constructs...........................................................................................................................421 Chapter 27. Arrays.........................................................................................................................................425 Chapter 28. Indirect References....................................................................................................................454 Chapter 29. /dev and /proc.............................................................................................................................458 29.1. /dev ..............................................................................................................................................458 29.2. /proc............................................................................................................................................461 Chapter 30. Network Programming.............................................................................................................467 Chapter 31. Of Zeros and Nulls.....................................................................................................................470 Chapter 32. Debugging...................................................................................................................................474 Chapter 33. Options........................................................................................................................................485 Chapter 34. Gotchas.......................................................................................................................................488 Chapter 35. Scripting With Style..................................................................................................................497 35.1. Unofficial Shell Scripting Stylesheet..........................................................................................497 Chapter 36. Miscellany...................................................................................................................................500 36.1. Interactive and non-interactive shells and scripts.......................................................................500 36.2. Shell Wrappers............................................................................................................................501 36.3. Tests and Comparisons: Alternatives ..........................................................................................506 36.4. Recursion: a script calling itself ..................................................................................................507 36.5. "Colorizing" Scripts....................................................................................................................509 36.6. Optimizations..............................................................................................................................522 36.7. Assorted Tips..............................................................................................................................523 36.7.1. Ideas for more powerful scripts.........................................................................................523 36.7.2. Widgets ..............................................................................................................................533 36.8. Security Issues............................................................................................................................535 36.8.1. Infected Shell Scripts .........................................................................................................535 36.8.2. Hiding Shell Script Source................................................................................................535 iii
Table of Contents
Chapter 36. Miscellany 36.8.3. Writing Secure Shell Scripts.............................................................................................536 36.9. Portability Issues.........................................................................................................................536 36.9.1. A Test Suite.......................................................................................................................537 36.10. Shell Scripting Under Windows...............................................................................................538 Chapter 37. Bash, versions 2, 3, and 4..........................................................................................................539 37.1. Bash, version 2............................................................................................................................539 37.2. Bash, version 3............................................................................................................................543 37.2.1. Bash, version 3.1...............................................................................................................546 37.2.2. Bash, version 3.2...............................................................................................................547 37.3. Bash, version 4............................................................................................................................547 37.3.1. Bash, version 4.1...............................................................................................................554 37.3.2. Bash, version 4.2...............................................................................................................555 Chapter 38. Endnotes.....................................................................................................................................559 38.1. Author's Note..............................................................................................................................559 38.2. About the Author........................................................................................................................559 38.3. Where to Go For Help .................................................................................................................559 38.4. Tools Used to Produce This Book..............................................................................................560 38.4.1. Hardware...........................................................................................................................560 38.4.2. Software and Printware.....................................................................................................560 38.5. Credits.........................................................................................................................................560 38.6. Disclaimer...................................................................................................................................562 Bibliography....................................................................................................................................................563 Appendix A. Contributed Scripts..................................................................................................................571 Appendix B. Reference Cards........................................................................................................................778 Appendix C. A Sed and Awk Micro-Primer................................................................................................783 C.1. Sed................................................................................................................................................783 C.2. Awk..............................................................................................................................................786 Appendix D. Parsing and Managing Pathnames.........................................................................................789 Appendix E. Exit Codes With Special Meanings.........................................................................................793 Appendix F. A Detailed Introduction to I/O and I/O Redirection.............................................................794 Appendix G. Command-Line Options..........................................................................................................796 G.1. Standard Command-Line Options...............................................................................................796 G.2. Bash Command-Line Options ......................................................................................................797 Appendix H. Important Files.........................................................................................................................799
iv
Table of Contents
Appendix I. Important System Directories..................................................................................................800 Appendix J. An Introduction to Programmable Completion.....................................................................802 Appendix K. Localization ...............................................................................................................................805 Appendix L. History Commands...................................................................................................................809 Appendix M. Sample .bashrc and .bash_profile Files.................................................................................810 Appendix N. Converting DOS Batch Files to Shell Scripts .........................................................................827 Appendix O. Exercises....................................................................................................................................831 O.1. Analyzing Scripts.........................................................................................................................831 O.2. Writing Scripts.............................................................................................................................833 Appendix P. Revision History........................................................................................................................843 Appendix Q. Download and Mirror Sites.....................................................................................................846 Appendix R. To Do List..................................................................................................................................847 Appendix S. Copyright...................................................................................................................................848 Appendix T. ASCII Table..............................................................................................................................851 Index....................................................................................................................................................853 Notes ..............................................................................................................................................890
In the early days of personal computing, the BASIC language enabled anyone reasonably computer proficient to write programs on an early generation of microcomputers. Decades later, the Bash scripting language enables anyone with a rudimentary knowledge of Linux or UNIX to do the same on modern machines. We now have miniaturized single-board computers with amazing capabilities, such as the Raspberry Pi. Bash scripting provides a way to explore the capabilities of these fascinating devices.
A shell script is a quick-and-dirty method of prototyping a complex application. Getting even a limited subset of the functionality to work in a script is often a useful first stage in project development. In this way, the structure of the application can be tested and tinkered with, and the major pitfalls found before proceeding to the final coding in C, C++, Java, Perl, or Python. Shell scripting hearkens back to the classic UNIX philosophy of breaking complex projects into simpler subtasks, of chaining together components and utilities. Many consider this a better, or at least more esthetically pleasing approach to problem solving than using one of the new generation of high-powered all-in-one languages, such as Perl, which attempt to be all things to all people, but at the cost of forcing you to alter your thinking processes to fit the tool. According to Herbert Mayer, "a useful language needs arrays, pointers, and a generic mechanism for building data structures." By these criteria, shell scripting falls somewhat short of being "useful." Or, perhaps not. . . .
Advanced Bash-Scripting Guide Resource-intensive tasks, especially where speed is a factor (sorting, hashing, recursion [2] ...) Procedures involving heavy-duty math operations, especially floating point arithmetic, arbitrary precision calculations, or complex numbers (use C++ or FORTRAN instead) Cross-platform portability required (use C or Java instead) Complex applications, where structured programming is a necessity (type-checking of variables, function prototypes, etc.) Mission-critical applications upon which you are betting the future of the company Situations where security is important, where you need to guarantee the integrity of your system and protect against intrusion, cracking, and vandalism Project consists of subcomponents with interlocking dependencies Extensive file operations required (Bash is limited to serial file access, and that only in a particularly clumsy and inefficient line-by-line fashion.) Need native support for multi-dimensional arrays Need data structures, such as linked lists or trees Need to generate / manipulate graphics or GUIs Need direct access to system hardware or external peripherals Need port or socket I/O Need to use libraries or interface with legacy code Proprietary, closed-source applications (Shell scripts put the source code right out in the open for all the world to see.) If any of the above applies, consider a more powerful scripting language -- perhaps Perl, Tcl, Python, Ruby -- or possibly a compiled language such as C, C++, or Java. Even then, prototyping the application as a shell script might still be a useful development step. We will be using Bash, an acronym [3] for "Bourne-Again shell" and a pun on Stephen Bourne's now classic Bourne shell. Bash has become a de facto standard for shell scripting on most flavors of UNIX. Most of the principles this book covers apply equally well to scripting with other shells, such as the Korn Shell, from which Bash derives some of its features, [4] and the C Shell and its variants. (Note that C Shell programming is not recommended due to certain inherent problems, as pointed out in an October, 1993 Usenet post by Tom Christiansen.) What follows is a tutorial on shell scripting. It relies heavily on examples to illustrate various features of the shell. The example scripts work -- they've been tested, insofar as possible -- and some of them are even useful in real life. The reader can play with the actual working code of the examples in the source archive (scriptname.sh or scriptname.bash), [5] give them execute permission (chmod u+rx scriptname), then run them to see what happens. Should the source archive not be available, then cut-and-paste from the HTML or pdf rendered versions. Be aware that some of the scripts presented here introduce features before they are explained, and this may require the reader to temporarily skip ahead for enlightenment. Unless otherwise noted, the author of this book wrote the example scripts that follow. His countenance was bold and bashed not. --Edmund Spenser
There is nothing unusual here, only a set of commands that could just as easily have been invoked one by one from the command-line on the console or in a terminal window. The advantages of placing the commands in a script go far beyond not having to retype them time and again. The script becomes a program -- a tool -- and it can easily be modified or customized for a particular application.
echo "Logs cleaned up." exit # The right and proper method of "exiting" from a script. # A bare "exit" (no parameter) returns the exit status #+ of the preceding command.
Now that's beginning to look like a real script. But we can go even farther . . .
Only users with $UID 0 have root privileges. Default number of lines saved. Can't change directory? Non-root exit error.
# Run as root, of course. if [ "$UID" -ne "$ROOT_UID" ] then echo "Must be root to run this script." exit $E_NOTROOT fi if [ -n "$1" ] # Test whether command-line argument is present (non-empty). then lines=$1 else lines=$LINES # Default, if not specified on command-line. fi
# #+ #+ # # # # # # # # # # #*
Stephane Chazelas suggests the following, as a better way of checking command-line arguments, but this is still a bit advanced for this stage of the tutorial. E_WRONGARGS=85 # Non-numerical argument (bad argument format).
case "$1" in "" ) lines=50;; *[!0-9]*) echo "Usage: `basename $0` lines-to-cleanup"; exit $E_WRONGARGS;; * ) lines=$1;; esac Skip ahead to "Loops" chapter to decipher all this.
then echo "Can't change to $LOG_DIR." exit $E_XCD fi # Doublecheck if in right directory before messing with log file. # Far more efficient is: # # cd /var/log || { # echo "Cannot change to necessary directory." >&2 # exit $E_XCD;
tail -n $lines messages > mesg.temp # Save last section of message log file. mv mesg.temp messages # Rename it as system log file.
# cat /dev/null > messages #* No longer needed, as the above method is safer. cat /dev/null > wtmp # ': > wtmp' and '> wtmp' have the same effect. echo "Log files cleaned up." # Note that there are other log files in /var/log not affected #+ by this script. exit 0 # A zero return value from the script upon exit indicates success #+ to the shell.
Since you may not wish to wipe out the entire system log, this version of the script keeps the last section of the message log intact. You will constantly discover ways of fine-tuning previously written scripts for increased effectiveness. *** The sha-bang ( #!) [6] at the head of a script tells your system that this file is a set of commands to be fed to the command interpreter indicated. The #! is actually a two-byte [7] magic number, a special marker that designates a file type, or in this case an executable shell script (type man magic for more details on this fascinating topic). Immediately following the sha-bang is a path name. This is the path to the program that interprets the commands in the script, whether it be a shell, a programming language, or a utility. This command interpreter then executes the commands in the script, starting at the top (the line following the sha-bang line), and ignoring comments. [8]
#!/bin/sh #!/bin/bash #!/usr/bin/perl #!/usr/bin/tcl #!/bin/sed -f #!/bin/awk -f
Each of the above script header lines calls a different command interpreter, be it /bin/sh, the default shell (bash in a Linux system) or otherwise. [9] Using #!/bin/sh, the default Bourne shell in most commercial variants of UNIX, makes the script portable to non-Linux machines, though you sacrifice Bash-specific features. The script will, however, conform to the POSIX [10] sh standard. Note that the path given at the "sha-bang" must be correct, otherwise an error message -- usually "Command not found." -- will be the only result of running the script. [11] #! can be omitted if the script consists only of a set of generic system commands, using no internal shell directives. The second example, above, requires the initial #!, since the variable assignment line, lines=50, uses a shell-specific construct. [12] Note again that #!/bin/sh invokes the default shell interpreter, which defaults to /bin/bash on a Linux machine. This tutorial encourages a modular approach to constructing a script. Make note of and collect "boilerplate" code snippets that might be useful in future scripts. Eventually you will build quite an Chapter 2. Starting Off With a Sha-Bang 5
Advanced Bash-Scripting Guide extensive library of nifty routines. As an example, the following script prolog tests whether the script has been invoked with the correct number of parameters.
E_WRONG_ARGS=85 script_parameters="-a -h -m -z" # -a = all, -h = help, etc. if [ $# -ne $Number_of_expected_args ] then echo "Usage: `basename $0` $script_parameters" # `basename $0` is the script's filename. exit $E_WRONG_ARGS fi
Many times, you will write a script that carries out one particular task. The first script in this chapter is an example. Later, it might occur to you to generalize the script to do other, similar tasks. Replacing the literal ("hard-wired") constants by variables is a step in that direction, as is replacing repetitive code blocks by functions.
Part 2. Basics
Table of Contents 3. Special Characters 4. Introduction to Variables and Parameters 4.1. Variable Substitution 4.2. Variable Assignment 4.3. Bash Variables Are Untyped 4.4. Special Variable Types 5. Quoting 5.1. Quoting Variables 5.2. Escaping 6. Exit and Exit Status 7. Tests 7.1. Test Constructs 7.2. File test operators 7.3. Other Comparison Operators 7.4. Nested if/then Condition Tests 7.5. Testing Your Knowledge of Tests 8. Operations and Related Topics 8.1. Operators 8.2. Numerical Constants 8.3. The Double-Parentheses Construct 8.4. Operator Precedence
Part 2. Basics
A command may not follow a comment on the same line. There is no method of terminating the comment, in order for "live code" to begin on the same line. Use a new line for the next command. Of course, a quoted or an escaped # in an echo statement does not begin a comment. Likewise, a # appears in certain parameter-substitution constructs and in numerical constant expressions.
echo echo echo echo "The # here does not begin a comment." 'The # here does not begin a comment.' The \# here does not begin a comment. The # here begins a comment. # Parameter substitution, not a comment. # Base conversion, not a comment.
The standard quoting and escape characters (" ' \) escape the #. Certain pattern matching operations also use the #. ; Command separator [semicolon]. Permits putting two or more commands on the same line.
echo hello; echo there
Note that the ";" sometimes needs to be escaped. ;; Terminator in a case option [double semicolon].
case "$variable" in abc) echo "\$variable = abc" ;; xyz) echo "\$variable = xyz" ;; esac
;;&, ;& Terminators in a case option (version 4+ of Bash). . "dot" command [period]. Equivalent to source (see Example 15-22). This is a bash builtin. . "dot", as a component of a filename. When working with filenames, a leading dot is the prefix of a "hidden" file, a file that an ls will not normally show.
bash$ touch .hidden-file bash$ ls -l total 10 -rw-r--r-1 bozo -rw-r--r-1 bozo -rw-r--r-1 bozo
4034 Jul 18 22:04 data1.addressbook 4602 May 25 13:58 data1.addressbook.bak 877 Dec 17 2000 employment.addressbook
2 52 1 1 1 1
29 29 18 25 17 29
When considering directory names, a single dot represents the current working directory, and two dots denote the parent directory.
bash$ pwd /home/bozo/projects bash$ cd . bash$ pwd /home/bozo/projects bash$ cd .. bash$ pwd /home/bozo/
The dot often appears as the destination (directory) of a file movement command, in this context meaning current directory. Chapter 3. Special Characters 9
Copy all the "junk" files to $PWD. . "dot" character match. When matching characters, as part of a regular expression, a "dot" matches a single character. " partial quoting [double quote]. "STRING" preserves (from interpretation) most of the special characters within STRING. See Chapter 5. ' full quoting [single quote]. 'STRING' preserves all special characters within STRING. This is a stronger form of quoting than "STRING". See Chapter 5. , comma operator. The comma operator [16] links together a series of arithmetic operations. All are evaluated, but only the last one is returned.
let "t2 = ((a = 9, 15 / 3))" # Set "a = 9" and "t2 = 15 / 3"
,, , Lowercase conversion in parameter substitution (added in version 4 of Bash). \ escape [backslash]. A quoting mechanism for single characters. \X escapes the character X. This has the effect of "quoting" X, equivalent to 'X'. The \ may be used to quote " and ', so they are expressed literally. See Chapter 5 for an in-depth explanation of escaped characters. / Filename path separator [forward slash]. Separates the components of a filename (as in /home/bozo/projects/Makefile). This is also the division arithmetic operator. ` command substitution. The `command` construct makes available the output of command for assignment to a variable. This is also known as backquotes or backticks. Chapter 3. Special Characters 10
Advanced Bash-Scripting Guide : null command [colon]. This is the shell equivalent of a "NOP" (no op, a do-nothing operation). It may be considered a synonym for the shell builtin true. The ":" command is itself a Bash builtin, and its exit status is true (0).
: echo $?
# 0
Endless loop:
while : do operation-1 operation-2 ... operation-n done # Same as: # while true # do # ... # done
Provide a placeholder where a binary operation is expected, see Example 8-2 and default parameters.
: ${username=`whoami`} # ${username=`whoami`} #
Gives an error without the leading : unless "username" is a command or builtin... # From "usage-message.sh example script.
: ${1?"Usage: $0 ARGUMENT"}
Provide a placeholder where a command is expected in a here document. See Example 19-10. Evaluate string of variables using parameter substitution (as in Example 10-7).
: ${HOSTNAME?} ${USER?} ${MAIL?} # Prints error message #+ if one or more of essential environmental variables not set.
Variable expansion / substring replacement. In combination with the > redirection operator, truncates a file to zero length, without changing its permissions. If the file did not previously exist, creates it.
: > data.xxx # File "data.xxx" now empty.
# Same effect as cat /dev/null >data.xxx # However, this does not fork a new process, since ":" is a builtin.
11
Advanced Bash-Scripting Guide In combination with the >> redirection operator, has no effect on a pre-existing target file (: >> target_file). If the file did not previously exist, creates it. This applies to regular files, not pipes, symlinks, and certain special files. May be used to begin a comment line, although this is not recommended. Using # for a comment turns off error checking for the remainder of that line, so almost anything may appear in a comment. However, this is not the case with :.
: This is a comment that generates an error, ( if [ $x -eq 3] ).
The ":" serves as a field separator, in /etc/passwd, and in the $PATH variable.
bash$ echo $PATH /usr/local/bin:/bin:/usr/bin:/usr/X11R6/bin:/sbin:/usr/sbin:/usr/games
! reverse (or negate) the sense of a test or exit status [bang]. The ! operator inverts the exit status of the command to which it is applied (see Example 6-2). It also inverts the meaning of a test operator. This can, for example, change the sense of equal ( = ) to not-equal ( != ). The ! operator is a Bash keyword. In a different context, the ! also appears in indirect variable references. In yet another context, from the command line, the ! invokes the Bash history mechanism (see Appendix L). Note that within a script, the history mechanism is disabled. * wild card [asterisk]. The * character serves as a "wild card" for filename expansion in globbing. By itself, it matches every filename in a given directory.
bash$ echo * abs-book.sgml add-drive.sh agram.sh alias.sh
12
Advanced Bash-Scripting Guide The * also represents any number (or zero) characters in a regular expression. * arithmetic operator. In the context of arithmetic operations, the * denotes multiplication. ** A double asterisk can represent the exponentiation operator or extended file-match globbing. ? test operator. Within certain expressions, the ? indicates a test for a condition.
In a double-parentheses construct, the ? can serve as an element of a C-style trinary operator. [17] condition?result-if-true:result-if-false
(( var0 = var1<98?9:21 )) # ^ ^ # # # # # # if [ "$var1" -lt 98 ] then var0=9 else var0=21 fi
In a parameter substitution expression, the ? tests whether a variable has been set. ? wild card. The ? character serves as a single-character "wild card" for filename expansion in globbing, as well as representing one character in an extended regular expression. $ Variable substitution (contents of a variable).
var1=5 var2=23skidoo echo $var1 echo $var2 # 5 # 23skidoo
A $ prefixing a variable name indicates the value the variable holds. $ end-of-line. In a regular expression, a "$" addresses the end of a line of text. ${} Parameter substitution. $' ... ' Quoted string expansion. This construct expands single or multiple escaped octal or hex values into ASCII [18] or Unicode characters. $*, $@ positional parameters. $? exit status variable. The $? variable holds the exit status of a command, a function, or of the script itself. $$ process ID variable. The $$ variable holds the process ID [19] of the script in which it appears. () command group.
13
A listing of commands within parentheses starts a subshell. Variables inside parentheses, within the subshell, are not visible to the rest of the script. The parent process, the script, cannot read variables created in the child process, the subshell.
a=123 ( a=321; ) echo "a = $a" # a = 123 # "a" within parentheses acts like a local variable.
array initialization.
Array=(element1 element2 element3)
cat {file1,file2,file3} > combined_file # Concatenates the files file1, file2, and file3 into combined_file. cp file22.{txt,backup} # Copies "file22.txt" to "file22.backup"
A command may act upon a comma-separated list of file specs within braces. [20] Filename expansion (globbing) applies to the file specs between the braces. No spaces allowed within the braces unless the spaces are quoted or escaped. echo {file1,file2}\ :{\ A," B",' C'} file1 : A file1 : B file1 : C file2 : A file2 : B file2 : C {a..z} Extended Brace expansion.
echo {a..z} # a b c d e f g h i j k l m n o p q r s t u v w x y z # Echoes characters between a and z. echo {0..3} # 0 1 2 3 # Echoes characters between 0 and 3.
base64_charset=( {A..Z} {a..z} {0..9} + / = ) # Initializing an array, using extended brace expansion. # From vladz's "base64.sh" example script.
The {a..z} extended brace expansion construction is a feature introduced in version 3 of Bash. {} Block of code [curly brackets]. Also referred to as an inline group, this construct, in effect, creates an anonymous function (a function without a name). However, unlike in a "standard" function, the Chapter 3. Special Characters 14
Advanced Bash-Scripting Guide variables inside a code block remain visible to the remainder of the script.
bash$ { local a; a=123; } bash: local: can only be used in a function
# a = 321
The code block enclosed in braces may have I/O redirected to and from it.
exit 0 # Now, how do you parse the separate fields of each line? # Hint: use awk, or . . . # . . . Hans-Joerg Diers suggests using the "set" Bash builtin.
15
Unlike a command group within (parentheses), as above, a code block enclosed by {braces} will not normally launch a subshell. [21] {} placeholder for text. Used after xargs -i (replace strings option). The {} double curly brackets are a placeholder for output text.
ls . | xargs -i -t cp ./{} $1 # ^^ ^^ # From "ex42.sh" (copydir.sh) example.
Definition: A pathname is a filename that includes the complete path. As an example, /home/bozo/Notes/Thursday/schedule.txt. This is sometimes referred to as the absolute path. The ";" ends the -exec option of a find command sequence. It needs to be escaped to protect it from interpretation by the shell. [] test. Test expression between [ ]. Note that [ is part of the shell builtin test (and a synonym for it), not a link to the external command /usr/bin/test. [[ ]] Chapter 3. Special Characters 16
Advanced Bash-Scripting Guide test. Test expression between [[ ]]. More flexible than the single-bracket [ ] test, this is a shell keyword. See the discussion on the [[ ... ]] construct. [] array element. In the context of an array, brackets set off the numbering of each element of that array.
Array[1]=slot_1 echo ${Array[1]}
[] range of characters. As part of a regular expression, brackets delineate a range of characters to match. $[ ... ] integer expansion. Evaluate integer expression between $[ ].
a=3 b=7 echo $[$a+$b] echo $[$a*$b] # 10 # 21
Note that this usage is deprecated, and has been replaced by the (( ... )) construct. (( )) integer expansion. Expand and evaluate integer expression between (( )). See the discussion on the (( ... )) construct. > &> >& >> < <> redirection. scriptname >filename redirects the output of scriptname to file filename. Overwrite filename if it already exists.
command &>filename redirects both the stdout and the stderr of command to filename. This is useful for suppressing output when testing for a condition. For example, let us test whether a certain command exists.
bash$ type bogus_command &>/dev/null
bash$ echo $? 1
17
command >&2 redirects stdout of command to stderr. scriptname >>filename appends the output of scriptname to file filename. If filename does not already exist, it is created.
[i]<>filename opens file filename for reading and writing, and assigns file descriptor i to it. If filename does not exist, it is created. process substitution. (command)> <(command) In a different context, the "<" and ">" characters act as string comparison operators. In yet another context, the "<" and ">" characters act as integer comparison operators. See also Example 16-9. << redirection used in a here document. <<< redirection used in a here string. <, > ASCII comparison.
veg1=carrots veg2=tomatoes if [[ "$veg1" < "$veg2" ]] then echo "Although $veg1 precede $veg2 in the dictionary," echo -n "this does not necessarily imply anything " echo "about my culinary preferences." else echo "What kind of dictionary are you using, anyhow?" fi
\<, \> word boundary in a regular expression. bash$ grep '\<the\>' textfile | Chapter 3. Special Characters 18
Advanced Bash-Scripting Guide pipe. Passes the output (stdout) of a previous command to the input (stdin) of the next one, or to the shell. This is a method of chaining commands together.
echo ls -l | sh # Passes the output of "echo ls -l" to the shell, #+ with the same result as a simple "ls -l".
cat *.lst | sort | uniq # Merges and sorts all ".lst" files, then deletes duplicate lines.
A pipe, as a classic method of interprocess communication, sends the stdout of one process to the stdin of another. In a typical case, a command, such as cat or echo, pipes a stream of data to a filter, a command that transforms its input for processing. [22] cat $filename1 $filename2 | grep $search_word For an interesting note on the complexity of using UNIX pipes, see the UNIX FAQ, Part 3. The output of a command or commands may be piped to a script.
#!/bin/bash # uppercase.sh : Changes input to uppercase. tr 'a-z' 'A-Z' # Letter ranges must be quoted #+ to prevent filename generation from single-letter filenames. exit 0
The stdout of each process in a pipe must be read as the stdin of the next. If this is not the case, the data stream will block, and the pipe will not behave as expected.
cat file1 file2 | ls -l | sort # The output from "cat file1 file2" disappears.
A pipe runs as a child process, and therefore cannot alter script variables.
variable="initial_value" echo "new_value" | read variable echo "variable = $variable" # variable = initial_value
If one of the commands in the pipe aborts, this prematurely terminates execution of the pipe. Called a broken pipe, this condition sends a SIGPIPE signal. >| force redirection (even if the noclobber option is set). This will forcibly overwrite an existing file. || OR logical operator. In a test construct, the || operator causes a return of 0 (success) if either of the linked test conditions is true. & Chapter 3. Special Characters 19
Advanced Bash-Scripting Guide Run job in background. A command followed by an & will run in the background.
bash$ sleep 10 & [1] 850 [1]+ Done
sleep 10
Within a script, commands and even loops may run in the background.
# ====================================================== # The expected output from the script: # 1 2 3 4 5 6 7 8 9 10 # 11 12 13 14 15 16 17 18 19 20 # # # # Sometimes, though, you get: 11 12 13 14 15 16 17 18 19 20 1 2 3 4 5 6 7 8 9 10 bozo $ (The second 'echo' doesn't execute. Why?)
# Occasionally also: # 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 # (The first 'echo' doesn't execute. Why?) # Very rarely something like: # 11 12 13 1 2 3 4 5 6 7 8 9 10 14 15 16 17 18 19 20 # The foreground loop preempts the background one. exit 0 # Nasimuddin Ansari suggests adding sleep 1 #+ after the echo -n "$i" in lines 6 and 14, #+ for some real fun.
A command run in the background within a script may cause the script to hang, waiting for a keystroke. Fortunately, there is a remedy for this. && Chapter 3. Special Characters 20
Advanced Bash-Scripting Guide AND logical operator. In a test construct, the && operator causes a return of 0 (success) only if both the linked test conditions are true. option, prefix. Option flag for a command or filter. Prefix for an operator. Prefix for a default parameter in parameter substitution. COMMAND -[Option1][Option2][...] ls -al sort -dfu $filename
if [ $file1 -ot $file2 ] then # ^ echo "File $file1 is older than $file2." fi if [ "$a" -eq "$b" ] then # ^ echo "$a is equal to $b." fi if [ "$c" -eq 24 -a "$d" -eq 47 ] then # ^ ^ echo "$c equals 24 and $d equals 47." fi
param2=${param1:-$DEFAULTVAL} # ^
-The double-dash -- prefixes long (verbatim) options to commands. sort --ignore-leading-blanks Used with a Bash builtin, it means the end of options to that particular command. This provides a handy means of removing files whose names begin with a dash.
bash$ ls -l -rw-r--r-- 1 bozo bozo 0 Nov 25 12:29 -badname
The double-dash is also used in conjunction with set. set -- $variable (as in Example 15-18) redirection from/to stdin or stdout [dash].
21
As expected, cat - echoes stdin, in this case keyboarded user input, to stdout. But, does I/O redirection using - have real-world applications?
(cd /source/directory && tar cf - . ) | (cd /dest/directory && tar xpvf -) # Move entire file tree from one directory to another # [courtesy Alan Cox <[email protected]>, with a minor change] # # # # # # # # # # # # # # # # # # # # # # # # # 1) cd /source/directory Source directory, where the files to be moved are. 2) && "And-list": if the 'cd' operation successful, then execute the next command. 3) tar cf - . The 'c' option 'tar' archiving command creates a new archive, the 'f' (file) option, followed by '-' designates the target file as stdout, and do it in current directory tree ('.'). 4) | Piped to ... 5) ( ... ) a subshell 6) cd /dest/directory Change to the destination directory. 7) && "And-list", as above 8) tar xpvf Unarchive ('x'), preserve ownership and file permissions ('p'), and send verbose messages to stdout ('v'), reading data from stdin ('f' followed by '-'). Note that 'x' is a command, and 'p', 'v', 'f' are options. Whew!
# More elegant than, but equivalent to: # cd source/directory # tar cf - . | (cd ../dest/directory; tar xpvf -) # # Also having same effect: # cp -a /source/directory/* /dest/directory # Or: # cp -a /source/directory/* /source/directory/.[^.]* /dest/directory # If there are hidden files in /source/directory. bunzip2 -c linux-2.6.16.tar.bz2 | tar xvf # --uncompress tar file-| --then pass it to "tar"-# If "tar" has not been patched to handle "bunzip2", #+ this needs to be done in two discrete steps, using a pipe. # The purpose of the exercise is to unarchive "bzipped" kernel source.
Note that in this context the "-" is not itself a Bash operator, but rather an option recognized by certain UNIX utilities that write to stdout, such as tar, cat, etc.
22
Where a filename is expected, - redirects output to stdout (sometimes seen with tar cf), or accepts input from stdin, rather than from a file. This is a method of using a file-oriented utility as a filter in a pipe.
bash$ file Usage: file [-bciknvzL] [-f namefile] [-m magicfiles] file...
By itself on the command-line, file fails with an error message. Add a "-" for a more useful result. This causes the shell to await user input.
bash$ file abc standard input:
ASCII text
Now the command accepts input from stdin and analyzes it. The "-" can be used to pipe stdout to other commands. This permits such stunts as prepending lines to a file. Using diff to compare a file with a section of another: grep Linux file1 | diff file2 Finally, a real-world example using - with tar.
# Stephane Chazelas points out that the above code will fail #+ if there are too many files found
23
# find . -mtime -1 -type f -exec tar rvf "$archive.tar" '{}' \; # portable to other UNIX flavors, but much slower. # -------------------------------------------------------------------
exit 0
Filenames beginning with "-" may cause problems when coupled with the "-" redirection operator. A script should check for this and add an appropriate prefix to such filenames, for example ./-FILENAME, $PWD/-FILENAME, or $PATHNAME/-FILENAME. If the value of a variable begins with a -, this may likewise create problems.
var="-n" echo $var # Has the effect of "echo -n", and outputs nothing.
previous working directory. A cd - command changes to the previous working directory. This uses the $OLDPWD environmental variable. Do not confuse the "-" used in this sense with the "-" redirection operator just discussed. The interpretation of the "-" depends on the context in which it appears. Minus. Minus sign in an arithmetic operation. = Equals. Assignment operator
a=28 echo $a
# 28
In a different context, the "=" is a string comparison operator. + Plus. Addition arithmetic operator. In a different context, the + is a Regular Expression operator. + Option. Option flag for a command or filter. Certain commands and builtins use the + to enable certain options and the - to disable them. In parameter substitution, the + prefixes an alternate value that a variable expands to. % modulo. Modulo (remainder of a division) arithmetic operation.
let "z = 5 % 3" echo $z # 2
24
Advanced Bash-Scripting Guide In a different context, the % is a pattern matching operator. ~ home directory [tilde]. This corresponds to the $HOME internal variable. ~bozo is bozo's home directory, and ls ~bozo lists the contents of it. ~/ is the current user's home directory, and ls ~/ lists the contents of it.
bash$ echo ~bozo /home/bozo bash$ echo ~ /home/bozo bash$ echo ~/ /home/bozo/ bash$ echo ~: /home/bozo: bash$ echo ~nonexistent-user ~nonexistent-user
~+ current working directory. This corresponds to the $PWD internal variable. ~previous working directory. This corresponds to the $OLDPWD internal variable. =~ regular expression match. This operator was introduced with version 3 of Bash. ^ beginning-of-line. In a regular expression, a "^" addresses the beginning of a line of text. ^, ^^ Uppercase conversion in parameter substitution (added in version 4 of Bash). Control Characters change the behavior of the terminal or text display. A control character is a CONTROL + key combination (pressed simultaneously). A control character may also be written in octal or hexadecimal notation, following an escape. Control characters are not normally useful inside a script. Ctl-A Moves cursor to beginning of line of text (on the command-line). Ctl-B Backspace (nondestructive). Ctl-C Break. Terminate a foreground job. Ctl-D Log out from a shell (similar to exit).
25
Advanced Bash-Scripting Guide EOF (end-of-file). This also terminates input from stdin. When typing text on the console or in an xterm window, Ctl-D erases the character under the cursor. When there are no characters present, Ctl-D logs out of the session, as expected. In an xterm window, this has the effect of closing the window. Ctl-E Moves cursor to end of line of text (on the command-line). Ctl-F Moves cursor forward one character position (on the command-line). Ctl-G BEL. On some old-time teletype terminals, this would actually ring a bell. In an xterm it might beep. Ctl-H Rubout (destructive backspace). Erases characters the cursor backs over while backspacing.
#!/bin/bash # Embedding Ctl-H in a string. a="^H^H" echo "abcdef" echo echo -n "abcdef$a " # Space at end ^ echo echo -n "abcdef$a" # No space at end echo; echo # # # # # Constantin Hagemeier suggests trying: a=$'\010\010' a=$'\b\b' a=$'\x08\x08' But, this does not change the results. # Two Ctl-H's -- backspaces # ctl-V ctl-H, using vi/vim # abcdef # abcd f ^ Backspaces twice. # abcdef ^ Doesn't backspace (why?). # Results may not be quite as expected.
######################################## # Now, try this. rubout="^H^H^H^H^H" echo -n "12345678" sleep 2 echo -n "$rubout" sleep 2 # 5 x Ctl-H.
26
Advanced Bash-Scripting Guide Ctl-J Newline (line feed). In a script, may also be expressed in octal notation -- '\012' or in hexadecimal -- '\x0a'. Ctl-K Vertical tab. When typing text on the console or in an xterm window, Ctl-K erases from the character under the cursor to end of line. Within a script, Ctl-K may behave differently, as in Lee Lee Maschmeyer's example, below. Ctl-L Formfeed (clear the terminal screen). In a terminal, this has the same effect as the clear command. When sent to a printer, a Ctl-L causes an advance to end of the paper sheet. Ctl-M Carriage return.
#!/bin/bash # Thank you, Lee Maschmeyer, for this example. read -n 1 -s -p \ $'Control-M leaves cursor at beginning of this line. Press Enter. \x0d' # Of course, '0d' is the hex equivalent of Control-M. echo >&2 # The '-s' makes anything typed silent, #+ so it is necessary to go to new line explicitly. read -n 1 -s -p $'Control-J leaves cursor on next line. \x0a' # '0a' is the hex equivalent of Control-J, linefeed. echo >&2 ### read -n 1 -s -p $'And Control-K\x0bgoes straight down.' echo >&2 # Control-K is vertical tab. # A better example of the effect of a vertical tab is: var=$'\x0aThis is the bottom line\x0bThis is the top line\x0a' echo "$var" # This works the same way as the above example. However: echo "$var" | col # This causes the right end of the line to be higher than the left end. # It also explains why we started and ended with a line feed -#+ to avoid a garbled screen. # As Lee Maschmeyer explains: # -------------------------# In the [first vertical tab example] . . . the vertical tab #+ makes the printing go straight down without a carriage return. # This is true only on devices, such as the Linux console, #+ that can't go "backward." # The real purpose of VT is to go straight UP, not down. # It can be used to print superscripts on a printer. # The col utility can be used to emulate the proper behavior of VT.
27
Ctl-N Erases a line of text recalled from history buffer [23] (on the command-line). Ctl-O Issues a newline (on the command-line). Ctl-P Recalls last command from history buffer (on the command-line). Ctl-Q Resume (XON). This resumes stdin in a terminal. Ctl-R Backwards search for text in history buffer (on the command-line). Ctl-S Suspend (XOFF). This freezes stdin in a terminal. (Use Ctl-Q to restore input.) Ctl-T Reverses the position of the character the cursor is on with the previous character (on the command-line). Ctl-U Erase a line of input, from the cursor backward to beginning of line. In some settings, Ctl-U erases the entire line of input, regardless of cursor position. Ctl-V When inputting text, Ctl-V permits inserting control characters. For example, the following two are equivalent:
echo -e '\x0a' echo <Ctl-V><Ctl-J>
Ctl-V is primarily useful from within a text editor. Ctl-W When typing text on the console or in an xterm window, Ctl-W erases from the character under the cursor backwards to the first instance of whitespace. In some settings, Ctl-W erases backwards to first non-alphanumeric character. Ctl-X In certain word processing programs, Cuts highlighted text and copies to clipboard. Ctl-Y Pastes back text previously erased (with Ctl-U or Ctl-W). Ctl-Z Chapter 3. Special Characters 28
Advanced Bash-Scripting Guide Pauses a foreground job. Substitute operation in certain word processing applications. EOF (end-of-file) character in the MSDOS filesystem. Whitespace functions as a separator between commands and/or variables. Whitespace consists of either spaces, tabs, blank lines, or any combination thereof. [24] In some contexts, such as variable assignment, whitespace is not permitted, and results in a syntax error. Blank lines have no effect on the action of a script, and are therefore useful for visually separating functional sections. $IFS, the special variable separating fields of input to certain commands. It defaults to whitespace.
Definition: A field is a discrete chunk of data expressed as a string of consecutive characters. Separating each field from adjacent fields is either whitespace or some other designated character (often determined by the $IFS). In some contexts, a field may be called a record. To preserve whitespace within a string or in a variable, use quoting. UNIX filters can target and operate on whitespace using the POSIX character class [:space:].
29
The only times a variable appears "naked" -- without the $ prefix -- is when declared or assigned, when unset, when exported, in an arithmetic expression within double parentheses (( ... )), or in the special case of a variable representing a signal (see Example 32-5). Assignment may be with an = (as in var1=27), in a read statement, and at the head of a loop (for var2 in 1 2 3). Enclosing a referenced value in double quotes (" ... ") does not interfere with variable substitution. This is called partial quoting, sometimes referred to as "weak quoting." Using single quotes (' ... ') causes the variable name to be used literally, and no substitution will take place. This is full quoting, sometimes referred to as 'strong quoting.' See Chapter 5 for a detailed discussion. Note that $variable is actually a simplified form of ${variable}. In contexts where the $variable syntax causes an error, the longer form may work (see Section 10.2, below).
30
echo hello # hello # Not a variable reference, just the string "hello" ... echo $hello # 375 # ^ This *is* a variable reference. echo ${hello} # 375 # Likewise a variable reference, as above. # Quoting . . . echo "$hello" echo "${hello}" echo hello="A B C D" echo $hello # A B C D echo "$hello" # A B C D # As we see, echo $hello and echo "$hello" # ======================================= # Quoting a variable preserves whitespace. # ======================================= echo echo '$hello' # $hello # ^ ^ # Variable referencing disabled (escaped) by single quotes, #+ which causes the "$" to be interpreted literally. # Notice the effect of different types of quoting.
# 375 # 375
hello= # Setting it to a null value. echo "\$hello (null value) = $hello" # $hello (null value) = # Note that setting a variable to a null value is not the same as #+ unsetting it, although the end result is the same (see below). # -------------------------------------------------------------# It is permissible to set multiple variables on the same line, #+ if separated by white space. # Caution, this may reduce legibility, and may not be portable. var1=21 var2=22 echo echo "var1=$var1 var3=$V3 var2=$var2 var3=$var3"
31
An uninitialized variable has a "null" value -- no assigned value at all (not zero!).
if [ -z "$unassigned" ] then echo "\$unassigned is NULL." fi # $unassigned is NULL.
Using a variable before assigning a value to it may cause problems. It is nevertheless possible to perform arithmetic operations on an uninitialized variable.
echo "$uninitialized" let "uninitialized += 5" echo "$uninitialized" # (blank line) # Add 5 to it. # 5
32
33
exit 0
Variable assignment using the $(...) mechanism (a newer method than backquotes). This is likewise a form of command substitution.
# From /etc/rc.d/rc.local R=$(cat /etc/redhat-release) arch=$(uname -m)
b=${a/23/BB} echo "b = $b" declare -i b echo "b = $b" let "b += 1" echo "b = $b" echo c=BB34 echo "c = $c"
# # # # #
Substitute "BB" for "23". This transforms $b into a string. b = BB35 Declaring it an integer doesn't help. b = BB35
# c = BB34
34
# What about null variables? e='' # ... Or e="" ... Or e= echo "e = $e" # e = let "e += 1" # Arithmetic operations allowed on a null variable? echo "e = $e" # e = 1 echo # Null variable transformed into an integer. # What about undeclared variables? echo "f = $f" # f = let "f += 1" # Arithmetic operations allowed? echo "f = $f" # f = 1 echo # Undeclared variable transformed into an integer. # # However ... let "f /= $undecl_var" # Divide by zero? # let: f /= : syntax error: operand expected (error token is " ") # Syntax error! Variable $undecl_var is not set to zero here! # # But still ... let "f /= 0" # let: f /= 0: division by 0 (error token is "0") # Expected behavior.
# #+ # #
(usually) sets the "integer value" of null to zero performing an arithmetic operation. don't try this at home, folks! undocumented and probably non-portable behavior.
# Conclusion: Variables in Bash are untyped, #+ with all attendant consequences. exit $?
Untyped variables are both a blessing and a curse. They permit more flexibility in scripting and make it easier to grind out lines of code (and give you enough rope to hang yourself!). However, they likewise permit subtle errors to creep in and encourage sloppy programming habits. To lighten the burden of keeping track of variable types in a script, Bash does permit declaring variables.
Advanced Bash-Scripting Guide process. Every time a shell starts, it creates shell variables that correspond to its own environmental variables. Updating or adding new environmental variables causes the shell to update its environment, and all the shell's child processes (the commands it executes) inherit this environment. The space allotted to the environment is limited. Creating too many environmental variables or ones that use up excessive space may cause problems.
bash$ eval "`seq 10000 | sed -e 's/.*/export var&=ZZZZZZZZZZZZZZ/'`" bash$ du bash: /usr/bin/du: Argument list too long
Note: this "error" has been fixed, as of kernel version 2.6.23. (Thank you, Stphane Chazelas for the clarification, and for providing the above example.) If a script sets environmental variables, they need to be "exported," that is, reported to the environment local to the script. This is the function of the export command.
A script can export variables only to child processes, that is, only to commands or processes which that particular script initiates. A script invoked from the command-line cannot export variables back to the command-line environment. Child processes cannot export variables back to the parent processes that spawned them. Definition: A child process is a subprocess launched by another process, its parent. Positional parameters Arguments passed to the script from the command line [26] : $0, $1, $2, $3 . . . $0 is the name of the script itself, $1 is the first argument, $2 the second, $3 the third, and so forth. [27] After $9, the arguments must be enclosed in brackets, for example, ${10}, ${11}, ${12}. The special variables $* and $@ denote all the positional parameters.
36
if [ -n "${10}" ] # Parameters > $9 must be enclosed in {brackets}. then echo "Parameter #10 is ${10}" fi echo "-----------------------------------" echo "All the command-line parameters are: "$*"" if [ $# -lt "$MINPARAMS" ] then echo echo "This script needs at least $MINPARAMS command-line arguments!" fi echo exit 0
Bracket notation for positional parameters leads to a fairly simple way of referencing the last argument passed to a script on the command-line. This also requires indirect referencing.
args=$# # Number of args passed. lastarg=${!args} # Note: This is an *indirect reference* to $args ...
# Or: lastarg=${!#} (Thanks, Chris Monson.) # This is an *indirect reference* to the $# variable. # Note that lastarg=${!$#} doesn't work.
Some scripts can perform different operations, depending on which name they are invoked with. For this to work, the script needs to check $0, the name it was invoked by. [28] There must also exist symbolic links to all the alternate names of the script. See Example 16-2.
If a script expects a command-line parameter but is invoked without one, this may cause a null variable assignment, generally an undesirable result. One way to prevent this is to append an extra character to both sides of the assignment statement using the Chapter 4. Introduction to Variables and Parameters 37
# #+ # # # #+
However, as Fabian Kreutz points out, the above method may have unexpected side-effects. A better method is parameter substitution: ${1:-$DefaultVal} See the "Parameter Substition" section in the "Variables Revisited" chapter.
---
E_NOARGS=75
if [ -z "$1" ] then echo "Usage: `basename $0` [domain-name]" exit $E_NOARGS fi # Check script case `basename "wh" "wh-ripe" "wh-apnic" "wh-cw" * name and call proper server. $0` in # Or: case ${0##*/} in ) whois [email protected];; ) whois [email protected];; ) whois [email protected];; ) whois [email protected];; ) echo "Usage: `basename $0` [domain-name]";;
38
---
The shift command reassigns the positional parameters, in effect shifting them to the left one notch. $1 <--- $2, $2 <--- $3, $3 <--- $4, etc. The old $1 disappears, but $0 (the script name) does not change. If you use a large number of positional parameters to a script, shift lets you access those past 10, although {bracket} notation also permits this.
# Extra linefeed.
# But, what happens to the "used-up" parameters? echo "$2" # Nothing echoes! # When $2 shifts into $1 (and there is no $3 to shift into $2) #+ then $2 remains empty. # So, it is not a parameter *copy*, but a *move*. exit # See also the echo-params.sh script for a "shiftless" #+ alternative method of stepping through the positional params.
The shift command can take a numerical parameter indicating how many positions to shift.
#!/bin/bash # shift-past.sh shift 3 # Shift 3 positions. # n=3; shift $n # Has the same effect. echo "$1" exit 0 # ======================== #
39
$ sh shift-past.sh 1 2 3 4 5 4 # However, as Eleni Fragkiadaki, points out, #+ attempting a 'shift' past the number of #+ positional parameters ($#) returns an exit status of 1, #+ and the positional parameters themselves do not change. # This means possibly getting stuck in an endless loop. . . . # For example: # until [ -z "$1" ] # do # echo -n "$1 " # shift 20 # If less than 20 pos params, # done #+ then loop never ends! # # When in doubt, add a sanity check. . . . # shift 20 || break # ^^^^^^^^
The shift command works in a similar fashion on parameters passed to a function. See Example 36-16.
40
Chapter 5. Quoting
Quoting means just that, bracketing a string in quotes. This has the effect of protecting special characters in the string from reinterpretation or expansion by the shell or shell script. (A character is "special" if it has an interpretation other than its literal meaning. For example, the asterisk * represents a wild card character in globbing and Regular Expressions).
bash$ ls -l [Vv]* -rw-rw-r-1 bozo bozo -rw-rw-r-1 bozo bozo -rw-rw-r-1 bozo bozo
324 Apr 2 15:05 VIEWDATA.BAT 507 May 4 14:25 vartrace.sh 539 Apr 14 17:11 viewdata.sh
In everyday speech or writing, when we "quote" a phrase, we set it apart and give it special meaning. In a Bash script, when we quote a string, we set it apart and protect its literal meaning. Certain programs and utilities reinterpret or expand special characters in a quoted string. An important use of quoting is protecting a command-line parameter from the shell, but still letting the calling program expand it.
bash$ grep '[Ff]irst' *.txt file1.txt:This is the first line of file1.txt. file2.txt:This is the First line of file2.txt.
Note that the unquoted grep [Ff]irst *.txt works under the Bash shell. [29] Quoting can also suppress echo's "appetite" for newlines.
bash$ echo $(ls -l) total 8 -rw-rw-r-- 1 bo bo 13 Aug 21 12:57 t.sh -rw-rw-r-- 1 bo bo 78 Aug 21 12:57 u.sh
bash$ echo "$(ls -l)" total 8 -rw-rw-r-- 1 bo bo 13 Aug 21 12:57 t.sh -rw-rw-r-- 1 bo bo 78 Aug 21 12:57 u.sh
Use double quotes to prevent word splitting. [31] An argument enclosed in double quotes presents itself as a single word, even if it contains whitespace separators.
Chapter 5. Quoting
41
variable2=""
# Empty.
COMMAND $variable2 $variable2 $variable2 # Executes COMMAND with no arguments. COMMAND "$variable2" "$variable2" "$variable2" # Executes COMMAND with 3 empty arguments. COMMAND "$variable2 $variable2 $variable2" # Executes COMMAND with 1 argument (2 spaces). # Thanks, Stphane Chazelas.
Enclosing the arguments to an echo statement in double quotes is necessary only when word splitting or preservation of whitespace is an issue. Example 5-1. Echoing Weird Variables
#!/bin/bash # weirdvars.sh: Echoing weird variables. echo var="'(]\\{}\$\"" echo $var # '(]\{}$" echo "$var" # '(]\{}$" echo IFS='\' echo $var echo "$var"
Chapter 5. Quoting
42
# ************************************************************ # # As the first example above shows, nesting quotes is permitted. echo "$(echo '"')" # ^ ^ # "
# Or, as Chris Hiestand points out ... if [[ "$(du "$My_File1")" -gt "$(du "$My_File2")" ]]; then ... # ************************************************************ #
Single quotes (' ') operate similarly to double quotes, but do not permit referencing variables, since the special meaning of $ is turned off. Within single quotes, every special character except ' gets interpreted literally. Consider single quotes ("full quoting") to be a stricter method of quoting than double quotes ("partial quoting"). Since even the escape character (\) gets a literal interpretation within single quotes, trying to enclose a single quote within single quotes will not yield the expected result.
echo "Why can't I write 's between single quotes" echo # The roundabout method. echo 'Why can'\''t I write '"'"'s between single quotes' # |-------| |----------| |-----------------------| # Three single-quoted strings, with escaped and quoted single quotes between. # This example courtesy of Stphane Chazelas.
5.2. Escaping
Escaping is a method of quoting single characters. The escape (\) preceding a character tells the shell to interpret that character literally. With certain commands and utilities, such as echo and sed, escaping a character may have the opposite effect - it can toggle on a special meaning for that character.
Chapter 5. Quoting
43
Advanced Bash-Scripting Guide Special meanings of certain escaped characters used with echo and sed \n means newline \r means return \t means tab \v means vertical tab \b means backspace \a means alert (beep or flash) \0xx translates to the octal ASCII equivalent of 0nn, where nn is a string of digits
The $' ... ' quoted string-expansion construct is a mechanism that uses escaped octal or hex values to assign ASCII characters to variables, e.g., quote=$'\042'. Example 5-2. Escaped Characters
#!/bin/bash # escaped.sh: escaped characters ############################################################# ### First, let's show some basic escaped-character usage. ### ############################################################# # Escaping a newline. # -----------------echo "" echo "This will print as two lines." # This will print # as two lines. echo "This will print \ as one line." # This will print as one line. echo; echo echo "============="
echo "\v\v\v\v" # Prints \v\v\v\v literally. # Use the -e option with 'echo' to print escaped characters. echo "=============" echo "VERTICAL TABS" echo -e "\v\v\v\v" # Prints 4 vertical tabs. echo "=============="
Chapter 5. Quoting
44
# The $'\X' construct makes the -e option unnecessary. echo; echo "NEWLINE and (maybe) BEEP" echo $'\n' # Newline. echo $'\a' # Alert (beep). # May only flash, not beep, depending on terminal. # We have seen $'\nnn" string expansion, and now . . . # =================================================================== # # Version 2 of Bash introduced the $'\nnn' string expansion construct. # =================================================================== # echo "Introducing the \$\' ... \' string-expansion construct . . . " echo ". . . featuring more quotation marks." echo $'\t \042 \t' # Quote (") framed by tabs. # Note that '\nnn' is an octal value. # It also works with hexadecimal values, in an $'\xhhh' construct. echo $'\t \x22 \t' # Quote (") framed by tabs. # Thank you, Greg Keraunen, for pointing this out. # Earlier Bash versions allowed '\x022'. echo
# Assigning ASCII characters to a variable. # ---------------------------------------quote=$'\042' # " assigned to a variable. echo "$quote Quoted string $quote and this lies outside the quotes." echo # Concatenating ASCII chars in a variable. triple_underline=$'\137\137\137' # 137 is octal ASCII code for '_'. echo "$triple_underline UNDERLINE $triple_underline" echo ABC=$'\101\102\103\010' echo $ABC echo escape=$'\033' # 033 is octal for escape. echo "\"escape\" echoes as $escape" # no visible output. echo exit 0 # 101, 102, 103 are octal A, B, C.
Chapter 5. Quoting
45
Chapter 5. Quoting
46
See also Example 37-1. \" gives the quote its literal meaning
echo "Hello" echo "\"Hello\" ... he said." # Hello # "Hello" ... he said.
\$ gives the dollar sign its literal meaning (variable name following \$ will not be referenced)
echo "\$variable01" echo "The book cost \$7.98." # $variable01 # The book cost $7.98.
# Whereas . . .
Chapter 5. Quoting
47
The behavior of \ depends on whether it is escaped, strong-quoted, weak-quoted, or appearing within command substitution or a here document.
# # # # # # # # # # # # # # # # Simple escaping and quoting z \z \z \\z \z \z Command substitution z z \z \z \z \\z \z \z
# \z
# \z
Elements of a string assigned to a variable may be escaped, but the escape character alone may not be assigned to a variable.
variable=\ echo "$variable" # Will not work - gives an error message: # test.sh: : command not found # A "naked" escape cannot safely be assigned to a variable. # # What actually happens here is that the "\" escapes the newline and #+ the effect is variable=echo "$variable" #+ invalid variable assignment variable=\ 23skidoo echo "$variable"
# 23skidoo # This works, since the second line #+ is a valid variable assignment.
variable=\ # \^
Chapter 5. Quoting
48
# \
variable=\\\ echo "$variable" # Will not work - gives an error message: # test.sh: \: command not found # # First escape escapes second one, but the third one is left "naked", #+ with same result as first instance, above. variable=\\\\ echo "$variable"
The escape also provides a means of writing a multi-line command. Normally, each separate line constitutes a different command, but an escape at the end of a line escapes the newline character, and the command sequence continues on to the next line.
(cd /source/directory && tar cf - . ) | \ (cd /dest/directory && tar xpvf -) # Repeating Alan Cox's directory tree copy command, # but split into two lines for increased legibility. # As an alternative: tar cf - -C /source/directory . | tar xpvf - -C /dest/directory # See note below. # (Thanks, Stphane Chazelas.)
If a script line ends with a |, a pipe character, then a \, an escape, is not strictly necessary. It is, however, good programming practice to always escape the end of a line of code that continues to the following line.
echo "foo bar" #foo #bar echo
Chapter 5. Quoting
49
Chapter 5. Quoting
50
Likewise, functions within a script and the script itself return an exit status. The last command executed in the function or script determines the exit status. Within a script, an exit nnn command may be used to deliver an nnn exit status to the shell (nnn must be an integer in the 0 - 255 range). When a script ends with an exit that has no parameter, the exit status of the script is the exit status of the last command executed in the script (previous to the exit).
#!/bin/bash COMMAND_1 . . . COMMAND_LAST # Will exit with status of last command. exit
The equivalent of a bare exit is exit $? or even just omitting the exit.
#!/bin/bash COMMAND_1 . . . COMMAND_LAST # Will exit with status of last command. exit $? #!/bin/bash COMMAND1 . . . COMMAND_LAST
51
$? reads the exit status of the last command executed. After a function returns, $? gives the exit status of the last command executed in the function. This is Bash's way of giving functions a "return value." [32] Following the execution of a pipe, a $? gives the exit status of the last command executed. After a script terminates, a $? from the command-line gives the exit status of the script, that is, the last command executed in the script, which is, by convention, 0 on success or an integer in the range 1 - 255 on error.
# By convention, an 'exit 0' indicates success, #+ while a non-zero exit value means an error or anomalous condition. # See the "Exit Codes With Special Meanings" appendix.
$? is especially useful for testing the result of a command in a script (see Example 16-35 and Example 16-20). The !, the logical not qualifier, reverses the outcome of a test or command, and this affects its exit status. Example 6-2. Negating a condition using !
true # The "true" builtin. echo "exit status of \"true\" = $?"
# 0
! true echo "exit status of \"! true\" = $?" # 1 # Note that the "!" needs a space between it and the command. # !true leads to a "command not found" error # # The '!' operator prefixing a command invokes the Bash history mechanism. true !true # No error this time, but no negation either. # It just repeats the previous command (true).
# =========================================================== # # Preceding a _pipe_ with ! inverts the exit status returned. ls | bogus_command # bash: bogus_command: command not found echo $? # 127
52
Certain exit status codes have reserved meanings and should not be user-specified in a script.
53
Chapter 7. Tests
Every reasonably complete programming language can test for a condition, then act according to the result of the test. Bash has the test command, various bracket and parenthesis operators, and the if/then construct.
(( 200 || 11 )) echo $? # 0 *** # ... let "num = (( 200 || 11 ))" echo $num # 1 let "num = (( 200 || 11 ))" echo $? # 0 ***
# Logical OR
(( 200 | 11 )) echo $? # ... let "num = (( 200 | 11 ))" echo $num let "num = (( 200 | 11 ))" echo $?
# Bitwise OR # 0 ***
# 203 # 0 ***
# The "let" construct returns the same exit status #+ as the double-parentheses arithmetic expansion.
Chapter 7. Tests
54
Advanced Bash-Scripting Guide Again, note that the exit status of an arithmetic expression is not an error value.
var=-2 && (( var+=2 )) echo $?
# 1
var=-2 && (( var+=2 )) && echo $var # Will not echo $var!
An if can test any command, not just conditions enclosed within brackets.
if cmp a b &> /dev/null # Suppress output. then echo "Files a and b are identical." else echo "Files a and b differ." fi # The very useful "if-grep" construct: # ----------------------------------if grep -q Bash file then echo "File contains at least one occurrence of Bash." fi word=Linux letter_sequence=inu if echo "$word" | grep -q "$letter_sequence" # The "-q" option to grep suppresses output. then echo "$letter_sequence found in $word" else echo "$letter_sequence not found in $word" fi
Chapter 7. Tests
55
Chapter 7. Tests
56
echo "Testing \"-n \$xyz\"" if [ -n "$xyz" ] then echo "Null variable is true." else echo "Null variable is false." fi # Null variable is false.
echo
# When is "false" true? echo "Testing \"false\"" if [ "false" ] # It seems that "false" is just a string ... then echo "\"false\" is true." #+ and it tests true. else echo "\"false\" is false." fi # "false" is true. echo echo "Testing \"\$false\"" # Again, uninitialized variable. if [ "$false" ] then echo "\"\$false\" is true." else echo "\"\$false\" is false." fi # "$false" is false. # Now, we get the expected result. # What would happen if we tested the uninitialized variable "$true"?
echo exit 0
if [ condition-true ] then command 1 command 2 ... else # Or else ... # Adds default code block executing if original condition tests false. command 3 command 4 ... fi
When if and then are on same line in a condition test, a semicolon must terminate the if statement. Both if and then are keywords. Keywords (or commands) begin statements, and before a new statement on the same line begins, the old one must terminate.
Chapter 7. Tests
57
Else if and elif elif elif is a contraction for else if. The effect is to nest an inner if/then construct within an outer one.
if [ condition1 ] then command1 command2 command3 elif [ condition2 ] # Same as else if then command4 command5 else default-command fi
The if test condition-true construct is the exact equivalent of if [ condition-true ]. As it happens, the left bracket, [ , is a token [33] which invokes the test command. The closing right bracket, ] , in an if/test should not therefore be strictly necessary, however newer versions of Bash require it.
The test command is a Bash builtin which tests file types and compares strings. Therefore, in a Bash script, test does not call the external /usr/bin/test binary, which is part of the sh-utils package. Likewise, [ does not call /usr/bin/[, which is linked to /usr/bin/test.
bash$ type test test is a shell builtin bash$ type '[' [ is a shell builtin bash$ type '[[' [[ is a shell keyword bash$ type ']]' ]] is a shell keyword bash$ type ']' bash: type: ]: not found
If, for some reason, you wish to use /usr/bin/test in a Bash script, then specify it by full pathname. Example 7-2. Equivalence of test, /usr/bin/test, [ ], and /usr/bin/[
#!/bin/bash echo if test -z "$1" then echo "No command-line arguments." else echo "First command-line argument is $1."
Chapter 7. Tests
58
if /usr/bin/[ -z "$1" ] # Again, functionally identical to above. # if /usr/bin/[ -z "$1" # Works, but gives an error message. # # Note: # This has been fixed in Bash, version 3.x. then echo "No command-line arguments." else echo "First command-line argument is $1." fi echo exit 0
The [[ ]] construct is the more versatile Bash version of [ ]. This is the extended test command, adopted from ksh88. *** No filename expansion or word splitting takes place between [[ and ]], but there is parameter expansion and command substitution.
file=/etc/passwd if [[ -e $file ]] then echo "Password file exists." fi
Using the [[ ... ]] test construct, rather than [ ... ] can prevent many logic errors in scripts. For example, the &&, ||, <, and > operators work within a [[ ]] test, despite giving an error within a [ ] construct.
Chapter 7. Tests
59
Advanced Bash-Scripting Guide Arithmetic evaluation of octal / hexadecimal constants takes place automatically within a [[ ... ]] construct.
# [[ Octal and hexadecimal evaluation ]] # Thank you, Moritz Gronbach, for pointing this out.
# = 15 (decimal) # = 15 (decimal)
if [ "$decimal" -eq "$octal" ] then echo "$decimal equals $octal" else echo "$decimal is not equal to $octal" # 15 is not equal to 017 fi # Doesn't evaluate within [ single brackets ]!
if [[ "$decimal" -eq "$octal" ]] then echo "$decimal equals $octal" # 15 equals 017 else echo "$decimal is not equal to $octal" fi # Evaluates within [[ double brackets ]]! if [[ "$decimal" -eq "$hex" ]] then echo "$decimal equals $hex" else echo "$decimal is not equal to $hex" fi # [[ $hexadecimal ]] also evaluates!
# 15 equals 0x0f
Following an if, neither the test command nor the test brackets ( [ ] or [[ ]] ) are strictly necessary.
dir=/home/bozo if cd "$dir" 2>/dev/null; then echo "Now in $dir." else echo "Can't change to $dir." fi # "2>/dev/null" hides error message.
The "if COMMAND" construct returns the exit status of COMMAND. Similarly, a condition within test brackets may stand alone without an if, when used in combination with a list construct.
var1=20 var2=22 [ "$var1" -ne "$var2" ] && echo "$var1 is not equal to $var2" home=/home/bozo [ -d "$home" ] || echo "$home directory does not exist."
The (( )) construct expands and evaluates an arithmetic expression. If the expression evaluates as zero, it returns an exit status of 1, or "false". A non-zero expression returns an exit status of 0, or "true". This is in marked contrast to using the test and [ ] constructs previously discussed.
Chapter 7. Tests
60
# 1
(( 1 / 0 )) 2>/dev/null # ^^^^^^^^^^^ echo "Exit status of \"(( 1 / 0 ))\" is $?." # What effect does the "2>/dev/null" have? # What would happen if it were removed? # Try removing it, then rerunning the script. # ======================================= # # (( ... )) also useful in an if-then test. var1=5 var2=4 if (( var1 > var2 )) then #^ ^ Note: Not $var1, $var2. Why? echo "$var1 is greater than $var2" fi # 5 is greater than 4 exit 0
Chapter 7. Tests
61
device1="/dev/ttyS1" # PCMCIA modem card. if [ -c "$device1" ] then echo "$device1 is a character device." fi # /dev/ttyS1 is a character device.
-p file is a pipe
function show_input_type() { [ -p /dev/fd/0 ] && echo PIPE || echo STDIN } show_input_type "Input" echo "Input" | show_input_type # This example courtesy of Carl Anderson. # STDIN # PIPE
Advanced Bash-Scripting Guide file is a symbolic link -S file is a socket -t file (descriptor) is associated with a terminal device This test option may be used to check whether the stdin [ -t 0 ] or stdout [ -t 1 ] in a given script is a terminal. -r file has read permission (for the user running the test) -w file has write permission (for the user running the test) -x file has execute permission (for the user running the test) -g set-group-id (sgid) flag set on file or directory If a directory has the sgid flag set, then a file created within that directory belongs to the group that owns the directory, not necessarily to the group of the user who created the file. This may be useful for a directory shared by a workgroup. -u set-user-id (suid) flag set on file A binary owned by root with set-user-id flag set runs with root privileges, even when an ordinary user invokes it. [35] This is useful for executables (such as pppd and cdrecord) that need to access system hardware. Lacking the suid flag, these binaries could not be invoked by a non-root user.
-rwsr-xr-t 1 root 178236 Oct 2 2000 /usr/sbin/pppd
A file with the suid flag set shows an s in its permissions. -k sticky bit set Commonly known as the sticky bit, the save-text-mode flag is a special type of file permission. If a file has this flag set, that file will be kept in cache memory, for quicker access. [36] If set on a directory, it restricts write permission. Setting the sticky bit adds a t to the permissions on the file or directory listing.
drwxrwxrwt 7 root 1024 May 19 21:26 tmp/
If a user does not own a directory that has the sticky bit set, but has write permission in that directory, she can only delete those files that she owns in it. This keeps users from inadvertently overwriting or deleting each other's files in a publicly accessible directory, such as /tmp. (The owner of the directory or root can, of course, delete or rename files there.) -O you are owner of file -G group-id of file same as yours -N Chapter 7. Tests 63
Advanced Bash-Scripting Guide file modified since it was last read f1 -nt f2 file f1 is newer than f2 f1 -ot f2 file f1 is older than f2 f1 -ef f2 files f1 and f2 are hard links to the same file ! "not" -- reverses the sense of the tests above (returns true if condition absent).
# If no args are passed to the script set directories-to-search #+ to current directory. Otherwise set the directories-to-search #+ to the args passed. ###################### [ $# -eq 0 ] && directorys=`pwd` || directorys=$@
function linkchk to check the directory it is passed that are links and don't exist, then print them quoted. the elements in the directory is a subdirectory then subdirectory to the linkcheck function.
linkchk () { for element in $1/*; do [ -h "$element" -a ! -e "$element" ] && echo \"$element\" [ -d "$element" ] && linkchk $element # Of course, '-h' tests for symbolic link, '-d' for directory. done } # Send each arg that was passed to the script to the linkchk() function #+ if it is a valid directoy. If not, then print the error message #+ and usage info.
Chapter 7. Tests
64
Example 31-1, Example 11-7, Example 11-3, Example 31-3, and Example A-1 also illustrate uses of the file test operators.
Chapter 7. Tests
65
Advanced Bash-Scripting Guide (("$a" <= "$b")) > is greater than (within double parentheses) (("$a" > "$b")) >= is greater than or equal to (within double parentheses) (("$a" >= "$b")) string comparison = is equal to if [ "$a" = "$b" ] Note the whitespace framing the =. if [ "$a"="$b" ] is not equivalent to the above. == is equal to if [ "$a" == "$b" ] This is a synonym for =. The == comparison operator behaves differently within a double-brackets test than within single brackets.
[[ $a == z* ]] # True if $a starts with an "z" (pattern matching). [[ $a == "z*" ]] # True if $a is equal to z* (literal matching). [ $a == z* ] # File globbing and word splitting take place. [ "$a" == "z*" ] # True if $a is equal to z* (literal matching). # Thanks, Stphane Chazelas
!= is not equal to if [ "$a" != "$b" ] This operator uses pattern matching within a [[ ... ]] construct. < is less than, in ASCII alphabetical order if [[ "$a" < "$b" ]] if [ "$a" \< "$b" ]
Chapter 7. Tests
66
Advanced Bash-Scripting Guide Note that the "<" needs to be escaped within a [ ] construct. > is greater than, in ASCII alphabetical order if [[ "$a" > "$b" ]] if [ "$a" \> "$b" ] Note that the ">" needs to be escaped within a [ ] construct. See Example 27-11 for an application of this comparison operator. -z string is null, that is, has zero length
String='' # Zero-length ("null") string variable.
if [ -z "$String" ] then echo "\$String is null." else echo "\$String is NOT null." fi # $String is null.
-n string is not null. The -n test requires that the string be quoted within the test brackets. Using an unquoted string with ! -z, or even just the unquoted string alone within test brackets (see Example 7-6) normally works, however, this is an unsafe practice. Always quote a tested string. [37]
Chapter 7. Tests
67
# If a string has not been initialized, it has no defined value. # This state is called "null" (not the same as zero!). if [ -n $string1 ] # string1 has not been declared or initialized. then echo "String \"string1\" is not null." else echo "String \"string1\" is null." fi # Wrong result. # Shows $string1 as not null, although it was not initialized. echo # Let's try it again. if [ -n "$string1" ] # This time, $string1 is quoted. then echo "String \"string1\" is not null." else echo "String \"string1\" is null." fi # Quote strings within test brackets! echo if [ $string1 ] # This time, $string1 stands naked. then echo "String \"string1\" is not null." else echo "String \"string1\" is null." fi # This works fine. # The [ ... ] test operator alone detects whether the string is null. # However it is good practice to quote it (if [ "$string1" ]). # # As Stephane Chazelas points out, # if [ $string1 ] has one argument, "]" # if [ "$string1" ] has two arguments, the empty "$string1" and "]"
echo
Chapter 7. Tests
68
string1=initialized if [ $string1 ] # Again, $string1 stands unquoted. then echo "String \"string1\" is not null." else echo "String \"string1\" is null." fi # Again, gives correct result. # Still, it is better to quote it ("$string1"), because . . .
string1="a = b" if [ $string1 ] # Again, $string1 stands unquoted. then echo "String \"string1\" is not null." else echo "String \"string1\" is null." fi # Not quoting "$string1" now gives wrong result! exit 0 # Thank you, also, Florian Wisser, for the "heads-up".
Chapter 7. Tests
69
compound comparison -a logical and exp1 -a exp2 returns true if both exp1 and exp2 are true. -o logical or exp1 -o exp2 returns true if either exp1 or exp2 is true. These are similar to the Bash comparison operators && and ||, used within double brackets.
[[ condition1 && condition2 ]]
The -o and -a operators work with the test command or occur within single test brackets.
if [ "$expr1" -a "$expr2" ] then echo "Both expr1 and expr2 are true." else echo "Either expr1 or expr2 is false." fi
Refer to Example 8-3, Example 27-17, and Example A-29 to see compound comparison operators in action.
Chapter 7. Tests
70
Example 37-4 and Example 17-11 demonstrate nested if/then condition tests.
Explain the test constructs in the above snippet, then examine an updated version of the file, /etc/X11/xinit/xinitrc, and analyze the if/then test constructs there. You may need to refer ahead to the discussions of grep, sed, and regular expressions.
Chapter 7. Tests
71
Do not confuse the "=" assignment operator with the = test operator.
# = as a test operator
if [ "$string1" = "$string2" ] then command fi # if [ "X$string1" = "X$string2" ] is safer, #+ to prevent an error message should one of the variables be empty. # (The prepended "X" characters cancel out.)
72
Advanced Bash-Scripting Guide This operator finds use in, among other things, generating numbers within a specific range (see Example 9-11 and Example 9-15) and formatting program output (see Example 27-16 and Example A-6). It can even be used to generate prime numbers, (see Example A-15). Modulo turns up surprisingly often in numerical recipes.
# -----------------------------------------------------# Argument check ARGS=2 E_BADARGS=85 if [ $# -ne "$ARGS" ] then echo "Usage: `basename $0` first-number second-number" exit $E_BADARGS fi # ------------------------------------------------------
gcd () { dividend=$1 divisor=$2 # Arbitrary assignment. #! It doesn't matter which of the two is larger. # Why not? # If an uninitialized variable is used inside #+ test brackets, an error message results.
remainder=1
until [ "$remainder" -eq 0 ] do # ^^^^^^^^^^ Must be previously initialized! let "remainder = $dividend % $divisor" dividend=$divisor # Now repeat with 2 smallest numbers. divisor=$remainder done # Euclid's algorithm } # Last $dividend is the gcd.
gcd $1 $2
73
# Exercises : # --------# 1) Check command-line arguments to make sure they are integers, #+ and exit the script with an appropriate error message if not. # 2) Rewrite the gcd () function to use local variables. exit 0
+= plus-equal (increment variable by a constant) [38] let "var += 5" results in var being incremented by 5. -= minus-equal (decrement variable by a constant) *= times-equal (multiply variable by a constant) let "var *= 4" results in var being multiplied by 4. /= slash-equal (divide variable by a constant) %= mod-equal (remainder of dividing variable by a constant) Arithmetic operators often occur in an expr or let expression.
: $((n = $n + 1)) # ":" necessary because otherwise Bash attempts #+ to interpret "$((n = $n + 1))" as a command. echo -n "$n " (( n = n + 1 )) # A simpler alternative to the method above. # Thanks, David Lombard, for pointing this out. echo -n "$n " n=$(($n + 1)) echo -n "$n " : $[ n = $n + 1 ] # ":" necessary because otherwise Bash attempts #+ to interpret "$[ n = $n + 1 ]" as a command. # Works even if "n" was initialized as a string.
74
# (( ++n ))
also works.
Integer variables in older versions of Bash were signed long (32-bit) integers, in the range of -2147483648 to 2147483647. An operation that took a variable outside these limits gave an erroneous result.
echo $BASH_VERSION a=2147483646 echo "a = $a" let "a+=1" echo "a = $a" let "a+=1" echo "a = $a" # 1.14
# # # # # # # #
a = 2147483646 Increment "a". a = 2147483647 increment "a" again, past the limit. a = -2147483648 ERROR: out of range, + and the leftmost bit, the sign bit, + has been set, making the result negative.
Bash does not understand floating point arithmetic. It treats numbers containing a decimal point as strings.
a=1.5 let "b = $a + 1.3" # Error. # t2.sh: let: b = 1.5 + 1.3: syntax error in expression # (error token is ".5 + 1.3") echo "b = $b" # b=1
Use bc in scripts that that need floating point calculations or math library functions.
75
Advanced Bash-Scripting Guide bitwise operators. The bitwise operators seldom make an appearance in shell scripts. Their chief use seems to be manipulating and testing values read from ports or sockets. "Bit flipping" is more relevant to compiled languages, such as C and C++, which provide direct access to system hardware. However, see vladz's ingenious use of bitwise operators in his base64.sh (Example A-54) script. bitwise operators << bitwise left shift (multiplies by 2 for each shift position) <<= left-shift-equal let "var <<= 2" results in var left-shifted 2 bits (multiplied by 4) >> bitwise right shift (divides by 2 for each shift position) >>= right-shift-equal (inverse of <<=) & bitwise AND &= bitwise AND-equal | bitwise OR |= bitwise OR-equal ~ bitwise NOT ^ bitwise XOR ^= bitwise XOR-equal logical (boolean) operators ! NOT
if [ ! -f $FILENAME ] then ...
&& AND
if [ $condition1 ] && [ $condition2 ] # Same as: if [ $condition1 -a $condition2 ] # Returns true if both condition1 and condition2 hold true... if [[ $condition1 && $condition2 ]] # Also works. # Note that && operator not permitted inside brackets #+ of [ ... ] construct.
&& may also be used, depending on context, in an and list to concatenate commands. Chapter 8. Operations and Related Topics 76
Bash tests the exit status of each statement linked with a logical operator. Example 8-3. Compound Condition Tests Using && and ||
#!/bin/bash a=24 b=47 if [ "$a" -eq 24 ] && [ "$b" -eq 47 ] then echo "Test #1 succeeds." else echo "Test #1 fails." fi # ERROR: if [ "$a" -eq 24 && "$b" -eq 47 ] #+ attempts to execute ' [ "$a" -eq 24 ' #+ and fails to finding matching ']'. # # Note: if [[ $a -eq 24 && $b -eq 24 ]] works. # The double-bracket if-test is more flexible #+ than the single-bracket version. # (The "&&" has a different meaning in line 17 than in line 6.) # Thanks, Stephane Chazelas, for pointing this out.
if [ "$a" -eq 98 ] || [ "$b" -eq 47 ] then echo "Test #2 succeeds." else echo "Test #2 fails." fi
# The -a and -o options provide #+ an alternative compound condition test. # Thanks to Patrick Callahan for pointing this out.
if [ "$a" -eq 24 -a "$b" -eq 47 ] then echo "Test #3 succeeds." else echo "Test #3 fails." fi
77
a=rhino b=crocodile if [ "$a" = rhino ] && [ "$b" = crocodile ] then echo "Test #5 succeeds." else echo "Test #5 fails." fi exit 0
miscellaneous operators , Comma operator The comma operator chains together two or more arithmetic operations. All the operations are evaluated (with possible side effects. [39]
let "t1 = ((5 + 3, 7 - 1, 15 - 4))" echo "t1 = $t1" ^^^^^^ # t1 = 11 # Here t1 is set to the result of the last operation. Why? let "t2 = ((a = 9, 15 / 3))" echo "t2 = $t2 a = $a" # Set "a" and calculate "t2". # t2 = 5 a = 9
The comma operator finds use mainly in for loops. See Example 11-12.
# 32
78
# Octal: numbers preceded by '0' (zero) let "oct = 032" echo "octal number = $oct" # Expresses result in decimal. # --------- ------ -- -------
# 26
# Hexadecimal: numbers preceded by '0x' or '0X' let "hex = 0x32" echo "hexadecimal number = $hex" # 50 echo $((0x9abc)) # 39612 # ^^ ^^ double-parentheses arithmetic expansion/evaluation # Expresses result in decimal.
# Other bases: BASE#NUMBER # BASE between 2 and 64. # NUMBER must use symbols within the BASE range, see below.
let "bin = 2#111100111001101" echo "binary number = $bin" let "b32 = 32#77" echo "base-32 number = $b32"
# 31181
# 231
let "b64 = 64#@_" echo "base-64 number = $b64" # 4031 # This notation only works for a limited range (2 - 64) of ASCII characters. # 10 digits + 26 lowercase characters + 26 uppercase characters + @ + _
echo echo $((36#zz)) $((2#10101010)) $((16#AF16)) $((53#1aA)) # 1295 170 44822 3375
# # # #+
Important note: -------------Using a digit out of range of the specified base notation gives an error message.
let "bad_oct = 081" # (Partial) error message output: # bad_oct = 081: value too great for base (error token is "081") # Octal numbers use only digits in the range 0 - 7. exit $? # Exit value = 1 (error)
79
echo (( a = 23 )) # Setting a value, C-style, #+ with spaces on both sides of the "=". echo "a (initial value) = $a" # 23 (( a++ )) # Post-increment 'a', C-style. echo "a (after a++) = $a" # 24 (( a-- )) # Post-decrement 'a', C-style. echo "a (after a--) = $a" # 23
(( ++a )) # Pre-increment 'a', C-style. echo "a (after ++a) = $a" # 24 (( --a )) # Pre-decrement 'a', C-style. echo "a (after --a) = $a" # 23 echo ######################################################## # Note that, as in C, pre- and post-decrement operators #+ have different side-effects. n=1; let --n && echo "True" || echo "False" n=1; let n-- && echo "True" || echo "False" # False # True
# Thanks, Jeroen Domburg. ######################################################## echo (( t = a<45?7:11 )) # C-style trinary operator. # ^ ^ ^ echo "If a < 45, then t = 7, else t = 11." # a = 23 echo "t = $t " # t = 7 echo
80
Table 8-1. Operator Precedence Operator var++ var-++var --var ! ~ Meaning post-increment, post-decrement pre-increment, pre-decrement negation Comments HIGHEST PRECEDENCE C-style operators
logical / bitwise, inverts sense of following operator arithmetic operation arithmetic operation arithmetic operation bitwise string is/is-not null file-test string and integer file-test test operators, string and integer bitwise exclusive OR, bitwise bitwise
** * / % + << >>
-z -n unary comparison -e -f -t -x, etc. unary comparison < -lt > -gt <= -le >= -ge compound comparison -nt -ot -ef compound comparison == -eq != -ne equality / inequality & ^ | AND XOR OR
81
Advanced Bash-Scripting Guide && -a || -o ?: = AND OR trinary operator assignment logical, compound comparison logical, compound comparison C-style (do not confuse with equality test) times-equal, divide-equal, mod-equal, etc. links a sequence of operations LOWEST PRECEDENCE
comma
In practice, all you really need to remember is the following: The "My Dear Aunt Sally" mantra (multiply, divide, add, subtract) for the familiar arithmetic operations. The compound logical operators, &&, ||, -a, and -o have low precedence. The order of evaluation of equal-precedence operators is usually left-to-right. Now, let's utilize our knowledge of operator precedence to analyze a couple of lines from the /etc/init.d/functions file, as found in the Fedora Core Linux distro.
while [ -n "$remaining" -a "$retry" -gt 0 ]; do # This looks rather daunting at first glance.
# Separate the conditions: while [ -n "$remaining" -a "$retry" -gt 0 ]; do # --condition 1-- ^^ --condition 2# #+ #+ #+ #+ #+ # # #+ #+ #+ If variable "$remaining" is not zero length AND (-a) variable "$retry" is greater-than zero then the [ expresion-within-condition-brackets ] returns success (0) and the while-loop executes an iteration. ============================================================== Evaluate "condition 1" and "condition 2" ***before*** ANDing them. Why? Because the AND (-a) has a lower precedence than the -n and -gt operators, and therefore gets evaluated *last*.
# Again, separate the conditions: if [ -f /etc/sysconfig/i18n -a -z "${NOLOCALE:-}" ] ; then # --condition 1--------- ^^ --condition 2----# If file "/etc/sysconfig/i18n" exists #+ AND (-a) #+ variable $NOLOCALE is zero length #+ then
82
To avoid confusion or error in a complex sequence of test operators, break up the sequence into bracketed sections.
if [ "$v1" -gt "$v2" -o "$v1" -lt "$v2" # Unclear what's going on here... -a -e "$filename" ]
if [[ "$v1" -gt "$v2" ]] || [[ "$v1" -lt "$v2" ]] && [[ -e "$filename" ]] # Much better -- the condition tests are grouped in logical sections.
83
84
$BASH_ENV An environmental variable pointing to a Bash startup file to be read when a script is invoked $BASH_SUBSHELL A variable indicating the subshell level. This is a new addition to Bash, version 3. See Example 21-1 for usage. $BASHPID Process ID of the current instance of Bash. This is not the same as the $$ variable, but it often gives the same result.
bash4$ echo $$ 11015
But ...
#!/bin/bash4 echo "\$\$ outside of subshell = $$" echo "\$BASH_SUBSHELL outside of subshell = $BASH_SUBSHELL" echo "\$BASHPID outside of subshell = $BASHPID" echo ( echo "\$\$ inside of subshell = $$" echo "\$BASH_SUBSHELL inside of subshell = $BASH_SUBSHELL" echo "\$BASHPID inside of subshell = $BASHPID" ) # Note that $$ returns PID of parent process. # 9602 # 1 # 9603 # 9602 # 0 # 9602
$BASH_VERSINFO[n] A 6-element array containing version information about the installed release of Bash. This is similar to $BASH_VERSION, below, but a bit more detailed.
85
Checking $BASH_VERSION is a good method of determining which shell is running. $SHELL does not necessarily give the correct answer. $CDPATH A colon-separated list of search paths available to the cd command, similar in function to the $PATH variable for binaries. The $CDPATH variable may be set in the local ~/.bashrc file.
bash$ cd bash-doc bash: cd: bash-doc: No such file or directory
$DIRSTACK The top value in the directory stack [41] (affected by pushd and popd) This builtin variable corresponds to the dirs command, however dirs shows the entire contents of the directory stack. $EDITOR The default editor invoked by a script, usually vi or emacs. $EUID "effective" user ID number Identification number of whatever identity the current user has assumed, perhaps by means of su. The $EUID is not necessarily the same as the $UID.
86
See also Example A-50. $GLOBIGNORE A list of filename patterns to be excluded from matching in globbing. $GROUPS Groups current user belongs to This is a listing (array) of the group id numbers for current user, as recorded in /etc/passwd and /etc/group.
root# echo $GROUPS 0
$HOME Home directory of the user, usually /home/username (see Example 10-7) $HOSTNAME The hostname command assigns the system host name at bootup in an init script. However, the gethostname() function sets the Bash internal variable $HOSTNAME. See also Example 10-7. $HOSTTYPE host type Like $MACHTYPE, identifies the system hardware.
bash$ echo $HOSTTYPE i686
$IFS internal field separator This variable determines how Bash recognizes fields, or word boundaries, when it interprets character strings.
$IFS defaults to whitespace (space, tab, and newline), but may be changed, for example, to parse a comma-separated data file. Note that $* uses the first character held in $IFS. See Example 5-1.
87
bash$ echo "$IFS" | cat -vte ^I$ $ (Show whitespace: here a single space, ^I [horizontal tab], and newline, and display "$" at end-of-line.)
bash$ bash -c 'set w x y z; IFS=":-;"; echo "$*"' w:x:y:z (Read commands from string and assign any arguments to pos params.)
$IFS does not handle whitespace the same as it does other characters. Example 9-1. $IFS and whitespace
#!/bin/bash # ifs.sh
var1="a+b+c" var2="d-e-f" var3="g,h,i" IFS=+ # The plus sign will be interpreted as a separator. echo $var1 # a b c echo $var2 # d-e-f echo $var3 # g,h,i echo IFS="-" # The plus sign reverts to default interpretation. # The minus sign will be interpreted as a separator. echo $var1 # a+b+c echo $var2 # d e f echo $var3 # g,h,i echo IFS="," # The comma will be interpreted as a separator. # The minus sign reverts to default interpretation. echo $var1 # a+b+c echo $var2 # d-e-f echo $var3 # g h i echo
88
b c
"`
echo; echo "IFS=:" echo "-----" IFS=: var=":a::b:c:::" # ^ ^^ ^^^ output_args_one_per_line $var # [] # [a] # [] # [b] # [c] # [] # []
...
# Note "empty" brackets. # The same thing happens with the "FS" field separator in awk.
echo exit
(Many thanks, Stphane Chazelas, for clarification and above examples.) See also Example 16-41, Example 11-7, and Example 19-14 for instructive examples of using $IFS. $IGNOREEOF Ignore EOF: how many end-of-files (control-D) the shell will ignore before logging out. $LC_COLLATE Chapter 9. Another Look at Variables 89
Advanced Bash-Scripting Guide Often set in the .bashrc or /etc/profile files, this variable controls collation order in filename expansion and pattern matching. If mishandled, LC_COLLATE can cause unexpected results in filename globbing. As of version 2.05 of Bash, filename globbing no longer distinguishes between lowercase and uppercase letters in a character range between brackets. For example, ls [A-M]* would match both File1.txt and file1.txt. To revert to the customary behavior of bracket matching, set LC_COLLATE to C by an export LC_COLLATE=C in /etc/profile and/or ~/.bashrc. $LC_CTYPE This internal variable controls character interpretation in globbing and pattern matching. $LINENO This variable is the line number of the shell script in which this variable appears. It has significance only within the script in which it appears, and is chiefly useful for debugging purposes.
# *** BEGIN DEBUG BLOCK *** last_cmd_arg=$_ # Save it. echo "At line number $LINENO, variable \"v1\" = $v1" echo "Last command argument processed = $last_cmd_arg" # *** END DEBUG BLOCK ***
$OLDPWD Old working directory ("OLD-Print-Working-Directory", previous directory you were in). $OSTYPE operating system type
bash$ echo $OSTYPE linux
$PATH Path to binaries, usually /usr/bin/, /usr/X11R6/bin/, /usr/local/bin, etc. When given a command, the shell automatically does a hash table search on the directories listed in the path for the executable. The path is stored in the environmental variable, $PATH, a list of directories, separated by colons. Normally, the system stores the $PATH definition in /etc/profile and/or ~/.bashrc (see Appendix H).
bash$ echo $PATH /bin:/usr/bin:/usr/local/bin:/usr/X11R6/bin:/sbin:/usr/sbin
PATH=${PATH}:/opt/bin appends the /opt/bin directory to the current path. In a script, it may be expedient to temporarily add a directory to the path in this way. When the script exits, this restores the original $PATH (a child process, such as a script, may not change the environment of the parent process, the shell).
90
Advanced Bash-Scripting Guide The current "working directory", ./, is usually omitted from the $PATH as a security measure. $PIPESTATUS Array variable holding exit status(es) of last executed foreground pipe.
bash$ echo $PIPESTATUS 0 bash$ ls -al | bogus_command bash: bogus_command: command not found bash$ echo ${PIPESTATUS[1]} 127 bash$ ls -al | bogus_command bash: bogus_command: command not found bash$ echo $? 127
The members of the $PIPESTATUS array hold the exit status of each respective command executed in a pipe. $PIPESTATUS[0] holds the exit status of the first command in the pipe, $PIPESTATUS[1] the exit status of the second command, and so on. The $PIPESTATUS variable may contain an erroneous 0 value in a login shell (in releases prior to 3.0 of Bash).
tcsh% bash bash$ who | grep nobody | sort bash$ echo ${PIPESTATUS[*]} 0
The above lines contained in a script would produce the expected 0 1 0 output. Thank you, Wayne Pollock for pointing this out and supplying the above example. The $PIPESTATUS variable gives unexpected results in some contexts.
bash$ echo $BASH_VERSION 3.00.14(1)-release bash$ $ ls | bogus_command | wc bash: bogus_command: command not found 0 0 0 bash$ echo ${PIPESTATUS[@]} 141 127 0
Chet Ramey attributes the above output to the behavior of ls. If ls writes to a pipe whose output is not read, then SIGPIPE kills it, and its exit status is 141. Otherwise its exit status is 0, as expected. This likewise is the case for tr. $PIPESTATUS is a "volatile" variable. It needs to be captured immediately after the pipe in question, before any other command intervenes.
91
The pipefail option may be useful in cases where $PIPESTATUS does not give the desired information. $PPID The $PPID of a process is the process ID (pid) of its parent process. [42] Compare this with the pidof command. $PROMPT_COMMAND A variable holding a command to be executed just before the primary prompt, $PS1 is to be displayed. $PS1 This is the main prompt, seen at the command-line. $PS2 The secondary prompt, seen when additional input is expected. It displays as ">". $PS3 The tertiary prompt, displayed in a select loop (see Example 11-29). $PS4 The quartenary prompt, shown at the beginning of each line of output when invoking a script with the -x option. It displays as "+". $PWD Working directory (directory you are in at the time) This is the analog to the pwd builtin command.
#!/bin/bash E_WRONG_DIRECTORY=85 clear # Clear the screen. TargetDirectory=/home/bozo/projects/GreatAmericanNovel cd $TargetDirectory echo "Deleting stale files in $TargetDirectory." if [ "$PWD" != "$TargetDirectory" ] then # Keep from wiping out wrong directory by accident. echo "Wrong directory!" echo "In $PWD, rather than $TargetDirectory!" echo "Bailing out!" exit $E_WRONG_DIRECTORY fi rm -rf * rm .[A-Za-z0-9]*
# Delete dotfiles.
92
$REPLY The default value when a variable is not supplied to read. Also applicable to select menus, but only supplies the item number of the variable chosen, not the value of the variable itself.
#!/bin/bash # reply.sh # REPLY is the default value for a 'read' command. echo echo -n "What is your favorite vegetable? " read echo "Your favorite vegetable is $REPLY." # REPLY holds the value of last "read" if and only if #+ no variable supplied. echo echo -n "What is your favorite fruit? " read fruit echo "Your favorite fruit is $fruit." echo "but..." echo "Value of \$REPLY is still $REPLY." # $REPLY is still set to its previous value because #+ the variable $fruit absorbed the new "read" value. echo exit 0
93
$SHLVL Shell level, how deeply Bash is nested. [43] If, at the command-line, $SHLVL is 1, then in a script it will increment to 2. This variable is not affected by subshells. Use $BASH_SUBSHELL when you need an indication of subshell nesting. $TMOUT If the $TMOUT environmental variable is set to a non-zero value time, then the shell prompt will time out after $time seconds. This will cause a logout. As of version 2.05b of Bash, it is now possible to use $TMOUT in a script in combination with read.
# Works in scripts for Bash, versions 2.05b and later. TMOUT=3 # Prompt times out at three seconds.
echo "What is your favorite song?" echo "Quickly now, you only have $TMOUT seconds to answer!" read song if [ -z "$song" ] then song="(no answer)" # Default response. fi echo "Your favorite song is $song."
There are other, more complex, ways of implementing timed input in a script. One alternative is to set up a timing loop to signal the script when it times out. This also requires a signal handling routine to trap (see Example 32-5) the interrupt generated by the timing loop (whew!).
94
TIMER_INTERRUPT=14 TIMELIMIT=3 # Three seconds in this instance. # May be set to different value. PrintAnswer() { if [ "$answer" = TIMEOUT ] then echo $answer else # Don't want to mix up the two instances. echo "Your favorite veggie is $answer" kill $! # Kills no-longer-needed TimerOn function #+ running in background. # $! is PID of last job running in background. fi }
TimerOn() { sleep $TIMELIMIT && kill -s 14 $$ & # Waits 3 seconds, then sends sigalarm to script. }
Int14Vector() { answer="TIMEOUT" PrintAnswer exit $TIMER_INTERRUPT } trap Int14Vector $TIMER_INTERRUPT # Timer interrupt (14) subverted for our purposes. echo "What is your favorite vegetable " TimerOn read answer PrintAnswer
# # # # #+
Admittedly, this is a kludgy implementation of timed input. However, the "-t" option to "read" simplifies this task. See the "t-out.sh" script. However, what about timing not just single user input, but an entire script?
# If you need something really elegant ... #+ consider writing the application in C or C++, #+ using appropriate library functions, such as 'alarm' and 'setitimer.' exit 0
95
timedout_read() { timeout=$1 varname=$2 old_tty_settings=`stty -g` stty -icanon min 0 time ${timeout}0 eval read $varname # or just read $varname stty "$old_tty_settings" # See man page for "stty." } echo; echo -n "What's your name? Quick! " timedout_read $INTERVAL your_name # This may not work on every terminal type. # The maximum timeout depends on the terminal. #+ (it is often 25.5 seconds). echo if [ ! -z "$your_name" ] # If name input before timeout ... then echo "Your name is $your_name." else echo "Timed out." fi echo # The behavior of this script differs somewhat from "timed-input.sh." # At each keystroke, the counter resets. exit 0
TIMELIMIT=4
# 4 seconds
read -t $TIMELIMIT variable <&1 # ^^^ # In this instance, "<&1" is needed for Bash 1.x and 2.x,
96
echo if [ -z "$variable" ] # Is null? then echo "Timed out, variable still unset." else echo "variable = $variable" fi exit 0
$UID User ID number Current user's user identification number, as recorded in /etc/passwd This is the current user's real id, even if she has temporarily assumed another identity through su. $UID is a readonly variable, not subject to change from the command line or within a script, and is the counterpart to the id builtin.
Am I root or not?
if [ "$UID" -eq "$ROOT_UID" ] # Will the real "root" please stand up? then echo "You are root." else echo "You are just an ordinary user (but mom loves you just the same)." fi exit 0
# ============================================================= # # Code below will not execute, because the script already exited. # An alternate method of getting to the root of matters: ROOTUSER_NAME=root username=`id -nu` # Or... username=`whoami` if [ "$username" = "$ROOTUSER_NAME" ] then echo "Rooty, toot, toot. You are root." else echo "You are just a regular fella." fi
See also Example 2-3. The variables $ENV, $LOGNAME, $MAIL, $TERM, $USER, and $USERNAME are not Bash builtins. These are, however, often set as environmental variables in one of the Bash startup files. $SHELL, the name of the user's login shell, may be set from Chapter 9. Another Look at Variables 97
Advanced Bash-Scripting Guide /etc/passwd or in an "init" script, and it is likewise not a Bash builtin.
tcsh% echo $LOGNAME bozo tcsh% echo $SHELL /bin/tcsh tcsh% echo $TERM rxvt bash$ echo $LOGNAME bozo bash$ echo $SHELL /bin/tcsh bash$ echo $TERM rxvt
Positional Parameters $0, $1, $2, etc. Positional parameters, passed from command line to script, passed to a function, or set to a variable (see Example 4-5 and Example 15-16) $# Number of command-line arguments [44] or positional parameters (see Example 36-2) $* All of the positional parameters, seen as a single word "$*" must be quoted. $@ Same as $*, but each parameter is a quoted string, that is, the parameters are passed on intact, without interpretation or expansion. This means, among other things, that each parameter in the argument list is seen as a separate word. Of course, "$@" should be quoted. Example 9-6. arglist: Listing arguments with $* and $@
#!/bin/bash # arglist.sh # Invoke this script with several arguments, such as "one two three" ... E_BADARGS=85 if [ ! -n "$1" ] then echo "Usage: `basename $0` argument1 argument2 etc." exit $E_BADARGS fi echo index=1 # Initialize count.
echo "Listing args with \"\$*\":" for arg in "$*" # Doesn't work properly if "$*" isn't quoted.
98
echo "Listing args with \"\$@\":" for arg in "$@" do echo "Arg #$index = $arg" let "index+=1" done # $@ sees arguments as separate words. echo "Arg list seen as separate words." echo index=1 # Reset count.
echo "Listing args with \$* (unquoted):" for arg in $* do echo "Arg #$index = $arg" let "index+=1" done # Unquoted $* sees arguments as separate words. echo "Arg list seen as separate words." exit 0
Following a shift, the $@ holds the remaining command-line parameters, lacking the previous $1, which was lost.
#!/bin/bash # Invoke with ./scriptname 1 2 3 4 5 echo "$@" shift echo "$@" shift echo "$@" # 1 2 3 4 5 # 2 3 4 5 # 3 4 5
# Each "shift" loses parameter $1. # "$@" then contains the remaining parameters.
The $@ special parameter finds use as a tool for filtering input into shell scripts. The cat "$@" construction accepts input to a script either from stdin or from files given as parameters to the script. See Example 16-24 and Example 16-25. The $* and $@ parameters sometimes display inconsistent and puzzling behavior, depending on the setting of $IFS. Example 9-7. Inconsistent $* and $@ behavior
#!/bin/bash # Erratic behavior of the "$*" and "$@" internal Bash variables, #+ depending on whether or not they are quoted.
99
set -- "First one" "second" "third:one" "" "Fifth: :one" # Setting the script arguments, $1, $2, $3, etc. echo echo 'IFS unchanged, using "$*"' c=0 for i in "$*" # quoted do echo "$((c+=1)): [$i]" # This line remains the same in every instance. # Echo args. done echo --echo 'IFS unchanged, using $*' c=0 for i in $* # unquoted do echo "$((c+=1)): [$i]" done echo --echo 'IFS unchanged, using "$@"' c=0 for i in "$@" do echo "$((c+=1)): [$i]" done echo --echo 'IFS unchanged, using $@' c=0 for i in $@ do echo "$((c+=1)): [$i]" done echo --IFS=: echo 'IFS=":", using "$*"' c=0 for i in "$*" do echo "$((c+=1)): [$i]" done echo --echo 'IFS=":", using $*' c=0 for i in $* do echo "$((c+=1)): [$i]" done echo --var=$* echo 'IFS=":", using "$var" (var=$*)' c=0 for i in "$var" do echo "$((c+=1)): [$i]" done echo --echo 'IFS=":", using $var (var=$*)' c=0
100
101
The $@ and $* parameters differ only when between double quotes. Example 9-8. $* and $@ when $IFS is empty
#!/bin/bash # If $IFS set, but empty, #+ then "$*" and "$@" do not echo positional params as expected. mecho () # Echo positional parameters. { echo "$1,$2,$3"; }
# The behavior of $* and $@ when $IFS is empty depends #+ on which Bash or sh version being run. # It is therefore inadvisable to depend on this "feature" in a script.
Other Special Parameters $Flags passed to script (using set). See Example 15-16. This was originally a ksh construct adopted into Bash, and unfortunately it does not seem to work reliably in Bash scripts. One possible use for it is to have a script self-test whether it is interactive. $! PID (process ID) of last job run in background
LOG=$0.log
102
Or, alternately:
# This example by Matthew Sage. # Used with permission. TIMEOUT=30 count=0 # Timeout value in seconds
possibly_hanging_job & { while ((count < TIMEOUT )); do eval '[ ! -d "/proc/$!" ] && ((count = TIMEOUT))' # /proc is where information about running processes is found. # "-d" tests whether it exists (whether directory exists). # So, we're waiting for the job in question to show up. ((count++)) sleep 1 done eval '[ -d "/proc/$!" ] && kill -15 $!' # If the hanging job is running, kill it. } # # #+ # # -------------------------------------------------------------- # However, this may not not work as specified if another process begins to run after the "hanging_job" . . . In such a case, the wrong job may be killed. Ariel Meragelman suggests the following fix.
TIMEOUT=30 count=0 # Timeout value in seconds possibly_hanging_job & { while ((count < TIMEOUT )); do eval '[ ! -d "/proc/$lastjob" ] && ((count = TIMEOUT))' lastjob=$! ((count++)) sleep 1 done eval '[ -d "/proc/$lastjob" ] && kill -15 $lastjob'
103
$? Exit status of a command, function, or the script itself (see Example 24-7) $$ Process ID (PID) of the script itself. [45] The $$ variable often finds use in scripts to construct "unique" temp file names (see Example 32-6, Example 16-31, and Example 15-27). This is usually simpler than invoking mktemp.
-i integer
104
# Number = 3
number=three echo "Number = $number" # Number = 0 # Tries to evaluate the string "three" as an integer.
Certain arithmetic operations are permitted for declared integer variables without the need for expr or let.
n=6/3 echo "n = $n" declare -i n n=6/3 echo "n = $n"
# n = 6/3
# n = 2
-a array
declare -a indices
A declare -f line with no arguments in a script causes a listing of all the functions previously defined in that script.
declare -f function_name
This declares a variable as available for exporting outside the environment of the script itself. -x var=$value
declare -x var3=373
The declare command permits assigning a value to a variable in the same statement as setting its properties.
105
However . . .
foo (){ declare FOO="bar" } bar () { foo echo $FOO } bar # Prints nothing.
106
bash$ Colors=([0]="purple" [1]="reddish-orange" [2]="light green") bash$ echo ${Colors[@]} purple reddish-orange light green bash$ declare | grep Colors Colors=([0]="purple" [1]="reddish-orange" [2]="light green")
---
$number"
107
# If you need a random integer greater than a lower bound, #+ then set up a test to discard all numbers below that. FLOOR=200 number=0 #initialize while [ "$number" -le $FLOOR ] do number=$RANDOM done echo "Random number greater than $FLOOR --echo
$number"
# Let's examine a simple alternative to the above loop, namely # let "number = $RANDOM + $FLOOR" # That would eliminate the while-loop and run faster. # But, there might be a problem with that. What is it?
# Combine above two techniques to retrieve random number between two limits. number=0 #initialize while [ "$number" -le $FLOOR ] do number=$RANDOM let "number %= $RANGE" # Scales $number down within $RANGE. done echo "Random number between $FLOOR and $RANGE --- $number" echo
# Generate binary choice, that is, "true" or "false" value. BINARY=2 T=1 number=$RANDOM let "number %= $BINARY" # Note that let "number >>= 14" gives a better random distribution #+ (right shifts out everything except last binary digit). if [ "$number" -eq $T ] then echo "TRUE" else echo "FALSE" fi echo
# Generate a toss of the dice. SPOTS=6 # Modulo 6 gives range 0 - 5. # Incrementing by 1 gives desired range of 1 - 6. # Thanks, Paulo Marcel Coelho Aragao, for the simplification. die1=0 die2=0 # Would it be better to just set SPOTS=7 and not add 1? Why or why not?
108
let "throw = $die1 + $die2" echo "Throw of the dice = $throw" echo
exit 0
# Pick a card, any card. Suites="Clubs Diamonds Hearts Spades" Denominations="2 3 4 5 6 7 8 9 10 Jack Queen King Ace" # Note variables spread over multiple lines.
suite=($Suites) denomination=($Denominations)
num_suites=${#suite[*]} # Count how many elements. num_denominations=${#denomination[*]} echo -n "${denomination[$((RANDOM%num_denominations))]} of " echo ${suite[$((RANDOM%num_suites))]}
109
Initialize_Slots () { # Zero out all elements of the array. for i in $( seq $NUMSLOTS ) do Slots[$i]=0 done echo } # Blank line at beginning of run.
110
# Row of slots: " |__|__|__|__|__|__|__|__|__|__|__|__|__|__|__|__|__|__|__|__|__|" " ||" # Note that if the count within any particular slot exceeds 99, #+ it messes up the display. # Running only(!) 500 passes usually avoids this.
Move () { Move=$RANDOM let "Move %= RANGE" case "$Move" in 0 ) ;; 1 ) ((POS--));; 2 ) ((POS++));; * ) echo -n "Error esac }
# Move one unit right / left, or stay put. # How random is $RANDOM? Well, let's see ... # Normalize into range of 0 - 2. # # # # Do nothing, i.e., stay in place. Left. Right. Anomaly! (Should never occur.)
";;
Play () { i=0 while [ "$i" -lt "$ROWS" ] do Move ((i++)); done SHIFT=11 let "POS += $SHIFT" (( Slots[$POS]++ )) # echo -n "$POS " }
# Why 11, and not 10? # Shift "zero position" to center. # DEBUG: echo $POS
Run () { # Outer loop. p=0 while [ "$p" -lt "$PASSES" ] do Play (( p++ )) POS=0 # Reset to zero. Why? done }
111
Jipe points out a set of techniques for generating random numbers within a range.
# Generate random number between 6 and 30. rnumber=$((RANDOM%25+6))
# Generate random number in the same 6 - 30 range, #+ but the number must be evenly divisible by 3. rnumber=$(((RANDOM%30/3+1)*3)) # # # Note that this will not work all the time. It fails if $RANDOM%30 returns 0. Frank Wang suggests the following alternative: rnumber=$(( RANDOM%27/3*3+6 ))
Bill Gradwohl came up with an improved formula that works for positive numbers.
rnumber=$(((RANDOM%(max-min+divisibleBy))/divisibleBy*divisibleBy+min))
Here Bill presents a versatile function that returns a random number between two specified values.
randomBetween() { # Generates a positive or negative random number #+ between $min and $max #+ and divisible by $divisibleBy. # Gives a "reasonably random" distribution of return values. # # Bill Gradwohl - Oct 1, 2003 syntax() { # Function echo echo echo echo -n echo echo echo echo -n echo echo echo echo
embedded within function. "Syntax: randomBetween [min] [max] [multiple]" "Expects up to 3 passed parameters, " "but all are completely optional." "min is the minimum value" "max is the maximum value" "multiple specifies that the answer must be " "a multiple of this value." " i.e. answer must be evenly divisible by this number." "If any value is missing, defaults area supplied as: 0 32767 1"
112
# Note that to get a proper distribution for the end points, #+ the range of random values has to be allowed to go between #+ 0 and abs(max-min)+divisibleBy, not just abs(max-min)+1. # The slight increase will produce the proper distribution for the
113
spread=$((max-min)) # Omair Eshkenazi points out that this test is unnecessary, #+ since max and min have already been switched around. [ ${spread} -lt 0 ] && spread=$((0-spread)) let spread+=divisibleBy randomBetweenAnswer=$(((RANDOM%spread)/divisibleBy*divisibleBy+min)) return 0 # #+ #+ # # # } # Let's test the function. min=-14 max=20 divisibleBy=3 However, Paulo Marcel Coelho Aragao points out that when $max and $min are not divisible by $divisibleBy, the formula fails. He suggests instead the following formula: rnumber = $(((RANDOM%(max-min+1)+min)/divisibleBy*divisibleBy))
# Generate an array of expected answers and check to make sure we get #+ at least one of each answer if we loop long enough. declare -a answer minimum=${min} maximum=${max} if [ $((minimum/divisibleBy*divisibleBy)) -ne ${minimum} ]; then if [ ${minimum} -lt 0 ]; then minimum=$((minimum/divisibleBy*divisibleBy)) else minimum=$((((minimum/divisibleBy)+1)*divisibleBy)) fi fi
# If max is itself not evenly divisible by $divisibleBy, #+ then fix the max to be within range. if [ $((maximum/divisibleBy*divisibleBy)) -ne ${maximum} ]; then if [ ${maximum} -lt 0 ]; then maximum=$((((maximum/divisibleBy)-1)*divisibleBy)) else maximum=$((maximum/divisibleBy*divisibleBy)) fi fi
# We need to generate only positive array subscripts, #+ so we need a displacement that that will guarantee #+ positive results.
114
# Now loop a large number of times to see what we get. loopIt=1000 # The script author suggests 100000, #+ but that takes a good long while. for ((i=0; i<${loopIt}; ++i)); do # Note that we are specifying min and max in reversed order here to #+ make the function correct for this case. randomBetween ${max} ${min} ${divisibleBy} # Report an error if an answer is unexpected. [ ${randomBetweenAnswer} -lt ${min} -o ${randomBetweenAnswer} -gt ${max} ] \ && echo MIN or MAX error - ${randomBetweenAnswer}! [ $((randomBetweenAnswer%${divisibleBy})) -ne 0 ] \ && echo DIVISIBLE BY error - ${randomBetweenAnswer}! # Store the answer away statistically. answer[randomBetweenAnswer+disp]=$((answer[randomBetweenAnswer+disp]+1)) done
# Let's check the results for ((i=${minimum}; i<=${maximum}; i+=divisibleBy)); do [ ${answer[i+disp]} -eq 0 ] \ && echo "We never got an answer of $i." \ || echo "${i} occurred ${answer[i+disp]} times." done
exit 0
Just how random is $RANDOM? The best way to test this is to write a script that tracks the distribution of "random" numbers generated by $RANDOM. Let's roll a $RANDOM die a few times . . .
115
while [ "$throw" -lt "$MAXTHROWS" ] do let "die1 = RANDOM % $PIPS" update_count $die1 let "throw += 1" done print_result exit $? # # #+ # # #+ The scores should distribute evenly, assuming RANDOM is random. With $MAXTHROWS at 600, all should cluster around 100, plus-or-minus 20 or so. Keep in mind that RANDOM is a ***pseudorandom*** generator, and not a spectacularly good one at that.
# Randomness is a deep and complex subject. # Sufficiently long "random" sequences may exhibit #+ chaotic and other "non-random" behavior. # # # # Exercise (easy): --------------Rewrite this script to flip a coin 1000 times. Choices are "HEADS" and "TAILS."
As we have seen in the last example, it is best to reseed the RANDOM generator each time it is invoked. Using the same seed for RANDOM repeats the same series of numbers. [48] (This mirrors the behavior of the random() function in C.)
116
random_numbers () { count=0 while [ "$count" -lt "$MAXCOUNT" ] do number=$RANDOM echo -n "$number " let "count += 1" done } echo; echo RANDOM=1 random_numbers # Setting RANDOM seeds the random number generator.
echo; echo "Trying again with same random seed ..." RANDOM=1 random_numbers # Same seed for RANDOM . . . # . . . reproduces the exact same number series. # # When is it useful to duplicate a "random" series?
echo; echo RANDOM=2 random_numbers echo; echo # RANDOM=$$ seeds RANDOM from process id of script. # It is also possible to seed RANDOM from 'time' or 'date' commands. # Getting fancy... SEED=$(head -1 /dev/urandom | od -N 1 | awk '{ print $2 }') # Pseudo-random output fetched #+ from /dev/urandom (system pseudo-random device-file), #+ then converted to line of printable (octal) numbers by "od", #+ finally "awk" retrieves just one number for SEED. RANDOM=$SEED random_numbers echo; echo exit 0 # Trying again, but with a different seed . . . # . . . gives a different number series.
The /dev/urandom pseudo-device file provides a method of generating much more "random" pseudorandom numbers than the $RANDOM variable. dd if=/dev/urandom of=targetfile bs=1 count=XX creates a file of well-scattered pseudorandom numbers. However, assigning these numbers to a variable in a script requires a workaround, such as filtering through od (as in above example, Example 16-14, and Example A-36), or even piping to md5sum (see Example 36-14).
117
Advanced Bash-Scripting Guide There are also other ways to generate pseudorandom numbers in a script. Awk provides a convenient means of doing this.
echo -n "Random number between 0 and 1 = " echo | awk "$AWKSCRIPT" # What happens if you leave out the 'echo'? exit 0
# Exercises: # --------# 1) Using a loop construct, print out 10 different random numbers. # (Hint: you must reseed the srand() function with a different seed #+ in each pass through the loop. What happens if you omit this?) # 2) Using an integer multiplier as a scaling factor, generate random numbers #+ in the range of 10 to 100. # 3) Same as exercise #2, above, but generate random integers this time.
The date command also lends itself to generating pseudorandom integer sequences.
118
len=${#line} if [[ "$len" -lt "$MINLEN" && "$line" =~ [*{\.}]$ ]] # if [[ "$len" -lt "$MINLEN" && "$line" =~ \[*\.\] ]] # An update to Bash broke the previous version of this script. Ouch! # Thank you, Halim Srama, for pointing this out and suggesting a fix. then echo # Add a blank line immediately fi #+ after a short line terminated by a period. done exit # Exercises: # --------# 1) The script usually inserts a blank line at the end
119
Length of Matching Substring at Beginning of String expr match "$string" '$substring' $substring is a regular expression. expr "$string" : '$substring' $substring is a regular expression.
stringZ=abcABC123ABCabc # |------| # 12345678 echo `expr match "$stringZ" 'abc[A-Z]*.2'` echo `expr "$stringZ" : 'abc[A-Z]*.2'` # 8 # 8
Index expr index $string $substring Numerical position in $string of first character in $substring that matches.
stringZ=abcABC123ABCabc # 123456 ... echo `expr index "$stringZ" C12`
# 6 # C position. # 3
echo `expr index "$stringZ" 1c` # 'c' (in #3 position) matches before '1'.
This is the near equivalent of strchr() in C. Substring Extraction ${string:position} Extracts substring from $string at $position. If the $string parameter is "*" or "@", then this extracts the positional parameters, [49] starting at $position. ${string:position:length} Extracts $length characters of substring from $string at $position.
stringZ=abcABC123ABCabc # 0123456789..... # 0-based indexing. echo ${stringZ:0} echo ${stringZ:1} echo ${stringZ:7} echo ${stringZ:7:3} # abcABC123ABCabc # bcABC123ABCabc # 23ABCabc # 23A # Three characters of substring.
120
The position and length arguments can be "parameterized," that is, represented as a variable, rather than as a numerical constant.
str1=$( echo "$str0" | md5sum | md5sum ) # Doubly scramble ^^^^^^ ^^^^^^ #+ by piping and repiping to md5sum. randstring="${str1:$POS:$LEN}" # Can parameterize ^^^^ ^^^^ echo "$randstring" exit $? # bozo$ ./rand-string.sh my-password # 1bdd88c4 # No, this is is not recommended #+ as a method of generating hack-proof passwords.
If the $string parameter is "*" or "@", then this extracts a maximum of $length positional parameters, starting at $position.
echo ${*:2} echo ${@:2} echo ${*:2:3} # Echoes second and following positional parameters. # Same as above. # Echoes three positional parameters, starting at second.
expr substr $string $position $length Chapter 10. Manipulating Variables 121
Advanced Bash-Scripting Guide Extracts $length characters from $string starting at $position.
stringZ=abcABC123ABCabc # 123456789...... # 1-based indexing. echo `expr substr $stringZ 1 2` echo `expr substr $stringZ 4 3` # ab # ABC
expr match "$string" '\($substring\)' Extracts $substring at beginning of $string, where $substring is a regular expression. expr "$string" : '\($substring\)' Extracts $substring at beginning of $string, where $substring is a regular expression.
stringZ=abcABC123ABCabc # ======= echo `expr match "$stringZ" '\(.[b-c]*[A-Z]..[0-9]\)'` echo `expr "$stringZ" : '\(.[b-c]*[A-Z]..[0-9]\)'` echo `expr "$stringZ" : '\(.......\)'` # All of the above forms give an identical result. # abcABC1 # abcABC1 # abcABC1
expr match "$string" '.*\($substring\)' Extracts $substring at end of $string, where $substring is a regular expression. expr "$string" : '.*\($substring\)' Extracts $substring at end of $string, where $substring is a regular expression.
stringZ=abcABC123ABCabc # ====== echo `expr match "$stringZ" '.*\([A-C][A-C][A-C][a-c]*\)'` echo `expr "$stringZ" : '.*\(......\)'` # ABCabc # ABCabc
Substring Removal ${string#substring} Deletes shortest match of $substring from front of $string. ${string##substring} Deletes longest match of $substring from front of $string.
stringZ=abcABC123ABCabc # |----| shortest # |----------| longest echo ${stringZ#a*C} # 123ABCabc # Strip out shortest match between 'a' and 'C'. echo ${stringZ##a*C} # abc # Strip out longest match between 'a' and 'C'.
# You can parameterize the substrings. X='a*C' echo ${stringZ#$X} echo ${stringZ##$X} # 123ABCabc # abc # As above.
122
Advanced Bash-Scripting Guide ${string%substring} Deletes shortest match of $substring from back of $string. For example:
# Rename all filenames in $PWD with "TXT" suffix to a "txt" suffix. # For example, "file1.TXT" becomes "file1.txt" . . . SUFF=TXT suff=txt for i in $(ls *.$SUFF) do mv -f $i ${i%.$SUFF}.$suff # Leave unchanged everything *except* the shortest pattern match #+ starting from the right-hand-side of the variable $i . . . done ### This could be condensed into a "one-liner" if desired. # Thank you, Rory Winston.
shortest longest
echo ${stringZ%b*c} # abcABC123ABCa # Strip out shortest match between 'b' and 'c', from back of $stringZ. echo ${stringZ%%b*c} # a # Strip out longest match between 'b' and 'c', from back of $stringZ.
# If directory name given as a script argument... # Otherwise use current working directory.
# Assumes all files in the target directory are MacPaint image files, #+ with a ".mac" filename suffix. for file in $directory/* do # Filename globbing.
123
OFILEPREF=${1%%ra} # Strip off the "ra" suffix. OFILESUFF=wav # Suffix for wav file. OUTFILE="$OFILEPREF""$OFILESUFF" E_NOARGS=85 if [ -z "$1" ] # Must specify a filename to convert. then echo "Usage: `basename $0` [filename]" exit $E_NOARGS fi
########################################################################## mplayer "$1" -ao pcm:file=$OUTFILE oggenc "$OUTFILE" # Correct file extension automatically added by oggenc. ########################################################################## rm "$OUTFILE" # Delete intermediate *.wav file. # If you want to keep it, comment out above line.
exit $? # # # #+ # #+ Note: ---On a Website, simply clicking on a *.ram streaming audio file usually only downloads the URL of the actual *.ra audio file. You can then use "wget" or something similar to download the *.ra file itself.
124
getopt_simple() { echo "getopt_simple()" echo "Parameters are '$*'" until [ -z "$1" ] do echo "Processing parameter of: '$1'" if [ ${1:0:1} = '/' ] then tmp=${1:1} # Strip off leading '/' . . . parameter=${tmp%%=*} # Extract name. value=${tmp##*=} # Extract value. echo "Parameter: '$parameter', value: '$value'" eval $parameter=$value fi shift done } # Pass all options to getopt_simple(). getopt_simple $* echo "test is '$test'" echo "test2 is '$test2'" exit 0 --sh getopt_example.sh /test=value1 /test2=value2 Parameters are '/test=value1 /test2=value2' Processing parameter of: '/test=value1' Parameter: 'test', value: 'value1' Processing parameter of: '/test2=value2' Parameter: 'test2', value: 'value2' test is 'value1' test2 is 'value2' # See also, UseGetOpt.sh, a modified version of this script.
125
Advanced Bash-Scripting Guide Substring Replacement ${string/substring/replacement} Replace first match of $substring with $replacement. [50] ${string//substring/replacement} Replace all matches of $substring with $replacement.
stringZ=abcABC123ABCabc echo ${stringZ/abc/xyz} # xyzABC123ABCabc # Replaces first match of 'abc' with 'xyz'. # xyzABC123ABCxyz # Replaces all matches of 'abc' with # 'xyz'.
echo ${stringZ//abc/xyz}
# Can the match and replacement match=abc repl=000 echo ${stringZ/$match/$repl} # # ^ ^ echo ${stringZ//$match/$repl} # # Yes! ^ ^ echo
strings be parameterized?
# What happens if no $replacement string is supplied? echo ${stringZ/abc} # ABC123ABCabc echo ${stringZ//abc} # ABC123ABC # A simple deletion takes place.
${string/#substring/replacement} If $substring matches front end of $string, substitute $replacement for $substring. ${string/%substring/replacement} If $substring matches back end of $string, substitute $replacement for $substring.
stringZ=abcABC123ABCabc echo ${stringZ/#abc/XYZ} # XYZABC123ABCabc # Replaces front-end match of 'abc' with 'XYZ'. # abcABC123ABCXYZ # Replaces back-end match of 'abc' with 'XYZ'.
echo ${stringZ/%abc/XYZ}
126
Advanced Bash-Scripting Guide May be used for concatenating variables with strings.
your_id=${USER}-on-${HOSTNAME} echo "$your_id" # echo "Old \$PATH = $PATH" PATH=${PATH}:/opt/bin # Add /opt/bin to $PATH for duration of script. echo "New \$PATH = $PATH"
echo ${username-`whoami`} # Echoes the result of `whoami`, if variable $username is still unset.
${parameter-default} and ${parameter:-default} are almost equivalent. The extra : makes a difference only when parameter has been declared, but is null.
#!/bin/bash # param-sub.sh # Whether a variable has been declared #+ affects triggering of the default option #+ even if the variable is null. username0= echo "username0 has been declared, but is set to null." echo "username0 = ${username0-`whoami`}" # Will not echo. echo echo username1 has not been declared. echo "username1 = ${username1-`whoami`}" # Will echo. username2= echo "username2 has been declared, but is set to null." echo "username2 = ${username2:-`whoami`}" # ^ # Will echo because of :- rather than just - in condition test. # Compare to first instance, above.
128
The default parameter construct finds use in providing "missing" command-line arguments in scripts.
DEFAULT_FILENAME=generic.data filename=${1:-$DEFAULT_FILENAME} # If not otherwise specified, the following command block operates #+ on the file "generic.data". # Begin-Command-Block # ... # ... # ... # End-Command-Block
# From "hanoi2.bash" example: DISKS=${1:-E_NOPARAM} # Must specify how many disks. # Set $DISKS to $1 command-line-parameter, #+ or to $E_NOPARAM if that is unset.
See also Example 3-4, Example 31-2, and Example A-6. Compare this method with using an and list to supply a default command-line argument. ${parameter=default}, ${parameter:=default} If parameter not set, set it to default. Both forms nearly equivalent. The : makes a difference only when $parameter has been declared and is null, [51] as above.
echo ${var=abc} # abc echo ${var=xyz} # abc # $var had already been set to abc, so it did not change.
${parameter+alt_value}, ${parameter:+alt_value} If parameter set, use alt_value, else use null string. Both forms nearly equivalent. The : makes a difference only when parameter has been declared and is null, see below.
echo "###### \${parameter+alt_value} ########" echo a=${param1+xyz} echo "a = $a" param2= a=${param2+xyz}
# a =
129
# a = xyz
echo echo "###### \${parameter:+alt_value} ########" echo a=${param4:+xyz} echo "a = $a"
# a =
param5= a=${param5:+xyz} echo "a = $a" # a = # Different result from param6=123 a=${param6:+xyz} echo "a = $a"
a=${param5+xyz}
# a = xyz
${parameter?err_msg}, ${parameter:?err_msg} If parameter set, use it, else print err_msg and abort the script with an exit status of 1. Both forms nearly equivalent. The : makes a difference only when parameter has been declared and is null, as above.
: ${HOSTNAME?} ${USER?} ${HOME?} ${MAIL?} echo echo "Name of the machine is $HOSTNAME." echo "You are $USER." echo "Your home directory is $HOME." echo "Your mail INBOX is located in $MAIL." echo echo "If you are reading this message," echo "critical environmental variables have been set." echo echo # -----------------------------------------------------# The ${variablename?} construction can also check #+ for variables set within the script. ThisVariable=Value-of-ThisVariable # Note, by the way, that string variables may be set #+ to characters disallowed in their names. : ${ThisVariable?} echo "Value of ThisVariable is $ThisVariable".
130
: ${ZZXy23AB?"ZZXy23AB has not been set."} # Since ZZXy23AB has not been set, #+ then the script terminates with an error message. # You can specify the error message. # : ${variablename?"ERROR MESSAGE"}
# Compare these methods of checking whether a variable has been set #+ with "set -u" . . .
echo "You will not see this message, because script already terminated." HERE=0 exit $HERE
# Check the exit status, both with and without command-line parameter. # If command-line parameter present, then "$?" is 0. # If not, then "$?" is 1.
Parameter substitution and/or expansion. The following expressions are the complement to the match in expr string operations (see Example 16-9). These particular ones are used mostly in parsing file path names. Variable length / Substring removal ${#var} String length (number of characters in $var). For an array, ${#array} is the length of the first element in the array. Exceptions:
131
Advanced Bash-Scripting Guide ${#*} and ${#@} give the number of positional parameters. For an array, ${#array[*]} and ${#array[@]} give the number of elements in the array. Example 10-9. Length of a variable
#!/bin/bash # length.sh E_NO_ARGS=65 if [ $# -eq 0 ] # Must have command-line args to demo script. then echo "Please invoke this script with one or more command-line arguments." exit $E_NO_ARGS fi var01=abcdEFGH28ij echo "var01 = ${var01}" echo "Length of var01 = ${#var01}" # Now, let's try embedding a space. var02="abcd EFGH28ij" echo "var02 = ${var02}" echo "Length of var02 = ${#var02}" echo "Number of command-line arguments passed to script = ${#@}" echo "Number of command-line arguments passed to script = ${#*}" exit 0
${var#Pattern}, ${var##Pattern} ${var#Pattern} Remove from $var the shortest part of $Pattern that matches the front end of $var.
${var##Pattern} Remove from $var the longest part of $Pattern that matches the front end of $var. A usage illustration from Example A-7:
# Function from "days-between.sh" example. # Strips leading zero(s) from argument passed. strip_leading_zero () # Strip possible leading zero(s) { #+ from argument passed. return=${1#0} # The "1" refers to "$1" -- passed arg. } # The "0" is what to remove from "$1" -- strips zeros.
132
${var%Pattern}, ${var%%Pattern} ${var%Pattern} Remove from $var the shortest part of $Pattern that matches the back end of $var.
${var%%Pattern} Remove from $var the longest part of $Pattern that matches the back end of $var. Version 2 of Bash added additional options.
var1=abcd12345abc6789 pattern1=a*c # * (wild card) matches everything between a - c. echo echo "var1 = $var1" echo "var1 = ${var1}"
# abcd12345abc6789 # abcd12345abc6789 # (alternate form) echo "Number of characters in ${var1} = ${#var1}" echo echo "pattern1 = $pattern1" # a*c (everything between 'a' and 'c') echo "--------------" echo '${var1#$pattern1} =' "${var1#$pattern1}" # d12345abc6789 # Shortest possible match, strips out first 3 characters abcd12345abc6789 # ^^^^^ |-| echo '${var1##$pattern1} =' "${var1##$pattern1}" # 6789 # Longest possible match, strips out first 12 characters abcd12345abc6789 # ^^^^^ |----------| echo; echo; echo pattern2=b*9 # everything between 'b' and '9' echo "var1 = $var1" # Still abcd12345abc6789 echo echo "pattern2 = $pattern2" echo "--------------" echo '${var1%pattern2} =' "${var1%$pattern2}" # abcd12345a
133
# Remember, # and ## work from the left end (beginning) of string, # % and %% work from the right end. echo exit 0
E_BADARGS=65 case $# in 0|1) # The vertical bar means "or" in this context. echo "Usage: `basename $0` old_file_suffix new_file_suffix" exit $E_BADARGS # If 0 or 1 arg, then bail out. ;; esac
for filename in *.$1 # Traverse list of files ending with 1st argument. do mv $filename ${filename%$1}$2 # Strip off part of filename matching 1st argument, #+ then append 2nd argument. done exit 0
Variable expansion / Substring replacement These constructs have been adopted from ksh. ${var:pos} Variable var expanded, starting from offset pos. ${var:pos:len} Expansion to a max of len characters of variable var, from offset pos. See Example A-13 for an example of the creative use of this operator. ${var/Pattern/Replacement} First match of Pattern, within var replaced with Replacement. If Replacement is omitted, then the first match of Pattern is replaced by nothing, that is, deleted. ${var//Pattern/Replacement} Chapter 10. Manipulating Variables 134
Advanced Bash-Scripting Guide Global replacement. All matches of Pattern, within var replaced with Replacement. As above, if Replacement is omitted, then all occurrences of Pattern are replaced by nothing, that is, deleted.
var1 = $t"
t=${var1%*-*} echo "var1 (with everything from the last - on stripped out) = $t" echo # ------------------------------------------path_name=/home/bozo/ideas/thoughts.for.today # ------------------------------------------echo "path_name = $path_name" t=${path_name##/*/} echo "path_name, stripped of prefixes = $t" # Same effect as t=`basename $path_name` in this particular case. # t=${path_name%/}; t=${t##*/} is a more general solution, #+ but still fails sometimes. # If $path_name ends with a newline, then `basename $path_name` will not work, #+ but the above expression will. # (Thanks, S.C.) t=${path_name%/*.*} # Same effect as t=`dirname $path_name` echo "path_name, stripped of suffixes = $t" # These will fail in some cases, such as "../", "/foo////", # "foo/", "/". # Removing suffixes, especially when the basename has no suffix, #+ but the dirname does, also complicates matters. # (Thanks, S.C.) echo t=${path_name:11} echo "$path_name, with first 11 chars stripped off = $t" t=${path_name:11:5} echo "$path_name, with first 11 chars stripped off, length 5 = $t" echo t=${path_name/bozo/clown}
135
\"today\" deleted = $t" all o's capitalized = $t" all o's deleted = $t"
${var/#Pattern/Replacement} If prefix of var matches Pattern, then substitute Replacement for Pattern. ${var/%Pattern/Replacement} If suffix of var matches Pattern, then substitute Replacement for Pattern.
# Match at prefix (beginning) of string. v1=${v0/#abc/ABCDEF} # abc1234zip1234abc # |-| echo "v1 = $v1" # ABCDEF1234zip1234abc # |----| # Match at suffix (end) of string. v2=${v0/%abc/ABCDEF} # abc1234zip123abc # |-| echo "v2 = $v2" # abc1234zip1234ABCDEF # |----| echo # ---------------------------------------------------# Must match at beginning / end of string, #+ otherwise no replacement results. # ---------------------------------------------------v3=${v0/#123/000} # Matches, but not at beginning. echo "v3 = $v3" # abc1234zip1234abc # NO REPLACEMENT. v4=${v0/%123/000} # Matches, but not at end. echo "v4 = $v4" # abc1234zip1234abc # NO REPLACEMENT. exit 0
${!varprefix*}, ${!varprefix@} Matches names of all previously declared variables beginning with varprefix.
# This is a variation on indirect reference, but with a * or @. # Bash, version 2.04, adds this feature. xyz23=whatever xyz24=
136
137
11.1. Loops
A loop is a block of code that iterates [52] a list of commands as long as the loop control condition is true. for loops for arg in [list] This is the basic looping construct. It differs significantly from its C counterpart.
for arg in [list] do command(s)... done During each pass through the loop, arg takes on the value of each successive variable in the list.
for arg in "$var1" # In pass 1 of the # In pass 2 of the # In pass 3 of the # ... # In pass N of the "$var2" "$var3" ... "$varN" loop, arg = $var1 loop, arg = $var2 loop, arg = $var3 loop, arg = $varN
If do is on same line as for, there needs to be a semicolon after list. for arg in [list] ; do
138
Each [list] element may contain multiple parameters. This is useful when processing parameters in groups. In such cases, use the set command (see Example 15-16) to force parsing of each [list] element and assignment of each component to the positional parameters.
Example 11-2. for loop with two parameters in each [list] element
#!/bin/bash # Planets revisited. # Associate the name of each planet with its distance from the sun. for planet in "Mercury 36" "Venus 67" "Earth 93" "Mars 142" "Jupiter 483" do set -- $planet # Parses variable "planet" #+ and sets positional parameters. # The "--" prevents nasty surprises if $planet is null or #+ begins with a dash. # May need to save original positional parameters, #+ since they get overwritten. # One way of doing this is to use an array, # original_params=("$@") echo "$1 #-------two done $2,000,000 miles from the sun" tabs---concatenate zeroes onto parameter $2
139
If the [list] in a for loop contains wild cards (* and ?) used in filename expansion, then globbing takes place.
140
Omitting the in [list] part of a for loop causes the loop to operate on $@ -- the positional parameters. A particularly clever illustration of this is Example A-15. See also Example 15-17.
It is possible to use command substitution to generate the [list] in a for loop. See also Example 16-54, Example 11-10 and Example 16-48.
Example 11-6. Generating the [list] in a for loop with command substitution
#!/bin/bash # for-loopcmd.sh: for-loop with [list] #+ generated by command substitution. NUMBERS="9 7 3 8 37.53" for number in `echo $NUMBERS` do echo -n "$number " done echo exit 0 # for number in 9 7 3 8 37.53
Here is a somewhat more complex example of using command substitution to create the [list].
141
IFS=$'\012'
# Per suggestion of Anton Filippov. # was: IFS="\n" for word in $( strings "$2" | grep "$1" ) # The "strings" command lists strings in binary files. # Output then piped to "grep", which tests for desired string. do echo $word done # As S.C. points out, lines 23 - 30 could be replaced with the simpler # strings "$2" | grep "$1" | tr -s "$IFS" '[\n*]'
# Try something like "./bin-grep.sh mem /bin/ls" #+ to exercise this script. exit 0
# # # # #
exit $?
142
for file in $( find $directory -type f -name '*' | sort ) do strings -f $file | grep "$fstring" | sed -e "s%$directory%%" # In the "sed" expression, #+ it is necessary to substitute for the normal "/" delimiter #+ because "/" happens to be one of the characters filtered out. # Failure to do so gives an error message. (Try it.) done exit $? # # # #+ Exercise (easy): --------------Convert this script to take command-line parameters for $directory and $fstring.
A final example of [list] / command substitution, but this time the "command" is a function.
generate_list () { echo "one two three" } for word in $(generate_list) do echo "$word" done # one # two # three # Let "word" grab output of function.
143
exit 0
# -------------------------------------------------------# Jean Helou proposes the following alternative: echo "symbolic links in directory \"$directory\"" # Backup of the current IFS. One can never be too cautious. OLDIFS=$IFS IFS=: for file in $(find $directory -type l -printf "%p$IFS") do # ^^^^^^^^^^^^^^^^ echo "$file" done|sort # And, James "Mike" Conley suggests modifying Helou's code thusly: OLDIFS=$IFS IFS='' # Null IFS means no word breaks for file in $( find $directory -type l ) do echo $file done | sort # This works in the "pathological" case of a directory name having #+ an embedded colon. # "This also fixes the pathological case of the directory name having #+ a colon (or space in earlier example) as well."
The stdout of a loop may be redirected to a file, as this slight modification to the previous example shows. Chapter 11. Loops and Branches 144
Advanced Bash-Scripting Guide Example 11-11. Symbolic links in a directory, saved to a file
#!/bin/bash # symlinks.sh: Lists symbolic links in a directory. OUTFILE=symlinks.list directory=${1-`pwd`} # Defaults to current working directory, #+ if not otherwise specified. # save-file
echo "symbolic links in directory \"$directory\"" > "$OUTFILE" echo "---------------------------" >> "$OUTFILE" for file in "$( find $directory -type l )" do echo "$file" done | sort >> "$OUTFILE" # ^^^^^^^^^^^^^ # echo "Output file = $OUTFILE" exit $? # -type l = symbolic links
There is an alternative syntax to a for loop that will look very familiar to C programmers. This requires double parentheses.
145
See also Example 27-16, Example 27-17, and Example A-6. --Now, a for loop used in a "real-life" context.
-ne $EXPECTED_ARGS ] for proper number of command-line args. "Usage: `basename $0` phone# text-file" $E_BADARGS
if [ ! -f "$2" ] then echo "File $2 is not a text file." # File is not a regular file, or does not exist. exit $E_BADARGS
146
# Concatenate the converted files. # Uses wild card (filename "globbing") #+ in variable list.
do fil="$fil $file" done efax -d "$MODEM_PORT" -t "T$1" $fil # Finally, do the work. # Trying adding -o1 if above line fails.
# As S.C. points out, the for-loop can be eliminated with # efax -d /dev/ttyS2 -o1 -t "T$1" $2.0* #+ but it's not quite as instructive [grin]. exit $? # Also, efax sends diagnostic messages to stdout.
while This construct tests for a condition at the top of a loop, and keeps looping as long as that condition is true (returns a 0 exit status). In contrast to a for loop, a while loop finds use in situations where the number of loop repetitions is not known beforehand. while [ condition ] do command(s)... done The bracket construct in a while loop is nothing more than our old friend, the test brackets used in an if/then test. In fact, a while loop can legally use the more versatile double-brackets construct (while [[ condition ]]).
As is the case with for loops, placing the do on the same line as the condition test requires a semicolon. while [ condition ] ; do Note that the test brackets are not mandatory in a while loop. See, for example, the getopts construct.
147
var0=`expr $var0 + 1`
A while loop may have multiple conditions. Only the final condition determines when the loop terminates. This necessitates a slightly different loop syntax, however.
As with a for loop, a while loop may employ C-style syntax by using the double-parentheses construct Chapter 11. Loops and Branches 148
while [ "$a" -le $LIMIT ] do echo -n "$a " let "a+=1" done # No surprises, so far. echo; echo # +=================================================================+ # Now, we'll repeat with C-like syntax. ((a = 1)) # a=1 # Double parentheses permit space when setting a variable, as in C. while (( a <= LIMIT )) # Double parentheses, do #+ and no "$" preceding variables. echo -n "$a " ((a += 1)) # let "a+=1" # Yes, indeed. # Double parentheses permit incrementing a variable with C-like syntax. done echo # C and Java programmers can feel right at home in Bash. exit 0
149
Similar to the if-test construct, a while loop can omit the test brackets.
while condition do command(s) ... done
By coupling the power of the read command with a while loop, we get the handy while read construct, useful for reading and parsing files.
cat $filename | while read line do ... done # Supply input from a file. # As long as there is another line to read ...
# =========== Snippet from "sd.sh" example script ========== # while read value # Read one data point at a time. do rt=$(echo "scale=$SC; $rt + $value" | bc) (( ct++ )) done am=$(echo "scale=$SC; $rt / $ct" | bc) echo $am; return $ct # This function "returns" TWO values! # Caution: This little trick will not work if $ct > 255! # To handle a larger number of data points, #+ simply comment out the "return $ct" above. } <"$datafile" # Feed in data file.
A while loop may have its stdin redirected to a file by a < at its end. A while loop may have its stdin supplied by a pipe. until This construct tests for a condition at the top of a loop, and keeps looping as long as that condition is false (opposite of while loop). until [ condition-is-true ] do command(s)... done Note that an until loop tests for the terminating condition at the top of the loop, differing from a similar construct in some programming languages. Chapter 11. Loops and Branches 150
Advanced Bash-Scripting Guide As is the case with for loops, placing the do on the same line as the condition test requires a semicolon. until [ condition-is-true ] ; do
# As with "for" and "while" loops, #+ an "until" loop permits C-like test constructs. LIMIT=10 var=0 until (( var > LIMIT )) do # ^^ ^ ^ ^^ No brackets, no $ prefixing variables. echo -n "$var " (( var++ )) done # 0 1 2 3 4 5 6 7 8 9 10
exit 0
How to choose between a for loop or a while loop or until loop? In C, you would typically use a for loop when the number of loop iterations is known beforehand. With Bash, however, the situation is fuzzier. The Bash for loop is more loosely structured and more flexible than its equivalent in other languages. Therefore, feel free to use whatever type of loop gets the job done in the simplest way.
151
# Beginning of outer loop. for a in 1 2 3 4 5 do echo "Pass $outer in outer loop." echo "---------------------" inner=1 # Reset inner loop counter. # =============================================== # Beginning of inner loop. for b in 1 2 3 4 5 do echo "Pass $inner in inner loop." let "inner+=1" # Increment inner loop counter. done # End of inner loop. # =============================================== let "outer+=1" # Increment outer loop counter. echo # Space between output blocks in pass of outer loop. done # End of outer loop. exit 0
See Example 27-11 for an illustration of nested while loops, and Example 27-13 to see a while loop nested inside an until loop.
152
# Exercise: # Why does the loop print up to 20? echo; echo echo Printing Numbers 1 through 20, but something happens after 2. ################################################################## # Same loop, but substituting 'break' for 'continue'. a=0 while [ "$a" -le "$LIMIT" ] do a=$(($a+1)) if [ "$a" -gt 2 ] then break # Skip entire rest of loop. fi echo -n "$a " done echo; echo; echo exit 0
The break command may optionally take a parameter. A plain break terminates only the innermost loop in which it is embedded, but a break N breaks out of N levels of loop.
"
# --------------------------------------------------------
153
The continue command, similar to break, optionally takes a parameter. A plain continue cuts short the current iteration within its loop and begins the next. A continue N terminates all remaining iterations at its loop level and continues with the next iteration at the loop, N levels above.
# -------------------------------------------------------------------for inner in 1 2 3 4 5 6 7 8 9 10 # inner loop do if [[ "$inner" -eq 7 && "$outer" = "III" ]] then continue 2 # Continue at loop on 2nd level, that is "outer loop". # Replace above line with a simple "continue" # to see normal loop behavior. fi echo -n "$inner " # 7 8 9 10 will not echo on "Group III." done # -------------------------------------------------------------------done echo; echo # Exercise: # Come up with a meaningful use for "continue N" in a script. exit 0
154
while true do for n in .iso.* do [ "$n" = ".iso.opts" ] && continue beta=${n#.iso.} [ -r .Iso.$beta ] && continue [ -r .lock.$beta ] && sleep 10 && continue lockfile -r0 .lock.$beta || continue echo -n "$beta: " `date` run-isotherm $beta date ls -alF .Iso.$beta [ -r .Iso.$beta ] && rm -f .lock.$beta continue 2 done break done exit 0 # The details, in particular the sleep N, are particular to my #+ application, but the general pattern is: while true do for job in {pattern} do {job already done or running} && continue {mark job as running, do job, mark job as done} continue 2 done break # Or something like `sleep 600' to avoid termination. done # #+ #+ #+ #+ #+ #+ #+ #+ #+ #+ This way the script will stop only when there are no more jobs to do (including jobs that were added during runtime). Through the use of appropriate lockfiles it can be run on several machines concurrently without duplication of calculations [which run a couple of hours in my case, so I really want to avoid this]. Also, as search always starts again from the beginning, one can encode priorities in the file names. Of course, one could also do this without `continue 2', but then one would have to actually check whether or not some job was done (so that we should immediately look for the next job) or not (in which case we terminate or sleep for a long time before checking for a new job).
The continue N construct is difficult to understand and tricky to use in any meaningful context. It is probably best avoided.
155
esac
Quoting the variables is not mandatory, since word splitting does not take place. Each test line ends with a right paren ). [54] Each condition block ends with a double semicolon ;;. If a condition tests true, then the associated commands execute and the case block terminates. The entire case block ends with an esac (case spelled backwards). Example 11-24. Using case
#!/bin/bash # Testing ranges of characters. echo; echo "Hit a key, then hit return." read Keypress case "$Keypress" in [[:lower:]] ) echo "Lowercase letter";; [[:upper:]] ) echo "Uppercase letter";; [0-9] ) echo "Digit";; * ) echo "Punctuation, whitespace, or other";; esac # Allows ranges of characters in [square brackets], #+ or POSIX ranges in [[double square brackets. # In the first version of this example, #+ the tests for lowercase and uppercase characters were
156
exit 0
read person case "$person" in # Note variable is quoted. "E" | "e" ) # Accept upper or lowercase input. echo echo "Roland Evans" echo "4321 Flash Dr." echo "Hardscrabble, CO 80753" echo "(303) 734-9874" echo "(303) 734-9892 fax" echo "[email protected]" echo "Business partner & old friend" ;; # Note double semicolon to terminate each option. "J" | "j" ) echo echo "Mildred Jones" echo "249 E. 7th St., Apt. 19" echo "New York, NY 10009" echo "(212) 533-2814" echo "(212) 533-9972 fax" echo "[email protected]" echo "Ex-girlfriend" echo "Birthday: Feb. 11" ;;
157
exit 0
# Otherwise, $1.
while [ $# -gt 0 ]; do # Until you run out of parameters . . . case "$1" in -d|--debug) # "-d" or "--debug" parameter? DEBUG=1 ;; -c|--conf) CONFFILE="$2" shift if [ ! -f $CONFFILE ]; then echo "Error: Supplied file doesn't exist!" exit $E_CONFFILE # File not found error. fi ;; esac
158
# From Stefano Falsetto's "Log2Rot" script, #+ part of his "rottlog" package. # Used with permission.
159
exit 0
isalpha () # Tests whether *first character* of input string is alphabetic. { if [ -z "$1" ] # No argument passed? then return $FAILURE fi case "$1" in [a-zA-Z]*) return $SUCCESS;; # Begins with a letter? * ) return $FAILURE;; esac } # Compare this with "isalpha ()" function in C.
isalpha2 () # Tests whether *entire string* is alphabetic. { [ $# -eq 1 ] || return $FAILURE case $1 in *[!a-zA-Z]*|"") return $FAILURE;; *) return $SUCCESS;; esac } isdigit () # Tests whether *entire string* is numerical. { # In other words, tests for integer variable. [ $# -eq 1 ] || return $FAILURE case $1 in *[!0-9]*|"") return $FAILURE;; *) return $SUCCESS;; esac }
check_var () # Front-end to isalpha (). { if isalpha "$@" then echo "\"$*\" begins with an alpha character." if isalpha2 "$@" then # No point in testing if first char is non-alpha. echo "\"$*\" contains only alpha characters."
160
# Command substitution.
check_var $a check_var $b check_var $c check_var $d check_var $e check_var $f check_var # No argument passed, so what happens? # digit_check $g digit_check $h digit_check $i
exit 0
# Exercise: # -------# Write an 'isfloat ()' function that tests for floating point numbers. # Hint: The function duplicates 'isdigit ()', #+ but adds a test for a mandatory decimal point.
select The select construct, adopted from the Korn Shell, is yet another tool for building menus. select variable [in list] do Chapter 11. Loops and Branches 161
Advanced Bash-Scripting Guide command... break done This prompts the user to enter one of the choices presented in the variable list. Note that select uses the $PS3 prompt (#? ) by default, but this may be changed.
If in list is omitted, then select uses the list of command line arguments ($@) passed to the script or the function containing the select construct. Compare this to the behavior of a for variable [in list] construct with the in list omitted. Example 11-30. Creating menus using select in a function
#!/bin/bash PS3='Choose your favorite vegetable: ' echo choice_of() { select vegetable # [in list] omitted, so 'select' uses arguments passed to function. do
162
163
The output of commands can be used as arguments to another command, to set a variable, and even for generating the argument list in a for loop.
rm `cat filename` # "filename" contains a list of files to delete. # # S. C. points out that "arg list too long" error might result. # Better is xargs rm -- < filename # ( -- covers those cases where "filename" begins with a "-" ) textfile_listing=`ls *.txt` # Variable contains names of all *.txt files in current working directory. echo $textfile_listing textfile_listing2=$(ls *.txt) echo $textfile_listing2 # Same result. # # # # # # # # # The alternative form of command substitution.
A possible problem with putting a list of files into a single string is that a newline may creep in. A safer way to assign a list of files to a parameter is with an array. shopt -s nullglob # If no match, filename expands to nothing. textfile_listing=( *.txt ) Thanks, S.C.
Command substitution invokes a subshell. Command substitution may result in word splitting.
COMMAND `echo a b` COMMAND "`echo a b`" COMMAND `echo` COMMAND "`echo`" # 2 args: a and b # 1 arg: "a b" # no arg # one empty arg
# Thanks, S.C.
Even when there is no word splitting, command substitution can remove trailing newlines.
# cd "`pwd`" # However... # This should always work.
164
# Disable "canonical" mode for terminal. # Also, disable *local* echo. key=$(dd bs=1 count=1 2> /dev/null) # Using 'dd' to get a keypress. stty "$old_tty_setting" # Restore old setting. echo "You hit ${#key} key." # ${#variable} = number of characters in $variable # # Hit any key except RETURN, and the output is "You hit 1 key." # Hit RETURN, and it's "You hit 0 key." # The newline gets eaten in the command substitution. #Code snippet by Stphane Chazelas.
Using echo to output an unquoted variable set with command substitution removes trailing newlines characters from the output of the reassigned command(s). This can cause unpleasant surprises.
dir_listing=`ls -l` echo $dir_listing
# unquoted
# Expecting a nicely ordered directory listing. # However, what you get is: # total 3 -rw-rw-r-- 1 bozo bozo 30 May 13 17:15 1.txt -rw-rw-r-- 1 bozo # bozo 51 May 15 20:57 t2.sh -rwxr-xr-x 1 bozo bozo 217 Mar 5 21:13 wi.sh # The newlines disappeared.
echo "$dir_listing" # quoted # -rw-rw-r-1 bozo 30 May 13 17:15 1.txt # -rw-rw-r-1 bozo 51 May 15 20:57 t2.sh # -rwxr-xr-x 1 bozo 217 Mar 5 21:13 wi.sh
Command substitution even permits setting a variable to the contents of a file, using either redirection or the cat command.
variable1=`<file1` variable2=`cat file2` # # # #+ Set "variable1" to contents of "file1". Set "variable2" to contents of "file2". This, however, forks a new process, so the line of code executes slower than the above version.
# Note that the variables may contain embedded whitespace, #+ or even (horrors), control characters.
165
if [ -f /fsckoptions ]; then fsckoptions=`cat /fsckoptions` ... fi # # if [ -e "/proc/ide/${disk[$device]}/media" ] ; then hdmedia=`cat /proc/ide/${disk[$device]}/media` ... fi # # if [ ! -n "`uname -r | grep -- "-"`" ]; then ktag="`cat /proc/version`" ... fi # # if [ $usb = "1" ]; then sleep 5 mouseoutput=`cat /proc/bus/usb/devices 2>/dev/null|grep -E "^I.*Cls=03.*Prot=02"` kbdoutput=`cat /proc/bus/usb/devices 2>/dev/null|grep -E "^I.*Cls=03.*Prot=01"` ... fi
Do not set a variable to the contents of a long text file unless you have a very good reason for doing so. Do not set a variable to the contents of a binary file, even as a joke.
dangerous_variable=`cat /boot/vmlinuz`
echo "string-length of \$dangerous_variable = ${#dangerous_variable}" # string-length of $dangerous_variable = 794151 # (Newer kernels are bigger.) # Does not give same count as 'wc -c /boot/vmlinuz'. # echo "$dangerous_variable" # Don't try this! It would hang the script.
# The document author is aware of no useful applications for #+ setting a variable to the contents of a binary file.
166
Notice that a buffer overrun does not occur. This is one instance where an interpreted language, such as Bash, provides more protection from programmer mistakes than a compiled language. Command substitution permits setting a variable to the output of a loop. The key to this is grabbing the output of an echo command within the loop.
i=0 variable2=`while [ "$i" -lt 10 ] do echo -n "$i" # Again, the necessary 'echo'. let "i += 1" # Increment. done` echo "variable2 = $variable2" # variable2 = 0123456789
# Demonstrates that it's possible to embed a loop #+ within a variable declaration. exit 0
Command substitution makes it possible to extend the toolset available to Bash. It is simply a matter of writing a program or script that outputs to stdout (like a well-behaved UNIX tool should) and assigning that output to a variable.
#include <stdio.h> /* "Hello, world." C program */
int main() { printf( "Hello, world.\n" ); return (0); } bash$ gcc -o hello hello.c
167
# Setting a variable to the contents of a text file. File_contents1=$(cat $file1) File_contents2=$(<$file2) # Bash permits this also.
The $(...) form of command substitution treats a double backslash in a different way than `...`.
bash$ echo `echo \\`
E_NOARGS=86 E_BADARG=87 MINLEN=7 if [ -z "$1" ] then echo "Usage $0 LETTERSET" exit $E_NOARGS # Script needs a command-line argument. elif [ ${#1} -lt $MINLEN ] then echo "Argument must have at least $MINLEN letters." exit $E_BADARG fi
FILTER='.......' # Must have at least 7 letters. # 1234567 Anagrams=( $(echo $(anagram $1 | grep $FILTER) ) )
168
7+ letter anagrams found" # First anagram. # Second anagram. # Etc. # To list all the anagrams in a single line . . .
# echo "${Anagrams[*]}"
# Look ahead to the Arrays chapter for enlightenment on #+ what's going on here. # See also the agram.sh script for an exercise in anagram finding. exit $?
Examples of command substitution in shell scripts: 1. Example 11-7 2. Example 11-26 3. Example 9-16 4. Example 16-3 5. Example 16-22 6. Example 16-17 7. Example 16-54 8. Example 11-13 9. Example 11-10 10. Example 16-32 11. Example 20-8 12. Example A-16 13. Example 29-3 14. Example 16-47 15. Example 16-48 16. Example 16-49
169
Arithmetic expansion with double parentheses, and using let The use of backticks (backquotes) in arithmetic expansion has been superseded by double parentheses -- ((...)) and $((...)) -- and also by the very convenient let construction.
z=$(($z+3)) z=$((z+3))
# # #+ #+
# You may also use operations within double parentheses without assignment. n=0 echo "n = $n" (( n += 1 )) # (( $n += 1 )) is incorrect! echo "n = $n"
# n = 0 # Increment. # n = 1
# Quotes permit the use of spaces in variable assignment. # The 'let' operator actually performs arithmetic evaluation, #+ rather than expansion.
Examples of arithmetic expansion in scripts: 1. Example 16-9 2. Example 11-14 3. Example 27-1 4. Example 27-11 5. Example A-16
170
Fellow Linux user, greetings! You are reading something which will bring you luck and good fortune. Just e-mail a copy of this document to 10 of your friends. Before making the copies, send a 100-line Bash script to the first person on the list at the bottom of this letter. Then delete their name and add yours to the bottom of the list. Don't break the chain! Make the copies within 48 hours. Wilfred P. of Brooklyn failed to send out his ten copies and woke the next morning to find his job description changed to "COBOL programmer." Howard L. of Newport News sent out his ten copies and within a month had enough hardware to build a 100-node Beowulf cluster dedicated to playing Tuxracer. Amelia V. of Chicago laughed at this letter and broke the chain. Shortly thereafter, a fire broke out in her terminal and she now spends her days writing documentation for MS Windows. Don't break the chain! Send out your ten copies today!
Courtesy 'NIX "fortune cookies", with some alterations and many apologies
171
Part 4. Commands
Mastering the commands on your Linux machine is an indispensable prelude to writing effective shell scripts. This section covers the following commands: . (See also source) ac adduser agetty agrep ar arch at autoload awk (See also Using awk for math operations) badblocks banner basename batch bc bg bind bison builtin bzgrep bzip2 cal caller cat cd chattr chfn chgrp chkconfig chmod chown chroot cksum clear clock cmp col colrm column comm command compgen Part 4. Commands 172
Advanced Bash-Scripting Guide complete compress coproc cp cpio cron crypt csplit cu cut date dc dd debugfs declare depmod df dialog diff diff3 diffstat dig dirname dirs disown dmesg doexec dos2unix du dump dumpe2fs e2fsck echo egrep enable enscript env eqn eval exec exit (Related topic: exit status) expand export expr factor false fdformat fdisk fg fgrep file Part 4. Commands 173
Advanced Bash-Scripting Guide find finger flex flock fmt fold free fsck ftp fuser getfacl getopt getopts gettext getty gnome-mount grep groff groupmod groups (Related topic: the $GROUPS variable) gs gzip halt hash hdparm head help hexdump host hostid hostname (Related topic: the $HOSTNAME variable) hwclock iconv id (Related topic: the $UID variable) ifconfig info infocmp init insmod install ip ipcalc iptables iwconfig jobs join jot kill killall last lastcomm Part 4. Commands 174
Advanced Bash-Scripting Guide lastlog ldd less let lex lid ln locate lockfile logger logname logout logrotate look losetup lp ls lsdev lsmod lsof lspci lsusb ltrace lynx lzcat lzma m4 mail mailstats mailto make MAKEDEV man mapfile mcookie md5sum merge mesg mimencode mkbootdisk mkdir mkdosfs mke2fs mkfifo mkisofs mknod mkswap mktemp mmencode modinfo modprobe Part 4. Commands 175
Advanced Bash-Scripting Guide more mount msgfmt mv nc netconfig netstat newgrp nice nl nm nmap nohup nslookup objdump od openssl passwd paste patch (Related topic: diff) pathchk pax pgrep pidof ping pkill popd pr printenv printf procinfo ps pstree ptx pushd pwd (Related topic: the $PWD variable) quota rcp rdev rdist read readelf readlink readonly reboot recode renice reset resize restore rev Part 4. Commands 176
Advanced Bash-Scripting Guide rlogin rm rmdir rmmod route rpm rpm2cpio rsh rsync runlevel run-parts rx rz sar scp script sdiff sed seq service set setfacl setquota setserial setterm sha1sum shar shopt shred shutdown size skill sleep slocate snice sort source sox split sq ssh stat strace strings strip stty su sudo sum suspend swapoff Part 4. Commands 177
Advanced Bash-Scripting Guide swapon sx sync sz tac tail tar tbl tcpdump tee telinit telnet Tex texexec time times tmpwatch top touch tput tr traceroute true tset tsort tty tune2fs type typeset ulimit umask umount uname unarc unarj uncompress unexpand uniq units unlzma unrar unset unsq unzip uptime usbmodules useradd userdel usermod users usleep Part 4. Commands 178
Advanced Bash-Scripting Guide uucp uudecode uuencode uux vacation vdir vmstat vrfy w wait wall watch wc wget whatis whereis which who whoami whois write xargs xrandr yacc yes zcat zdiff zdump zegrep zfgrep zgrep zip Table of Contents 15. Internal Commands and Builtins 15.1. Job Control Commands 16. External Filters, Programs and Commands 16.1. Basic Commands 16.2. Complex Commands 16.3. Time / Date Commands 16.4. Text Processing Commands 16.5. File and Archiving Commands 16.6. Communications Commands 16.7. Terminal Control Commands 16.8. Math Commands 16.9. Miscellaneous Commands 17. System and Administrative Commands 17.1. Analyzing a System Script
Part 4. Commands
179
When a command or the shell itself initiates (or spawns) a new subprocess to carry out a task, this is called forking. This new process is the child, and the process that forked it off is the parent. While the child process is doing its work, the parent process is still executing. Note that while a parent process gets the process ID of the child process, and can thus pass arguments to it, the reverse is not true. This can create problems that are subtle and hard to track down.
PIDS=$(pidof sh $0) # Process IDs of the various instances of this script. P_array=( $PIDS ) # Put them in an array (why?). echo $PIDS # Show process IDs of parent and child processes. let "instances = ${#P_array[*]} - 1" # Count elements, less 1. # Why subtract 1? echo "$instances instance(s) of this script running." echo "[Hit Ctl-C to exit.]"; echo
sleep 1 sh $0 exit 0
# Wait. # Play it again, Sam. # Not necessary; script will never get to here. # Why not?
# After exiting with a Ctl-C, #+ do all the spawned instances of the script die? # If so, why? # # # # Note: ---Be careful not to run this script too long. It will eventually eat up too many system resources.
# Is having a script spawn multiple instances of itself #+ an advisable scripting technique. # Why or why not?
Generally, a Bash builtin does not fork a subprocess when it executes within a script. An external system command or filter in a script usually will fork a subprocess. A builtin may be a synonym to a system command of the same name, but Bash reimplements it internally. For example, the Bash echo command is not the same as /bin/echo, although their behavior is almost identical. Chapter 15. Internal Commands and Builtins 180
A keyword is a reserved word, token or operator. Keywords have a special meaning to the shell, and indeed are the building blocks of the shell's syntax. As examples, for, while, do, and ! are keywords. Similar to a builtin, a keyword is hard-coded into Bash, but unlike a builtin, a keyword is not in itself a command, but a subunit of a command construct. [59] I/O echo prints (to stdout) an expression or variable (see Example 4-1).
echo Hello echo $a
An echo requires the -e option to print escaped characters. See Example 5-2. Normally, each echo command prints a terminal newline, but the -n option suppresses this.
An echo, in combination with command substitution can set a variable. a=`echo "HELLO" | tr A-Z a-z` See also Example 16-22, Example 16-3, Example 16-47, and Example 16-48. Be aware that echo `command` deletes any linefeeds that the output of command generates. The $IFS (internal field separator) variable normally contains \n (linefeed) as one of its set of whitespace characters. Bash therefore splits the output of command at linefeeds into arguments to echo. Then echo outputs these arguments, separated by spaces.
bash$ ls -l /usr/share/apps/kjezz/sounds -rw-r--r-1 root root 1407 Nov 7 2000 reflect.au -rw-r--r-1 root root 362 Nov 7 2000 seconds.au
bash$ echo `ls -l /usr/share/apps/kjezz/sounds` total 40 -rw-r--r-- 1 root root 716 Nov 7 2000 reflect.au -rw-r--r-- 1 root root ...
181
This command is a shell builtin, and not the same as /bin/echo, although its behavior is similar.
bash$ type -a echo echo is a shell builtin echo is /bin/echo
printf The printf, formatted print, command is an enhanced echo. It is a limited variant of the C language printf() library function, and its syntax is somewhat different. printf format-string... parameter... This is the Bash builtin version of the /bin/printf or /usr/bin/printf command. See the printf manpage (of the system command) for in-depth coverage. Chapter 15. Internal Commands and Builtins 182
Advanced Bash-Scripting Guide Older versions of Bash may not support printf. Example 15-2. printf in action
#!/bin/bash # printf demo declare -r PI=3.14159265358979 declare -r DecimalConstant=31373 Message1="Greetings," Message2="Earthling." echo printf "Pi to 2 decimal places = %1.2f" $PI echo printf "Pi to 9 decimal places = %1.9f" $PI printf "\n" # Read-only variable, i.e., a constant.
# It even rounds off correctly. # Prints a line feed, # Equivalent to 'echo' . . . # Inserts tab (\t).
printf "Constant = \t%d\n" $DecimalConstant printf "%s %s \n" $Message1 $Message2 echo # ==========================================# # Simulation of C function, sprintf(). # Loading a variable with a formatted string. echo Pi12=$(printf "%1.12f" $PI) echo "Pi to 12 decimal places = $Pi12" Msg=`printf "%s %s \n" $Message1 $Message2` echo $Msg; echo $Msg
# Roundoff error!
# As it happens, the 'sprintf' function can now be accessed #+ as a loadable module to Bash, #+ but this is not portable. exit 0
183
See also Example 36-15. read "Reads" the value of a variable from stdin, that is, interactively fetches input from the keyboard. The -a option lets read get array variables (see Example 27-6).
echo # A single 'read' statement can set multiple variables. echo -n "Enter the values of variables 'var2' and 'var3' " echo =n "(separated by a space or tab): " read var2 var3 echo "var2 = $var2 var3 = $var3" # If you input only one value, #+ the other variable(s) will remain unset (null). exit 0
A read without an associated variable assigns its input to the dedicated variable $REPLY.
184
# This example is similar to the "reply.sh" script. # However, this one shows that $REPLY is available #+ even after a 'read' to a variable in the conventional way.
# ================================================================= # # # In some instances, you might wish to discard the first value read. In such cases, simply ignore the $REPLY variable.
{ # Code block. read read line2 } <$0 echo "Line 2 of echo "$line2" echo
# Line 1, to be discarded. # Line 2, saved in variable. this script is:" # # read-novar.sh # #!/bin/bash line discarded.
Normally, inputting a \ suppresses a newline during input to a read. The -r option causes an inputted \ to be interpreted literally.
echo "var1 = $var1" # var1 = first line second line # For each line terminated by a "\" #+ you get a prompt on the next line to continue feeding characters into var1. echo; echo echo "Enter another string terminated by a \\ , then press <ENTER>." read -r var2 # The -r option causes the "\" to be read literally. # first line \ echo "var2 = $var2" # var2 = first line \
185
The read command has some interesting options that permit echoing a prompt and even reading keystrokes without hitting ENTER.
# Read a keypress without hitting ENTER. read -s -n1 -p "Hit a key " keypress echo; echo "Keypress was "\"$keypress\""." # -s option means do not echo input. # -n N option means accept only N characters of input. # -p option means echo the following prompt before reading input. # Using these options is tricky, since they need to be in the correct order.
The -n option to read also allows detection of the arrow keys and certain of the other unusual keys.
186
echo " Some other key pressed." exit $OTHER # ========================================= # # Mark Alexander came up with a simplified #+ version of the above script (Thank you!). # It eliminates the need for grep. #!/bin/bash uparrow=$'\x1b[A' downarrow=$'\x1b[B' leftarrow=$'\x1b[D' rightarrow=$'\x1b[C' read -s -n3 -p "Hit an arrow key: " x case "$x" in $uparrow) echo "You ;; $downarrow) echo "You ;; $leftarrow) echo "You ;; $rightarrow) echo "You ;;
pressed up-arrow"
pressed down-arrow"
pressed left-arrow"
pressed right-arrow"
187
The -n option to read will not detect the ENTER (newline) key. The -t option to read permits timed input (see Example 9-4 and Example A-41). The -u option takes the file descriptor of the target file.
The read command may also "read" its variable value from a file redirected to stdin. If the file contains more than one line, only the first line is assigned to the variable. If read has more than one parameter, then each of these variables gets assigned a successive whitespace-delineated string. Caution!
188
# Setting the $IFS variable within the loop itself #+ eliminates the need for storing the original $IFS #+ in a temporary variable. # Thanks, Dim Segebart, for pointing this out. echo "------------------------------------------------" echo "List of all users:" while IFS=: read name passwd uid gid fullname ignore do echo "$name ($fullname)" done </etc/passwd # I/O redirection. echo echo "\$IFS still $IFS" exit 0
Piping output to a read, using echo to set variables will fail. Yet, piping the output of cat seems to work.
189
############################################# ./readpipe.sh {#!/bin/sh} {last="(null)"} {cat $0 |} {while read line} {do} {echo "{$line}"} {last=$line} {done} {printf "nAll done, last: $lastn"}
All done, last: (null) The variable (last) is set within the loop/subshell but its value does not persist outside the loop.
The gendiff script, usually found in /usr/bin on many Linux distros, pipes the output of find to a while read construct.
find $1 \( -name "*$2" -o -name ".*$2" \) -print | while read f; do . . .
It is possible to paste text into the input field of a read (but not multiple lines!). See Example A-38. Filesystem cd Chapter 15. Internal Commands and Builtins 190
Advanced Bash-Scripting Guide The familiar cd change directory command finds use in scripts where execution of a command requires being in a specified directory.
(cd /source/directory && tar cf - . ) | (cd /dest/directory && tar xpvf -)
[from the previously cited example by Alan Cox] The -P (physical) option to cd causes it to ignore symbolic links. cd - changes to $OLDPWD, the previous working directory.
The cd command does not function as expected when presented with two forward slashes.
bash$ cd // bash$ pwd //
The output should, of course, be /. This is a problem both from the command-line and in a script. pwd Print Working Directory. This gives the user's (or script's) current directory (see Example 15-9). The effect is identical to reading the value of the builtin variable $PWD. pushd, popd, dirs This command set is a mechanism for bookmarking working directories, a means of moving back and forth through directories in an orderly manner. A pushdown stack is used to keep track of directory names. Options allow various manipulations of the directory stack. pushd dir-name pushes the path dir-name onto the directory stack (to the top of the stack) and simultaneously changes the current working directory to dir-name popd removes (pops) the top directory path name off the directory stack and simultaneously changes the current working directory to the directory now at the top of the stack. dirs lists the contents of the directory stack (compare this with the $DIRSTACK variable). A successful pushd or popd will automatically invoke dirs. Scripts that require various changes to the current working directory without hard-coding the directory name changes can make good use of these commands. Note that the implicit $DIRSTACK array variable, accessible from within a script, holds the contents of the directory stack.
191
Variables let The let command carries out arithmetic operations on variables. [60] In many cases, it functions as a less complex version of expr.
let "a <<= 3" # Equivalent to let "a = a << 3" echo "\"\$a\" (=16) left-shifted 3 places = $a" # 128 let "a /= 4" # Equivalent to echo "128 / 4 = $a" # 32 let "a -= 5" echo "32 - 5 = $a" # Equivalent to # 27 let "a = a / 4"
let "a %= 8" # Equivalent to let "a = a % 8" echo "270 modulo 8 = $a (270 / 8 = 33, remainder $a)" # 6
# Does "let" permit C-style operators? # Yes, just as the (( ... )) double-parentheses construct does. let a++ echo "6++ = $a" let a-# C-style (post) increment. # 6++ = 7 # C-style decrement.
192
# Trinary operator. # Note that $a is 6, see above. let "t = a<7?7:11" # True echo $t # 7 let a++ let "t = a<7?7:11" echo $t # 11 exit
# False
The let command can, in certain contexts, return a surprising exit status.
# Evgeniy Ivanov points out: var=0 echo $?
# 0 # As expected.
# 0 # As expected.
# However, as Jeff Gorak points out, #+ this is part of the design spec for 'let' . . . # "If the last ARG evaluates to 0, let returns 1; # let returns 0 otherwise." ['help let']
eval eval arg1 [arg2] ... [argN] Combines the arguments in an expression or list of expressions and evaluates them. Any variables within the expression are expanded. The net result is to convert a string into a command. The eval command can be used for code generation from the command-line or within a script.
bash$ bash$ bash$ 26973 command_string="ps ax" process="ps ax" eval "$command_string" | grep "$process" pts/3 R+ 0:00 grep --color ps ax
193
# When LF's not preserved, it may make it easier to parse output, #+ using utilities such as "awk". echo echo "===========================================================" echo eval "`seq 3 | sed -e 's/.*/echo var&=ABCDEFGHIJ/'`" # var1=ABCDEFGHIJ # var2=ABCDEFGHIJ # var3=ABCDEFGHIJ echo echo "===========================================================" echo
# Now, showing how to do something useful with "eval" . . . # (Thank you, E. Choroba!) version=3.4 # Can we split the version into major and minor #+ part in one command? echo "version = $version" eval major=${version/./;minor=} # Replaces '.' in version by ';minor=' # The substitution yields '3; minor=4' #+ so eval does minor=4, major=3
194
choose_array () { eval array_member=\${arr${array_number}[element_number]} # ^ ^^^^^^^^^^^^ # Using eval to construct the name of a variable, #+ in this particular case, an array name. echo "Element $element_number of array $array_number is $array_member" } # Function can be rewritten to take parameters. array_number=0 element_number=3 choose_array array_number=2 element_number=4 choose_array array_number=3 element_number=4 choose_array # First array. # 13 # Third array. # 34 # Null array (arr3 not allocated). # (null)
while [ "$param" -le "$params" ] do echo -n "Command-line parameter " echo -n \$$param # Gives only the *name* of variable. # ^^^ # $1, $2, $3, etc. # Why? # \$ escapes the first "$"
195
# On to the next.
# ================================================= $ sh echo-params.sh first Command-line parameter $1 Command-line parameter $2 Command-line parameter $3 Command-line parameter $4 Command-line parameter $5 second third fourth fifth = first = second = third = fourth = fifth
killppp="eval kill -9 `ps ax | awk '/ppp/ { print $1 }'`" # -------- process ID of ppp ------$killppp # This variable is now a command.
# The following operations must be done as root user. chmod 666 /dev/$SERPORT # Restore r+w permissions, or else what? # Since doing a SIGKILL on ppp changed the permissions on the serial port, #+ we restore permissions to previous state. rm /var/lock/LCK..$SERPORT exit $? # Exercises: # --------# 1) Have script check whether root user is invoking it. # 2) Do a check on whether the process to be killed #+ is actually running before attempting to kill it. # 3) Write an alternate version of this script based on 'fuser': #+ if [ fuser -s /dev/modem ]; then . . . # Remove the serial port lock file. Why?
196
setvar_rot_13 var "foobar" echo $var setvar_rot_13 var "$var" echo $var
# Run "foobar" through rot13. # sbbone # Run "sbbone" through rot13. # Back to original variable. # foobar
Here is another example of using eval to evaluate a complex expression, this one from an earlier version of YongYe's Tetris game script.
eval ${1}+=\"${x} ${y} \"
Example A-53 uses eval to convert array elements into a command list. The eval command occurs in the older version of indirect referencing.
eval var=\$$var
The eval command can be used to parameterize brace expansion. The eval command can be risky, and normally should be avoided when there exists a reasonable alternative. An eval $COMMANDS executes the contents of COMMANDS, which may contain such unpleasant surprises as rm -rf *. Running an eval on unfamiliar code written by persons unknown is living dangerously. set The set command changes the value of internal script variables/options. One use for this is to toggle option flags which help determine the behavior of the script. Another application for it is to reset the positional parameters that a script sees as the result of a command (set `command`). The script can then parse the fields of the command output.
197
set `uname -a` # Sets the positional parameters to the output # of the command `uname -a` echo echo +++++ echo $_ # +++++ # Flags set in script. echo $# hB # Anomalous behavior? echo echo "Positional parameters after set \`uname -a\` :" # $1, $2, $3, etc. reinitialized to result of `uname -a` echo "Field #1 of 'uname -a' = $1" echo "Field #2 of 'uname -a' = $2" echo "Field #3 of 'uname -a' = $3" echo \#\#\# echo $_ # ### echo exit 0
Spaces escaped Spaces not escaped Saving old IFS and setting new one.
until [ $# -eq 0 ] do # Step through positional parameters. echo "### k0 = "$k"" # Before k=$1:$k; # Append each pos param to loop variable. # ^ echo "### k = "$k"" # After echo shift; done set $k # echo echo $# # echo Set new positional parameters. Count of positional parameters.
198
# Restore IFS.
Question: Is it necessary to set an new IFS, internal field separator, in order for this script to work properly? What happens if you don't? Try it. And, why use the new IFS -- a colon -- in line 17, to append to the loop variable? What is the purpose of this?
Invoking set without any options or arguments simply lists all the environmental and other variables that have been initialized.
bash$ set AUTHORCOPY=/home/bozo/posts BASH=/bin/bash BASH_VERSION=$'2.05.8(1)-release' ... XAUTHORITY=/home/bozo/.Xauthority _=/etc/bashrc variable22=abc variable23=xzy
Using set with the -- option explicitly assigns the contents of a variable to the positional parameters. If no variable follows the -- it unsets the positional parameters.
199
# one # two
# ====================================================== set -# Unsets positional parameters if no variable specified. first_param=$1 second_param=$2 echo "first parameter = $first_param" echo "second parameter = $second_param" exit 0
See also Example 11-2 and Example 16-56. unset The unset command deletes a shell variable, effectively setting it to null. Note that this command does not affect positional parameters.
bash$ unset PATH bash$ echo $PATH bash$
# # #+ #
Unset. In this particular context, same effect as: variable= $variable is null.
200
In most contexts, an undeclared variable and one that has been unset are equivalent. However, the ${parameter:-default} parameter substitution construct can distinguish between the two. export The export [61] command makes available variables to all child processes of the running script or shell. One important use of the export command is in startup files, to initialize and make accessible environmental variables to subsequent user processes. Unfortunately, there is no way to export variables back to the parent process, to the process that called or invoked the script or shell.
ARGS=2 E_WRONGARGS=85 if [ $# -ne "$ARGS" ] # Check for proper number of command-line args. then echo "Usage: `basename $0` filename column-number" exit $E_WRONGARGS fi filename=$1 column_number=$2 #===== Same as original script, up to this point =====# export column_number # Export column number to environment, so it's available for retrieval.
# ----------------------------------------------awkscript='{ total += $ENVIRON["column_number"] } END { print total }' # Yes, a variable can hold an awk script. # ----------------------------------------------# Now, run the awk script. awk "$awkscript" "$filename" # Thanks, Stephane Chazelas.
201
It is possible to initialize and export variables in the same operation, as in export var1=xxx. However, as Greg Keraunen points out, in certain situations this may have a different effect than setting a variable, then exporting it.
bash$ export var=(a b); echo ${var[0]} (a b)
A variable to be exported may require special treatment. See Example M-2. declare, typeset The declare and typeset commands specify and/or restrict properties of variables. readonly Same as declare -r, sets a variable as read-only, or, in effect, as a constant. Attempts to change the variable fail with an error message. This is the shell analog of the C language const type qualifier. getopts This powerful tool parses command-line arguments passed to the script. This is the Bash analog of the getopt external command and the getopt library function familiar to C programmers. It permits passing and concatenating multiple options [62] and associated arguments to a script (for example scriptname -abc -e /usr/local).
The getopts construct uses two implicit variables. $OPTIND is the argument pointer (OPTion INDex) and $OPTARG (OPTion ARGument) the (optional) argument attached to an option. A colon following the option name in the declaration tags that option as having an associated argument. A getopts construct usually comes packaged in a while loop, which processes the options and arguments one at a time, then increments the implicit $OPTIND variable to point to the next.
1. The arguments passed from the command-line to the script must be preceded by a dash (-). It is the prefixed - that lets getopts recognize command-line arguments as options. In fact, getopts will not process arguments without the prefixed -, and will terminate option processing at the first argument encountered lacking them. 2. The getopts template differs slightly from the standard while loop, in that it lacks condition brackets. 3. The getopts construct is a highly functional replacement for the traditional getopt external command.
while getopts ":abcde:fg" Option # Initial declaration. # a, b, c, d, e, f, and g are the options (flags) expected.
202
# Here we observe how 'getopts' processes command-line arguments to script. # The arguments are parsed as "options" (flags) and associated arguments. # Try invoking this script with: # 'scriptname -mn' # 'scriptname -oq qOption' (qOption can be some arbitrary string.) # 'scriptname -qXXX -r' # # 'scriptname -qr' #+ - Unexpected result, takes "r" as the argument to option "q" # 'scriptname -q -r' #+ - Unexpected result, same as above # 'scriptname -mnop -mnop' - Unexpected result # (OPTIND is unreliable at stating where an option came from.) # # If an option expects an argument ("flag:"), then it will grab #+ whatever is next on the command-line. NO_ARGS=0 E_OPTERROR=85 if [ $# -eq "$NO_ARGS" ] # Script invoked with no command-line args? then echo "Usage: `basename $0` options (-mnopqrs)" exit $E_OPTERROR # Exit and explain usage. # Usage: scriptname -options # Note: dash (-) necessary fi
while getopts ":mnopq:rs" Option do case $Option in m ) echo "Scenario #1: option n | o ) echo "Scenario #2: option p ) echo "Scenario #3: option q ) echo "Scenario #4: option
203
shift $(($OPTIND - 1)) # Decrements the argument pointer so it points to next argument. # $1 now references the first non-option item supplied on the command-line #+ if one exists. exit $? # As Bill Gradwohl states, # "The getopts mechanism allows one to specify: scriptname -mnop -mnop #+ but there is no reliable way to differentiate what came #+ from where by using OPTIND." # There are, however, workarounds.
Script Behavior source, . (dot command) This command, when invoked from the command-line, executes a script. Within a script, a source file-name loads the file file-name. Sourcing a file (dot-command) imports code into the script, appending to the script (same effect as the #include directive in a C program). The net result is the same as if the "sourced" lines of code were physically present in the body of the script. This is useful in situations when multiple scripts use a common data file or function library.
exit $?
204
Advanced Bash-Scripting Guide File data-file for Example 15-22, above. Must be present in same directory.
# This is a data file loaded by a script. # Files of this type may contain variables, functions, etc. # It loads with a 'source' or '.' command from a shell script. # Let's initialize some variables. variable1=23 variable2=474 variable3=5 variable4=97 message1="Greetings from *** line $LINENO *** of the data file!" message2="Enough for now. Goodbye." print_message () { # Echoes any message passed to it. if [ -z "$1" ] then return 1 # Error, if argument missing. fi echo until [ -z "$1" ] do # Step through arguments passed to function. echo -n "$1" # Echo args one at a time, suppressing line feeds. echo -n " " # Insert spaces between words. shift # Next one. done echo return 0 }
If the sourced file is itself an executable script, then it will run, then return control to the script that called it. A sourced executable script may use a return for this purpose.
It is even possible for a script to source itself, though this does not seem to have any practical applications.
echo -n "$pass_count " # At first execution pass, this just echoes two blank spaces,
205
echo exit 0 # The net effect is counting from 1 to 100. # Very impressive.
# Exercise: # -------# Write a script that uses this trick to actually do something useful.
exit Unconditionally terminates a script. [63] The exit command may optionally take an integer argument, which is returned to the shell as the exit status of the script. It is good practice to end all but the simplest scripts with an exit 0, indicating a successful run. If a script terminates with an exit lacking an argument, the exit status of the script is the exit status of the last command executed in the script, not counting the exit. This is equivalent to an exit $?. An exit command may also be used to terminate a subshell. exec This shell builtin replaces the current process with a specified command. Normally, when the shell encounters a command, it forks off a child process to actually execute the command. Using the exec builtin, the shell does not fork, and the command exec'ed replaces the shell. When used in a script, therefore, it forces an exit from the script when the exec'ed command terminates. [64]
206
An exec also serves to reassign file descriptors. For example, exec <zzz-file replaces stdin with the file zzz-file. The -exec option to find is not the same as the exec shell builtin. shopt This command permits changing shell options on the fly (see Example 25-1 and Example 25-2). It often appears in the Bash startup files, but also has its uses in scripts. Needs version 2 or later of Bash.
shopt -s cdspell # Allows minor misspelling of directory names with 'cd' # Option -s sets, -u unsets. cd /hpme pwd # Oops! Mistyped '/home'. # /home # The shell corrected the misspelling.
caller Putting a caller command inside a function echoes to stdout information about the caller of that function.
#!/bin/bash function1 () {
207
Line number that the function was called from. Invoked from "main" part of script. Name of calling script.
A caller command can also return caller information from a script sourced within another script. Analogous to a function, this is a "subroutine call." You may find this command useful in debugging. Commands true A command that returns a successful (zero) exit status, but does nothing else.
bash$ true bash$ echo $? 0
# Endless loop while true # alias for ":" do operation-1 operation-2 ... operation-n # Need a way to break out of loop or script will hang. done
false A command that returns an unsuccessful exit status, but does nothing else.
bash$ false bash$ echo $? 1
# Testing "false" if false then echo "false evaluates \"true\"" else echo "false evaluates \"false\"" fi # false evaluates "false"
# Looping while "false" (null loop) while false do # The following code will not execute.
208
type [cmd] Similar to the which external command, type cmd identifies "cmd." Unlike which, type is a Bash builtin. The useful -a option to type identifies keywords and builtins, and also locates system commands with identical names.
bash$ type '[' [ is a shell builtin bash$ type -a '[' [ is a shell builtin [ is /usr/bin/[
The type command can be useful for testing whether a certain command exists. hash [cmds] Records the path name of specified commands -- in the shell hash table [65] -- so the shell or script will not need to search the $PATH on subsequent calls to those commands. When hash is called with no arguments, it simply lists the commands that have been hashed. The -r option resets the hash table. bind The bind builtin displays or modifies readline [66] key bindings. help Gets a short usage summary of a shell builtin. This is the counterpart to whatis, but for builtins. The display of help information got a much-needed update in the version 4 release of Bash.
bash$ help exit exit: exit [n] Exit the shell with a status of N. If N is omitted, the exit status is that of the last command executed.
209
"1" is the job number (jobs are maintained by the current shell). "1384" is the PID or process ID number (processes are maintained by the system). To kill this job/process, either a kill %1 or a kill 1384 works. Thanks, S.C. disown Remove job(s) from the shell's table of active jobs. fg, bg The fg command switches a job running in the background into the foreground. The bg command restarts a suspended job, and runs it in the background. If no job number is specified, then the fg or bg command acts upon the currently running job. wait Suspend script execution until all jobs running in background have terminated, or until the job number or process ID specified as an option terminates. Returns the exit status of waited-for command. You may use the wait command to prevent a script from exiting before a background job finishes executing (this would create a dreaded orphan process).
echo "Updating 'locate' database..." echo "This may take a while." updatedb /usr & # Must be run as root. wait # Don't run the rest of the script until 'updatedb' finished. # You want the the database updated before looking up the file name. locate $1 # Without the 'wait' command, in the worse case scenario, #+ the script would exit while 'updatedb' was still running, #+ leaving it as an orphan process.
210
Optionally, wait can take a job identifier as an argument, for example, wait%1 or wait $PPID. [67] See the job id table.
Within a script, running a command in the background with an ampersand (&) may cause the script to han ENTER is hit. This seems to occur with commands that write to stdout. It can be a major annoyance.
#!/bin/bash # test.sh ls -l & echo "Done." bash$ ./test.sh Done. [bozo@localhost test-scripts]$ total 1 -rwxr-xr-x 1 bozo bozo _
As far as I can tell, such scripts don't actually hang. It ju seems that they do because the background command writes text the console after the prompt. The user gets the impression th the prompt was never displayed. Here's the sequence of events 1. 2. 3. 4.
Script launches background command. Script exits. Shell displays the prompt. Background command continues running and writing text to t console. 5. Background command finishes. 6. User doesn't see a prompt at the bottom of the output, thi is hanging. Placing a wait after the background command seems to remedy this.
#!/bin/bash # test.sh ls -l & echo "Done." wait bash$ ./test.sh Done. [bozo@localhost test-scripts]$ total 1 -rwxr-xr-x 1 bozo bozo
Redirecting the output of the command to a file or even to /dev/null also takes care of this problem. suspend This has a similar effect to Control-Z, but it suspends the shell (the shell's parent process should resume it at an appropriate time). Chapter 15. Internal Commands and Builtins 211
Advanced Bash-Scripting Guide logout Exit a login shell, optionally specifying an exit status. times Gives statistics on the system time elapsed when executing commands, in the following form:
0m0.020s 0m0.020s
This capability is of relatively limited value, since it is not common to profile and benchmark shell scripts. kill Forcibly terminate a process by sending it an appropriate terminate signal (see Example 17-6).
echo "This line will not echo." # Instead, the shell sends a "Terminated" message to stdout. exit 0 # Normal exit? No!
# After this script terminates prematurely, #+ what exit status does it return? # # sh self-destruct.sh # echo $? # 143 # # 143 = 128 + 15 # TERM signal
kill -l lists all the signals (as does the file /usr/include/asm/signal.h). A kill -9 is a sure kill, which will usually terminate a process that stubbornly refuses to die with a plain kill. Sometimes, a kill -15 works. A zombie process, that is, a child process that has terminated, but that the parent process has not (yet) killed, cannot be killed by a logged-on user -- you can't kill something that is already dead -- but init will generally clean it up sooner or later. killall The killall command kills a running process by name, rather than by process ID. If there are multiple instances of a particular command running, then doing a killall on that command will terminate them all. This refers to the killall command in /usr/bin, not the killall script in /etc/rc.d/init.d. command The command directive disables aliases and functions for the command immediately following it.
bash$ command ls
212
Advanced Bash-Scripting Guide This is one of three shell directives that effect script command processing. The others are builtin and enable. builtin Invoking builtin BUILTIN_COMMAND runs the command BUILTIN_COMMAND as a shell builtin, temporarily disabling both functions and external system commands with the same name. enable This either enables or disables a shell builtin command. As an example, enable -n kill disables the shell builtin kill, so that when Bash subsequently encounters kill, it invokes the external command /bin/kill. The -a option to enable lists all the shell builtins, indicating whether or not they are enabled. The -f filename option lets enable load a builtin as a shared library (DLL) module from a properly compiled object file. [68]. autoload This is a port to Bash of the ksh autoloader. With autoload in place, a function with an autoload declaration will load from an external file at its first invocation. [69] This saves system resources. Note that autoload is not a part of the core Bash installation. It needs to be loaded in with enable -f (see above).
Table 15-1. Job identifiers Notation %N %S %?S %% %+ %$! Meaning Job number [N] Invocation (command-line) of job begins with string S Invocation (command-line) of job contains within it string S "current" job (last job stopped in foreground or started in background) "current" job (last job stopped in foreground or started in background) Last job Last background process
213
bash$ ls -lv total 0 -rw-rw-r-- 1 -rw-rw-r-- 1 -rw-rw-r-- 1 -rw-rw-r-- 1 -rw-rw-r-- 1 -rw-rw-r-- 1 -rw-rw-r-- 1 -rw-rw-r-- 1
0 0 0 0 0 0 0 0
14 14 14 14 14 14 14 14
The ls command returns a non-zero exit status when attempting to list a non-existent file.
bash$ ls abc ls: abc: No such file or directory
bash$ echo $? 2
Example 16-1. Using ls to create a table of contents for burning a CDR disk
214
SPEED=10 # May use higher speed if your hardware supports it. IMAGEFILE=cdimage.iso CONTENTSFILE=contents # DEVICE=/dev/cdrom For older versions of cdrecord DEVICE="1,0,0" DEFAULTDIR=/opt # This is the directory containing the data to be burned. # Make sure it exists. # Exercise: Add a test for this. # Uses Joerg Schilling's "cdrecord" package: # http://www.fokus.fhg.de/usr/schilling/cdrecord.html # If this script invoked as an ordinary user, may need to suid cdrecord #+ chmod u+s /usr/bin/cdrecord, as root. # Of course, this creates a security hole, though a relatively minor one. if [ -z "$1" ] then IMAGE_DIRECTORY=$DEFAULTDIR # Default directory, if not specified on command-line. else IMAGE_DIRECTORY=$1 fi # Create a "table of contents" file. ls -lRF $IMAGE_DIRECTORY > $IMAGE_DIRECTORY/$CONTENTSFILE # The "l" option gives a "long" file listing. # The "R" option makes the listing recursive. # The "F" option marks the file types (directories get a trailing /). echo "Creating table of contents." # Create an image file preparatory to burning it onto the CDR. mkisofs -r -o $IMAGEFILE $IMAGE_DIRECTORY echo "Creating ISO9660 file system image ($IMAGEFILE)." # Burn the CDR. echo "Burning the disk." echo "Please be patient, this will take a while." wodim -v -isosize dev=$DEVICE $IMAGEFILE # In newer Linux distros, the "wodim" utility assumes the #+ functionality of "cdrecord." exitcode=$? echo "Exit code = $exitcode" exit $exitcode
cat, tac cat, an acronym for concatenate, lists a file to stdout. When combined with redirection (> or >>), it is commonly used to concatenate files.
# Uses of 'cat' cat filename cat file.1 file.2 file.3 > file.123
The -n option to cat inserts consecutive numbers before all lines of the target file(s). The -b option numbers only the non-blank lines. The -v option echoes nonprintable characters, using ^ notation. Chapter 16. External Filters, Programs and Commands 215
Advanced Bash-Scripting Guide The -s option squeezes multiple consecutive blank lines into a single blank line. See also Example 16-28 and Example 16-24. In a pipe, it may be more efficient to redirect the stdin to a file, rather than to cat the file.
cat filename | tr a-z A-Z tr a-z A-Z < filename # Same effect, but starts one less process, #+ and also dispenses with the pipe.
tac, is the inverse of cat, listing a file backwards from its end. rev reverses each line of a file, and outputs to stdout. This does not have the same effect as tac, as it preserves the order of the lines, but flips each one around (mirror image).
bash$ cat file1.txt This is line 1. This is line 2.
cp This is the file copy command. cp file1 file2 copies file1 to file2, overwriting file2 if it already exists (see Example 16-6). Particularly useful are the -a archive flag (for copying an entire directory tree), the -u update flag (which prevents overwriting identically-named newer files), and the -r and -R recursive flags.
cp -u source_dir/* dest_dir # "Synchronize" dest_dir to source_dir #+ by copying over all newer and not previously existing files.
mv This is the file move command. It is equivalent to a combination of cp and rm. It may be used to move multiple files to a directory, or even to rename a directory. For some examples of using mv in a script, see Example 10-11 and Example A-2. When used in a non-interactive script, mv takes the -f (force) option to bypass user input. When a directory is moved to a preexisting directory, it becomes a subdirectory of the destination directory.
bash$ mv source_directory target_directory
216
rm Delete (remove) a file or files. The -f option forces removal of even readonly files, and is useful for bypassing user input in a script.
The rm command will, by itself, fail to remove filenames beginning with a dash. Why? Because rm sees a dash-prefixed filename as an option.
bash$ rm -badname rm: invalid option -- b Try `rm --help' for more information.
One clever workaround is to precede the filename with a " -- " (the end-of-options flag).
bash$ rm -- -badname
When used with the recursive flag -r, this command removes files all the way down the directory tree from the current directory. A careless rm -rf * can wipe out a big chunk of a directory structure. rmdir Remove directory. The directory must be empty of all files -- including "invisible" dotfiles [71] -- for this command to succeed. mkdir Make directory, creates a new directory. For example, mkdir -p project/programs/December creates the named directory. The -p option automatically creates any necessary parent directories. chmod Changes the attributes of an existing file or directory (see Example 15-14).
chmod +x filename # Makes "filename" executable for all users. chmod u+s filename # Sets "suid" bit on "filename" permissions. # An ordinary user may execute "filename" with same privileges as the file's owner. # (This does not apply to shell scripts.) chmod 644 filename # Makes "filename" readable/writable to owner, readable to others #+ (octal mode). chmod 444 filename # Makes "filename" read-only for all. # Modifying the file (for example, with a text editor) #+ not allowed for a user who does not own the file (except for root), #+ and even the file owner must force a file-save #+ if she modifies the file.
217
chmod 1777 directory-name # Gives everyone read, write, and execute permission in directory, #+ however also sets the "sticky bit". # This means that only the owner of the directory, #+ owner of the file, and, of course, root #+ can delete any particular file in that directory. chmod 111 directory-name # Gives everyone execute-only permission in a directory. # This means that you can execute and READ the files in that directory #+ (execute permission necessarily includes read permission #+ because you can't execute a file without being able to read it). # But you can't list the files or search for them with the "find" command. # These restrictions do not apply to root. chmod 000 directory-name # No permissions at all for that directory. # Can't read, write, or execute files in it. # Can't even list files in it or "cd" to it. # But, you can rename (mv) the directory #+ or delete it (rmdir) if it is empty. # You can even symlink to files in the directory, #+ but you can't read, write, or execute the symlinks. # These restrictions do not apply to root.
chattr Change file attributes. This is analogous to chmod above, but with different options and a different invocation syntax, and it works only on ext2/ext3 filesystems. One particularly interesting chattr option is i. A chattr +i filename marks the file as immutable. The file cannot be modified, linked to, or deleted, not even by root. This file attribute can be set or removed only by root. In a similar fashion, the a option marks the file as append only.
root# chattr +i file1.txt
root# rm file1.txt rm: remove write-protected regular file `file1.txt'? y rm: cannot remove `file1.txt': Operation not permitted
If a file has the s (secure) attribute set, then when it is deleted its block is overwritten with binary zeroes. [72] If a file has the u (undelete) attribute set, then when it is deleted, its contents can still be retrieved (undeleted). If a file has the c (compress) attribute set, then it will automatically be compressed on writes to disk, and uncompressed on reads. The file attributes set with chattr do not show in a file listing (ls -l). ln Creates links to pre-existings files. A "link" is a reference to a file, an alternate name for it. The ln command permits referencing the linked file by more than one name and is a superior alternative to aliasing (see Example 4-6). Chapter 16. External Filters, Programs and Commands 218
Advanced Bash-Scripting Guide The ln creates only a reference, a pointer to the file only a few bytes in size.
The ln command is most often used with the -s, symbolic or "soft" link flag. Advantages of using the -s flag are that it permits linking across file systems or to directories. The syntax of the command is a bit tricky. For example: ln -s oldfile newfile links the previously existing oldfile to the newly created link, newfile. If a file named newfile has previously existed, an error message will result. Which type of link to use? As John Macdonald explains it: Both of these [types of links] provide a certain measure of dual reference -- if you edit the contents of the file using any name, your changes will affect both the original name and either a hard or soft new name. The differences between them occurs when you work at a higher level. The advantage of a hard link is that the new name is totally independent of the old name -- if you remove or rename the old name, that does not affect the hard link, which continues to point to the data while it would leave a soft link hanging pointing to the old name which is no longer there. The advantage of a soft link is that it can refer to a different file system (since it is just a reference to a file name, not to actual data). And, unlike a hard link, a symbolic link can refer to a directory. Links give the ability to invoke a script (or any other type of executable) with multiple names, and having that script behave according to how it was invoked.
HELLO_CALL=65 GOODBYE_CALL=66 if [ $0 = "./goodbye" ] then echo "Good-bye!" # Some other goodbye-type commands, as appropriate. exit $GOODBYE_CALL fi echo "Hello!" # Some other hello-type commands, as appropriate. exit $HELLO_CALL
219
Advanced Bash-Scripting Guide man, info These commands access the manual and information pages on system commands and installed utilities. When available, the info pages usually contain more detailed descriptions than do the man pages. There have been various attempts at "automating" the writing of man pages. For a script that makes a tentative first step in that direction, see Example A-39.
If COMMAND contains {}, then find substitutes the full path name of the selected file for "{}".
find ~/ -name 'core*' -exec rm {} \; # Removes all core dump files from user's home directory. find /home/bozo/projects -mtime -1 # ^ Note minus sign! # Lists all files in /home/bozo/projects directory tree #+ that were modified within the last day (current_day - 1). # find /home/bozo/projects -mtime 1 # Same as above, but modified *exactly* one day ago. # # mtime = last modification time of the target file # ctime = last status change time (via 'chmod' or otherwise) # atime = last access time DIR=/home/bozo/junk_files find "$DIR" -type f -atime +5 -exec rm {} \; # ^ ^^ # Curly brackets are placeholder for the path name output by "find." # # Deletes all files in "/home/bozo/junk_files" #+ that have not been accessed in *at least* 5 days (plus sign ... +5). # # "-type filetype", where # f = regular file # d = directory # l = symbolic link, etc.
220
find /etc -exec grep '[0-9][0-9]*[.][0-9][0-9]*[.][0-9][0-9]*[.][0-9][0-9]*' {} \; # Finds all IP addresses (xxx.xxx.xxx.xxx) in /etc directory files. # There a few extraneous hits. Can they be filtered out? # Possibly by: find /etc -type f -exec cat '{}' \; | tr -c '.[:digit:]' '\n' \ | grep '^[^.][^.]*\.[^.][^.]*\.[^.][^.]*\.[^.][^.]*$' # # [:digit:] is one of the character classes #+ introduced with the POSIX 1003.2 standard. # Thanks, Stphane Chazelas.
The -exec option to find should not be confused with the exec shell builtin. Example 16-3. Badname, eliminate file names in current directory containing bad characters and whitespace.
#!/bin/bash # badname.sh # Delete filenames in current directory containing bad characters. for filename in * do badname=`echo "$filename" | sed -n /[\+\{\;\"\\\=\?~\(\)\<\>\&\*\|\$]/p` # badname=`echo "$filename" | sed -n '/[+{;"\=?~()<>&*|$]/p'` also works. # Deletes files containing these nasties: + { ; " \ = ? ~ ( ) < > & * | $ # rm $badname 2>/dev/null # ^^^^^^^^^^^ Error messages deep-sixed. done # Now, take care of files containing all manner of whitespace. find . -name "* *" -exec rm -f {} \; # The path name of the file that _find_ finds replaces the "{}". # The '\' ensures that the ';' is interpreted literally, as end of command. exit 0 #--------------------------------------------------------------------# Commands below this line will not execute because of _exit_ command. # An alternative to the above script: find . -name '*[+{;"\\=?~()<>&*|$ ]*' -maxdepth 0 \ -exec rm -f '{}' \; # The "-maxdepth 0" option ensures that _find_ will not search #+ subdirectories below $PWD. # (Thanks, S.C.)
221
if [ $# -ne "$ARGCOUNT" ] then echo "Usage: `basename $0` filename" exit $E_WRONGARGS fi if [ ! -e "$1" ] then echo "File \""$1"\" does not exist." exit $E_FILE_NOT_EXIST fi inum=`ls -i | grep "$1" | awk '{print $1}'` # inum = inode (index node) number of file # ----------------------------------------------------------------------# Every file has an inode, a record that holds its physical address info. # ----------------------------------------------------------------------echo; echo -n "Are you absolutely sure you want to delete \"$1\" (y/n)? " # The '-v' option to 'rm' also asks this. read answer case "$answer" in [nN]) echo "Changed your mind, huh?" exit $E_CHANGED_MIND ;; *) echo "Deleting file \"$1\".";; esac find . -inum $inum -exec rm {} \; # ^^ # Curly brackets are placeholder #+ for text output by "find." echo "File "\"$1"\" deleted!" exit 0
for file in $( find "$directory" -perm "$permissions" ) do ls -ltF --author "$file" done
222
Advanced Bash-Scripting Guide See Example 16-30, Example 3-4, and Example 11-9 for scripts using find. Its manpage provides more detail on this complex and powerful command. xargs A filter for feeding arguments to a command, and also a tool for assembling the commands themselves. It breaks a data stream into small enough chunks for filters and commands to process. Consider it as a powerful replacement for backquotes. In situations where command substitution fails with a too many arguments error, substituting xargs often works. [73] Normally, xargs reads from stdin or from a pipe, but it can also be given the output of a file. The default command for xargs is echo. This means that input piped to xargs may have linefeeds and other whitespace characters stripped out.
bash$ ls -l total 0 -rw-rw-r--rw-rw-r--
1 bozo 1 bozo
bozo bozo
bash$ ls -l | xargs total 0 -rw-rw-r-- 1 bozo bozo 0 Jan 29 23:58 file1 -rw-rw-r-- 1 bozo bozo 0 Jan...
bash$ find ~/mail -type f | xargs grep "Linux" ./misc:User-Agent: slrn/0.9.8.1 (Linux) ./sent-mail-jul-2005: hosted by the Linux Documentation Project. ./sent-mail-jul-2005: (Linux Documentation Project Site, rtf version) ./sent-mail-jul-2005: Subject: Criticism of Bozo's Windows/Linux article ./sent-mail-jul-2005: while mentioning that the Linux ext2/ext3 filesystem . . .
ls | xargs -p -l gzip gzips every file in current directory, one at a time, prompting before each operation.
Note that xargs processes the arguments passed to it sequentially, one at a time.
bash$ find /usr/bin | xargs file /usr/bin: directory /usr/bin/foomatic-ppd-options: . . .
An interesting xargs option is -n NN, which limits to NN the number of arguments passed. ls | xargs -n 8 echo lists the files in the current directory in 8 columns. Another useful option is -0, in combination with find -print0 or grep -lZ. This allows handling arguments containing whitespace or quotes. find / -type f -print0 | xargs -0 grep -liwZ GUI | xargs -0 rm -f Chapter 16. External Filters, Programs and Commands 223
Advanced Bash-Scripting Guide grep -rliwZ GUI / | xargs -0 rm -f Either of the above will remove any file containing "GUI". (Thanks, S.C.) Or:
cat /proc/"$pid"/"$OPTION" | xargs -0 echo # Formats output: ^^^^^^^^^^^^^^^ # From Han Holl's fixup of "get-commandline.sh" #+ script in "/dev and /proc" chapter.
The -P option to xargs permits running processes in parallel. This speeds up execution in a machine with a multicore CPU.
#!/bin/bash ls *gif | xargs -t -n1 -P2 gif2png # Converts all the gif images in current directory to png. # # # # # Options: ======= -t Print command to stderr. -n1 At most 1 argument per command line. -P2 Run up to 2 processes simultaneously.
224
# # # #+ #
Exercise: -------Modify this script to track changes in /var/log/messages at intervals of 20 minutes. Hint: Use the "watch" command.
exit 0
225
PROCESS_NAME="$1" ps ax | grep "$PROCESS_NAME" | awk '{print $1}' | xargs -i kill {} 2&>/dev/null # ^^ ^^ # # # # # # # # --------------------------------------------------------------Notes: -i is the "replace strings" option to xargs. The curly brackets are the placeholder for the replacement. 2&>/dev/null suppresses unwanted error messages. Can grep "$PROCESS_NAME" be replaced by pidof "$PROCESS_NAME"? ---------------------------------------------------------------
exit $? # The "killall" command has the same effect as this script, #+ but using it is not quite as educational.
# Check for input file on command-line. ARGS=1 E_BADARGS=85 E_NOFILE=86 if [ $# -ne "$ARGS" ] # Correct number of arguments passed to script? then echo "Usage: `basename $0` filename" exit $E_BADARGS fi if [ ! -f "$1" ] # Does file exist? then echo "File \"$1\" does not exist." exit $E_NOFILE fi
##################################################### cat "$1" | xargs -n1 | \ # List the file, one word per line.
226
expr All-purpose expression evaluator: Concatenates and evaluates the arguments according to the operation given (arguments must be separated by spaces). Operations may be arithmetic, comparison, string, or logical. expr 3 + 5 returns 8 expr 5 % 3 returns 2 expr 1 / 0 returns the error message, expr: division by zero Illegal arithmetic operations not allowed. expr 5 \* 3 returns 15 The multiplication operator must be escaped when used in an arithmetic expression with expr. y=`expr $y + 1` Increment a variable, with the same effect as let y=y+1 and y=$(($y+1)). This is an example of arithmetic expansion. z=`expr substr $string $position $length` Extract substring of $length characters, starting at $position. Example 16-9. Using expr
#!/bin/bash # Demonstrating some of the uses of 'expr' # ======================================= echo # Arithmetic Operators # ---------- --------echo "Arithmetic Operators" echo a=`expr 5 + 3` echo "5 + 3 = $a"
227
a=3 b=`expr $a \> 10` echo 'b=`expr $a \> 10`, therefore...' echo "If a > 10, b = 0 (false)" echo "b = $b" # 0 ( 3 ! -gt 10 ) echo b=`expr $a \< 10` echo "If a < 10, b = 1 (true)" echo "b = $b" # 1 ( 3 -lt 10 ) echo # Note escaping of operators. b=`expr $a \<= 3` echo "If a <= 3, b = 1 (true)" echo "b = $b" # 1 ( 3 -le 3 ) # There is also a "\>=" operator (greater than or equal to).
echo echo
# String Operators # ------ --------echo "String Operators" echo a=1234zipper43231 echo "The string being operated upon is \"$a\"." # length: length of string
228
# The default behavior of the 'match' operations is to #+ search for the specified match at the BEGINNING of the string. # # Using Regular Expressions ... b=`expr match "$a" '[0-9]*'` # Numerical count. echo Number of digits at the beginning of \"$a\" is $b. b=`expr match "$a" '\([0-9]*\)'` # Note that escaped parentheses # == == #+ trigger substring match. echo "The digits at the beginning of \"$a\" are \"$b\"." echo exit 0
The : (null) operator can substitute for match. For example, b=`expr $a : [0-9]*` is the exact equivalent of b=`expr match $a [0-9]*` in the above listing.
#!/bin/bash echo echo "String operations using \"expr \$string : \" construct" echo "===================================================" echo a=1234zipper5FLIPPER43231 echo "The string being operated upon is \"`expr "$a" : '\(.*\)'`\"." # Escaped parentheses grouping operator. == == # #+ #+ # *************************** Escaped parentheses match a substring ***************************
# If no escaped parentheses ... #+ then 'expr' converts the string operand to an integer. echo "Length of \"$a\" is `expr "$a" : '.*'`." # Length of string
echo "Number of digits at the beginning of \"$a\" is `expr "$a" : '[0-9]*'`." # ------------------------------------------------------------------------- # echo echo "The digits at the beginning of \"$a\" are `expr "$a" : '\([0-9]*\)'`."
229
The above script illustrates how expr uses the escaped parentheses -- \( ... \) -- grouping operator in tandem with regular expression parsing to match a substring. Here is a another example, this time from "real life."
# Strip the whitespace from the beginning and end. LRFDATE=`expr "$LRFDATE" : '[[:space:]]*\(.*\)[[:space:]]*$'` # From Peter Knowles' "booklistgen.sh" script #+ for converting files to Sony Librie/PRS-50X format. # (http://booklistgensh.peterknowles.com)
Perl, sed, and awk have far superior string parsing facilities. A short sed or awk "subroutine" within a script (see Section 36.2) is an attractive alternative to expr. See Section 10.1 for more on using expr in string operations.
230
diff () { printf '%s' $(( $(date -u -d"$TARGET" +%s) $(date -u -d"$CURRENT" +%s))) # %d = day of month. }
CURRENT=$(date -u -d '2007-09-01 17:30:24' '+%F %T.%N %Z') TARGET=$(date -u -d'2007-12-25 12:30:00' '+%F %T.%N %Z') # %F = full date, %T = %H:%M:%S, %N = nanoseconds, %Z = time zone. printf '\nIn 2007, %s ' \ "$(date -d"$CURRENT + $(( $(diff) /$MPHR /$MPHR /$HPD / 2 )) days" '+%d %B')" # %B = name of month ^ halfway printf 'was halfway between %s ' "$(date -d"$CURRENT" '+%d %B')" printf 'and %s\n' "$(date -d"$TARGET" '+%d %B')" printf '\nOn %s at %s, there were\n' \ $(date -u -d"$CURRENT" +%F) $(date -u -d"$CURRENT" +%T) DAYS=$(( $(diff) / $MPHR / $MPHR / $HPD )) CURRENT=$(date -d"$CURRENT +$DAYS days" '+%F %T.%N %Z') HOURS=$(( $(diff) / $MPHR / $MPHR )) CURRENT=$(date -d"$CURRENT +$HOURS hours" '+%F %T.%N %Z') MINUTES=$(( $(diff) / $MPHR )) CURRENT=$(date -d"$CURRENT +$MINUTES minutes" '+%F %T.%N %Z') printf '%s days, %s hours, ' "$DAYS" "$HOURS" printf '%s minutes, and %s seconds ' "$MINUTES" "$(diff)" printf 'until Christmas Dinner!\n\n' # # # Exercise: -------Rewrite the diff () function to accept passed parameters,
231
The date command has quite a number of output options. For example %N gives the nanosecond portion of the current time. One interesting use for this is to generate random integers.
date +%N | sed -e 's/000$//' -e 's/^0//' ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ # Strip off leading and trailing zeroes, if present. # Length of generated integer depends on #+ how many zeroes stripped off. # 115281032 # 63408725 # 394504284
# The 'TZ' parameter permits overriding the default time zone. date # Mon Mar 28 21:42:16 MST 2005 TZ=EST date # Mon Mar 28 23:42:16 EST 2005 # Thanks, Frank Kannemann and Pete Sjoberg, for the tip.
SixDaysAgo=$(date --date='6 days ago') OneMonthAgo=$(date --date='1 month ago') OneYearAgo=$(date --date='1 year ago')
See also Example 3-4 and Example A-43. zdump Time zone dump: echoes the time in a specified time zone.
bash$ zdump EST EST Tue Sep 18 22:09:22 2001 EST
time Outputs verbose timing statistics for executing a command. time ls -l / gives something like this:
real user sys 0m0.067s 0m0.004s 0m0.005s
See also the very similar times command in the previous section. As of version 2.0 of Bash, time became a shell reserved word, with slightly altered behavior in a pipeline. touch Utility for updating access/modification times of a file to current system time or other specified time, but also useful for creating a new file. The command touch zzz will create a new file of zero Chapter 16. External Filters, Programs and Commands 232
Advanced Bash-Scripting Guide length, named zzz, assuming that zzz did not previously exist. Time-stamping empty files in this way is useful for storing date information, for example in keeping track of modification times on a project. The touch command is equivalent to : >> newfile or >> newfile (for ordinary files). Before doing a cp -u (copy/update), use touch to update the time stamp of files you don't wish overwritten. As an example, if the directory /home/bozo/tax_audit contains the files spreadsheet-051606.data, spreadsheet-051706.data, and spreadsheet-051806.data, then doing a touch spreadsheet*.data will protect these files from being overwritten by files with the same names during a cp -u /home/bozo/financial_info/spreadsheet*data /home/bozo/tax_audit. at The at job control command executes a given set of commands at a specified time. Superficially, it resembles cron, however, at is chiefly useful for one-time execution of a command set. at 2pm January 15 prompts for a set of commands to execute at that time. These commands should be shell-script compatible, since, for all practical purposes, the user is typing in an executable shell script a line at a time. Input terminates with a Ctl-D. Using either the -f option or input redirection (<), at reads a command list from a file. This file is an executable shell script, though it should, of course, be non-interactive. Particularly clever is including the run-parts command in the file to execute a different set of scripts.
bash$ at 2:30 am Friday < at-jobs.list job 2 at 2000-10-27 02:30
batch The batch job control command is similar to at, but it runs a command list when the system load drops below .8. Like at, it can read commands from a file with the -f option.
The concept of batch processing dates back to the era of mainframe computers. It means running a set of commands without user intervention. cal Prints a neatly formatted monthly calendar to stdout. Will do current year or a large range of past and future years. sleep This is the shell equivalent of a wait loop. It pauses for a specified number of seconds, doing nothing. It can be useful for timing or in processes running in the background, checking for a specific event every so often (polling), as in Example 32-6.
sleep 3 # Pauses 3 seconds.
The sleep command defaults to seconds, but minute, hours, or days may also be specified. Chapter 16. External Filters, Programs and Commands 233
The watch command may be a better choice than sleep for running commands at timed intervals. usleep Microsleep (the u may be read as the Greek mu, or micro- prefix). This is the same as sleep, above, but "sleeps" in microsecond intervals. It can be used for fine-grained timing, or for polling an ongoing process at very frequent intervals.
usleep 30 # Pauses 30 microseconds.
This command is part of the Red Hat initscripts / rc-scripts package. The usleep command does not provide particularly accurate timing, and is therefore unsuitable for critical timing loops. hwclock, clock The hwclock command accesses or adjusts the machine's hardware clock. Some options require root privileges. The /etc/rc.d/rc.sysinit startup file uses hwclock to set the system time from the hardware clock at bootup. The clock command is a synonym for hwclock.
The useful -c option prefixes each line of the input file with its number of occurrences.
bash$ cat testfile This line occurs only once. This line occurs twice. This line occurs twice. This line occurs three times.
234
bash$ uniq -c testfile 1 This line occurs only once. 2 This line occurs twice. 3 This line occurs three times.
bash$ sort testfile | uniq -c | sort -nr 3 This line occurs three times. 2 This line occurs twice. 1 This line occurs only once.
The sort INPUTFILE | uniq -c | sort -nr command string produces a frequency of occurrence listing on the INPUTFILE file (the -nr options to sort cause a reverse numerical sort). This template finds use in analysis of log files and dictionary lists, and wherever the lexical structure of a document needs to be examined.
# Check for input file on command-line. ARGS=1 E_BADARGS=85 E_NOFILE=86 if [ $# -ne "$ARGS" ] # Correct number of arguments passed to script? then echo "Usage: `basename $0` filename" exit $E_BADARGS fi if [ ! -f "$1" ] # Check if file exists. then echo "File \"$1\" does not exist." exit $E_NOFILE fi
######################################################## # main () sed -e 's/\.//g' -e 's/\,//g' -e 's/ /\ /g' "$1" | tr 'A-Z' 'a-z' | sort | uniq -c | sort -nr # ========================= # Frequency of occurrence # #+ #+ #+ # Filter out periods and commas, and change space between words to linefeed, then shift characters to lowercase, and finally prefix occurrence count and sort numerically. Arun Giridhar suggests modifying the above to:
235
bash$ ./wf.sh testfile 6 this 6 occurs 6 line 3 times 3 three 2 twice 1 only 1 once
expand, unexpand The expand filter converts tabs to spaces. It is often used in a pipe. The unexpand filter converts spaces to tabs. This reverses the effect of expand. cut A tool for extracting fields from files. It is similar to the print $N command set in awk, but more limited. It may be simpler to use cut in a script than awk. Particularly important are the -d (delimiter) and -f (field specifier) options. Using cut to obtain a listing of the mounted filesystems:
cut -d ' ' -f1,2 /etc/mtab
236
Advanced Bash-Scripting Guide Using cut to extract message headers from an e-mail folder:
bash$ grep '^Subject:' read-messages | cut -c10-80 Re: Linux suitable for mission-critical apps? MAKE MILLIONS WORKING AT HOME!!! Spam complaint Re: Spam complaint
cut -d ' ' -f2,3 filename is equivalent to awk -F'[ ]' '{ print $2, $3 }' filename It is even possible to specify a linefeed as a delimiter. The trick is to actually embed a linefeed (RETURN) in the command sequence.
bash$ cut -d' ' -f3,7,19 testfile This is line 3 of testfile. This is line 7 of testfile. This is line 19 of testfile.
Thank you, Jaka Kranjc, for pointing this out. See also Example 16-48. paste Tool for merging together different files into a single, multi-column file. In combination with cut, useful for creating system log files. join Consider this a special-purpose cousin of paste. This powerful utility allows merging two files in a meaningful fashion, which essentially creates a simple version of a relational database. The join command operates on exactly two files, but pastes together only those lines with a common tagged field (usually a numerical label), and writes the result to stdout. The files to be joined should be sorted according to the tagged field for the matchups to work properly.
File: 1.data 100 Shoes 200 Laces 300 Socks File: 2.data 100 $40.00 200 $1.00 300 $2.00
237
The tagged field appears only once in the output. head lists the beginning of a file to stdout. The default is 10 lines, but a different number can be specified. The command has a number of interesting options. Example 16-13. Which files are scripts?
#!/bin/bash # script-detector.sh: Detects scripts within a directory. TESTCHARS=2 SHABANG='#!' # Test first 2 characters. # Scripts begin with a "sha-bang."
for file in * # Traverse all the files in current directory. do if [[ `head -c$TESTCHARS "$file"` = "$SHABANG" ]] # head -c2 #! # The '-c' option to "head" outputs a specified #+ number of characters, rather than lines (the default). then echo "File \"$file\" is a script." else echo "File \"$file\" is *not* a script." fi done exit 0 # # # #+ #+ # # #+ # Exercises: --------1) Modify this script to take as an optional argument the directory to scan for scripts (rather than just the current working directory). 2) As it stands, this script gives "false positives" for Perl, awk, and other scripting language scripts. Correct this.
# =================================================================== #
238
# The author of this script explains the action of 'sed', as follows. # head -c4 /dev/urandom | od -N4 -tu4 | sed -ne '1s/.* //p' # ----------------------------------> | # Assume output up to "sed" --------> | # is 0000000 1198195154\n # # #+ # # # # #+ # # #+ sed begins reading characters: 0000000 1198195154\n. Here it finds a newline character, so it is ready to process the first line (0000000 1198195154). It looks at its <range><action>s. The first and only one is range 1 action s/.* //p
The line number is in the range, so it executes the action: tries to substitute the longest string ending with a space in the line ("0000000 ") with nothing (//), and if it succeeds, prints the result ("p" is a flag to the "s" command here, this is different from the "p" command).
# sed is now ready to continue reading its input. (Note that before #+ continuing, if -n option had not been passed, sed would have printed #+ the line once again). # #+ # #+ # Now, sed reads the remainder of the characters, and finds the end of the file. It is now ready to process its 2nd line (which is also numbered '$' as it's the last one). It sees it is not matched by any <range>, so its job is done.
# In few word this sed commmand means: # "On the first line only, remove any character up to the right-most space, #+ then print it." # A better way to do this would have been: # sed -e 's/.* //;q' # Here, two <range><action>s (could have been written # sed -e 's/.* //' -e q): # # # range nothing (matches line) nothing (matches line) action s/.* // q (quit)
239
See also Example 16-39. tail lists the (tail) end of a file to stdout. The default is 10 lines, but this can be changed with the -n option. Commonly used to keep track of changes to a system logfile, using the -f option, which outputs lines appended to the file.
To list a specific line of a text file, pipe the output of head to tail -n 1. For example head -n 8 database.txt | tail -n 1 lists the 8th line of the file database.txt. To set a variable to a given block of a text file:
var=$(head -n $m $filename | tail -n $n) # filename = name of file # m = from beginning of file, number of lines to end of block # n = number of lines to set variable to (trim from end of block)
Newer implementations of tail deprecate the older tail -$LINES filename usage. The standard tail -n $LINES filename is correct. See also Example 16-5, Example 16-39 and Example 32-6. grep A multi-purpose file search tool that uses Regular Expressions. It was originally a command/filter in the venerable ed line editor: g/re/p -- global - regular expression - print. grep pattern [file...] Chapter 16. External Filters, Programs and Commands 240
Advanced Bash-Scripting Guide Search the target file(s) for occurrences of pattern, where pattern may be literal text or a Regular Expression.
bash$ grep '[rst]ystem.$' osinfo.txt The GPL governs the distribution of the Linux operating system.
The -i option causes a case-insensitive search. The -w option matches only whole words. The -l option lists only the files in which matches were found, but not the matching lines. The -r (recursive) option searches files in the current working directory and all subdirectories below it. The -n option lists the matching lines, together with line numbers.
bash$ grep -n Linux osinfo.txt 2:This is a file containing information about Linux. 6:The GPL governs the distribution of the Linux operating system.
The -c (--count) option gives a numerical count of matches, rather than actually listing the matches.
grep -c txt *.sgml # (number of occurrences of "txt" in "*.sgml" files)
# grep -cz . # ^ dot # means count (-c) zero-separated (-z) items matching "." # that is, non-empty ones (containing at least 1 character). # printf 'a b\nc d\n\n\n\n\n\000\n\000e\000\000\nf' | grep -cz . printf 'a b\nc d\n\n\n\n\n\000\n\000e\000\000\nf' | grep -cz '$' printf 'a b\nc d\n\n\n\n\n\000\n\000e\000\000\nf' | grep -cz '^' # printf 'a b\nc d\n\n\n\n\n\000\n\000e\000\000\nf' | grep -c '$' # By default, newline chars (\n) separate items to match. # Note that the -z option is GNU "grep" specific.
# 3 # 5 # 5 # 9
# Thanks, S.C.
241
Advanced Bash-Scripting Guide The --color (or --colour) option marks the matching string in color (on the console or in an xterm window). Since grep prints out each entire line containing the matching pattern, this lets you see exactly what is being matched. See also the -o option, which shows only the matching portion of the line(s).
Example 16-16. Printing out the From lines in stored e-mail messages
#!/bin/bash # from.sh # Emulates the useful 'from' utility in Solaris, BSD, etc. # Echoes the "From" header line in all messages #+ in your e-mail directory.
MAILDIR=~/mail/* # No quoting of variable. Why? # Maybe check if-exists $MAILDIR: if [ -d $MAILDIR ] . . . GREP_OPTS="-H -A 5 --color" # Show file, plus extra context lines #+ and display "From" in color. TARGETSTR="^From" # "From" at beginning of line. for file in $MAILDIR # No quoting of variable. do grep $GREP_OPTS "$TARGETSTR" "$file" # ^^^^^^^^^^ # Again, do not quote this variable. echo done exit $? # You might wish to pipe the output of this script to 'more' #+ or redirect it to a file . . .
When invoked with more than one target file given, grep specifies which file contains matches.
bash$ grep Linux osinfo.txt misc.txt osinfo.txt:This is a file containing information about Linux. osinfo.txt:The GPL governs the distribution of the Linux operating system. misc.txt:The Linux operating system is steadily gaining in popularity.
To force grep to show the filename when searching only one target file, simply give /dev/null as the second file.
bash$ grep Linux osinfo.txt /dev/null osinfo.txt:This is a file containing information about Linux. osinfo.txt:The GPL governs the distribution of the Linux operating system.
If there is a successful match, grep returns an exit status of 0, which makes it useful in a condition test in a script, especially in combination with the -q option to suppress output.
SUCCESS=0 word=Linux filename=data.file grep -q "$word" "$filename" # if grep lookup succeeds
242
Example 32-6 demonstrates how to use grep to search for a word pattern in a system logfile.
How can grep search for two (or more) separate patterns? What if you want grep to display all lines in a file or files that contain both "pattern1" and "pattern2"? One method is to pipe the result of grep pattern1 to grep pattern2. For example, given the following file:
# Filename: tstfile This This This This is a sample file. is an ordinary text file. file does not contain any unusual text. file is not unusual.
243
Now, let's search this file for lines containing both "file" and "text" . . .
bash$ grep file tstfile # Filename: tstfile This is a sample file. This is an ordinary text file. This file does not contain any unusual text. This file is not unusual. bash$ grep file tstfile | grep text This is an ordinary text file. This file does not contain any unusual text.
E_NOPATT=71 DICT=/usr/share/dict/word.lst # ^^^^^^^^ Looks for word list here. # ASCII word list, one word per line. # If you happen to need an appropriate list, #+ download the author's "yawl" word list package. # http://ibiblio.org/pub/Linux/libs/yawl-0.3.2.tar.gz # or # http://bash.deta.in/yawl-0.3.2.tar.gz
if [ -z "$1" ] # If no word pattern specified then #+ as a command-line argument . . . echo #+ . . . then . . . echo "Usage:" #+ Usage message. echo echo ""$0" \"pattern,\"" echo "where \"pattern\" is in the form" echo "xxx..x.x..." echo echo "The x's represent known letters," echo "and the periods are unknown letters (blanks)." echo "Letters and periods can be in any position." echo "For example, try: sh cw-solver.sh w...i....n" echo exit $E_NOPATT fi
244
exit $? # Script terminates here. # If there are too many words generated, #+ redirect the output to a file. $ sh cw-solver.sh w...i....n wellington workingman workingmen
egrep -- extended grep -- is the same as grep -E. This uses a somewhat different, extended set of Regular Expressions, which can make the search a bit more flexible. It also allows the boolean | (or) operator.
bash $ egrep 'matches|Matches' file.txt Line 1 matches. Line 3 Matches. Line 4 contains matches, but also Matches
fgrep -- fast grep -- is the same as grep -F. It does a literal string search (no Regular Expressions), which generally speeds things up a bit. On some Linux distros, egrep and fgrep are symbolic links to, or aliases for grep, but invoked with the -E and -F options, respectively. Example 16-19. Looking up definitions in Webster's 1913 Dictionary
#!/bin/bash # dict-lookup.sh # # #+ #+ # # #+ # # This script looks up definitions in the 1913 Webster's Dictionary. This Public Domain dictionary is available for download from various sites, including Project Gutenberg (http://www.gutenberg.org/etext/247). Convert it from DOS to UNIX format (with only LF at end of line) before using it with this script. Store the file in plain, uncompressed ASCII text. Set DEFAULT_DICTFILE variable below to path/filename.
E_BADARGS=85 MAXCONTEXTLINES=50 # Maximum number of lines to show. DEFAULT_DICTFILE="/usr/share/dict/webster1913-dict.txt" # Default dictionary file pathname.
245
if [[ -z $(echo "$1" | sed -n '/^[A-Z]/p') ]] # Must at least specify word to look up, and #+ it must start with an uppercase letter. then echo "Usage: `basename $0` Word-to-define [dictionary-file]" echo echo "Note: Word to look up must start with capital letter," echo "with the rest of the word in lowercase." echo "--------------------------------------------" echo "Examples: Abandon, Dictionary, Marking, etc." exit $E_BADARGS fi
# --------------------------------------------------------Definition=$(fgrep -A $MAXCONTEXTLINES "$1 \\" "$dictfile") # Definitions in form "Word \..." # # And, yes, "fgrep" is fast enough #+ to search even a very large text file.
# Now, snip out just the definition block. echo "$Definition" | sed -n '1,/^[A-Z]/p' | # Print from first line of output #+ to the first line of the next entry. sed '$d' | sed '$d' # Delete last two lines of output #+ (blank line and first line of next entry). # --------------------------------------------------------exit $? # # # # # # # # Exercises: --------1) Modify the script to accept any type of alphabetic input + (uppercase, lowercase, mixed case), and convert it + to an acceptable format for processing. 2) Convert the script to a GUI application, + using something like 'gdialog' or 'zenity' . . .
246
See also Example A-41 for an example of speedy fgrep lookup on a large text file. agrep (approximate grep) extends the capabilities of grep to approximate matching. The search string may differ by a specified number of characters from the resulting matches. This utility is not part of the core Linux distribution.
To search compressed files, use zgrep, zegrep, or zfgrep. These also work on non-compressed files, though slower than plain grep, egrep, fgrep. They are handy for searching through a mixed set of files, some compressed, some not.
To search bzipped files, use bzgrep. look The command look works like grep, but does a lookup on a "dictionary," a sorted word list. By default, look searches for a match in /usr/dict/words, but a different dictionary file may be specified.
echo echo "Testing file $file" echo while [ "$word" != end ] # Last word in data file. do # ^^^ read word # From data file, because of redirection at end of loop. look $word > /dev/null # Don't want to display lines in dictionary file. # Searches for words in the file /usr/share/dict/words #+ (usually a link to linux.words). lookup=$? # Exit status of 'look' command. if [ "$lookup" -eq 0 ] then echo "\"$word\" is valid." else echo "\"$word\" is invalid." fi done <"$file" echo exit 0 # Redirects stdin to $file, so "reads" come from there.
247
# Stephane Chazelas proposes the following, more concise alternative: while read word && [[ $word != end ]] do if look "$word" > /dev/null then echo "\"$word\" is valid." else echo "\"$word\" is invalid." fi done <"$file" exit 0
sed, awk Scripting languages especially suited for parsing text files and command output. May be embedded singly or in combination in pipes and shell scripts. sed Non-interactive "stream editor", permits using many ex commands in batch mode. It finds many uses in shell scripts. awk Programmable file extractor and formatter, good for manipulating and/or extracting fields (columns) in structured text files. Its syntax is similar to C. wc wc gives a "word count" on a file or I/O stream:
bash $ wc /usr/share/doc/sed-4.1.2/README 13 70 447 README [13 lines 70 words 447 characters]
wc -w gives only the word count. wc -l gives only the line count. wc -c gives only the byte count. wc -m gives only the character count. wc -L gives only the length of the longest line. Using wc to count how many .txt files are in current working directory:
$ ls *.txt | wc -l # Will work as long as none of the "*.txt" files #+ have a linefeed embedded in their name. # # # # Alternative ways of doing this are: find . -maxdepth 1 -name \*.txt -print0 | grep -cz . (shopt -s nullglob; set -- *.txt; echo $#) Thanks, S.C.
Using wc to total up the size of all the files whose names begin with letters in the range d - h
bash$ wc [d-h]* | grep total | awk '{print $3}' 71832
248
Advanced Bash-Scripting Guide Using wc to count the instances of the word "Linux" in the main source file for this book.
bash$ grep Linux abs-book.sgml | wc -l 50
See also Example 16-39 and Example 20-8. Certain commands include some of the functionality of wc as options.
... | grep foo | wc -l # This frequently used construct can be more concisely rendered. ... | grep -c foo # Just use the "-c" (or "--count") option of grep. # Thanks, S.C.
tr character translation filter. Must use quoting and/or brackets, as appropriate. Quotes prevent the shell from reinterpreting the special characters in tr command sequences. Brackets should be quoted to prevent expansion by the shell. Either tr "A-Z" "*" <filename or tr A-Z \* <filename changes all the uppercase letters in filename to asterisks (writes to stdout). On some systems this may not work, but tr A-Z '[**]' will.
The --squeeze-repeats (or -s) option deletes all but the first instance of a string of consecutive characters. This option is useful for removing excess whitespace.
bash$ echo "XXXXX" | tr --squeeze-repeats 'X' X
The -c "complement" option inverts the character set to match. With this option, tr acts only upon those characters not matching the specified set.
bash$ echo "acfdeb123" | tr -c b-d + +c+d+b++++
Example 16-21. toupper: Transforms a file to all uppercase. Chapter 16. External Filters, Programs and Commands 249
exit 0 # # #+ # Exercise: Rewrite this script to give the option of changing a file to *either* upper or lowercase. Hint: Use either the "case" or "select" command.
for filename in * do fname=`basename $filename` n=`echo $fname | tr A-Z a-z` if [ "$fname" != "$n" ] then mv $fname $n fi done exit $?
# Code below this line will not execute because of "exit". #--------------------------------------------------------# # To run it, delete script above line. # The above script will not work on filenames containing blanks or newlines. # Stephane Chazelas therefore suggests the following alternative:
for filename in *
250
tr -d $CR < $1 > $NEWFILENAME # Delete CR's and write to new file. echo "Original DOS text file is \"$1\"." echo "Converted UNIX text file is \"$NEWFILENAME\"." exit 0 # Exercise: # -------# Change the above script to convert from UNIX to DOS.
251
key=ETAOINSHRDLUBCFGJMQPVWZYXK # The "key" is nothing more than a scrambled alphabet. # Changing the "key" changes the encryption. # The 'cat "$@"' construction gets input either from stdin or from files. # If using stdin, terminate input with a Control-D. # Otherwise, specify filename as command-line parameter. cat "$@" | tr "a-z" "A-Z" | tr "A-Z" "$key" # | to uppercase | encrypt # Will work on lowercase, uppercase, or mixed-case quotes. # Passes non-alphabetic characters through unchanged.
# # # # # # #
Try this script with something like: "Nothing so needs reforming as other people's habits." --Mark Twain Output is: "CFPHRCS QF CIIOQ MINFMBRCS EQ FPHIM GIFGUI'Q HETRPQ." --BEML PZERC
# This simple-minded cipher can be broken by an average 12-year old #+ using only pencil and paper. exit 0 # # # #+ Exercise: -------Modify the script so that it will either encrypt or decrypt, depending on command-line argument(s).
252
Advanced Bash-Scripting Guide tr variants The tr utility has two historic variants. The BSD version does not use brackets (tr a-z A-Z), but the SysV one does (tr '[a-z]' '[A-Z]'). The GNU version of tr resembles the BSD one. fold A filter that wraps lines of input to a specified width. This is especially useful with the -s option, which breaks lines at word spaces (see Example 16-26 and Example A-1). fmt Simple-minded file formatter, used as a filter in a pipe to "wrap" long lines of text output.
See also Example 16-5. A powerful alternative to fmt is Kamil Toman's par utility, available from http://www.cs.berkeley.edu/~amc/Par/. col This deceptively named filter removes reverse line feeds from an input stream. It also attempts to replace whitespace with equivalent tabs. The chief use of col is in filtering the output from certain text processing utilities, such as groff and tbl. column Column formatter. This filter transforms list-type text output into a "pretty-printed" table by inserting tabs at appropriate places.
(printf "PERMISSIONS LINKS OWNER GROUP SIZE MONTH DAY HH:MM PROG-NAME\n" \ ; ls -l | sed 1d) | column -t # ^^^^^^ ^^ # The "sed 1d" in the pipe deletes the first line of output, #+ which would be "total N", #+ where "N" is the total number of files found by "ls -l". # The -t option to "column" pretty-prints a table.
253
colrm Column removal filter. This removes columns (characters) from a file and writes the file, lacking the range of specified columns, back to stdout. colrm 2 4 <filename removes the second through fourth characters from each line of the text file filename. If the file contains tabs or nonprintable characters, this may cause unpredictable behavior. In such cases, consider using expand and unexpand in a pipe preceding colrm. nl Line numbering filter: nl filename lists filename to stdout, but inserts consecutive numbers at the beginning of each non-blank line. If filename omitted, operates on stdin. The output of nl is very similar to cat -b, since, by default nl does not list blank lines.
nl `basename $0` echo; echo # Now, let's try it with 'cat -n'
cat -n `basename $0` # The difference is that 'cat -n' numbers the blank lines. # Note that 'nl -ba' will also do so. exit 0 # -----------------------------------------------------------------
pr Print formatting filter. This will paginate files (or stdout) into sections suitable for hard copy printing or viewing on screen. Various options permit row and column manipulation, joining lines, setting margins, numbering lines, adding page headers, and merging files, among other things. The pr command combines much of the functionality of nl, paste, fold, column, and expand. pr -o 5 --width=65 fileZZZ | more gives a nice paginated listing to screen of fileZZZ with margins set at 5 and 65. A particularly useful option is -d, forcing double-spacing (same effect as sed -G). gettext The GNU gettext package is a set of utilities for localizing and translating the text output of programs into foreign languages. While originally intended for C programs, it now supports quite a number of programming and scripting languages. The gettext program works on shell scripts. See the info page. Chapter 16. External Filters, Programs and Commands 254
Advanced Bash-Scripting Guide msgfmt A program for generating binary message catalogs. It is used for localization. iconv A utility for converting file(s) to a different encoding (character set). Its chief use is for localization.
# Convert a string from UTF-8 to UTF-16 and print to the BookList function write_utf8_string { STRING=$1 BOOKLIST=$2 echo -n "$STRING" | iconv -f UTF8 -t UTF16 | \ cut -b 3- | tr -d \\n >> "$BOOKLIST" } # From Peter Knowles' "booklistgen.sh" script #+ for converting files to Sony Librie/PRS-50X format. # (http://booklistgensh.peterknowles.com)
recode Consider this a fancier version of iconv, above. This very versatile utility for converting a file to a different encoding scheme. Note that recode is not part of the standard Linux installation. TeX, gs TeX and Postscript are text markup languages used for preparing copy for printing or formatted video display. TeX is Donald Knuth's elaborate typsetting system. It is often convenient to write a shell script encapsulating all the options and arguments passed to one of these markup languages. Ghostscript (gs) is a GPL-ed Postscript interpreter. texexec Utility for processing TeX and pdf files. Found in /usr/bin on many Linux distros, it is actually a shell wrapper that calls Perl to invoke Tex.
texexec --pdfarrange --result=Concatenated.pdf *pdf # #+ # # Concatenates all the pdf files in the current working directory into the merged file, Concatenated.pdf . . . (The --pdfarrange option repaginates a pdf file. See also --pdfcombine.) The above command-line could be parameterized and put into a shell script.
enscript Utility for converting plain text file to PostScript For example, enscript filename.txt -p filename.ps produces the PostScript output file filename.ps. groff, tbl, eqn Yet another text markup and display formatting language is groff. This is the enhanced GNU version of the venerable UNIX roff/troff display and typesetting package. Manpages use groff. The tbl table processing utility is considered part of groff, as its function is to convert table markup into groff commands. The eqn equation processing utility is likewise part of groff, and its function is to convert equation markup into groff commands.
255
See also Example A-39. lex, yacc The lex lexical analyzer produces programs for pattern matching. This has been replaced by the nonproprietary flex on Linux systems.
The yacc utility creates a parser based on a set of specifications. This has been replaced by the nonproprietary bison on Linux systems.
256
Advanced Bash-Scripting Guide 1. -c create (a new archive) 2. -x extract (files from existing archive) 3. --delete delete (files from existing archive) This option will not work on magnetic tape devices. 4. -r append (files to existing archive) 5. -A append (tar files to existing archive) 6. -t list (contents of existing archive) 7. -u update archive 8. -d compare archive with specified filesystem 9. --after-date only process files with a date stamp after specified date 10. -z gzip the archive (compress or uncompress, depending on whether combined with the -c or -x) option 11. -j bzip2 the archive It may be difficult to recover data from a corrupted gzipped tar archive. When archiving important files, make multiple backups. shar Shell archiving utility. The text files in a shell archive are concatenated without compression, and the resultant archive is essentially a shell script, complete with #!/bin/sh header, containing all the necessary unarchiving commands, as well as the files themselves. Shar archives still show up in Usenet newsgroups, but otherwise shar has been replaced by tar/gzip. The unshar command unpacks shar archives. The mailshar command is a Bash script that uses shar to concatenate multiple files into a single one for e-mailing. This script supports compression and uuencoding. ar Creation and manipulation utility for archives, mainly used for binary object file libraries. rpm The Red Hat Package Manager, or rpm utility provides a wrapper for source or binary archives. It includes commands for installing and checking the integrity of packages, among other things. A simple rpm -i package_name.rpm usually suffices to install a package, though there are many more options available. rpm -qf identifies which package a file originates from.
bash$ rpm -qf /bin/ls coreutils-5.2.1-31
rpm -qa gives a complete list of all installed rpm packages on a given system. An rpm -qa package_name lists only the package(s) corresponding to package_name.
bash$ rpm -qa redhat-logos-1.1.3-1 glibc-2.2.4-13 cracklib-2.7-12 dosfstools-2.7-1
257
bash$ rpm -qa docbook | grep docbook