Linux Advanced Bash-Scripting Guide
Linux Advanced Bash-Scripting Guide
Table of Contents
Advanced Bash-Scripting Guide.......................................................................................................................1
An in-depth exploration of the art of shell scripting................................................................................1
Mendel Cooper..................................................................................................................................1
Dedication............................................................................................................................................................3
Part 1. Introduction..........................................................................................................................................15
Part 2. Basics.....................................................................................................................................................29
Chapter 5. Quoting...........................................................................................................................................67
5.2. Escaping......................................................................................................................................................73
Chapter 7. Tests................................................................................................................................................83
i
Advanced Bash-Scripting Guide
Table of Contents
7.1. Test Constructs...........................................................................................................................................85
Notes......................................................................................................................................................92
8.1. Operators..................................................................................................................................................109
Notes....................................................................................................................................................115
ii
Advanced Bash-Scripting Guide
Table of Contents
10.2. Parameter Substitution.........................................................................................................................177
Notes....................................................................................................................................................186
11.1. Loops.......................................................................................................................................................189
Notes....................................................................................................................................................202
Part 4. Commands..........................................................................................................................................227
iii
Advanced Bash-Scripting Guide
Table of Contents
16.7. Terminal Control Commands...............................................................................................................345
18.2. Globbing..................................................................................................................................................413
Notes....................................................................................................................................................414
20.3. Applications............................................................................................................................................445
iv
Advanced Bash-Scripting Guide
Table of Contents
24.2. Local Variables.......................................................................................................................................475
24.2.1. Local variables and recursion..................................................................................................476
Notes..............................................................................................................................................478
29.1. /dev..........................................................................................................................................................525
Notes....................................................................................................................................................527
29.2. /proc.........................................................................................................................................................529
Notes....................................................................................................................................................534
v
Advanced Bash-Scripting Guide
Table of Contents
35.4. A script calling itself (recursion)..........................................................................................................579
35.6. Optimizations.........................................................................................................................................597
Notes....................................................................................................................................................597
vi
Advanced Bash-Scripting Guide
Table of Contents
37.4. Tools Used to Produce This Book.........................................................................................................647
37.4.1. Hardware..................................................................................................................................647
37.4.2. Software and Printware............................................................................................................647
37.5. Credits.....................................................................................................................................................649
37.6. Disclaimer...............................................................................................................................................651
Bibliography....................................................................................................................................................653
Notes....................................................................................................................................................659
C.1. Sed............................................................................................................................................................857
Notes....................................................................................................................................................859
C.2. Awk...........................................................................................................................................................861
Notes....................................................................................................................................................863
Appendix J. Localization................................................................................................................................883
vii
Advanced Bash-Scripting Guide
Table of Contents
Appendix L. A Sample .bashrc File...............................................................................................................889
Appendix N. Exercises....................................................................................................................................907
Appendix Q. To Do List..................................................................................................................................927
Appendix R. Copyright..................................................................................................................................929
Notes....................................................................................................................................................930
Index.................................................................................................................................................................935
viii
Advanced Bash-Scripting Guide
An in-depth exploration of the art of shell scripting
Version 6.2.01
12 May 2010
Mendel Cooper
This tutorial assumes no previous knowledge of scripting or programming, but progresses rapidly toward an
intermediate/advanced level of instruction . . . all the while sneaking in little nuggets of UNIX® wisdom and
lore. It serves as a textbook, a manual for self-study, and a reference and source of knowledge on shell
scripting techniques. The exercises and heavily-commented examples invite active reader participation, under
the premise that the only way to really learn scripting is to write scripts.
This book is suitable for classroom use as a general introduction to programming concepts.
Dedication
For Anita, the source of all the magic
Table of Contents
Part 1. Introduction
1. Shell Programming!
2. Starting Off With a Sha-Bang
2.1. Invoking the script
2.2. Preliminary Exercises
Part 2. Basics
3. Special Characters
4. Introduction to Variables and Parameters
4.1. Variable Substitution
4.2. Variable Assignment
4.3. Bash Variables Are Untyped
4.4. Special Variable Types
5. Quoting
5.1. Quoting Variables
5.2. Escaping
6. Exit and Exit Status
7. Tests
7.1. Test Constructs
7.2. File test operators
7.3. Other Comparison Operators
7.4. Nested if/then Condition Tests
7.5. Testing Your Knowledge of Tests
8. Operations and Related Topics
8.1. Operators
8.2. Numerical Constants
8.3. The Double-Parentheses Construct
8.4. Operator Precedence
Part 3. Beyond the Basics
9. Another Look at Variables
9.1. Internal Variables
9.2. Typing variables: declare or typeset
9.3. $RANDOM: generate random integer
10. Manipulating Variables
10.1. Manipulating Strings
10.2. Parameter Substitution
11. Loops and Branches
11.1. Loops
11.2. Nested Loops
11.3. Loop Control
11.4. Testing and Branching
12. Command Substitution
13. Arithmetic Expansion
14. Recess Time
Part 4. Commands
15. Internal Commands and Builtins
15.1. Job Control Commands
16. External Filters, Programs and Commands
16.1. Basic Commands
16.2. Complex Commands
16.3. Time / Date Commands
16.4. Text Processing Commands
16.5. File and Archiving Commands
16.6. Communications Commands
16.7. Terminal Control Commands
16.8. Math Commands
16.9. Miscellaneous Commands
17. System and Administrative Commands
17.1. Analyzing a System Script
Part 5. Advanced Topics
18. Regular Expressions
18.1. A Brief Introduction to Regular Expressions
18.2. Globbing
19. Here Documents
19.1. Here Strings
20. I/O Redirection
20.1. Using exec
20.2. Redirecting Code Blocks
20.3. Applications
21. Subshells
22. Restricted Shells
23. Process Substitution
24. Functions
24.1. Complex Functions and Function Complexities
24.2. Local Variables
24.3. Recursion Without Local Variables
25. Aliases
26. List Constructs
27. Arrays
28. Indirect References
29. /dev and /proc
29.1. /dev
29.2. /proc
30. Of Zeros and Nulls
31. Debugging
32. Options
33. Gotchas
34. Scripting With Style
34.1. Unofficial Shell Scripting Stylesheet
35. Miscellany
35.1. Interactive and non-interactive shells and scripts
35.2. Shell Wrappers
35.3. Tests and Comparisons: Alternatives
35.4. A script calling itself (recursion)
35.5. "Colorizing" Scripts
35.6. Optimizations
35.7. Assorted Tips
35.8. Security Issues
35.9. Portability Issues
35.10. Shell Scripting Under Windows
36. Bash, versions 2, 3, and 4
36.1. Bash, version 2
36.2. Bash, version 3
36.3. Bash, version 4
37. Endnotes
37.1. Author's Note
37.2. About the Author
37.3. Where to Go For Help
37.4. Tools Used to Produce This Book
37.4.1. Hardware
37.4.2. Software and Printware
37.5. Credits
37.6. Disclaimer
Bibliography
A. Contributed Scripts
B. Reference Cards
C. A Sed and Awk Micro-Primer
C.1. Sed
C.2. Awk
D. Exit Codes With Special Meanings
E. A Detailed Introduction to I/O and I/O Redirection
F. Command-Line Options
F.1. Standard Command-Line Options
F.2. Bash Command-Line Options
G. Important Files
H. Important System Directories
I. An Introduction to Programmable Completion
J. Localization
K. History Commands
L. A Sample .bashrc File
M. Converting DOS Batch Files to Shell Scripts
N. Exercises
N.1. Analyzing Scripts
N.2. Writing Scripts
O. Revision History
P. Download and Mirror Sites
Q. To Do List
R. Copyright
S. ASCII Table
Index
List of Tables
8-1. Operator Precedence
15-1. Job identifiers
32-1. Bash options
35-1. Numbers representing colors in Escape Sequences
B-1. Special Shell Variables
B-2. TEST Operators: Binary Comparison
B-3. TEST Operators: Files
B-4. Parameter Substitution and Expansion
B-5. String Operations
B-6. Miscellaneous Constructs
C-1. Basic sed operators
C-2. Examples of sed operators
D-1. Reserved Exit Codes
M-1. Batch file keywords / variables / operators, and their shell equivalents
M-2. DOS commands and their UNIX equivalents
O-1. Revision History
List of Examples
2-1. cleanup: A script to clean up the log files in /var/log
2-2. cleanup: An improved clean-up script
2-3. cleanup: An enhanced and generalized version of above scripts.
3-1. Code blocks and I/O redirection
3-2. Saving the output of a code block to a file
3-3. Running a loop in the background
3-4. Backup of all files changed in last day
4-1. Variable assignment and substitution
4-2. Plain Variable Assignment
4-3. Variable Assignment, plain and fancy
4-4. Integer or string?
4-5. Positional Parameters
4-6. wh, whois domain name lookup
4-7. Using shift
5-1. Echoing Weird Variables
5-2. Escaped Characters
6-1. exit / exit status
6-2. Negating a condition using !
7-1. What is truth?
7-2. Equivalence of test, /usr/bin/test, [ ], and /usr/bin/[
7-3. Arithmetic Tests using (( ))
7-4. Testing for broken links
7-5. Arithmetic and string comparisons
7-6. Testing whether a string is null
7-7. zmore
8-1. Greatest common divisor
8-2. Using Arithmetic Operations
8-3. Compound Condition Tests Using && and ||
8-4. Representation of numerical constants
8-5. C-style manipulation of variables
9-1. $IFS and whitespace
9-2. Timed Input
9-3. Once more, timed input
9-4. Timed read
9-5. Am I root?
9-6. arglist: Listing arguments with $* and $@
9-7. Inconsistent $* and $@ behavior
9-8. $* and $@ when $IFS is empty
9-9. Underscore variable
9-10. Using declare to type variables
9-11. Generating random numbers
9-12. Picking a random card from a deck
9-13. Brownian Motion Simulation
9-14. Random between values
9-15. Rolling a single die with RANDOM
9-16. Reseeding RANDOM
9-17. Pseudorandom numbers, using awk
10-1. Inserting a blank line between paragraphs in a text file
10-2. Generating an 8-character "random" string
10-3. Converting graphic file formats, with filename change
10-4. Converting streaming audio files to ogg
10-5. Emulating getopt
10-6. Alternate ways of extracting and locating substrings
10-7. Using parameter substitution and error messages
10-8. Parameter substitution and "usage" messages
10-9. Length of a variable
10-10. Pattern matching in parameter substitution
10-11. Renaming file extensions:
10-12. Using pattern matching to parse arbitrary strings
10-13. Matching patterns at prefix or suffix of string
11-1. Simple for loops
11-2. for loop with two parameters in each [list] element
11-3. Fileinfo: operating on a file list contained in a variable
11-4. Operating on files with a for loop
11-5. Missing in [list] in a for loop
11-6. Generating the [list] in a for loop with command substitution
11-7. A grep replacement for binary files
11-8. Listing all users on the system
11-9. Checking all the binaries in a directory for authorship
11-10. Listing the symbolic links in a directory
11-11. Symbolic links in a directory, saved to a file
11-12. A C-style for loop
11-13. Using efax in batch mode
11-14. Simple while loop
11-15. Another while loop
11-16. while loop with multiple conditions
11-17. C-style syntax in a while loop
11-18. until loop
11-19. Nested Loop
11-20. Effects of break and continue in a loop
11-21. Breaking out of multiple loop levels
11-22. Continuing at a higher loop level
11-23. Using continue N in an actual task
11-24. Using case
11-25. Creating menus using case
11-26. Using command substitution to generate the case variable
11-27. Simple string matching
11-28. Checking for alphabetic input
11-29. Creating menus using select
11-30. Creating menus using select in a function
12-1. Stupid script tricks
12-2. Generating a variable from a loop
12-3. Finding anagrams
15-1. A script that spawns multiple instances of itself
15-2. printf in action
15-3. Variable assignment, using read
15-4. What happens when read has no variable
15-5. Multi-line input to read
15-6. Detecting the arrow keys
15-7. Using read with file redirection
15-8. Problems reading from a pipe
15-9. Changing the current working directory
15-10. Letting let do arithmetic.
15-11. Showing the effect of eval
15-12. Using eval to select among variables
15-13. Echoing the command-line parameters
15-14. Forcing a log-off
15-15. A version of rot13
15-16. Using set with positional parameters
15-17. Reversing the positional parameters
15-18. Reassigning the positional parameters
15-19. "Unsetting" a variable
15-20. Using export to pass a variable to an embedded awk script
15-21. Using getopts to read the options/arguments passed to a script
15-22. "Including" a data file
15-23. A (useless) script that sources itself
15-24. Effects of exec
15-25. A script that exec's itself
15-26. Waiting for a process to finish before proceeding
15-27. A script that kills itself
16-1. Using ls to create a table of contents for burning a CDR disk
16-2. Hello or Good-bye
16-3. Badname, eliminate file names in current directory containing bad characters and whitespace.
16-4. Deleting a file by its inode number
16-5. Logfile: Using xargs to monitor system log
16-6. Copying files in current directory to another
16-7. Killing processes by name
16-8. Word frequency analysis using xargs
16-9. Using expr
16-10. Using date
16-11. Date calculations
16-12. Word Frequency Analysis
16-13. Which files are scripts?
16-14. Generating 10-digit random numbers
16-15. Using tail to monitor the system log
16-16. Printing out the From lines in stored e-mail messages
16-17. Emulating grep in a script
16-18. Crossword puzzle solver
16-19. Looking up definitions in Webster's 1913 Dictionary
16-20. Checking words in a list for validity
16-21. toupper: Transforms a file to all uppercase.
16-22. lowercase: Changes all filenames in working directory to lowercase.
16-23. du: DOS to UNIX text file conversion.
16-24. rot13: ultra-weak encryption.
16-25. Generating "Crypto-Quote" Puzzles
16-26. Formatted file listing.
16-27. Using column to format a directory listing
16-28. nl: A self-numbering script.
16-29. manview: Viewing formatted manpages
16-30. Using cpio to move a directory tree
16-31. Unpacking an rpm archive
16-32. Stripping comments from C program files
16-33. Exploring /usr/X11R6/bin
16-34. An "improved" strings command
16-35. Using cmp to compare two files within a script.
16-36. basename and dirname
16-37. A script that copies itself in sections
16-38. Checking file integrity
16-39. Uudecoding encoded files
16-40. Finding out where to report a spammer
16-41. Analyzing a spam domain
16-42. Getting a stock quote
16-43. Updating FC4
16-44. Using ssh
16-45. A script that mails itself
16-46. Generating prime numbers
16-47. Monthly Payment on a Mortgage
16-48. Base Conversion
16-49. Invoking bc using a here document
16-50. Calculating PI
16-51. Converting a decimal number to hexadecimal
16-52. Factoring
16-53. Calculating the hypotenuse of a triangle
16-54. Using seq to generate loop arguments
16-55. Letter Count"
16-56. Using getopt to parse command-line options
16-57. A script that copies itself
16-58. Exercising dd
16-59. Capturing Keystrokes
16-60. Securely deleting a file
16-61. Filename generator
16-62. Converting meters to miles
16-63. Using m4
17-1. Setting a new password
17-2. Setting an erase character
17-3. secret password: Turning off terminal echoing
17-4. Keypress detection
17-5. Checking a remote server for identd
17-6. pidof helps kill a process
17-7. Checking a CD image
17-8. Creating a filesystem in a file
17-9. Adding a new hard drive
17-10. Using umask to hide an output file from prying eyes
17-11. killall, from /etc/rc.d/init.d
19-1. broadcast: Sends message to everyone logged in
19-2. dummyfile: Creates a 2-line dummy file
19-3. Multi-line message using cat
19-4. Multi-line message, with tabs suppressed
19-5. Here document with replaceable parameters
19-6. Upload a file pair to Sunsite incoming directory
19-7. Parameter substitution turned off
19-8. A script that generates another script
19-9. Here documents and functions
19-10. "Anonymous" Here Document
19-11. Commenting out a block of code
19-12. A self-documenting script
19-13. Prepending a line to a file
19-14. Parsing a mailbox
20-1. Redirecting stdin using exec
20-2. Redirecting stdout using exec
20-3. Redirecting both stdin and stdout in the same script with exec
20-4. Avoiding a subshell
20-5. Redirected while loop
20-6. Alternate form of redirected while loop
20-7. Redirected until loop
20-8. Redirected for loop
20-9. Redirected for loop (both stdin and stdout redirected)
20-10. Redirected if/then test
20-11. Data file names.data for above examples
20-12. Logging events
21-1. Variable scope in a subshell
21-2. List User Profiles
21-3. Running parallel processes in subshells
22-1. Running a script in restricted mode
23-1. Code block redirection without forking
24-1. Simple functions
24-2. Function Taking Parameters
24-3. Functions and command-line args passed to the script
24-4. Passing an indirect reference to a function
24-5. Dereferencing a parameter passed to a function
24-6. Again, dereferencing a parameter passed to a function
24-7. Maximum of two numbers
24-8. Converting numbers to Roman numerals
24-9. Testing large return values in a function
24-10. Comparing two large integers
24-11. Real name from username
24-12. Local variable visibility
24-13. Demonstration of a simple recursive function
24-14. Another simple demonstration
24-15. Recursion, using a local variable
24-16. The Fibonacci Sequence
24-17. The Towers of Hanoi
25-1. Aliases within a script
25-2. unalias: Setting and unsetting an alias
26-1. Using an and list to test for command-line arguments
26-2. Another command-line arg test using an and list
26-3. Using or lists in combination with an and list
27-1. Simple array usage
27-2. Formatting a poem
27-3. Various array operations
27-4. String operations on arrays
27-5. Loading the contents of a script into an array
27-6. Some special properties of arrays
27-7. Of empty arrays and empty elements
27-8. Initializing arrays
27-9. Copying and concatenating arrays
27-10. More on concatenating arrays
27-11. The Bubble Sort
27-12. Embedded arrays and indirect references
27-13. The Sieve of Eratosthenes
27-14. The Sieve of Eratosthenes, Optimized
27-15. Emulating a push-down stack
27-16. Complex array application: Exploring a weird mathematical series
27-17. Simulating a two-dimensional array, then tilting it
28-1. Indirect Variable References
28-2. Passing an indirect reference to awk
29-1. Using /dev/tcp for troubleshooting
29-2. Playing music
29-3. Finding the process associated with a PID
29-4. On-line connect status
30-1. Hiding the cookie jar
30-2. Setting up a swapfile using /dev/zero
30-3. Creating a ramdisk
31-1. A buggy script
31-2. Missing keyword
31-3. test24: another buggy script
31-4. Testing a condition with an assert
31-5. Trapping at exit
31-6. Cleaning up after Control-C
31-7. Tracing a variable
31-8. Running multiple processes (on an SMP box)
33-1. Numerical and string comparison are not equivalent
33-2. Subshell Pitfalls
33-3. Piping the output of echo to a read
35-1. shell wrapper
35-2. A slightly more complex shell wrapper
35-3. A generic shell wrapper that writes to a logfile
35-4. A shell wrapper around an awk script
35-5. A shell wrapper around another awk script
35-6. Perl embedded in a Bash script
35-7. Bash and Perl scripts combined
35-8. A (useless) script that recursively calls itself
35-9. A (useful) script that recursively calls itself
35-10. Another (useful) script that recursively calls itself
35-11. A "colorized" address database
35-12. Drawing a box
35-13. Echoing colored text
35-14. A "horserace" game
35-15. A Progress Bar
35-16. Return value trickery
35-17. Even more return value trickery
35-18. Passing and returning arrays
35-19. Fun with anagrams
35-20. Widgets invoked from a shell script
35-21. Test Suite
36-1. String expansion
36-2. Indirect variable references - the new way
36-3. Simple database application, using indirect variable referencing
36-4. Using arrays and other miscellaneous trickery to deal four random hands from a deck of cards
36-5. A simple address database
36-6. A somewhat more elaborate address database
36-7. Testing characters
A-1. mailformat: Formatting an e-mail message
A-2. rn: A simple-minded file renaming utility
A-3. blank-rename: Renames filenames containing blanks
A-4. encryptedpw: Uploading to an ftp site, using a locally encrypted password
A-5. copy-cd: Copying a data CD
A-6. Collatz series
A-7. days-between: Days between two dates
A-8. Making a dictionary
A-9. Soundex conversion
A-10. Game of Life
A-11. Data file for Game of Life
A-12. behead: Removing mail and news message headers
A-13. password: Generating random 8-character passwords
A-14. fifo: Making daily backups, using named pipes
A-15. Generating prime numbers using the modulo operator
A-16. tree: Displaying a directory tree
A-17. tree2: Alternate directory tree script
A-18. string functions: C-style string functions
A-19. Directory information
A-20. Library of hash functions
A-21. Colorizing text using hash functions
A-22. More on hash functions
A-23. Mounting USB keychain storage devices
A-24. Converting to HTML
A-25. Preserving weblogs
A-26. Protecting literal strings
A-27. Unprotecting literal strings
A-28. Spammer Identification
A-29. Spammer Hunt
A-30. Making wget easier to use
A-31. A podcasting script
A-32. Nightly backup to a firewire HD
A-33. An expanded cd command
A-34. A soundcard setup script
A-35. Locating split paragraphs in a text file
A-36. Insertion sort
A-37. Standard Deviation
A-38. A pad file generator for shareware authors
A-39. A man page editor
A-40. Petals Around the Rose
A-41. Quacky: a Perquackey-type word game
A-42. Nim
A-43. A command-line stopwatch
A-44. An all-purpose shell scripting homework assignment solution
A-45. The Knight's Tour
A-46. Magic Squares
A-47. Fifteen Puzzle
A-48. The Towers of Hanoi, graphic version
A-49. The Towers of Hanoi, alternate graphic version
A-50. An alternate version of the getopt-simple.sh script
A-51. The version of the UseGetOpt.sh example used in the Tab Expansion appendix
A-52. Cycling through all the possible color backgrounds
A-53. Basics Reviewed
C-1. Counting Letter Occurrences
I-1. Completion script for UseGetOpt.sh
L-1. Sample .bashrc file
M-1. VIEWDATA.BAT: DOS Batch File
M-2. viewdata.sh: Shell Script Conversion of VIEWDATA.BAT
Q-1. Print the server environment
S-1. A script that generates an ASCII table
Next
Introduction
Advanced Bash-Scripting Guide: An in-depth exploration of the art of shell scripting
Prev Next
Part 1. Introduction
The shell is a command interpreter. More than just the insulating layer between the operating system kernel
and the user, it's also a fairly powerful programming language. A shell program, called a script, is an
easy-to-use tool for building applications by "gluing together" system calls, tools, utilities, and compiled
binaries. Virtually the entire repertoire of UNIX commands, utilities, and tools is available for invocation by a
shell script. If that were not enough, internal shell commands, such as testing and loop constructs, lend
additional power and flexibility to scripts. Shell scripts are especially well suited for administrative system
tasks and other routine repetitive tasks not requiring the bells and whistles of a full-blown tightly structured
programming language.
Table of Contents
1. Shell Programming!
2. Starting Off With a Sha-Bang
--Herbert Mayer
A working knowledge of shell scripting is essential to anyone wishing to become reasonably proficient at
system administration, even if they do not anticipate ever having to actually write a script. Consider that as a
Linux machine boots up, it executes the shell scripts in /etc/rc.d to restore the system configuration and
set up services. A detailed understanding of these startup scripts is important for analyzing the behavior of a
system, and possibly modifying it.
The craft of scripting is not hard to master, since the scripts can be built in bite-sized sections and there is only
a fairly small set of shell-specific operators and options [1] to learn. The syntax is simple and straightforward,
similar to that of invoking and chaining together utilities at the command line, and there are only a few "rules"
governing their use. Most short scripts work right the first time, and debugging even the longer ones is
straightforward.
In the 1970s, the BASIC language enabled anyone reasonably computer proficient
to write programs on an early generation of microcomputers. Decades later, the Bash
scripting language enables anyone with a rudimentary knowledge of Linux or UNIX to do the same
on much more powerful machines.
A shell script is a quick-and-dirty method of prototyping a complex application. Getting even a limited subset
of the functionality to work in a script is often a useful first stage in project development. This way, the
structure of the application can be tested and played with, and the major pitfalls found before proceeding to
the final coding in C, C++, Java, Perl, or Python.
Shell scripting hearkens back to the classic UNIX philosophy of breaking complex projects into simpler
subtasks, of chaining together components and utilities. Many consider this a better, or at least more
esthetically pleasing approach to problem solving than using one of the new generation of high powered
all-in-one languages, such as Perl, which attempt to be all things to all people, but at the cost of forcing you to
alter your thinking processes to fit the tool.
According to Herbert Mayer, "a useful language needs arrays, pointers, and a generic mechanism for building
data structures." By these criteria, shell scripting falls somewhat short of being "useful." Or, perhaps not. . . .
• Resource-intensive tasks, especially where speed is a factor (sorting, hashing, recursion [2] ...)
• Procedures involving heavy-duty math operations, especially floating point arithmetic, arbitrary
precision calculations, or complex numbers (use C++ or FORTRAN instead)
• Cross-platform portability required (use C or Java instead)
• Complex applications, where structured programming is a necessity (type-checking of variables,
function prototypes, etc.)
• Mission-critical applications upon which you are betting the future of the company
• Situations where security is important, where you need to guarantee the integrity of your system and
protect against intrusion, cracking, and vandalism
• Project consists of subcomponents with interlocking dependencies
• Extensive file operations required (Bash is limited to serial file access, and that only in a
particularly clumsy and inefficient line-by-line fashion.)
• Need native support for multi-dimensional arrays
• Need data structures, such as linked lists or trees
• Need to generate / manipulate graphics or GUIs
• Need direct access to system hardware
• Need port or socket I/O
• Need to use libraries or interface with legacy code
• Proprietary, closed-source applications (Shell scripts put the source code right out in the open for all
the world to see.)
If any of the above applies, consider a more powerful scripting language -- perhaps Perl, Tcl, Python, Ruby
-- or possibly a compiled language such as C, C++, or Java. Even then, prototyping the application as a
shell script might still be a useful development step.
We will be using Bash, an acronym for "Bourne-Again shell" and a pun on Stephen Bourne's now classic
Bourne shell. Bash has become a de facto standard for shell scripting on most flavors of UNIX. Most of the
principles this book covers apply equally well to scripting with other shells, such as the Korn Shell, from
which Bash derives some of its features, [3] and the C Shell and its variants. (Note that C Shell programming
is not recommended due to certain inherent problems, as pointed out in an October, 1993 Usenet post by Tom
Christiansen.)
What follows is a tutorial on shell scripting. It relies heavily on examples to illustrate various features of the
shell. The example scripts work -- they've been tested, insofar as was possible -- and some of them are even
useful in real life. The reader can play with the actual working code of the examples in the source archive
(scriptname.sh or scriptname.bash), [4] give them execute permission (chmod u+rx
scriptname), then run them to see what happens. Should the source archive not be available, then
cut-and-paste from the HTML or pdf rendered versions. Be aware that some of the scripts presented here
introduce features before they are explained, and this may require the reader to temporarily skip ahead for
enlightenment.
Unless otherwise noted, the author of this book wrote the example scripts that follow.
--Edmund Spenser
Notes
--Larry Wall
In the simplest case, a script is nothing more than a list of system commands stored in a file. At the very least,
this saves the effort of retyping that particular sequence of commands each time it is invoked.
1 # Cleanup
2 # Run as root, of course.
3
4 cd /var/log
5 cat /dev/null > messages
6 cat /dev/null > wtmp
7 echo "Logs cleaned up."
There is nothing unusual here, only a set of commands that could just as easily have been invoked one by one
from the command-line on the console or in a terminal window. The advantages of placing the commands in a
script go far beyond not having to retype them time and again. The script becomes a program -- a tool -- and it
can easily be modified or customized for a particular application.
1 #!/bin/bash
2 # Proper header for a Bash script.
3
4 # Cleanup, version 2
5
6 # Run as root, of course.
7 # Insert code here to print error message and exit if not root.
8
9 LOG_DIR=/var/log
10 # Variables are better than hard-coded values.
11 cd $LOG_DIR
12
13 cat /dev/null > messages
14 cat /dev/null > wtmp
15
16
17 echo "Logs cleaned up."
18
19 exit # The right and proper method of "exiting" from a script.
Now that's beginning to look like a real script. But we can go even farther . . .
1 #!/bin/bash
2 # Cleanup, version 3
3
4 # Warning:
5 # -------
6 # This script uses quite a number of features that will be explained
7 #+ later on.
8 # By the time you've finished the first half of the book,
9 #+ there should be nothing mysterious about it.
10
11
12
13 LOG_DIR=/var/log
14 ROOT_UID=0 # Only users with $UID 0 have root privileges.
15 LINES=50 # Default number of lines saved.
16 E_XCD=86 # Can't change directory?
17 E_NOTROOT=87 # Non-root exit error.
18
19
20 # Run as root, of course.
21 if [ "$UID" -ne "$ROOT_UID" ]
22 then
23 echo "Must be root to run this script."
24 exit $E_NOTROOT
25 fi
26
27 if [ -n "$1" ]
28 # Test whether command-line argument is present (non-empty).
29 then
30 lines=$1
31 else
32 lines=$LINES # Default, if not specified on command-line.
33 fi
34
35
36 # Stephane Chazelas suggests the following,
37 #+ as a better way of checking command-line arguments,
38 #+ but this is still a bit advanced for this stage of the tutorial.
39 #
40 # E_WRONGARGS=85 # Non-numerical argument (bad argument format).
41 #
42 # case "$1" in
43 # "" ) lines=50;;
44 # *[!0-9]*) echo "Usage: `basename $0` file-to-cleanup"; exit $E_WRONGARGS;;
45 # * ) lines=$1;;
46 # esac
47 #
48 #* Skip ahead to "Loops" chapter to decipher all this.
49
50
51 cd $LOG_DIR
52
53 if [ `pwd` != "$LOG_DIR" ] # or if [ "$PWD" != "$LOG_DIR" ]
54 # Not in /var/log?
55 then
56 echo "Can't change to $LOG_DIR."
57 exit $E_XCD
58 fi # Doublecheck if in right directory before messing with log file.
59
60 # Far more efficient is:
61 #
62 # cd /var/log || {
63 # echo "Cannot change to necessary directory." >&2
64 # exit $E_XCD;
65 # }
66
67
68
69
70 tail -n $lines messages > mesg.temp # Save last section of message log file.
71 mv mesg.temp messages # Becomes new log directory.
72
73
74 # cat /dev/null > messages
75 #* No longer needed, as the above method is safer.
76
77 cat /dev/null > wtmp # ': > wtmp' and '> wtmp' have the same effect.
78 echo "Logs cleaned up."
79
80 exit 0
81 # A zero return value from the script upon exit indicates success
82 #+ to the shell.
Since you may not wish to wipe out the entire system log, this version of the script keeps the last section of
the message log intact. You will constantly discover ways of fine-tuning previously written scripts for
increased effectiveness.
***
The sha-bang ( #!) [1] at the head of a script tells your system that this file is a set of commands to be fed to
the command interpreter indicated. The #! is actually a two-byte [2] magic number, a special marker that
designates a file type, or in this case an executable shell script (type man magic for more details on this
fascinating topic). Immediately following the sha-bang is a path name. This is the path to the program that
interprets the commands in the script, whether it be a shell, a programming language, or a utility. This
command interpreter then executes the commands in the script, starting at the top (the line following the
sha-bang line), and ignoring comments. [3]
1 #!/bin/sh
2 #!/bin/bash
3 #!/usr/bin/perl
4 #!/usr/bin/tcl
5 #!/bin/sed -f
6 #!/usr/awk -f
Each of the above script header lines calls a different command interpreter, be it /bin/sh, the default shell
(bash in a Linux system) or otherwise. [4] Using #!/bin/sh, the default Bourne shell in most commercial
variants of UNIX, makes the script portable to non-Linux machines, though you sacrifice Bash-specific
features. The script will, however, conform to the POSIX [5] sh standard.
Note that the path given at the "sha-bang" must be correct, otherwise an error message -- usually "Command
not found." -- will be the only result of running the script. [6]
#! can be omitted if the script consists only of a set of generic system commands, using no internal shell
directives. The second example, above, requires the initial #!, since the variable assignment line, lines=50,
uses a shell-specific construct. [7] Note again that #!/bin/sh invokes the default shell interpreter, which
defaults to /bin/bash on a Linux machine.
This tutorial encourages a modular approach to constructing a script. Make note of and collect
"boilerplate" code snippets that might be useful in future scripts. Eventually you will build quite an
extensive library of nifty routines. As an example, the following script prolog tests whether the script has
been invoked with the correct number of parameters.
1 E_WRONG_ARGS=85
2 script_parameters="-a -h -m -z"
3 # -a = all, -h = help, etc.
4
5 if [ $# -ne $Number_of_expected_args ]
6 then
7 echo "Usage: `basename $0` $script_parameters"
8 # `basename $0` is the script's filename.
9 exit $E_WRONG_ARGS
10 fi
Many times, you will write a script that carries out one particular task. The first script in this chapter is
an example. Later, it might occur to you to generalize the script to do other, similar tasks. Replacing the
literal ("hard-wired") constants by variables is a step in that direction, as is replacing repetitive code
blocks by functions.
2.1. Invoking the script
Having written the script, you can invoke it by sh scriptname, [8] or alternatively bash scriptname.
(Not recommended is using sh <scriptname, since this effectively disables reading from stdin within
the script.) Much more convenient is to make the script itself directly executable with a chmod.
Either:
chmod 555 scriptname (gives everyone read/execute permission) [9]
or
chmod +rx scriptname (gives everyone read/execute permission)
chmod u+rx scriptname (gives only the script owner read/execute permission)
Having made the script executable, you may now test it by ./scriptname. [10] If it begins with a
"sha-bang" line, invoking the script calls the correct command interpreter to run it.
As a final step, after testing and debugging, you would likely want to move it to /usr/local/bin (as root,
of course), to make the script available to yourself and all other users as a systemwide executable. The script
could then be invoked by simply typing scriptname [ENTER] from the command-line.
Notes
[1] Also seen in the literature as she-bang or sh-bang. This derives from the concatenation of the tokens sharp
(#) and bang (!).
[2] Some flavors of UNIX (those based on 4.2 BSD) allegedly take a four-byte magic number, requiring a blank
after the ! -- #! /bin/sh. According to Sven Mascheck this is probably a myth.
[3] The #! line in a shell script will be the first thing the command interpreter (sh or bash) sees. Since this line
begins with a #, it will be correctly interpreted as a comment when the command interpreter finally executes
the script. The line has already served its purpose - calling the command interpreter.
If, in fact, the script includes an extra #! line, then bash will interpret it as a comment.
1 #!/bin/bash
2
3 echo "Part 1 of script."
4 a=1
5
6 #!/bin/bash
7 # This does *not* launch a new script.
8
9 echo "Part 2 of script."
10 echo $a # Value of $a stays at 1.
[4] This allows some cute tricks.
1 #!/bin/rm
2 # Self-deleting script.
3
4 # Nothing much seems to happen when you run this... except that the file disappears.
5
6 WHATEVER=85
7
8 echo "This line will never print (betcha!)."
9
10 exit $WHATEVER # Doesn't matter. The script will not exit here.
11 # Try an echo $? after script termination.
12 # You'll get a 0, not a 85.
Also, try starting a README file with a #!/bin/more, and making it executable. The result is a self-listing
documentation file. (A here document using cat is possibly a better alternative -- see Example 19-3).
[5] Portable Operating System Interface, an attempt to standardize UNIX-like OSes. The POSIX specifications
are listed on the Open Group site.
[6] To avoid this possibility, a script may begin with a #!/bin/env bash sha-bang line. This may be useful on
UNIX machines where bash is not located in /bin
[7] If Bash is your default shell, then the #! isn't necessary at the beginning of a script. However, if launching a
script from a different shell, such as tcsh, then you will need the #!.
[8] Caution: invoking a Bash script by sh scriptname turns off Bash-specific extensions, and the script may
therefore fail to execute.
[9] A script needs read, as well as execute permission for it to run, since the shell needs to be able to read it.
[10] Why not simply invoke the script with scriptname? If the directory you are in ($PWD) is where
scriptname is located, why doesn't this work? This fails because, for security reasons, the current
directory (./) is not by default included in a user's $PATH. It is therefore necessary to explicitly invoke the
script in the current directory with a ./scriptname.
#
Comments. Lines beginning with a # (with the exception of #!) are comments and will not be
executed.
1 case "$variable" in
2 abc) echo "\$variable = abc" ;;
3 xyz) echo "\$variable = xyz" ;;
4 esac
;;&, ;&
Terminators in a case option (version 4+ of Bash).
.
"dot" command [period]. Equivalent to source (see Example 15-22). This is a bash builtin.
.
"dot", as a component of a filename. When working with filenames, a leading dot is the prefix of a
"hidden" file, a file that an ls will not normally show.
bash$ ls -al
total 14
drwxrwxr-x 2 bozo bozo 1024 Aug 29 20:54 ./
drwx------ 52 bozo bozo 3072 Aug 29 20:51 ../
-rw-r--r-- 1 bozo bozo 4034 Jul 18 22:04 data1.addressbook
-rw-r--r-- 1 bozo bozo 4602 May 25 13:58 data1.addressbook.bak
-rw-r--r-- 1 bozo bozo 877 Dec 17 2000 employment.addressbook
-rw-rw-r-- 1 bozo bozo 0 Aug 29 20:54 .hidden-file
When considering directory names, a single dot represents the current working directory, and two dots
denote the parent directory.
bash$ pwd
/home/bozo/projects
bash$ cd .
bash$ pwd
/home/bozo/projects
bash$ cd ..
bash$ pwd
/home/bozo/
The dot often appears as the destination (directory) of a file movement command, in this context
meaning current directory.
bash$ cp /home/bozo/current_work/junk/* .
\X escapes the character X. This has the effect of "quoting" X, equivalent to 'X'. The \ may be used to
quote " and ', so they are expressed literally.
null command [colon]. This is the shell equivalent of a "NOP" (no op, a do-nothing operation). It
may be considered a synonym for the shell builtin true. The ":" command is itself a Bash builtin, and
its exit status is true (0).
1 :
2 echo $? # 0
Endless loop:
1 while :
2 do
3 operation-1
4 operation-2
5 ...
6 operation-n
7 done
8
9 # Same as:
10 # while true
11 # do
12 # ...
13 # done
Placeholder in if/then test:
1 if condition
2 then : # Do nothing and branch ahead
3 else # Or else ...
4 take-some-action
5 fi
Provide a placeholder where a binary operation is expected, see Example 8-2 and default parameters.
1 : ${username=`whoami`}
2 # ${username=`whoami`} Gives an error without the leading :
3 # unless "username" is a command or builtin...
Provide a placeholder where a command is expected in a here document. See Example 19-10.
In combination with the > redirection operator, truncates a file to zero length, without changing its
permissions. If the file did not previously exist, creates it.
In combination with the >> redirection operator, has no effect on a pre-existing target file (: >>
target_file). If the file did not previously exist, creates it.
This applies to regular files, not pipes, symlinks, and certain special files.
May be used to begin a comment line, although this is not recommended. Using # for a comment
turns off error checking for the remainder of that line, so almost anything may appear in a comment.
However, this is not the case with :.
In yet another context, from the command line, the ! invokes the Bash history mechanism (see
Appendix K). Note that within a script, the history mechanism is disabled.
*
wild card [asterisk]. The * character serves as a "wild card" for filename expansion in globbing. By
itself, it matches every filename in a given directory.
bash$ echo *
abs-book.sgml add-drive.sh agram.sh alias.sh
The * also represents any number (or zero) characters in a regular expression.
*
arithmetic operator. In the context of arithmetic operations, the * denotes multiplication.
** A double asterisk can represent the exponentiation operator or extended file-match globbing.
?
test operator. Within certain expressions, the ? indicates a test for a condition.
In a double-parentheses construct, the ? can serve as an element of a C-style trinary operator, ?:.
1 (( var0 = var1<98?9:21 ))
2 # ^ ^
3
4 # if [ "$var1" -lt 98 ]
5 # then
6 # var0=9
7 # else
8 # var0=21
9 # fi
In a parameter substitution expression, the ? tests whether a variable has been set.
?
wild card. The ? character serves as a single-character "wild card" for filename expansion in
globbing, as well as representing one character in an extended regular expression.
$
Variable substitution (contents of a variable).
1 var1=5
2 var2=23skidoo
3
4 echo $var1 # 5
5 echo $var2 # 23skidoo
Variables inside parentheses, within the subshell, are not visible to the rest of the
script. The parent process, the script, cannot read variables created in the child
process, the subshell.
1 a=123
2 ( a=321; )
3
4 echo "a = $a" # a = 123
5 # "a" within parentheses acts like a local variable.
array initialization.
No spaces allowed within the braces unless the spaces are quoted or escaped.
1 echo {a..z} # a b c d e f g h i j k l m n o p q r s t u v w x y z
2 # Echoes characters between a and z.
3
4 echo {0..3} # 0 1 2 3
5 # Echoes characters between 0 and 3.
The {a..z} extended brace expansion construction is a feature introduced in version 3 of Bash.
{}
Block of code [curly brackets]. Also referred to as an inline group, this construct, in effect, creates
an anonymous function (a function without a name). However, unlike in a "standard" function, the
variables inside a code block remain visible to the remainder of the script.
bash$ { local a;
a=123; }
bash: local: can only be used in a
function
1 a=123
2 { a=321; }
3 echo "a = $a" # a = 321 (value inside code block)
4
5 # Thanks, S.C.
The code block enclosed in braces may have I/O redirected to and from it.
1 #!/bin/bash
2 # Reading lines in /etc/fstab.
3
4 File=/etc/fstab
5
6 {
7 read line1
8 read line2
9 } < $File
10
11 echo "First line in $File is:"
12 echo "$line1"
13 echo
14 echo "Second line in $File is:"
15 echo "$line2"
16
17 exit 0
18
19 # Now, how do you parse the separate fields of each line?
20 # Hint: use awk, or . . .
21 # . . . Hans-Joerg Diers suggests using the "set" Bash builtin.
1 #!/bin/bash
2 # rpm-check.sh
3
4 # Queries an rpm file for description, listing,
5 #+ and whether it can be installed.
6 # Saves output to a file.
7 #
8 # This script illustrates using a code block.
9
10 SUCCESS=0
11 E_NOARGS=65
12
13 if [ -z "$1" ]
14 then
15 echo "Usage: `basename $0` rpm-file"
16 exit $E_NOARGS
17 fi
18
19 { # Begin code block.
20 echo
21 echo "Archive Description:"
22 rpm -qpi $1 # Query description.
23 echo
24 echo "Archive Listing:"
25 rpm -qpl $1 # Query listing.
26 echo
27 rpm -i --test $1 # Query whether rpm file can be installed.
28 if [ "$?" -eq $SUCCESS ]
29 then
30 echo "$1 can be installed."
31 else
32 echo "$1 cannot be installed."
33 fi
34 echo # End code block.
35 } > "$1.test" # Redirects output of everything in block to file.
36
37 echo "Results of rpm test in file $1.test"
38
39 # See rpm man page for explanation of options.
40
41 exit 0
1 ls . | xargs -i -t cp ./{} $1
2 # ^^ ^^
3
4 # From "ex42.sh" (copydir.sh) example.
anchor id="semicolonesc">
{} \;
pathname. Mostly used in find constructs. This is not a shell builtin.
The ";" ends the -exec option of a find command sequence. It needs to be escaped to
protect it from interpretation by the shell.
[]
test.
Test expression between [ ]. Note that [ is part of the shell builtin test (and a synonym for it), not a
link to the external command /usr/bin/test.
[[ ]]
test.
Test expression between [[ ]]. More flexible than the single-bracket [ ] test, this is a shell keyword.
In the context of an array, brackets set off the numbering of each element of that array.
1 Array[1]=slot_1
2 echo ${Array[1]}
[]
range of characters.
1 a=3
2 b=7
3
4 echo $[$a+$b] # 10
5 echo $[$a*$b] # 21
Note that this usage is deprecated, and has been replaced by the (( ... )) construct.
(( ))
integer expansion.
command &>filename redirects both the stdout and the stderr of command to filename.
This is useful for suppressing output when testing for a condition. For example, let us
test whether a certain command exists.
bash$ echo $?
1
Or in a script:
process substitution.
(command)>
<(command)
In a different context, the "<" and ">" characters act as string comparison operators.
In yet another context, the "<" and ">" characters act as integer comparison operators. See also
Example 16-9.
<<
redirection used in a here document.
<<<
redirection used in a here string.
<, >
ASCII comparison.
1 veg1=carrots
2 veg2=tomatoes
3
4 if [[ "$veg1" < "$veg2" ]]
5 then
6 echo "Although $veg1 precede $veg2 in the dictionary,"
7 echo -n "this does not necessarily imply anything "
8 echo "about my culinary preferences."
9 else
10 echo "What kind of dictionary are you using, anyhow?"
11 fi
\<, \>
word boundary in a regular expression.
pipe. Passes the output (stdout of a previous command to the input (stdin) of the next one, or to
the shell. This is a method of chaining commands together.
1 echo ls -l | sh
2 # Passes the output of "echo ls -l" to the shell,
3 #+ with the same result as a simple "ls -l".
4
5
6 cat *.lst | sort | uniq
7 # Merges and sorts all ".lst" files, then deletes duplicate lines.
A pipe, as a classic method of interprocess communication, sends the stdout of one process to the
stdin of another. In a typical case, a command, such as cat or echo, pipes a stream of data to a
filter, a command that transforms its input for processing. [5]
For an interesting note on the complexity of using UNIX pipes, see the UNIX FAQ, Part 3.
The output of a command or commands may be piped to a script.
1 #!/bin/bash
2 # uppercase.sh : Changes input to uppercase.
3
4 tr 'a-z' 'A-Z'
5 # Letter ranges must be quoted
6 #+ to prevent filename generation from single-letter filenames.
7
8 exit 0
Now, let us pipe the output of ls -l to this script.
bash$ ls -l | ./uppercase.sh
-RW-RW-R-- 1 BOZO BOZO 109 APR 7 19:49 1.TXT
-RW-RW-R-- 1 BOZO BOZO 109 APR 14 16:48 2.TXT
-RW-R--R-- 1 BOZO BOZO 725 APR 20 20:56 DATA-FILE
The stdout of each process in a pipe must be read as the stdin of the next. If this
is not the case, the data stream will block, and the pipe will not behave as expected.
1 variable="initial_value"
2 echo "new_value" | read variable
3 echo "variable = $variable" # variable = initial_value
If one of the commands in the pipe aborts, this prematurely terminates execution of the
pipe. Called a broken pipe, this condition sends a SIGPIPE signal.
>|
force redirection (even if the noclobber option is set). This will forcibly overwrite an existing file.
||
OR logical operator. In a test construct, the || operator causes a return of 0 (success) if either of the
linked test conditions is true.
&
Run job in background. A command followed by an & will run in the background.
Within a script, commands and even loops may run in the background.
1 #!/bin/bash
2 # background-loop.sh
3
4 for i in 1 2 3 4 5 6 7 8 9 10 # First loop.
5 do
6 echo -n "$i "
7 done & # Run this loop in background.
8 # Will sometimes execute after second loop.
9
10 echo # This 'echo' sometimes will not display.
11
12 for i in 11 12 13 14 15 16 17 18 19 20 # Second loop.
13 do
14 echo -n "$i "
15 done
16
17 echo # This 'echo' sometimes will not display.
18
19 # ======================================================
20
21 # The expected output from the script:
22 # 1 2 3 4 5 6 7 8 9 10
23 # 11 12 13 14 15 16 17 18 19 20
24
25 # Sometimes, though, you get:
26 # 11 12 13 14 15 16 17 18 19 20
27 # 1 2 3 4 5 6 7 8 9 10 bozo $
28 # (The second 'echo' doesn't execute. Why?)
29
30 # Occasionally also:
31 # 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
32 # (The first 'echo' doesn't execute. Why?)
33
34 # Very rarely something like:
35 # 11 12 13 1 2 3 4 5 6 7 8 9 10 14 15 16 17 18 19 20
36 # The foreground loop preempts the background one.
37
38 exit 0
39
40 # Nasimuddin Ansari suggests adding sleep 1
41 #+ after the echo -n "$i" in lines 6 and 14,
42 #+ for some real fun.
A command run in the background within a script may cause the script to hang,
waiting for a keystroke. Fortunately, there is a remedy for this.
&&
AND logical operator. In a test construct, the && operator causes a return of 0 (success) only if both
the linked test conditions are true.
-
option, prefix. Option flag for a command or filter. Prefix for an operator. Prefix for a default
parameter in parameter substitution.
COMMAND -[Option1][Option2][...]
ls -al
sort --ignore-leading-blanks
Used with a Bash builtin, it means the end of options to that particular command.
This provides a handy means of removing files whose names begin with a dash.
bash$ ls -l
-rw-r--r-- 1 bozo bozo 0 Nov 25 12:29 -badname
bash$ rm -- -badname
bash$ ls -l
total 0
The double-dash is also used in conjunction with set.
bash$ cat -
abc
abc
...
Ctl-D
As expected, cat - echoes stdin, in this case keyboarded user input, to stdout. But, does I/O
redirection using - have real-world applications?
bash$ file
Usage: file [-bciknvzL] [-f namefile] [-m magicfiles] file...
Add a "-" for a more useful result. This causes the shell to await user input.
bash$ file -
abc
standard input: ASCII text
bash$ file -
#!/bin/bash
standard input: Bourne-Again shell script text executable
Now the command accepts input from stdin and analyzes it.
The "-" can be used to pipe stdout to other commands. This permits such stunts as prepending lines
to a file.
Filenames beginning with "-" may cause problems when coupled with the "-"
redirection operator. A script should check for this and add an appropriate prefix to
such filenames, for example ./-FILENAME, $PWD/-FILENAME, or
$PATHNAME/-FILENAME.
If the value of a variable begins with a -, this may likewise create problems.
1 var="-n"
2 echo $var
3 # Has the effect of "echo -n", and outputs nothing.
-
previous working directory. A cd - command changes to the previous working directory. This uses
the $OLDPWD environmental variable.
Do not confuse the "-" used in this sense with the "-" redirection operator just
discussed. The interpretation of the "-" depends on the context in which it appears.
-
Minus. Minus sign in an arithmetic operation.
=
Equals. Assignment operator
1 a=28
2 echo $a # 28
In a different context, the "=" is a string comparison operator.
+
Plus. Addition arithmetic operator.
In a different context, the + is a Regular Expression operator.
+
Option. Option flag for a command or filter.
Certain commands and builtins use the + to enable certain options and the - to disable them. In
parameter substitution, the + prefixes an alternate value that a variable expands to.
%
modulo. Modulo (remainder of a division) arithmetic operation.
bash$ echo ~
/home/bozo
bash$ echo ~/
/home/bozo/
bash$ echo ~:
/home/bozo:
~+
current working directory. This corresponds to the $PWD internal variable.
~-
previous working directory. This corresponds to the $OLDPWD internal variable.
=~
regular expression match. This operator was introduced with version 3 of Bash.
^
beginning-of-line. In a regular expression, a "^" addresses the beginning of a line of text.
^, ^^
Uppercase conversion in parameter substitution (added in version 4 of Bash).
Control Characters
change the behavior of the terminal or text display. A control character is a CONTROL + key
combination (pressed simultaneously). A control character may also be written in octal or
hexadecimal notation, following an escape.
◊ Ctl-A
Backspace (nondestructive).
◊
Ctl-C
Break. Terminate a foreground job.
◊
Ctl-D
When typing text on the console or in an xterm window, Ctl-D erases the character under
the cursor. When there are no characters present, Ctl-D logs out of the session, as expected.
In an xterm window, this has the effect of closing the window.
◊ Ctl-E
BEL. On some old-time teletype terminals, this would actually ring a bell. In an xterm it
might beep.
◊
Ctl-H
Rubout (destructive backspace). Erases characters the cursor backs over while backspacing.
1 #!/bin/bash
2 # Embedding Ctl-H in a string.
3
4 a="^H^H" # Two Ctl-H's -- backspaces
5 # ctl-V ctl-H, using vi/vim
6 echo "abcdef" # abcdef
7 echo
8 echo -n "abcdef$a " # abcd f
9 # Space at end ^ ^ Backspaces twice.
10 echo
11 echo -n "abcdef$a" # abcdef
12 # No space at end ^ Doesn't backspace (why?).
13 # Results may not be quite as expected.
14 echo; echo
15
16 # Constantin Hagemeier suggests trying:
17 # a=$'\010\010'
18 # a=$'\b\b'
19 # a=$'\x08\x08'
20 # But, this does not change the results.
◊ Ctl-I
Horizontal tab.
◊
Ctl-J
Newline (line feed). In a script, may also be expressed in octal notation -- '\012' or in
hexadecimal -- '\x0a'.
◊ Ctl-K
Vertical tab.
When typing text on the console or in an xterm window, Ctl-K erases from the character
under the cursor to end of line. Within a script, Ctl-K may behave differently, as in Lee Lee
Maschmeyer's example, below.
◊ Ctl-L
Formfeed (clear the terminal screen). In a terminal, this has the same effect as the clear
command. When sent to a printer, a Ctl-L causes an advance to end of the paper sheet.
◊
Ctl-M
Carriage return.
1 #!/bin/bash
2 # Thank you, Lee Maschmeyer, for this example.
3
4 read -n 1 -s -p \
5 $'Control-M leaves cursor at beginning of this line. Press Enter. \x0d'
6 # Of course, '0d' is the hex equivalent of Control-M.
7 echo >&2 # The '-s' makes anything typed silent,
8 #+ so it is necessary to go to new line explicitly.
9
10 read -n 1 -s -p $'Control-J leaves cursor on next line. \x0a'
11 # '0a' is the hex equivalent of Control-J, linefeed.
12 echo >&2
13
14 ###
15
16 read -n 1 -s -p $'And Control-K\x0bgoes straight down.'
17 echo >&2 # Control-K is vertical tab.
18
19 # A better example of the effect of a vertical tab is:
20
21 var=$'\x0aThis is the bottom line\x0bThis is the top line\x0a'
22 echo "$var"
23 # This works the same way as the above example. However:
24 echo "$var" | col
25 # This causes the right end of the line to be higher than the left end.
26 # It also explains why we started and ended with a line feed --
27 #+ to avoid a garbled screen.
28
29 # As Lee Maschmeyer explains:
30 # --------------------------
31 # In the [first vertical tab example] . . . the vertical tab
32 #+ makes the printing go straight down without a carriage return.
33 # This is true only on devices, such as the Linux console,
34 #+ that can't go "backward."
35 # The real purpose of VT is to go straight UP, not down.
36 # It can be used to print superscripts on a printer.
37 # The col utility can be used to emulate the proper behavior of VT.
38
39 exit 0
◊ Ctl-N
Erases a line of text recalled from history buffer [6] (on the command-line).
◊ Ctl-O
Resume (XON).
Suspend (XOFF).
Reverses the position of the character the cursor is on with the previous character (on the
command-line).
◊ Ctl-U
Erase a line of input, from the cursor backward to beginning of line. In some settings, Ctl-U
erases the entire line of input, regardless of cursor position.
◊ Ctl-V
When inputting text, Ctl-V permits inserting control characters. For example, the following
two are equivalent:
1 echo -e '\x0a'
2 echo <Ctl-V><Ctl-J>
Ctl-V is primarily useful from within a text editor.
◊ Ctl-W
When typing text on the console or in an xterm window, Ctl-W erases from the character
under the cursor backwards to the first instance of whitespace. In some settings, Ctl-W
erases backwards to first non-alphanumeric character.
◊ Ctl-X
In certain word processing programs, Cuts highlighted text and copies to clipboard.
◊ Ctl-Y
$IFS, the special variable separating fields of input to certain commands. It defaults to whitespace.
UNIX filters can target and operate on whitespace using the POSIX character class [:space:].
Notes
[1] An operator is an agent that carries out an operation. Some examples are the common arithmetic
operators, + - * /. In Bash, there is some overlap between the concepts of operator and keyword.
[2]
A PID, or process ID, is a number assigned to a running process. The PIDs of running processes may
be viewed with a ps command.
Variables appear in arithmetic operations and manipulation of quantities, and in string parsing.
4.1. Variable Substitution
The name of a variable is a placeholder for its value, the data it holds. Referencing (retrieving) its value is
called variable substitution.
Let us carefully distinguish between the name of a variable and its value. If variable1 is the name
of a variable, then $variable1 is a reference to its value, the data item it contains. [1]
bash$ variable1=23
Enclosing a referenced value in double quotes (" ... ") does not interfere with variable substitution.
This is called partial quoting, sometimes referred to as "weak quoting." Using single quotes (' ... ')
causes the variable name to be used literally, and no substitution will take place. This is full quoting,
sometimes referred to as 'strong quoting.' See Chapter 5 for a detailed discussion.
Note that $variable is actually a simplified form of ${variable}. In contexts where the
$variable syntax causes an error, the longer form may work (see Section 10.2, below).
1 #!/bin/bash
2 # ex9.sh
3
4 # Variables: assignment and substitution
5
6 a=375
7 hello=$a
8
9 #-------------------------------------------------------------------------
10 # No space permitted on either side of = sign when initializing variables.
11 # What happens if there is a space?
12
13 # "VARIABLE =value"
14 # ^
15 #% Script tries to run "VARIABLE" command with one argument, "=value".
16
17 # "VARIABLE= value"
18 # ^
19 #% Script tries to run "value" command with
20 #+ the environmental variable "VARIABLE" set to "".
21 #-------------------------------------------------------------------------
22
23
24 echo hello # hello
25 # Not a variable reference, just the string "hello" . . .
26
27 echo $hello # 375
28 # ^ This *is* a variable reference.
29 echo ${hello} # 375
30 # Also a variable reference, as above.
31
32 # Quoting . . .
33 echo "$hello" # 375
34 echo "${hello}" # 375
35
36 echo
37
38 hello="A B C D"
39 echo $hello # A B C D
40 echo "$hello" # A B C D
41 # As you see, echo $hello and echo "$hello" give different results.
42 # Why?
43 # =======================================
44 # Quoting a variable preserves whitespace.
45 # =======================================
46
47 echo
48
49 echo '$hello' # $hello
50 # ^ ^
51 # Variable referencing disabled (escaped) by single quotes,
52 #+ which causes the "$" to be interpreted literally.
53
54 # Notice the effect of different types of quoting.
55
56
57 hello= # Setting it to a null value.
58 echo "\$hello (null value) = $hello"
59 # Note that setting a variable to a null value is not the same as
60 #+ unsetting it, although the end result is the same (see below).
61
62 # --------------------------------------------------------------
63
64 # It is permissible to set multiple variables on the same line,
65 #+ if separated by white space.
66 # Caution, this may reduce legibility, and may not be portable.
67
68 var1=21 var2=22 var3=$V3
69 echo
70 echo "var1=$var1 var2=$var2 var3=$var3"
71
72 # May cause problems with older versions of "sh" . . .
73
74 # --------------------------------------------------------------
75
76 echo; echo
77
78 numbers="one two three"
79 # ^ ^
80 other_numbers="1 2 3"
81 # ^ ^
82 # If there is whitespace embedded within a variable,
83 #+ then quotes are necessary.
84 # other_numbers=1 2 3 # Gives an error message.
85 echo "numbers = $numbers"
86 echo "other_numbers = $other_numbers" # other_numbers = 1 2 3
87 # Escaping the whitespace also works.
88 mixed_bag=2\ ---\ Whatever
89 # ^ ^ Space after escape (\).
90
91 echo "$mixed_bag" # 2 --- Whatever
92
93 echo; echo
94
95 echo "uninitialized_variable = $uninitialized_variable"
96 # Uninitialized variable has null value (no value at all!).
97 uninitialized_variable= # Declaring, but not initializing it --
98 #+ same as setting it to a null value, as above.
99 echo "uninitialized_variable = $uninitialized_variable"
100 # It still has a null value.
101
102 uninitialized_variable=23 # Set it.
103 unset uninitialized_variable # Unset it.
104 echo "uninitialized_variable = $uninitialized_variable"
105 # It still has a null value.
106 echo
107
108 exit 0
An uninitialized variable has a "null" value -- no assigned value at all (not zero!).
1 if [ -z "$unassigned" ]
2 then
3 echo "\$unassigned is NULL."
4 fi # $unassigned is NULL.
Using a variable before assigning a value to it may cause problems. It is nevertheless
possible to perform arithmetic operations on an uninitialized variable.
Notes
[1] Technically, the name of a variable is called an lvalue, meaning that it appears on the left side of an
assignment statment, as in VARIABLE=23. A variable's value is an rvalue, meaning that it appears on
the right side of an assignment statement, as in VAR2=$VARIABLE.
A variable's name is, in fact, a reference, a pointer to the memory location(s) where the actual data
associated with that variable is kept.
Do not confuse this with = and -eq, which test, rather than assign!
1 #!/bin/bash
2 # Naked variables
3
4 echo
5
6 # When is a variable "naked", i.e., lacking the '$' in front?
7 # When it is being assigned, rather than referenced.
8
9 # Assignment
10 a=879
11 echo "The value of \"a\" is $a."
12
13 # Assignment using 'let'
14 let a=16+5
15 echo "The value of \"a\" is now $a."
16
17 echo
18
19 # In a 'for' loop (really, a type of disguised assignment):
20 echo -n "Values of \"a\" in the loop are: "
21 for a in 7 8 9 11
22 do
23 echo -n "$a "
24 done
25
26 echo
27 echo
28
29 # In a 'read' statement (also a type of assignment):
30 echo -n "Enter \"a\" "
31 read a
32 echo "The value of \"a\" is now $a."
33
34 echo
35
36 exit 0
1 #!/bin/bash
2
3 a=23 # Simple case
4 echo $a
5 b=$a
6 echo $b
7
8 # Now, getting a little bit fancier (command substitution).
9
10 a=`echo Hello!` # Assigns result of 'echo' command to 'a' ...
11 echo $a
12 # Note that including an exclamation mark (!) within a
13 #+ command substitution construct will not work from the command-line,
14 #+ since this triggers the Bash "history mechanism."
15 # Inside a script, however, the history functions are disabled.
16
17 a=`ls -l` # Assigns result of 'ls -l' command to 'a'
18 echo $a # Unquoted, however, it removes tabs and newlines.
19 echo
20 echo "$a" # The quoted variable preserves whitespace.
21 # (See the chapter on "Quoting.")
22
23 exit 0
Variable assignment using the $(...) mechanism (a newer method than backquotes). This is actually a
form of command substitution.
1 # From /etc/rc.d/rc.local
2 R=$(cat /etc/redhat-release)
3 arch=$(uname -m)
Unlike many other programming languages, Bash does not segregate its variables by "type." Essentially, Bash
variables are character strings, but, depending on context, Bash permits arithmetic operations and
comparisons on variables. The determining factor is whether the value of a variable contains only digits.
1 #!/bin/bash
2 # int-or-string.sh
3
4 a=2334 # Integer.
5 let "a += 1"
6 echo "a = $a " # a = 2335
7 echo # Integer, still.
8
9
10 b=${a/23/BB} # Substitute "BB" for "23".
11 # This transforms $b into a string.
12 echo "b = $b" # b = BB35
13 declare -i b # Declaring it an integer doesn't help.
14 echo "b = $b" # b = BB35
15
16 let "b += 1" # BB35 + 1
17 echo "b = $b" # b = 1
18 echo # Bash sets the "integer value" of a string to 0.
19
20 c=BB34
21 echo "c = $c" # c = BB34
22 d=${c/BB/23} # Substitute "23" for "BB".
23 # This makes $d an integer.
24 echo "d = $d" # d = 2334
25 let "d += 1" # 2334 + 1
26 echo "d = $d" # d = 2335
27 echo
28
29
30 # What about null variables?
31 e='' # ... Or e="" ... Or e=
32 echo "e = $e" # e =
33 let "e += 1" # Arithmetic operations allowed on a null variable?
34 echo "e = $e" # e = 1
35 echo # Null variable transformed into an integer.
36
37 # What about undeclared variables?
38 echo "f = $f" # f =
39 let "f += 1" # Arithmetic operations allowed?
40 echo "f = $f" # f = 1
41 echo # Undeclared variable transformed into an integer.
42 #
43 # However ...
44 let "f /= $undecl_var" # Divide by zero?
45 # let: f /= : syntax error: operand expected (error token is " ")
46 # Syntax error! Variable $undecl_var is not set to zero here!
47 #
48 # But still ...
49 let "f /= 0"
50 # let: f /= 0: division by 0 (error token is "0")
51 # Expected behavior.
52
53
54 # Bash (usually) sets the "integer value" of null to zero
55 #+ when performing an arithmetic operation.
56 # But, don't try this at home, folks!
57 # It's undocumented and probably non-portable behavior.
58
59
60 # Conclusion: Variables in Bash are untyped,
61 #+ with all attendant consequences.
62
63 exit $?
Untyped variables are both a blessing and a curse. They permit more flexibility in scripting and make it easier
to grind out lines of code (and give you enough rope to hang yourself!). However, they likewise permit subtle
errors to creep in and encourage sloppy programming habits.
To lighten the burden of keeping track of variable types in a script, Bash does permit declaring variables.
In a more general context, each process has an "environment", that is, a group of
variables that the process may reference. In this sense, the shell behaves like any other
process.
Every time a shell starts, it creates shell variables that correspond to its own
environmental variables. Updating or adding new environmental variables causes the
shell to update its environment, and all the shell's child processes (the commands it
executes) inherit this environment.
The space allotted to the environment is limited. Creating too many environmental
variables or ones that use up excessive space may cause problems.
bash$ du
bash: /usr/bin/du: Argument list too long
(Thank you, Stéphane Chazelas for the clarification, and for providing the above
example.)
If a script sets environmental variables, they need to be "exported," that is, reported to the
environment local to the script. This is the function of the export command.
A script can export variables only to child processes, that is, only to commands or
processes which that particular script initiates. A script invoked from the
command-line cannot export variables back to the command-line environment.
Child processes cannot export variables back to the parent processes that spawned
them.
$0 is the name of the script itself, $1 is the first argument, $2 the second, $3 the third, and so forth.
[2] After $9, the arguments must be enclosed in brackets, for example, ${10}, ${11}, ${12}.
1 #!/bin/bash
2
3 # Call this script with at least 10 parameters, for example
4 # ./scriptname 1 2 3 4 5 6 7 8 9 10
5 MINPARAMS=10
6
7 echo
8
9 echo "The name of this script is \"$0\"."
10 # Adds ./ for current directory
11 echo "The name of this script is \"`basename $0`\"."
12 # Strips out path name info (see 'basename')
13
14 echo
15
16 if [ -n "$1" ] # Tested variable is quoted.
17 then
18 echo "Parameter #1 is $1" # Need quotes to escape #
19 fi
20
21 if [ -n "$2" ]
22 then
23 echo "Parameter #2 is $2"
24 fi
25
26 if [ -n "$3" ]
27 then
28 echo "Parameter #3 is $3"
29 fi
30
31 # ...
32
33
34 if [ -n "${10}" ] # Parameters > $9 must be enclosed in {brackets}.
35 then
36 echo "Parameter #10 is ${10}"
37 fi
38
39 echo "-----------------------------------"
40 echo "All the command-line parameters are: "$*""
41
42 if [ $# -lt "$MINPARAMS" ]
43 then
44 echo
45 echo "This script needs at least $MINPARAMS command-line arguments!"
46 fi
47
48 echo
49
50 exit 0
Bracket notation for positional parameters leads to a fairly simple way of referencing the last
argument passed to a script on the command-line. This also requires indirect referencing.
1 #!/bin/bash
2 # ex18.sh
3
4 # Does a 'whois domain-name' lookup on any of 3 alternate servers:
5 # ripe.net, cw.net, radb.net
6
7 # Place this script -- renamed 'wh' -- in /usr/local/bin
8
9 # Requires symbolic links:
10 # ln -s /usr/local/bin/wh /usr/local/bin/wh-ripe
11 # ln -s /usr/local/bin/wh /usr/local/bin/wh-apnic
12 # ln -s /usr/local/bin/wh /usr/local/bin/wh-tucows
13
14 E_NOARGS=75
15
16
17 if [ -z "$1" ]
18 then
19 echo "Usage: `basename $0` [domain-name]"
20 exit $E_NOARGS
21 fi
22
23 # Check script name and call proper server.
24 case `basename $0` in # Or: case ${0##*/} in
25 "wh" ) whois [email protected];;
26 "wh-ripe" ) whois [email protected];;
27 "wh-apnic" ) whois [email protected];;
28 "wh-cw" ) whois [email protected];;
29 * ) echo "Usage: `basename $0` [domain-name]";;
30 esac
31
32 exit $?
---
The shift command reassigns the positional parameters, in effect shifting them to the left one notch.
The old $1 disappears, but $0 (the script name) does not change. If you use a large number of
positional parameters to a script, shift lets you access those past 10, although {bracket} notation also
permits this.
1 #!/bin/bash
2 # shft.sh: Using 'shift' to step through all the positional parameters.
3
4 # Name this script something like shft.sh,
5 #+ and invoke it with some parameters.
6 #+ For example:
7 # sh shft.sh a b c def 23 Skidoo
8
9 until [ -z "$1" ] # Until all parameters used up . . .
10 do
11 echo -n "$1 "
12 shift
13 done
14
15 echo # Extra linefeed.
16
17 # But, what happens to the "used-up" parameters?
18 echo "$2"
19 # Nothing echoes!
20 # When $2 shifts into $1 (and there is no $3 to shift into $2)
21 #+ then $2 remains empty.
22 # So, it is not a parameter *copy*, but a *move*.
23
24 exit
25
26 # See also the echo-params.sh script for a "shiftless"
27 #+ alternative method of stepping through the positional params.
The shift command can take a numerical parameter indicating how many positions to shift.
1 #!/bin/bash
2 # shift-past.sh
3
4 shift 3 # Shift 3 positions.
5 # n=3; shift $n
6 # Has the same effect.
7
8 echo "$1"
9
10 exit 0
11
12 # ======================== #
13
14
15 $ sh shift-past.sh 1 2 3 4 5
16 4
17
18 # However, as Eleni Fragkiadaki, points out,
19 #+ attempting a 'shift' past the number of
20 #+ positional parameters ($#) returns an exit status of 1,
21 #+ and the positional parameters themselves do not change.
22 # This means possibly getting stuck in an endless loop. . . .
23 # For example:
24 # until [ -z "$1" ]
25 # do
26 # echo -n "$1 "
27 # shift 20 # If less than 20 pos params,
28 # done #+ then loop never ends!
29 #
30 # When in doubt, add a sanity check. . . .
31 # shift 20 || break
32 # ^^^^^^^^
The shift command works in a similar fashion on parameters passed to a function. See
Example 35-16.
Notes
bash$ echo $0
bash
tcsh% echo $0
tcsh
Quoting means just that, bracketing a string in quotes. This has the effect of protecting special characters in
the string from reinterpretation or expansion by the shell or shell script. (A character is "special" if it has an
interpretation other than its literal meaning. For example, the asterisk * represents a wild card character in
globbing and Regular Expressions).
bash$ ls -l [Vv]*
-rw-rw-r-- 1 bozo bozo 324 Apr 2 15:05 VIEWDATA.BAT
-rw-rw-r-- 1 bozo bozo 507 May 4 14:25 vartrace.sh
-rw-rw-r-- 1 bozo bozo 539 Apr 14 17:11 viewdata.sh
bash$ ls -l '[Vv]*'
ls: [Vv]*: No such file or directory
In everyday speech or writing, when we "quote" a phrase, we set it apart and give it special meaning. In a
Bash script, when we quote a string, we set it apart and protect its literal meaning.
Certain programs and utilities reinterpret or expand special characters in a quoted string. An important use of
quoting is protecting a command-line parameter from the shell, but still letting the calling program expand it.
Use double quotes to prevent word splitting. [3] An argument enclosed in double quotes presents itself as a
single word, even if it contains whitespace separators.
1 #!/bin/bash
2 # weirdvars.sh: Echoing weird variables.
3
4 echo
5
6 var="'(]\\{}\$\""
7 echo $var # '(]\{}$"
8 echo "$var" # '(]\{}$" Doesn't make a difference.
9
10 echo
11
12 IFS='\'
13 echo $var # '(] {}$" \ converted to space. Why?
14 echo "$var" # '(]\{}$"
15
16 # Examples above supplied by Stephane Chazelas.
17
18 echo
19
20 var2="\\\\\""
21 echo $var2 # "
22 echo "$var2" # \\"
23 echo
24 # But ... var2="\\\\"" is illegal. Why?
25 var3='\\\\'
26 echo "$var3" # \\\\
27 # Strong quoting works, though.
28
29 exit
Single quotes (' ') operate similarly to double quotes, but do not permit referencing variables, since the special
meaning of $ is turned off. Within single quotes, every special character except ' gets interpreted literally.
Consider single quotes ("full quoting") to be a stricter method of quoting than double quotes ("partial
quoting").
Since even the escape character (\) gets a literal interpretation within single quotes, trying to enclose a
single quote within single quotes will not yield the expected result.
Notes
[1] Unless there is a file named first in the current working directory. Yet another reason to quote.
(Thank you, Harald Koenig, for pointing this out.
[2]
Encapsulating "!" within double quotes gives an error when used from the command line. This is
interpreted as a history command. Within a script, though, this problem does not occur, since the Bash
history mechanism is disabled then.
Of more concern is the apparently inconsistent behavior of \ within double quotes, and especially
following an echo -e command.
Double quotes following an echo sometimes escape \. Moreover, the -e option to echo causes the "\t"
to be interpreted as a tab.
(Thank you, Wayne Pollock, for pointing this out, and Geoff Lee and Daniel Barclay for explaining it.)
[3] "Word splitting," in this context, means dividing a character string into separate and discrete arguments.
With certain commands and utilities, such as echo and sed, escaping a character may have the opposite
effect - it can toggle on a special meaning for that character.
Special meanings of certain escaped characters
1 #!/bin/bash
2 # escaped.sh: escaped characters
3
4 echo; echo
5
6 # Escaping a newline.
7 # ------------------
8
9 echo ""
10
11 echo "This will print
12 as two lines."
13 # This will print
14 # as two lines.
15
16 echo "This will print \
17 as one line."
18 # This will print as one line.
19
20 echo; echo
21
22 echo "============="
23
24
25 echo "\v\v\v\v" # Prints \v\v\v\v literally.
26 # Use the -e option with 'echo' to print escaped characters.
27 echo "============="
28 echo "VERTICAL TABS"
29 echo -e "\v\v\v\v" # Prints 4 vertical tabs.
30 echo "=============="
31
32 echo "QUOTATION MARK"
33 echo -e "\042" # Prints " (quote, octal ASCII character 42).
34 echo "=============="
35
36 # The $'\X' construct makes the -e option unnecessary.
37 echo; echo "NEWLINE AND BEEP"
38 echo $'\n' # Newline.
39 echo $'\a' # Alert (beep).
40
41 echo "==============="
42 echo "QUOTATION MARKS"
43 # Version 2 and later of Bash permits using the $'\nnn' construct.
44 # Note that in this case, '\nnn' is an octal value.
45 echo $'\t \042 \t' # Quote (") framed by tabs.
46
47 # It also works with hexadecimal values, in an $'\xhhh' construct.
48 echo $'\t \x22 \t' # Quote (") framed by tabs.
49 # Thank you, Greg Keraunen, for pointing this out.
50 # Earlier Bash versions allowed '\x022'.
51 echo "==============="
52 echo
53
54
55
56
57
58 # Assigning ASCII characters to a variable.
59 # ----------------------------------------
60 quote=$'\042' # " assigned to a variable.
61 echo "$quote This is a quoted string, $quote and this lies outside the quotes."
62
63 echo
64
65 # Concatenating ASCII chars in a variable.
66 triple_underline=$'\137\137\137' # 137 is octal ASCII code for '_'.
67 echo "$triple_underline UNDERLINE $triple_underline"
68
69 echo
70
71 ABC=$'\101\102\103\010' # 101, 102, 103 are octal A, B, C.
72 echo $ABC
73
74 echo; echo
75
76 escape=$'\033' # 033 is octal for escape.
77 echo "\"escape\" echoes as $escape"
78 # no visible output.
79
80 echo; echo
81
82 exit 0
See Example 36-1 for another example of the $' ... ' string-expansion construct.
\"
gives the quote its literal meaning
1 variable=\
2 echo "$variable"
3 # Will not work - gives an error message:
4 # test.sh: : command not found
5 # A "naked" escape cannot safely be assigned to a variable.
6 #
7 # What actually happens here is that the "\" escapes the newline and
8 #+ the effect is variable=echo "$variable"
9 #+ invalid variable assignment
10
11 variable=\
12 23skidoo
13 echo "$variable" # 23skidoo
14 # This works, since the second line
15 #+ is a valid variable assignment.
16
17 variable=\
18 # \^ escape followed by space
19 echo "$variable" # space
20
21 variable=\\
22 echo "$variable" # \
23
24 variable=\\\
25 echo "$variable"
26 # Will not work - gives an error message:
27 # test.sh: \: command not found
28 #
29 # First escape escapes second one, but the third one is left "naked",
30 #+ with same result as first instance, above.
31
32 variable=\\\\
33 echo "$variable" # \\
34 # Second and fourth escapes escaped.
35 # This is o.k.
Escaping a space can prevent word splitting in a command's argument list.
The escape also provides a means of writing a multi-line command. Normally, each separate line constitutes a
different command, but an escape at the end of a line escapes the newline character, and the command
sequence continues on to the next line.
1 echo "foo
2 bar"
3 #foo
4 #bar
5
6 echo
7
8 echo 'foo
9 bar' # No difference yet.
10 #foo
11 #bar
12
13 echo
14
15 echo foo\
16 bar # Newline escaped.
17 #foobar
18
19 echo
20
21 echo "foo\
22 bar" # Same here, as \ still interpreted as escape within weak quotes.
23 #foobar
24
25 echo
26
27 echo 'foo\
28 bar' # Escape character \ taken literally because of strong quoting.
29 #foo\
30 #bar
31
32 # Examples suggested by Stéphane Chazelas.
--Chet Ramey
The exit command terminates a script, just as in a C program. It can also return a value, which is available to
the script's parent process.
Every command returns an exit status (sometimes referred to as a return status or exit code). A successful
command returns a 0, while an unsuccessful one returns a non-zero value that usually can be interpreted as an
error code. Well-behaved UNIX commands, programs, and utilities return a 0 exit code upon successful
completion, though there are some exceptions.
Likewise, functions within a script and the script itself return an exit status. The last command executed in the
function or script determines the exit status. Within a script, an exit nnn command may be used to deliver
an nnn exit status to the shell (nnn must be an integer in the 0 - 255 range).
When a script ends with an exit that has no parameter, the exit status of the script is the exit status of the
last command executed in the script (previous to the exit).
1 #!/bin/bash
2
3 COMMAND_1
4
5 . . .
6
7 COMMAND_LAST
8
9 # Will exit with status of last command.
10
11 exit
The equivalent of a bare exit is exit $? or even just omitting the exit.
1 #!/bin/bash
2
3 COMMAND_1
4
5 . . .
6
7 COMMAND_LAST
8
9 # Will exit with status of last command.
10
11 exit $?
1 #!/bin/bash
2
3 COMMAND1
4
5 . . .
6
7 COMMAND_LAST
8
9 # Will exit with status of last command.
$? reads the exit status of the last command executed. After a function returns, $? gives the exit status of the
last command executed in the function. This is Bash's way of giving functions a "return value." [1]
Following the execution of a pipe, a $? gives the exit status of the last command executed.
After a script terminates, a $? from the command-line gives the exit status of the script, that is, the last
command executed in the script, which is, by convention, 0 on success or an integer in the range 1 - 255 on
error.
1 #!/bin/bash
2
3 echo hello
4 echo $? # Exit status 0 returned because command executed successfully.
5
6 lskdf # Unrecognized command.
7 echo $? # Non-zero exit status returned because command failed to execute.
8
9 echo
10
11 exit 113 # Will return 113 to shell.
12 # To verify this, type "echo $?" after script terminates.
13
14 # By convention, an 'exit 0' indicates success,
15 #+ while a non-zero exit value means an error or anomalous condition.
$? is especially useful for testing the result of a command in a script (see Example 16-35 and Example 16-20).
The !, the logical not qualifier, reverses the outcome of a test or command, and this affects its exit status.
Certain exit status codes have reserved meanings and should not be user-specified in a script.
Notes
Every reasonably complete programming language can test for a condition, then act according to the result of
the test. Bash has the test command, various bracket and parenthesis operators, and the if/then construct.
7.1. Test Constructs
• An if/then construct tests whether the exit status of a list of commands is 0 (since 0 means "success"
by UNIX convention), and if so, executes one or more commands.
• There exists a dedicated command called [ (left bracket special character). It is a synonym for test,
and a builtin for efficiency reasons. This command considers its arguments as comparison expressions
or file tests and returns an exit status corresponding to the result of the comparison (0 for true, 1 for
false).
• With version 2.02, Bash introduced the [[ ... ]] extended test command, which performs comparisons
in a manner more familiar to programmers from other languages. Note that [[ is a keyword, not a
command.
1 #!/bin/bash
2
3 # Tip:
4 # If you're unsure of how a certain condition would evaluate,
5 #+ test it in an if-test.
6
7 echo
8
9 echo "Testing \"0\""
10 if [ 0 ] # zero
11 then
12 echo "0 is true."
13 else # Or else ...
14 echo "0 is false."
15 fi # 0 is true.
16
17 echo
18
19 echo "Testing \"1\""
20 if [ 1 ] # one
21 then
22 echo "1 is true."
23 else
24 echo "1 is false."
25 fi # 1 is true.
26
27 echo
28
29 echo "Testing \"-1\""
30 if [ -1 ] # minus one
31 then
32 echo "-1 is true."
33 else
34 echo "-1 is false."
35 fi # -1 is true.
36
37 echo
38
39 echo "Testing \"NULL\""
40 if [ ] # NULL (empty condition)
41 then
42 echo "NULL is true."
43 else
44 echo "NULL is false."
45 fi # NULL is false.
46
47 echo
48
49 echo "Testing \"xyz\""
50 if [ xyz ] # string
51 then
52 echo "Random string is true."
53 else
54 echo "Random string is false."
55 fi # Random string is true.
56
57 echo
58
59 echo "Testing \"\$xyz\""
60 if [ $xyz ] # Tests if $xyz is null, but...
61 # it's only an uninitialized variable.
62 then
63 echo "Uninitialized variable is true."
64 else
65 echo "Uninitialized variable is false."
66 fi # Uninitialized variable is false.
67
68 echo
69
70 echo "Testing \"-n \$xyz\""
71 if [ -n "$xyz" ] # More pedantically correct.
72 then
73 echo "Uninitialized variable is true."
74 else
75 echo "Uninitialized variable is false."
76 fi # Uninitialized variable is false.
77
78 echo
79
80
81 xyz= # Initialized, but set to null value.
82
83 echo "Testing \"-n \$xyz\""
84 if [ -n "$xyz" ]
85 then
86 echo "Null variable is true."
87 else
88 echo "Null variable is false."
89 fi # Null variable is false.
90
91
92 echo
93
94
95 # When is "false" true?
96
97 echo "Testing \"false\""
98 if [ "false" ] # It seems that "false" is just a string.
99 then
100 echo "\"false\" is true." #+ and it tests true.
101 else
102 echo "\"false\" is false."
103 fi # "false" is true.
104
105 echo
106
107 echo "Testing \"\$false\"" # Again, uninitialized variable.
108 if [ "$false" ]
109 then
110 echo "\"\$false\" is true."
111 else
112 echo "\"\$false\" is false."
113 fi # "$false" is false.
114 # Now, we get the expected result.
115
116 # What would happen if we tested the uninitialized variable "$true"?
117
118 echo
119
120 exit 0
1 if [ condition-true ]
2 then
3 command 1
4 command 2
5 ...
6 else # Or else ...
7 # Adds default code block executing if original condition tests false.
8 command 3
9 command 4
10 ...
11 fi
When if and then are on same line in a condition test, a semicolon must terminate the if statement. Both if
and then are keywords. Keywords (or commands) begin statements, and before a new statement on the
same line begins, the old one must terminate.
1 if [ -x "$filename" ]; then
Else if and elif
elif
elif is a contraction for else if. The effect is to nest an inner if/then construct within an outer one.
1 if [ condition1 ]
2 then
3 command1
4 command2
5 command3
6 elif [ condition2 ]
7 # Same as else if
8 then
9 command4
10 command5
11 else
12 default-command
13 fi
The test command is a Bash builtin which tests file types and compares strings. Therefore, in a Bash
script, test does not call the external /usr/bin/test binary, which is part of the sh-utils package.
Likewise, [ does not call /usr/bin/[, which is linked to /usr/bin/test.
bash$ type test
test is a shell builtin
bash$ type '['
[ is a shell builtin
bash$ type '[['
[[ is a shell keyword
bash$ type ']]'
]] is a shell keyword
bash$ type ']'
bash: type: ]: not found
If, for some reason, you wish to use /usr/bin/test in a Bash script, then specify it by full
pathname.
1 #!/bin/bash
2
3 echo
4
5 if test -z "$1"
6 then
7 echo "No command-line arguments."
8 else
9 echo "First command-line argument is $1."
10 fi
11
12 echo
13
14 if /usr/bin/test -z "$1" # Equivalent to "test" builtin.
15 # ^^^^^^^^^^^^^ # Specifying full pathname.
16 then
17 echo "No command-line arguments."
18 else
19 echo "First command-line argument is $1."
20 fi
21
22 echo
23
24 if [ -z "$1" ] # Functionally identical to above code blocks.
25 # if [ -z "$1" should work, but...
26 #+ Bash responds to a missing close-bracket with an error message.
27 then
28 echo "No command-line arguments."
29 else
30 echo "First command-line argument is $1."
31 fi
32
33 echo
34
35
36 if /usr/bin/[ -z "$1" ] # Again, functionally identical to above.
37 # if /usr/bin/[ -z "$1" # Works, but gives an error message.
38 # # Note:
39 # This has been fixed in Bash, version 3.x.
40 then
41 echo "No command-line arguments."
42 else
43 echo "First command-line argument is $1."
44 fi
45
46 echo
47
48 exit 0
The [[ ]] construct is the more versatile Bash version of [ ]. This is the extended test command, adopted from
ksh88.
***
No filename expansion or word splitting takes place between [[ and ]], but there is parameter expansion and
command substitution.
1 file=/etc/passwd
2
3 if [[ -e $file ]]
4 then
5 echo "Password file exists."
6 fi
Using the [[ ... ]] test construct, rather than [ ... ] can prevent many logic errors in scripts. For example, the
&&, ||, <, and > operators work within a [[ ]] test, despite giving an error within a [ ] construct.
Arithmetic evaluation of octal / hexadecimal constants takes place automatically within a [[ ... ]] construct.
Following an if, neither the test command nor the test brackets ( [ ] or [[ ]] ) are strictly necessary.
1 dir=/home/bozo
2
3 if cd "$dir" 2>/dev/null; then # "2>/dev/null" hides error message.
4 echo "Now in $dir."
5 else
6 echo "Can't change to $dir."
7 fi
The "if COMMAND" construct returns the exit status of COMMAND.
Similarly, a condition within test brackets may stand alone without an if, when used in combination with
a list construct.
1 var1=20
2 var2=22
3 [ "$var1" -ne "$var2" ] && echo "$var1 is not equal to $var2"
4
5 home=/home/bozo
6 [ -d "$home" ] || echo "$home directory does not exist."
The (( )) construct expands and evaluates an arithmetic expression. If the expression evaluates as zero, it
returns an exit status of 1, or "false". A non-zero expression returns an exit status of 0, or "true". This is in
marked contrast to using the test and [ ] constructs previously discussed.
1 #!/bin/bash
2 # arith-tests.sh
3 # Arithmetic tests.
4
5 # The (( ... )) construct evaluates and tests numerical expressions.
6 # Exit status opposite from [ ... ] construct!
7
8 (( 0 ))
9 echo "Exit status of \"(( 0 ))\" is $?." # 1
10
11 (( 1 ))
12 echo "Exit status of \"(( 1 ))\" is $?." # 0
13
14 (( 5 > 4 )) # true
15 echo "Exit status of \"(( 5 > 4 ))\" is $?." # 0
16
17 (( 5 > 9 )) # false
18 echo "Exit status of \"(( 5 > 9 ))\" is $?." # 1
19
20 (( 5 == 5 )) # true
21 echo "Exit status of \"(( 5 == 5 ))\" is $?." # 0
22 # (( 5 = 5 )) gives an error message.
23
24 (( 5 - 5 )) # 0
25 echo "Exit status of \"(( 5 - 5 ))\" is $?." # 1
26
27 (( 5 / 4 )) # Division o.k.
28 echo "Exit status of \"(( 5 / 4 ))\" is $?." # 0
29
30 (( 1 / 2 )) # Division result < 1.
31 echo "Exit status of \"(( 1 / 2 ))\" is $?." # Rounded off to 0.
32 # 1
33
34 (( 1 / 0 )) 2>/dev/null # Illegal division by 0.
35 # ^^^^^^^^^^^
36 echo "Exit status of \"(( 1 / 0 ))\" is $?." # 1
37
38 # What effect does the "2>/dev/null" have?
39 # What would happen if it were removed?
40 # Try removing it, then rerunning the script.
41
42 # ======================================= #
43
44 # (( ... )) also useful in an if-then test.
45
46 var1=5
47 var2=4
48
49 if (( var1 > var2 ))
50 then #^ ^ Note: Not $var1, $var2. Why?
51 echo "$var1 is greater than $var2"
52 fi # 5 is greater than 4
53
54 exit 0
Notes
[1] A token is a symbol or short string with a special meaning attached to it (a meta-meaning). In Bash,
certain tokens, such as [ and . (dot-command), may expand to keywords and commands.
-e
file exists
-a
file exists
This is identical in effect to -e. It has been "deprecated," [1] and its use is discouraged.
-f
file is a regular file (not a directory or device file)
-s
file is not zero size
-d
file is a directory
-b
file is a block device
-c
file is a character device
This test option may be used to check whether the stdin [ -t 0 ] or stdout [ -t 1 ] in a
given script is a terminal.
-r
file has read permission (for the user running the test)
-w
file has write permission (for the user running the test)
-x
file has execute permission (for the user running the test)
-g
set-group-id (sgid) flag set on file or directory
If a directory has the sgid flag set, then a file created within that directory belongs to the group that
owns the directory, not necessarily to the group of the user who created the file. This may be useful
for a directory shared by a workgroup.
-u
A binary owned by root with set-user-id flag set runs with root privileges, even when an
ordinary user invokes it. [2] This is useful for executables (such as pppd and cdrecord) that need to
access system hardware. Lacking the suid flag, these binaries could not be invoked by a non-root
user.
Commonly known as the sticky bit, the save-text-mode flag is a special type of file permission. If a
file has this flag set, that file will be kept in cache memory, for quicker access. [3] If set on a
directory, it restricts write permission. Setting the sticky bit adds a t to the permissions on the file or
directory listing.
If a user does not own a directory that has the sticky bit set, but has write permission in that directory,
she can only delete those files that she owns in it. This keeps users from inadvertently overwriting or
deleting each other's files in a publicly accessible directory, such as /tmp. (The owner of the
directory or root can, of course, delete or rename files there.)
-O
you are owner of file
-G
group-id of file same as yours
-N
file modified since it was last read
f1 -nt f2
file f1 is newer than f2
f1 -ot f2
file f1 is older than f2
f1 -ef f2
files f1 and f2 are hard links to the same file
!
"not" -- reverses the sense of the tests above (returns true if condition absent).
1 #!/bin/bash
2 # broken-link.sh
3 # Written by Lee bigelow <[email protected]>
4 # Used in ABS Guide with permission.
5
6 # A pure shell script to find dead symlinks and output them quoted
7 #+ so they can be fed to xargs and dealt with :)
8 #+ eg. sh broken-link.sh /somedir /someotherdir|xargs rm
9 #
10 # This, however, is a better method:
11 #
12 # find "somedir" -type l -print0|\
13 # xargs -r0 file|\
14 # grep "broken symbolic"|
15 # sed -e 's/^\|: *broken symbolic.*$/"/g'
16 #
17 #+ but that wouldn't be pure Bash, now would it.
18 # Caution: beware the /proc file system and any circular links!
19 ################################################################
20
21
22 # If no args are passed to the script set directories-to-search
23 #+ to current directory. Otherwise set the directories-to-search
24 #+ to the args passed.
25 ######################
26
27 [ $# -eq 0 ] && directorys=`pwd` || directorys=$@
28
29
30 # Setup the function linkchk to check the directory it is passed
31 #+ for files that are links and don't exist, then print them quoted.
32 # If one of the elements in the directory is a subdirectory then
33 #+ send that subdirectory to the linkcheck function.
34 ##########
35
36 linkchk () {
37 for element in $1/*; do
38 [ -h "$element" -a ! -e "$element" ] && echo \"$element\"
39 [ -d "$element" ] && linkchk $element
40 # Of course, '-h' tests for symbolic link, '-d' for directory.
41 done
42 }
43
44 # Send each arg that was passed to the script to the linkchk() function
45 #+ if it is a valid directoy. If not, then print the error message
46 #+ and usage info.
47 ##################
48 for directory in $directorys; do
49 if [ -d $directory ]
50 then linkchk $directory
51 else
52 echo "$directory is not a directory"
53 echo "Usage: $0 dir1 dir2 ..."
54 fi
55 done
56
57 exit $?
Example 30-1, Example 11-7, Example 11-3, Example 30-3, and Example A-1 also illustrate uses of the file
test operators.
Notes
1 Deprecate
2 ...
3
4 To pray against, as an evil;
5 to seek to avert by prayer;
6 to desire the removal of;
7 to seek deliverance from;
8 to express deep regret for;
9 to disapprove of strongly.
[2] Be aware that suid binaries may open security holes. The suid flag has no effect on shell scripts.
[3] On Linux systems, the sticky bit is no longer used for files, only on directories.
integer comparison
-eq
is equal to
string comparison
is equal to
if [ "$a" = "$b" ]
==
is equal to
if [ "$a" == "$b" ]
The == comparison operator behaves differently within a double-brackets test than within
single brackets.
if [ "$a" != "$b" ]
1 #!/bin/bash
2
3 a=4
4 b=5
5
6 # Here "a" and "b" can be treated either as integers or strings.
7 # There is some blurring between the arithmetic and string comparisons,
8 #+ since Bash variables are not strongly typed.
9
10 # Bash permits integer operations and comparisons on variables
11 #+ whose value consists of all-integer characters.
12 # Caution advised, however.
13
14 echo
15
16 if [ "$a" -ne "$b" ]
17 then
18 echo "$a is not equal to $b"
19 echo "(arithmetic comparison)"
20 fi
21
22 echo
23
24 if [ "$a" != "$b" ]
25 then
26 echo "$a is not equal to $b."
27 echo "(string comparison)"
28 # "4" != "5"
29 # ASCII 52 != ASCII 53
30 fi
31
32 # In this particular instance, both "-ne" and "!=" work.
33
34 echo
35
36 exit 0
1 #!/bin/bash
2 # str-test.sh: Testing null strings and unquoted strings,
3 #+ but not strings and sealing wax, not to mention cabbages and kings . . .
4
5 # Using if [ ... ]
6
7 # If a string has not been initialized, it has no defined value.
8 # This state is called "null" (not the same as zero!).
9
10 if [ -n $string1 ] # string1 has not been declared or initialized.
11 then
12 echo "String \"string1\" is not null."
13 else
14 echo "String \"string1\" is null."
15 fi # Wrong result.
16 # Shows $string1 as not null, although it was not initialized.
17
18 echo
19
20 # Let's try it again.
21
22 if [ -n "$string1" ] # This time, $string1 is quoted.
23 then
24 echo "String \"string1\" is not null."
25 else
26 echo "String \"string1\" is null."
27 fi # Quote strings within test brackets!
28
29 echo
30
31 if [ $string1 ] # This time, $string1 stands naked.
32 then
33 echo "String \"string1\" is not null."
34 else
35 echo "String \"string1\" is null."
36 fi # This works fine.
37 # The [ ... ] test operator alone detects whether the string is null.
38 # However it is good practice to quote it (if [ "$string1" ]).
39 #
40 # As Stephane Chazelas points out,
41 # if [ $string1 ] has one argument, "]"
42 # if [ "$string1" ] has two arguments, the empty "$string1" and "]"
43
44
45 echo
46
47
48 string1=initialized
49
50 if [ $string1 ] # Again, $string1 stands unquoted.
51 then
52 echo "String \"string1\" is not null."
53 else
54 echo "String \"string1\" is null."
55 fi # Again, gives correct result.
56 # Still, it is better to quote it ("$string1"), because . . .
57
58
59 string1="a = b"
60
61 if [ $string1 ] # Again, $string1 stands unquoted.
62 then
63 echo "String \"string1\" is not null."
64 else
65 echo "String \"string1\" is null."
66 fi # Not quoting "$string1" now gives wrong result!
67
68 exit 0 # Thank you, also, Florian Wisser, for the "heads-up".
1 #!/bin/bash
2 # zmore
3
4 # View gzipped files with 'more' filter.
5
6 E_NOARGS=65
7 E_NOTFOUND=66
8 E_NOTGZIP=67
9
10 if [ $# -eq 0 ] # same effect as: if [ -z "$1" ]
11 # $1 can exist, but be empty: zmore "" arg2 arg3
12 then
13 echo "Usage: `basename $0` filename" >&2
14 # Error message to stderr.
15 exit $E_NOARGS
16 # Returns 65 as exit status of script (error code).
17 fi
18
19 filename=$1
20
21 if [ ! -f "$filename" ] # Quoting $filename allows for possible spaces.
22 then
23 echo "File $filename not found!" >&2 # Error message to stderr.
24 exit $E_NOTFOUND
25 fi
26
27 if [ ${filename##*.} != "gz" ]
28 # Using bracket in variable substitution.
29 then
30 echo "File $1 is not a gzipped file!"
31 exit $E_NOTGZIP
32 fi
33
34 zcat $1 | more
35
36 # Uses the 'more' filter.
37 # May substitute 'less' if desired.
38
39 exit $? # Script returns exit status of pipe.
40 # Actually "exit $?" is unnecessary, as the script will, in any case,
41 #+ return the exit status of the last command executed.
compound comparison
-a
logical and
exp1 -a exp2 returns true if both exp1 and exp2 are true.
-o
logical or
These are similar to the Bash comparison operators && and ||, used within double brackets.
1 if [ "$expr1" -a "$expr2" ]
2 then
3 echo "Both expr1 and expr2 are true."
4 else
5 echo "Either expr1 or expr2 is false."
6 fi
But, as rihad points out:
1 [ 1 -eq 1 ] && [ -n "`echo true 1>&2`" ] # true
2 [ 1 -eq 2 ] && [ -n "`echo true 1>&2`" ] # (no output)
3 # ^^^^^^^ False condition. So far, everything as expected.
4
5 # However ...
6 [ 1 -eq 2 -a -n "`echo true 1>&2`" ] # true
7 # ^^^^^^^ False condition. So, why "true" output?
8
9 # Is it because both condition clauses within brackets evaluate?
10 [[ 1 -eq 2 && -n "`echo true 1>&2`" ]] # (no output)
11 # No, that's not it.
12
13 # Apparently && and || "short-circuit" while -a and -o do not.
Refer to Example 8-3, Example 27-17, and Example A-29 to see compound comparison operators in action.
Notes
[1] As S.C. points out, in a compound test, even quoting the string variable might not suffice. [ -n
"$string" -o "$a" = "$b" ] may cause an error with some versions of Bash if $string is
empty. The safe way is to append an extra character to possibly empty variables, [ "x$string" !=
x -o "x$a" = "x$b" ] (the "x's" cancel out).
1 a=3
2
3 if [ "$a" -gt 0 ]
4 then
5 if [ "$a" -lt 5 ]
6 then
7 echo "The value of \"a\" lies somewhere between 0 and 5."
8 fi
9 fi
10
11 # Same result as:
12
13 if [ "$a" -gt 0 ] && [ "$a" -lt 5 ]
14 then
15 echo "The value of \"a\" lies somewhere between 0 and 5."
16 fi
Example 36-4 demonstrates a nested if/then condition test.
1 if [ -f $HOME/.Xclients ]; then
2 exec $HOME/.Xclients
3 elif [ -f /etc/X11/xinit/Xclients ]; then
4 exec /etc/X11/xinit/Xclients
5 else
6 # failsafe settings. Although we should never get here
7 # (we provide fallbacks in Xclients as well) it can't hurt.
8 xclock -geometry 100x100-5+5 &
9 xterm -geometry 80x50-50+150 &
10 if [ -f /usr/bin/netscape -a -f /usr/share/doc/HTML/index.html ]; then
11 netscape /usr/share/doc/HTML/index.html &
12 fi
13 fi
Explain the test constructs in the above snippet, then examine an updated version of the file,
/etc/X11/xinit/xinitrc, and analyze the if/then test constructs there. You may need to refer ahead to
the discussions of grep, sed, and regular expressions.
variable assignment
Initializing or changing the value of a variable
=
All-purpose assignment operator, which works for both arithmetic and string assignments.
1 var=27
2 category=minerals # No spaces allowed after the "=".
Do not confuse the "=" assignment operator with the = test operator.
1 # = as a test operator
2
3 if [ "$string1" = "$string2" ]
4 then
5 command
6 fi
7
8 # if [ "X$string1" = "X$string2" ] is safer,
9 #+ to prevent an error message should one of the variables be empty.
10 # (The prepended "X" characters cancel out.)
arithmetic operators
+
plus
-
minus
*
multiplication
/
division
**
exponentiation
bash$ expr 5 % 3
2
This operator finds use in, among other things, generating numbers within a specific range (see
Example 9-11 and Example 9-15) and formatting program output (see Example 27-16 and Example
A-6). It can even be used to generate prime numbers, (see Example A-15). Modulo turns up
surprisingly often in numerical recipes.
Example 8-1. Greatest common divisor
1 #!/bin/bash
2 # gcd.sh: greatest common divisor
3 # Uses Euclid's algorithm
4
5 # The "greatest common divisor" (gcd) of two integers
6 #+ is the largest integer that will divide both, leaving no remainder.
7
8 # Euclid's algorithm uses successive division.
9 # In each pass,
10 #+ dividend <--- divisor
11 #+ divisor <--- remainder
12 #+ until remainder = 0.
13 # The gcd = dividend, on the final pass.
14 #
15 # For an excellent discussion of Euclid's algorithm, see
16 #+ Jim Loy's site, http://www.jimloy.com/number/euclids.htm.
17
18
19 # ------------------------------------------------------
20 # Argument check
21 ARGS=2
22 E_BADARGS=85
23
24 if [ $# -ne "$ARGS" ]
25 then
26 echo "Usage: `basename $0` first-number second-number"
27 exit $E_BADARGS
28 fi
29 # ------------------------------------------------------
30
31
32 gcd ()
33 {
34
35 dividend=$1 # Arbitrary assignment.
36 divisor=$2 #! It doesn't matter which of the two is larger.
37 # Why not?
38
39 remainder=1 # If an uninitialized variable is used inside
40 #+ test brackets, an error message results.
41
42 until [ "$remainder" -eq 0 ]
43 do # ^^^^^^^^^^ Must be previously initialized!
44 let "remainder = $dividend % $divisor"
45 dividend=$divisor # Now repeat with 2 smallest numbers.
46 divisor=$remainder
47 done # Euclid's algorithm
48
49 } # Last $dividend is the gcd.
50
51
52 gcd $1 $2
53
54 echo; echo "GCD of $1 and $2 = $dividend"; echo
55
56
57 # Exercises :
58 # ---------
59 # 1) Check command-line arguments to make sure they are integers,
60 #+ and exit the script with an appropriate error message if not.
61 # 2) Rewrite the gcd () function to use local variables.
62
63 exit 0
+=
plus-equal (increment variable by a constant)
1 #!/bin/bash
2 # Counting to 11 in 10 different ways.
3
4 n=1; echo -n "$n "
5
6 let "n = $n + 1" # let "n = n + 1" also works.
7 echo -n "$n "
8
9
10 : $((n = $n + 1))
11 # ":" necessary because otherwise Bash attempts
12 #+ to interpret "$((n = $n + 1))" as a command.
13 echo -n "$n "
14
15 (( n = n + 1 ))
16 # A simpler alternative to the method above.
17 # Thanks, David Lombard, for pointing this out.
18 echo -n "$n "
19
20 n=$(($n + 1))
21 echo -n "$n "
22
23 : $[ n = $n + 1 ]
24 # ":" necessary because otherwise Bash attempts
25 #+ to interpret "$[ n = $n + 1 ]" as a command.
26 # Works even if "n" was initialized as a string.
27 echo -n "$n "
28
29 n=$[ $n + 1 ]
30 # Works even if "n" was initialized as a string.
31 #* Avoid this type of construct, since it is obsolete and nonportable.
32 # Thanks, Stephane Chazelas.
33 echo -n "$n "
34
35 # Now for C-style increment operators.
36 # Thanks, Frank Wang, for pointing this out.
37
38 let "n++" # let "++n" also works.
39 echo -n "$n "
40
41 (( n++ )) # (( ++n )) also works.
42 echo -n "$n "
43
44 : $(( n++ )) # : $(( ++n )) also works.
45 echo -n "$n "
46
47 : $[ n++ ] # : $[ ++n ] also works
48 echo -n "$n "
49
50 echo
51
52 exit 0
Integer variables in older versions of Bash were signed long (32-bit) integers, in the range of
-2147483648 to 2147483647. An operation that took a variable outside these limits gave an erroneous
result.
Bash does not understand floating point arithmetic. It treats numbers containing a decimal point as
strings.
1 a=1.5
2
3 let "b = $a + 1.3" # Error.
4 # t2.sh: let: b = 1.5 + 1.3: syntax error in expression
5 # (error token is ".5 + 1.3")
6
7 echo "b = $b" # b=1
Use bc in scripts that that need floating point calculations or math library functions.
bitwise operators. The bitwise operators seldom make an appearance in shell scripts. Their chief use seems to
be manipulating and testing values read from ports or sockets. "Bit flipping" is more relevant to compiled
languages, such as C and C++, which provide direct access to system hardware.
bitwise operators
<<
bitwise left shift (multiplies by 2 for each shift position)
<<=
left-shift-equal
!
NOT
1 if [ ! -f $FILENAME ]
2 then
3 ...
&&
AND
1 if [ $condition1 ] || [ $condition2 ]
2 # Same as: if [ $condition1 -o $condition2 ]
3 # Returns true if either condition1 or condition2 holds true...
4
5 if [[ $condition1 || $condition2 ]] # Also works.
6 # Note that || operator not permitted inside brackets
7 #+ of a [ ... ] construct.
Bash tests the exit status of each statement linked with a logical operator.
1 #!/bin/bash
2
3 a=24
4 b=47
5
6 if [ "$a" -eq 24 ] && [ "$b" -eq 47 ]
7 then
8 echo "Test #1 succeeds."
9 else
10 echo "Test #1 fails."
11 fi
12
13 # ERROR: if [ "$a" -eq 24 && "$b" -eq 47 ]
14 #+ attempts to execute ' [ "$a" -eq 24 '
15 #+ and fails to finding matching ']'.
16 #
17 # Note: if [[ $a -eq 24 && $b -eq 24 ]] works.
18 # The double-bracket if-test is more flexible
19 #+ than the single-bracket version.
20 # (The "&&" has a different meaning in line 17 than in line 6.)
21 # Thanks, Stephane Chazelas, for pointing this out.
22
23
24 if [ "$a" -eq 98 ] || [ "$b" -eq 47 ]
25 then
26 echo "Test #2 succeeds."
27 else
28 echo "Test #2 fails."
29 fi
30
31
32 # The -a and -o options provide
33 #+ an alternative compound condition test.
34 # Thanks to Patrick Callahan for pointing this out.
35
36
37 if [ "$a" -eq 24 -a "$b" -eq 47 ]
38 then
39 echo "Test #3 succeeds."
40 else
41 echo "Test #3 fails."
42 fi
43
44
45 if [ "$a" -eq 98 -o "$b" -eq 47 ]
46 then
47 echo "Test #4 succeeds."
48 else
49 echo "Test #4 fails."
50 fi
51
52
53 a=rhino
54 b=crocodile
55 if [ "$a" = rhino ] && [ "$b" = crocodile ]
56 then
57 echo "Test #5 succeeds."
58 else
59 echo "Test #5 fails."
60 fi
61
62 exit 0
bash$ echo $(( 1 && 2 )) $((3 && 0)) $((4 || 0)) $((0 || 0))
1 0 1 0
miscellaneous operators
,
Comma operator
The comma operator chains together two or more arithmetic operations. All the operations are
evaluated (with possible side effects. [1]
Notes
[1] Side effects are, of course, unintended -- and usually undesirable -- consequences.
1 #!/bin/bash
2 # numbers.sh: Representation of numbers in different bases.
3
4 # Decimal: the default
5 let "dec = 32"
6 echo "decimal number = $dec" # 32
7 # Nothing out of the ordinary here.
8
9
10 # Octal: numbers preceded by '0' (zero)
11 let "oct = 032"
12 echo "octal number = $oct" # 26
13 # Expresses result in decimal.
14 # --------- ------ -- -------
15
16
17 # Hexadecimal: numbers preceded by '0x' or '0X'
18 let "hex = 0x32"
19 echo "hexadecimal number = $hex" # 50
20
21 echo $((0x9abc)) # 39612
22 # ^^ ^^ double-parentheses arithmetic expansion/evaluation
23 # Expresses result in decimal.
24
25
26
27 # Other bases: BASE#NUMBER
28 # BASE between 2 and 64.
29 # NUMBER must use symbols within the BASE range, see below.
30
31
32 let "bin = 2#111100111001101"
33 echo "binary number = $bin" # 31181
34
35 let "b32 = 32#77"
36 echo "base-32 number = $b32" # 231
37
38 let "b64 = 64#@_"
39 echo "base-64 number = $b64" # 4031
40 # This notation only works for a limited range (2 - 64) of ASCII characters.
41 # 10 digits + 26 lowercase characters + 26 uppercase characters + @ + _
42
43
44 echo
45
46 echo $((36#zz)) $((2#10101010)) $((16#AF16)) $((53#1aA))
47 # 1295 170 44822 3375
48
49
50 # Important note:
51 # --------------
52 # Using a digit out of range of the specified base notation
53 #+ gives an error message.
54
55 let "bad_oct = 081"
56 # (Partial) error message output:
57 # bad_oct = 081: value too great for base (error token is "081")
58 # Octal numbers use only digits in the range 0 - 7.
59
60 exit $? # Thanks, Rich Bartell and Stephane Chazelas, for clarification.
61
62 $ sh numbers.sh
63 $ echo $?
64 $ 1
Similar to the let command, the (( ... )) construct permits arithmetic expansion and evaluation. In its simplest
form, a=$(( 5 + 3 )) would set a to 5 + 3, or 8. However, this double-parentheses construct is also a
mechanism for allowing C-style manipulation of variables in Bash, for example, (( var++ )).
1 #!/bin/bash
2 # c-vars.sh
3 # Manipulating a variable, C-style, using the (( ... )) construct.
4
5
6 echo
7
8 (( a = 23 )) # Setting a value, C-style,
9 #+ with spaces on both sides of the "=".
10 echo "a (initial value) = $a" # 23
11
12 (( a++ )) # Post-increment 'a', C-style.
13 echo "a (after a++) = $a" # 24
14
15 (( a-- )) # Post-decrement 'a', C-style.
16 echo "a (after a--) = $a" # 23
17
18
19 (( ++a )) # Pre-increment 'a', C-style.
20 echo "a (after ++a) = $a" # 24
21
22 (( --a )) # Pre-decrement 'a', C-style.
23 echo "a (after --a) = $a" # 23
24
25 echo
26
27 ########################################################
28 # Note that, as in C, pre- and post-decrement operators
29 #+ have different side-effects.
30
31 n=1; let --n && echo "True" || echo "False" # False
32 n=1; let n-- && echo "True" || echo "False" # True
33
34 # Thanks, Jeroen Domburg.
35 ########################################################
36
37 echo
38
39 (( t = a<45?7:11 )) # C-style trinary operator.
40 # ^ ^ ^
41 echo "If a < 45, then t = 7, else t = 11." # a = 23
42 echo "t = $t " # t = 7
43
44 echo
45
46
47 # -----------------
48 # Easter Egg alert!
49 # -----------------
50 # Chet Ramey seems to have snuck a bunch of undocumented C-style
51 #+ constructs into Bash (actually adapted from ksh, pretty much).
52 # In the Bash docs, Ramey calls (( ... )) shell arithmetic,
53 #+ but it goes far beyond that.
54 # Sorry, Chet, the secret is out.
55
56 # See also "for" and "while" loops using the (( ... )) construct.
57
58 # These work only with version 2.04 or later of Bash.
59
60 exit
In a script, operations execute in order of precedence: the higher precedence operations execute before the
lower precedence ones. [1]
• The "My Dear Aunt Sally" mantra (multiply, divide, add, subtract) for the familiar arithmetic
operations.
• The compound logical operators, &&, ||, -a, and -o have low precedence.
• The order of evaluation of equal-precedence operators is usually left-to-right.
Now, let's utilize our knowledge of operator precedence to analyze a couple of lines from the
/etc/init.d/functions file, as found in the Fedora Core Linux distro.
Notes
[1] Precedence, in this context, has approximately the same meaning as priority
bash4$ echo $$
11015
But ...
1 #!/bin/bash4
2
3 echo "\$\$ outside of subshell = $$" # 9602
4 echo "\$BASH_SUBSHELL outside of subshell = $BASH_SUBSHELL" # 0
5 echo "\$BASHPID outside of subshell = $BASHPID" # 9602
6
7 echo
8
9 ( echo "\$\$ inside of subshell = $$" # 9602
10 echo "\$BASH_SUBSHELL inside of subshell = $BASH_SUBSHELL" # 1
11 echo "\$BASHPID inside of subshell = $BASHPID" ) # 9603
12 # Note that $$ returns PID of parent process.
$BASH_VERSINFO[n]
A 6-element array containing version information about the installed release of Bash. This is similar
to $BASH_VERSION, below, but a bit more detailed.
Checking $BASH_VERSION is a good method of determining which shell is running. $SHELL does
not necessarily give the correct answer.
$CDPATH
A colon-separated list of search paths available to the cd command, similar in function to the $PATH
variable for binaries. The $CDPATH variable may be set in the local ~/.bashrc file.
bash$ cd bash-doc
bash: cd: bash-doc: No such file or directory
bash$ CDPATH=/usr/share/doc
bash$ cd bash-doc
/usr/share/doc/bash-doc
$DIRSTACK
The top value in the directory stack [1] (affected by pushd and popd)
This builtin variable corresponds to the dirs command, however dirs shows the entire contents of the
directory stack.
$EDITOR
The default editor invoked by a script, usually vi or emacs.
$EUID
"effective" user ID number
Identification number of whatever identity the current user has assumed, perhaps by means of su.
1 xyz23 ()
2 {
3 echo "$FUNCNAME now executing." # xyz23 now executing.
4 }
5
6 xyz23
7
8 echo "FUNCNAME = $FUNCNAME" # FUNCNAME =
9 # Null value outside a function.
See also Example A-50.
$GLOBIGNORE
A list of filename patterns to be excluded from matching in globbing.
$GROUPS
Groups current user belongs to
This is a listing (array) of the group id numbers for current user, as recorded in /etc/passwd and
/etc/group.
$HOME
Home directory of the user, usually /home/username (see Example 10-7)
$HOSTNAME
The hostname command assigns the system host name at bootup in an init script. However, the
gethostname() function sets the Bash internal variable $HOSTNAME. See also Example 10-7.
$HOSTTYPE
host type
This variable determines how Bash recognizes fields, or word boundaries, when it interprets character
strings.
$IFS defaults to whitespace (space, tab, and newline), but may be changed, for example, to parse a
comma-separated data file. Note that $* uses the first character held in $IFS. See Example 5-1.
$IFS does not handle whitespace the same as it does other characters.
Example 9-1. $IFS and whitespace
1 #!/bin/bash
2 # ifs.sh
3
4
5 var1="a+b+c"
6 var2="d-e-f"
7 var3="g,h,i"
8
9 IFS=+
10 # The plus sign will be interpreted as a separator.
11 echo $var1 # a b c
12 echo $var2 # d-e-f
13 echo $var3 # g,h,i
14
15 echo
16
17 IFS="-"
18 # The plus sign reverts to default interpretation.
19 # The minus sign will be interpreted as a separator.
20 echo $var1 # a+b+c
21 echo $var2 # d e f
22 echo $var3 # g,h,i
23
24 echo
25
26 IFS=","
27 # The comma will be interpreted as a separator.
28 # The minus sign reverts to default interpretation.
29 echo $var1 # a+b+c
30 echo $var2 # d-e-f
31 echo $var3 # g h i
32
33 echo
34
35 IFS=" "
36 # The space character will be interpreted as a separator.
37 # The comma reverts to default interpretation.
38 echo $var1 # a+b+c
39 echo $var2 # d-e-f
40 echo $var3 # g,h,i
41
42 # ======================================================== #
43
44 # However ...
45 # $IFS treats whitespace differently than other characters.
46
47 output_args_one_per_line()
48 {
49 for arg
50 do
51 echo "[$arg]"
52 done # ^ ^ Embed within brackets, for your viewing pleasure.
53 }
54
55 echo; echo "IFS=\" \""
56 echo "-------"
57
58 IFS=" "
59 var=" a b c "
60 # ^ ^^ ^^^
61 output_args_one_per_line $var # output_args_one_per_line `echo " a b c "`
62 # [a]
63 # [b]
64 # [c]
65
66
67 echo; echo "IFS=:"
68 echo "-----"
69
70 IFS=:
71 var=":a::b:c:::" # Same pattern as above,
72 # ^ ^^ ^^^ #+ but substituting ":" for " " ...
73 output_args_one_per_line $var
74 # []
75 # [a]
76 # []
77 # [b]
78 # [c]
79 # []
80 # []
81
82 # Note "empty" brackets.
83 # The same thing happens with the "FS" field separator in awk.
84
85
86 echo
87
88 exit
See also Example 16-41, Example 11-7, and Example 19-14 for instructive examples of using $IFS.
$IGNOREEOF
Ignore EOF: how many end-of-files (control-D) the shell will ignore before logging out.
$LC_COLLATE
Often set in the .bashrc or /etc/profile files, this variable controls collation order in filename
expansion and pattern matching. If mishandled, LC_COLLATE can cause unexpected results in
filename globbing.
When given a command, the shell automatically does a hash table search on the directories listed in
the path for the executable. The path is stored in the environmental variable, $PATH, a list of
directories, separated by colons. Normally, the system stores the $PATH definition in
/etc/profile and/or ~/.bashrc (see Appendix G).
The current "working directory", ./, is usually omitted from the $PATH as a security
measure.
$PIPESTATUS
Array variable holding exit status(es) of last executed foreground pipe.
The members of the $PIPESTATUS array hold the exit status of each respective command executed
in a pipe. $PIPESTATUS[0] holds the exit status of the first command in the pipe,
$PIPESTATUS[1] the exit status of the second command, and so on.
The $PIPESTATUS variable may contain an erroneous 0 value in a login shell (in
releases prior to 3.0 of Bash).
tcsh% bash
The above lines contained in a script would produce the expected 0 1 0 output.
Thank you, Wayne Pollock for pointing this out and supplying the above example.
The $PIPESTATUS variable gives unexpected results in some contexts.
bash$ $ ls | bogus_command | wc
bash: bogus_command: command not found
0 0 0
Chet Ramey attributes the above output to the behavior of ls. If ls writes to a pipe
whose output is not read, then SIGPIPE kills it, and its exit status is 141. Otherwise
its exit status is 0, as expected. This likewise is the case for tr.
$PIPESTATUS is a "volatile" variable. It needs to be captured immediately after the
pipe in question, before any other command intervenes.
bash$ $ ls | bogus_command | wc
bash: bogus_command: command not found
0 0 0
The pipefail option may be useful in cases where $PIPESTATUS does not give the
desired information.
$PPID
The $PPID of a process is the process ID (pid) of its parent process. [2]
1 #!/bin/bash
2
3 E_WRONG_DIRECTORY=83
4
5 clear # Clear screen.
6
7 TargetDirectory=/home/bozo/projects/GreatAmericanNovel
8
9 cd $TargetDirectory
10 echo "Deleting stale files in $TargetDirectory."
11
12 if [ "$PWD" != "$TargetDirectory" ]
13 then # Keep from wiping out wrong directory by accident.
14 echo "Wrong directory!"
15 echo "In $PWD, rather than $TargetDirectory!"
16 echo "Bailing out!"
17 exit $E_WRONG_DIRECTORY
18 fi
19
20 rm -rf *
21 rm .[A-Za-z0-9]* # Delete dotfiles.
22 # rm -f .[^.]* ..?* to remove filenames beginning with multiple dots.
23 # (shopt -s dotglob; rm -f *) will also work.
24 # Thanks, S.C. for pointing this out.
25
26 # A filename (`basename`) may contain all characters in the 0 - 255 range,
27 #+ except "/".
28 # Deleting files beginning with weird characters, such as -
29 #+ is left as an exercise.
30
31 echo
32 echo "Done."
33 echo "Old files deleted in $TargetDirectory."
34 echo
35
36 # Various other operations here, as necessary.
37
38 exit $?
$REPLY
The default value when a variable is not supplied to read. Also applicable to select menus, but only
supplies the item number of the variable chosen, not the value of the variable itself.
1 #!/bin/bash
2 # reply.sh
3
4 # REPLY is the default value for a 'read' command.
5
6 echo
7 echo -n "What is your favorite vegetable? "
8 read
9
10 echo "Your favorite vegetable is $REPLY."
11 # REPLY holds the value of last "read" if and only if
12 #+ no variable supplied.
13
14 echo
15 echo -n "What is your favorite fruit? "
16 read fruit
17 echo "Your favorite fruit is $fruit."
18 echo "but..."
19 echo "Value of \$REPLY is still $REPLY."
20 # $REPLY is still set to its previous value because
21 #+ the variable $fruit absorbed the new "read" value.
22
23 echo
24
25 exit 0
$SECONDS
The number of seconds the script has been running.
1 #!/bin/bash
2
3 TIME_LIMIT=10
4 INTERVAL=1
5
6 echo
7 echo "Hit Control-C to exit before $TIME_LIMIT seconds."
8 echo
9
10 while [ "$SECONDS" -le "$TIME_LIMIT" ]
11 do
12 if [ "$SECONDS" -eq 1 ]
13 then
14 units=second
15 else
16 units=seconds
17 fi
18
19 echo "This script has been running $SECONDS $units."
20 # On a slow or overburdened machine, the script may skip a count
21 #+ every once in a while.
22 sleep $INTERVAL
23 done
24
25 echo -e "\a" # Beep!
26
27 exit 0
$SHELLOPTS
The list of enabled shell options, a readonly variable.
$SHLVL
Shell level, how deeply Bash is nested. [3] If, at the command-line, $SHLVL is 1, then in a script it
will increment to 2.
This variable is not affected by subshells. Use $BASH_SUBSHELL when you need
an indication of subshell nesting.
$TMOUT
If the $TMOUT environmental variable is set to a non-zero value time, then the shell prompt will
time out after $time seconds. This will cause a logout.
As of version 2.05b of Bash, it is now possible to use $TMOUT in a script in combination with read.
There are other, more complex, ways of implementing timed input in a script. One alternative is to set
up a timing loop to signal the script when it times out. This also requires a signal handling routine to
trap (see Example 31-5) the interrupt generated by the timing loop (whew!).
1 #!/bin/bash
2 # timed-input.sh
3
4 # TMOUT=3 Also works, as of newer versions of Bash.
5
6 TIMER_INTERRUPT=14
7 TIMELIMIT=3 # Three seconds in this instance.
8 # May be set to different value.
9
10 PrintAnswer()
11 {
12 if [ "$answer" = TIMEOUT ]
13 then
14 echo $answer
15 else # Don't want to mix up the two instances.
16 echo "Your favorite veggie is $answer"
17 kill $! # Kills no-longer-needed TimerOn function
18 #+ running in background.
19 # $! is PID of last job running in background.
20 fi
21
22 }
23
24
25 TimerOn()
26 {
27 sleep $TIMELIMIT && kill -s 14 $$ &
28 # Waits 3 seconds, then sends sigalarm to script.
29 }
30
31
32 Int14Vector()
33 {
34 answer="TIMEOUT"
35 PrintAnswer
36 exit $TIMER_INTERRUPT
37 }
38
39 trap Int14Vector $TIMER_INTERRUPT
40 # Timer interrupt (14) subverted for our purposes.
41
42 echo "What is your favorite vegetable "
43 TimerOn
44 read answer
45 PrintAnswer
46
47
48 # Admittedly, this is a kludgy implementation of timed input.
49 # However, the "-t" option to "read" simplifies this task.
50 # See the "t-out.sh" script.
51 # However, what about timing not just single user input,
52 #+ but an entire script?
53
54 # If you need something really elegant ...
55 #+ consider writing the application in C or C++,
56 #+ using appropriate library functions, such as 'alarm' and 'setitimer.'
57
58 exit 0
1 #!/bin/bash
2 # timeout.sh
3
4 # Written by Stephane Chazelas,
5 #+ and modified by the document author.
6
7 INTERVAL=5 # timeout interval
8
9 timedout_read() {
10 timeout=$1
11 varname=$2
12 old_tty_settings=`stty -g`
13 stty -icanon min 0 time ${timeout}0
14 eval read $varname # or just read $varname
15 stty "$old_tty_settings"
16 # See man page for "stty."
17 }
18
19 echo; echo -n "What's your name? Quick! "
20 timedout_read $INTERVAL your_name
21
22 # This may not work on every terminal type.
23 # The maximum timeout depends on the terminal.
24 #+ (it is often 25.5 seconds).
25
26 echo
27
28 if [ ! -z "$your_name" ] # If name input before timeout ...
29 then
30 echo "Your name is $your_name."
31 else
32 echo "Timed out."
33 fi
34
35 echo
36
37 # The behavior of this script differs somewhat from "timed-input.sh."
38 # At each keystroke, the counter resets.
39
40 exit 0
1 #!/bin/bash
2 # t-out.sh
3 # Inspired by a suggestion from "syngin seven" (thanks).
4
5
6 TIMELIMIT=4 # 4 seconds
7
8 read -t $TIMELIMIT variable <&1
9 # ^^^
10 # In this instance, "<&1" is needed for Bash 1.x and 2.x,
11 # but unnecessary for Bash 3.x.
12
13 echo
14
15 if [ -z "$variable" ] # Is null?
16 then
17 echo "Timed out, variable still unset."
18 else
19 echo "variable = $variable"
20 fi
21
22 exit 0
$UID
User ID number
This is the current user's real id, even if she has temporarily assumed another identity through su.
$UID is a readonly variable, not subject to change from the command line or within a script, and is
the counterpart to the id builtin.
1 #!/bin/bash
2 # am-i-root.sh: Am I root or not?
3
4 ROOT_UID=0 # Root has $UID 0.
5
6 if [ "$UID" -eq "$ROOT_UID" ] # Will the real "root" please stand up?
7 then
8 echo "You are root."
9 else
10 echo "You are just an ordinary user (but mom loves you just the same)."
11 fi
12
13 exit 0
14
15
16 # ============================================================= #
17 # Code below will not execute, because the script already exited.
18
19 # An alternate method of getting to the root of matters:
20
21 ROOTUSER_NAME=root
22
23 username=`id -nu` # Or... username=`whoami`
24 if [ "$username" = "$ROOTUSER_NAME" ]
25 then
26 echo "Rooty, toot, toot. You are root."
27 else
28 echo "You are just a regular fella."
29 fi
Positional Parameters
1 #!/bin/bash
2 # arglist.sh
3 # Invoke this script with several arguments, such as "one two three".
4
5 E_BADARGS=65
6
7 if [ ! -n "$1" ]
8 then
9 echo "Usage: `basename $0` argument1 argument2 etc."
10 exit $E_BADARGS
11 fi
12
13 echo
14
15 index=1 # Initialize count.
16
17 echo "Listing args with \"\$*\":"
18 for arg in "$*" # Doesn't work properly if "$*" isn't quoted.
19 do
20 echo "Arg #$index = $arg"
21 let "index+=1"
22 done # $* sees all arguments as single word.
23 echo "Entire arg list seen as single word."
24
25 echo
26
27 index=1 # Reset count.
28 # What happens if you forget to do this?
29
30 echo "Listing args with \"\$@\":"
31 for arg in "$@"
32 do
33 echo "Arg #$index = $arg"
34 let "index+=1"
35 done # $@ sees arguments as separate words.
36 echo "Arg list seen as separate words."
37
38 echo
39
40 index=1 # Reset count.
41
42 echo "Listing args with \$* (unquoted):"
43 for arg in $*
44 do
45 echo "Arg #$index = $arg"
46 let "index+=1"
47 done # Unquoted $* sees arguments as separate words.
48 echo "Arg list seen as separate words."
49
50 exit 0
Following a shift, the $@ holds the remaining command-line parameters, lacking the previous $1,
which was lost.
1 #!/bin/bash
2 # Invoke with ./scriptname 1 2 3 4 5
3
4 echo "$@" # 1 2 3 4 5
5 shift
6 echo "$@" # 2 3 4 5
7 shift
8 echo "$@" # 3 4 5
9
10 # Each "shift" loses parameter $1.
11 # "$@" then contains the remaining parameters.
The $@ special parameter finds use as a tool for filtering input into shell scripts. The cat "$@"
construction accepts input to a script either from stdin or from files given as parameters to the
script. See Example 16-24 and Example 16-25.
1 #!/bin/bash
2
3 # Erratic behavior of the "$*" and "$@" internal Bash variables,
4 #+ depending on whether they are quoted or not.
5 # Inconsistent handling of word splitting and linefeeds.
6
7
8 set -- "First one" "second" "third:one" "" "Fifth: :one"
9 # Setting the script arguments, $1, $2, etc.
10
11 echo
12
13 echo 'IFS unchanged, using "$*"'
14 c=0
15 for i in "$*" # quoted
16 do echo "$((c+=1)): [$i]" # This line remains the same in every instance.
17 # Echo args.
18 done
19 echo ---
20
21 echo 'IFS unchanged, using $*'
22 c=0
23 for i in $* # unquoted
24 do echo "$((c+=1)): [$i]"
25 done
26 echo ---
27
28 echo 'IFS unchanged, using "$@"'
29 c=0
30 for i in "$@"
31 do echo "$((c+=1)): [$i]"
32 done
33 echo ---
34
35 echo 'IFS unchanged, using $@'
36 c=0
37 for i in $@
38 do echo "$((c+=1)): [$i]"
39 done
40 echo ---
41
42 IFS=:
43 echo 'IFS=":", using "$*"'
44 c=0
45 for i in "$*"
46 do echo "$((c+=1)): [$i]"
47 done
48 echo ---
49
50 echo 'IFS=":", using $*'
51 c=0
52 for i in $*
53 do echo "$((c+=1)): [$i]"
54 done
55 echo ---
56
57 var=$*
58 echo 'IFS=":", using "$var" (var=$*)'
59 c=0
60 for i in "$var"
61 do echo "$((c+=1)): [$i]"
62 done
63 echo ---
64
65 echo 'IFS=":", using $var (var=$*)'
66 c=0
67 for i in $var
68 do echo "$((c+=1)): [$i]"
69 done
70 echo ---
71
72 var="$*"
73 echo 'IFS=":", using $var (var="$*")'
74 c=0
75 for i in $var
76 do echo "$((c+=1)): [$i]"
77 done
78 echo ---
79
80 echo 'IFS=":", using "$var" (var="$*")'
81 c=0
82 for i in "$var"
83 do echo "$((c+=1)): [$i]"
84 done
85 echo ---
86
87 echo 'IFS=":", using "$@"'
88 c=0
89 for i in "$@"
90 do echo "$((c+=1)): [$i]"
91 done
92 echo ---
93
94 echo 'IFS=":", using $@'
95 c=0
96 for i in $@
97 do echo "$((c+=1)): [$i]"
98 done
99 echo ---
100
101 var=$@
102 echo 'IFS=":", using $var (var=$@)'
103 c=0
104 for i in $var
105 do echo "$((c+=1)): [$i]"
106 done
107 echo ---
108
109 echo 'IFS=":", using "$var" (var=$@)'
110 c=0
111 for i in "$var"
112 do echo "$((c+=1)): [$i]"
113 done
114 echo ---
115
116 var="$@"
117 echo 'IFS=":", using "$var" (var="$@")'
118 c=0
119 for i in "$var"
120 do echo "$((c+=1)): [$i]"
121 done
122 echo ---
123
124 echo 'IFS=":", using $var (var="$@")'
125 c=0
126 for i in $var
127 do echo "$((c+=1)): [$i]"
128 done
129
130 echo
131
132 # Try this script with ksh or zsh -y.
133
134 exit 0
135
136 # This example script by Stephane Chazelas,
137 # and slightly modified by the document author.
1 #!/bin/bash
2
3 # If $IFS set, but empty,
4 #+ then "$*" and "$@" do not echo positional params as expected.
5
6 mecho () # Echo positional parameters.
7 {
8 echo "$1,$2,$3";
9 }
10
11
12 IFS="" # Set, but empty.
13 set a b c # Positional parameters.
14
15 mecho "$*" # abc,,
16 # ^^
17 mecho $* # a,b,c
18
19 mecho $@ # a,b,c
20 mecho "$@" # a,b,c
21
22 # The behavior of $* and $@ when $IFS is empty depends
23 #+ on which Bash or sh version being run.
24 # It is therefore inadvisable to depend on this "feature" in a script.
25
26
27 # Thanks, Stephane Chazelas.
28
29 exit
$-
Flags passed to script (using set). See Example 15-16.
This was originally a ksh construct adopted into Bash, and unfortunately it does not
seem to work reliably in Bash scripts. One possible use for it is to have a script
self-test whether it is interactive.
$!
PID (process ID) of last job run in background
1 LOG=$0.log
2
3 COMMAND1="sleep 100"
4
5 echo "Logging PIDs background commands for script: $0" >> "$LOG"
6 # So they can be monitored, and killed as necessary.
7 echo >> "$LOG"
8
9 # Logging commands.
10
11 echo -n "PID of \"$COMMAND1\": " >> "$LOG"
12 ${COMMAND1} &
13 echo $! >> "$LOG"
14 # PID of "sleep 100": 1506
15
16 # Thank you, Jacques Lederer, for suggesting this.
Using $! for job control:
1 #!/bin/bash
2
3 echo $_ # /bin/bash
4 # Just called /bin/bash to run the script.
5 # Note that this will vary according to
6 #+ how the script is invoked.
7
8 du >/dev/null # So no output from command.
9 echo $_ # du
10
11 ls -al >/dev/null # So no output from command.
12 echo $_ # -al (last argument)
13
14 :
15 echo $_ # :
$?
Exit status of a command, function, or the script itself (see Example 24-7)
$$
Process ID (PID) of the script itself. [5] The $$ variable often finds use in scripts to construct
"unique" temp file names (see Example 31-6, Example 16-31, and Example 15-27). This is usually
simpler than invoking mktemp.
Notes
[1] A stack register is a set of consecutive memory locations, such that the values stored (pushed) are
retrieved (popped) in reverse order. The last value stored is the first retrieved. This is sometimes called
a LIFO (last-in-first-out) or pushdown stack.
[2] The PID of the currently running script is $$, of course.
[3] Somewhat analogous to recursion, in this context nesting refers to a pattern embedded within a larger
pattern. One of the definitions of nest, according to the 1913 edition of Webster's Dictionary, illustrates
this beautifully: "A collection of boxes, cases, or the like, of graduated size, each put within the one next
larger."
[4] The words "argument" and "parameter" are often used interchangeably. In the context of this document,
they have the same precise meaning: a variable passed to a script or function.
[5] Within a script, inside a subshell, $$ returns the PID of the script, not the subshell.
The declare or typeset builtins, which are exact synonyms, permit modifying the properties of variables. This
is a very weak form of the typing [1] available in certain programming languages. The declare command is
specific to version 2 or later of Bash. The typeset command also works in ksh scripts.
declare/typeset options
-r readonly
(declare -r var1 works the same as readonly var1)
This is the rough equivalent of the C const type qualifier. An attempt to change the value of a
readonly variable fails with an error message.
1 declare -r var1=1
2 echo "var1 = $var1" # var1 = 1
3
4 (( var1++ )) # x.sh: line 4: var1: readonly variable
-i integer
1 declare -i number
2 # The script will treat subsequent occurrences of "number" as an integer.
3
4 number=3
5 echo "Number = $number" # Number = 3
6
7 number=three
8 echo "Number = $number" # Number = 0
9 # Tries to evaluate the string "three" as an integer.
Certain arithmetic operations are permitted for declared integer variables without the need for expr or
let.
1 n=6/3
2 echo "n = $n" # n = 6/3
3
4 declare -i n
5 n=6/3
6 echo "n = $n" # n = 2
-a array
1 declare -a indices
The variable indices will be treated as an array.
-f function(s)
1 declare -f
A declare -f line with no arguments in a script causes a listing of all the functions previously
defined in that script.
1 declare -f function_name
A declare -f function_name in a script lists just the function named.
-x export
1 declare -x var3
This declares a variable as available for exporting outside the environment of the script itself.
-x var=$value
1 declare -x var3=373
The declare command permits assigning a value to a variable in the same statement as setting its
properties.
1 #!/bin/bash
2
3 func1 ()
4 {
5 echo This is a function.
6 }
7
8 declare -f # Lists the function above.
9
10 echo
11
12 declare -i var1 # var1 is an integer.
13 var1=2367
14 echo "var1 declared as $var1"
15 var1=var1+1 # Integer declaration eliminates the need for 'let'.
16 echo "var1 incremented by 1 is $var1."
17 # Attempt to change variable declared as integer.
18 echo "Attempting to change var1 to floating point value, 2367.1."
19 var1=2367.1 # Results in error message, with no change to variable.
20 echo "var1 is still $var1"
21
22 echo
23
24 declare -r var2=13.36 # 'declare' permits setting a variable property
25 #+ and simultaneously assigning it a value.
26 echo "var2 declared as $var2" # Attempt to change readonly variable.
27 var2=13.37 # Generates error message, and exit from script.
28
29 echo "var2 is still $var2" # This line will not execute.
30
31 exit 0 # Script will not exit here.
1 foo ()
2 {
3 FOO="bar"
4 }
5
6 bar ()
7 {
8 foo
9 echo $FOO
10 }
11
12 bar # Prints bar.
However . . .
1 foo (){
2 declare FOO="bar"
3 }
4
5 bar ()
6 {
7 foo
8 echo $FOO
9 }
10
11 bar # Prints nothing.
12
13
14 # Thank you, Michael Iatrou, for pointing this out.
bash$ zzy=68
bash$ declare | grep zzy
zzy=68
Notes
[1] In this context, typing a variable means to classify it and restrict its properties. For example, a variable
declared or typed as an integer is no longer available for string operations.
1 declare -i intvar
2
3 intvar=23
4 echo "$intvar" # 23
5 intvar=stringval
6 echo "$intvar" # 0
$RANDOM is an internal Bash function (not a constant) that returns a pseudorandom [1] integer in the range 0
- 32767. It should not be used to generate an encryption key.
1 #!/bin/bash
2
3 # $RANDOM returns a different random integer at each invocation.
4 # Nominal range: 0 - 32767 (signed 16-bit integer).
5
6 MAXCOUNT=10
7 count=1
8
9 echo
10 echo "$MAXCOUNT random numbers:"
11 echo "-----------------"
12 while [ "$count" -le $MAXCOUNT ] # Generate 10 ($MAXCOUNT) random integers.
13 do
14 number=$RANDOM
15 echo $number
16 let "count += 1" # Increment count.
17 done
18 echo "-----------------"
19
20 # If you need a random int within a certain range, use the 'modulo' operator.
21 # This returns the remainder of a division operation.
22
23 RANGE=500
24
25 echo
26
27 number=$RANDOM
28 let "number %= $RANGE"
29 # ^^
30 echo "Random number less than $RANGE --- $number"
31
32 echo
33
34
35
36 # If you need a random integer greater than a lower bound,
37 #+ then set up a test to discard all numbers below that.
38
39 FLOOR=200
40
41 number=0 #initialize
42 while [ "$number" -le $FLOOR ]
43 do
44 number=$RANDOM
45 done
46 echo "Random number greater than $FLOOR --- $number"
47 echo
48
49 # Let's examine a simple alternative to the above loop, namely
50 # let "number = $RANDOM + $FLOOR"
51 # That would eliminate the while-loop and run faster.
52 # But, there might be a problem with that. What is it?
53
54
55
56 # Combine above two techniques to retrieve random number between two limits.
57 number=0 #initialize
58 while [ "$number" -le $FLOOR ]
59 do
60 number=$RANDOM
61 let "number %= $RANGE" # Scales $number down within $RANGE.
62 done
63 echo "Random number between $FLOOR and $RANGE --- $number"
64 echo
65
66
67
68 # Generate binary choice, that is, "true" or "false" value.
69 BINARY=2
70 T=1
71 number=$RANDOM
72
73 let "number %= $BINARY"
74 # Note that let "number >>= 14" gives a better random distribution
75 #+ (right shifts out everything except last binary digit).
76 if [ "$number" -eq $T ]
77 then
78 echo "TRUE"
79 else
80 echo "FALSE"
81 fi
82
83 echo
84
85
86 # Generate a toss of the dice.
87 SPOTS=6 # Modulo 6 gives range 0 - 5.
88 # Incrementing by 1 gives desired range of 1 - 6.
89 # Thanks, Paulo Marcel Coelho Aragao, for the simplification.
90 die1=0
91 die2=0
92 # Would it be better to just set SPOTS=7 and not add 1? Why or why not?
93
94 # Tosses each die separately, and so gives correct odds.
95
96 let "die1 = $RANDOM % $SPOTS +1" # Roll first one.
97 let "die2 = $RANDOM % $SPOTS +1" # Roll second one.
98 # Which arithmetic operation, above, has greater precedence --
99 #+ modulo (%) or addition (+)?
100
101
102 let "throw = $die1 + $die2"
103 echo "Throw of the dice = $throw"
104 echo
105
106
107 exit 0
1 #!/bin/bash
2 # pick-card.sh
3
4 # This is an example of choosing random elements of an array.
5
6
7 # Pick a card, any card.
8
9 Suites="Clubs
10 Diamonds
11 Hearts
12 Spades"
13
14 Denominations="2
15 3
16 4
17 5
18 6
19 7
20 8
21 9
22 10
23 Jack
24 Queen
25 King
26 Ace"
27
28 # Note variables spread over multiple lines.
29
30
31 suite=($Suites) # Read into array variable.
32 denomination=($Denominations)
33
34 num_suites=${#suite[*]} # Count how many elements.
35 num_denominations=${#denomination[*]}
36
37 echo -n "${denomination[$((RANDOM%num_denominations))]} of "
38 echo ${suite[$((RANDOM%num_suites))]}
39
40
41 # $bozo sh pick-cards.sh
42 # Jack of Clubs
43
44
45 # Thank you, "jipe," for pointing out this use of $RANDOM.
46 exit 0
1 #!/bin/bash
2 # brownian.sh
3 # Author: Mendel Cooper
4 # Reldate: 10/26/07
5 # License: GPL3
6
7 # ----------------------------------------------------------------
8 # This script models Brownian motion:
9 #+ the random wanderings of tiny particles in a fluid,
10 #+ as they are buffeted by random currents and collisions.
11 #+ This is colloquially known as the "Drunkard's Walk."
12
13 # It can also be considered as a stripped-down simulation of a
14 #+ Galton Board, a slanted board with a pattern of pegs,
15 #+ down which rolls a succession of marbles, one at a time.
16 #+ At the bottom is a row of slots or catch basins in which
17 #+ the marbles come to rest at the end of their journey.
18 # Think of it as a kind of bare-bones Pachinko game.
19 # As you see by running the script,
20 #+ most of the marbles cluster around the center slot.
21 #+ This is consistent with the expected binomial distribution.
22 # As a Galton Board simulation, the script
23 #+ disregards such parameters as
24 #+ board tilt-angle, rolling friction of the marbles,
25 #+ angles of impact, and elasticity of the pegs.
26 # To what extent does this affect the accuracy of the simulation?
27 # ----------------------------------------------------------------
28
29 PASSES=500 # Number of particle interactions / marbles.
30 ROWS=10 # Number of "collisions" (or horiz. peg rows).
31 RANGE=3 # 0 - 2 output range from $RANDOM.
32 POS=0 # Left/right position.
33 RANDOM=$$ # Seeds the random number generator from PID
34 #+ of script.
35
36 declare -a Slots # Array holding cumulative results of passes.
37 NUMSLOTS=21 # Number of slots at bottom of board.
38
39
40 Initialize_Slots () { # Zero out all elements of the array.
41 for i in $( seq $NUMSLOTS )
42 do
43 Slots[$i]=0
44 done
45
46 echo # Blank line at beginning of run.
47 }
48
49
50 Show_Slots () {
51 echo -n " "
52 for i in $( seq $NUMSLOTS ) # Pretty-print array elements.
53 do
54 printf "%3d" ${Slots[$i]} # Allot three spaces per result.
55 done
56
57 echo # Row of slots:
58 echo " |__|__|__|__|__|__|__|__|__|__|__|__|__|__|__|__|__|__|__|__|__|"
59 echo " ^^"
60 echo # Note that if the count within any particular slot exceeds 99,
61 #+ it messes up the display.
62 # Running only(!) 500 passes usually avoids this.
63 }
64
65
66 Move () { # Move one unit right / left, or stay put.
67 Move=$RANDOM # How random is $RANDOM? Well, let's see ...
68 let "Move %= RANGE" # Normalize into range of 0 - 2.
69 case "$Move" in
70 0 ) ;; # Do nothing, i.e., stay in place.
71 1 ) ((POS--));; # Left.
72 2 ) ((POS++));; # Right.
73 * ) echo -n "Error ";; # Anomaly! (Should never occur.)
74 esac
75 }
76
77
78 Play () { # Single pass (inner loop).
79 i=0
80 while [ "$i" -lt "$ROWS" ] # One event per row.
81 do
82 Move
83 ((i++));
84 done
85
86 SHIFT=11 # Why 11, and not 10?
87 let "POS += $SHIFT" # Shift "zero position" to center.
88 (( Slots[$POS]++ )) # DEBUG: echo $POS
89 }
90
91
92 Run () { # Outer loop.
93 p=0
94 while [ "$p" -lt "$PASSES" ]
95 do
96 Play
97 (( p++ ))
98 POS=0 # Reset to zero. Why?
99 done
100 }
101
102
103 # --------------
104 # main ()
105 Initialize_Slots
106 Run
107 Show_Slots
108 # --------------
109
110 exit $?
111
112 # Exercises:
113 # ---------
114 # 1) Show the results in a vertical bar graph, or as an alternative,
115 #+ a scattergram.
116 # 2) Alter the script to use /dev/urandom instead of $RANDOM.
117 # Will this make the results more random?
Jipe points out a set of techniques for generating random numbers within a range.
1 rnumber=$(((RANDOM%(max-min+divisibleBy))/divisibleBy*divisibleBy+min))
Here Bill presents a versatile function that returns a random number between two specified values.
1 #!/bin/bash
2 # random-between.sh
3 # Random number between two specified values.
4 # Script by Bill Gradwohl, with minor modifications by the document author.
5 # Used with permission.
6
7
8 randomBetween() {
9 # Generates a positive or negative random number
10 #+ between $min and $max
11 #+ and divisible by $divisibleBy.
12 # Gives a "reasonably random" distribution of return values.
13 #
14 # Bill Gradwohl - Oct 1, 2003
15
16 syntax() {
17 # Function embedded within function.
18 echo
19 echo "Syntax: randomBetween [min] [max] [multiple]"
20 echo
21 echo -n "Expects up to 3 passed parameters, "
22 echo "but all are completely optional."
23 echo "min is the minimum value"
24 echo "max is the maximum value"
25 echo -n "multiple specifies that the answer must be "
26 echo "a multiple of this value."
27 echo " i.e. answer must be evenly divisible by this number."
28 echo
29 echo "If any value is missing, defaults area supplied as: 0 32767 1"
30 echo -n "Successful completion returns 0, "
31 echo "unsuccessful completion returns"
32 echo "function syntax and 1."
33 echo -n "The answer is returned in the global variable "
34 echo "randomBetweenAnswer"
35 echo -n "Negative values for any passed parameter are "
36 echo "handled correctly."
37 }
38
39 local min=${1:-0}
40 local max=${2:-32767}
41 local divisibleBy=${3:-1}
42 # Default values assigned, in case parameters not passed to function.
43
44 local x
45 local spread
46
47 # Let's make sure the divisibleBy value is positive.
48 [ ${divisibleBy} -lt 0 ] && divisibleBy=$((0-divisibleBy))
49
50 # Sanity check.
51 if [ $# -gt 3 -o ${divisibleBy} -eq 0 -o ${min} -eq ${max} ]; then
52 syntax
53 return 1
54 fi
55
56 # See if the min and max are reversed.
57 if [ ${min} -gt ${max} ]; then
58 # Swap them.
59 x=${min}
60 min=${max}
61 max=${x}
62 fi
63
64 # If min is itself not evenly divisible by $divisibleBy,
65 #+ then fix the min to be within range.
66 if [ $((min/divisibleBy*divisibleBy)) -ne ${min} ]; then
67 if [ ${min} -lt 0 ]; then
68 min=$((min/divisibleBy*divisibleBy))
69 else
70 min=$((((min/divisibleBy)+1)*divisibleBy))
71 fi
72 fi
73
74 # If max is itself not evenly divisible by $divisibleBy,
75 #+ then fix the max to be within range.
76 if [ $((max/divisibleBy*divisibleBy)) -ne ${max} ]; then
77 if [ ${max} -lt 0 ]; then
78 max=$((((max/divisibleBy)-1)*divisibleBy))
79 else
80 max=$((max/divisibleBy*divisibleBy))
81 fi
82 fi
83
84 # ---------------------------------------------------------------------
85 # Now, to do the real work.
86
87 # Note that to get a proper distribution for the end points,
88 #+ the range of random values has to be allowed to go between
89 #+ 0 and abs(max-min)+divisibleBy, not just abs(max-min)+1.
90
91 # The slight increase will produce the proper distribution for the
92 #+ end points.
93
94 # Changing the formula to use abs(max-min)+1 will still produce
95 #+ correct answers, but the randomness of those answers is faulty in
96 #+ that the number of times the end points ($min and $max) are returned
97 #+ is considerably lower than when the correct formula is used.
98 # ---------------------------------------------------------------------
99
100 spread=$((max-min))
101 # Omair Eshkenazi points out that this test is unnecessary,
102 #+ since max and min have already been switched around.
103 [ ${spread} -lt 0 ] && spread=$((0-spread))
104 let spread+=divisibleBy
105 randomBetweenAnswer=$(((RANDOM%spread)/divisibleBy*divisibleBy+min))
106
107 return 0
108
109 # However, Paulo Marcel Coelho Aragao points out that
110 #+ when $max and $min are not divisible by $divisibleBy,
111 #+ the formula fails.
112 #
113 # He suggests instead the following formula:
114 # rnumber = $(((RANDOM%(max-min+1)+min)/divisibleBy*divisibleBy))
115
116 }
117
118 # Let's test the function.
119 min=-14
120 max=20
121 divisibleBy=3
122
123
124 # Generate an array of expected answers and check to make sure we get
125 #+ at least one of each answer if we loop long enough.
126
127 declare -a answer
128 minimum=${min}
129 maximum=${max}
130 if [ $((minimum/divisibleBy*divisibleBy)) -ne ${minimum} ]; then
131 if [ ${minimum} -lt 0 ]; then
132 minimum=$((minimum/divisibleBy*divisibleBy))
133 else
134 minimum=$((((minimum/divisibleBy)+1)*divisibleBy))
135 fi
136 fi
137
138
139 # If max is itself not evenly divisible by $divisibleBy,
140 #+ then fix the max to be within range.
141
142 if [ $((maximum/divisibleBy*divisibleBy)) -ne ${maximum} ]; then
143 if [ ${maximum} -lt 0 ]; then
144 maximum=$((((maximum/divisibleBy)-1)*divisibleBy))
145 else
146 maximum=$((maximum/divisibleBy*divisibleBy))
147 fi
148 fi
149
150
151 # We need to generate only positive array subscripts,
152 #+ so we need a displacement that that will guarantee
153 #+ positive results.
154
155 disp=$((0-minimum))
156 for ((i=${minimum}; i<=${maximum}; i+=divisibleBy)); do
157 answer[i+disp]=0
158 done
159
160
161 # Now loop a large number of times to see what we get.
162 loopIt=1000 # The script author suggests 100000,
163 #+ but that takes a good long while.
164
165 for ((i=0; i<${loopIt}; ++i)); do
166
167 # Note that we are specifying min and max in reversed order here to
168 #+ make the function correct for this case.
169
170 randomBetween ${max} ${min} ${divisibleBy}
171
172 # Report an error if an answer is unexpected.
173 [ ${randomBetweenAnswer} -lt ${min} -o ${randomBetweenAnswer} -gt ${max} ] \
174 && echo MIN or MAX error - ${randomBetweenAnswer}!
175 [ $((randomBetweenAnswer%${divisibleBy})) -ne 0 ] \
176 && echo DIVISIBLE BY error - ${randomBetweenAnswer}!
177
178 # Store the answer away statistically.
179 answer[randomBetweenAnswer+disp]=$((answer[randomBetweenAnswer+disp]+1))
180 done
181
182
183
184 # Let's check the results
185
186 for ((i=${minimum}; i<=${maximum}; i+=divisibleBy)); do
187 [ ${answer[i+displacement]} -eq 0 ] \
188 && echo "We never got an answer of $i." \
189 || echo "${i} occurred ${answer[i+displacement]} times."
190 done
191
192
193 exit 0
Just how random is $RANDOM? The best way to test this is to write a script that tracks the distribution of
"random" numbers generated by $RANDOM. Let's roll a $RANDOM die a few times . . .
1 #!/bin/bash
2 # How random is RANDOM?
3
4 RANDOM=$$ # Reseed the random number generator using script process ID.
5
6 PIPS=6 # A die has 6 pips.
7 MAXTHROWS=600 # Increase this if you have nothing better to do with your time.
8 throw=0 # Throw count.
9
10 ones=0 # Must initialize counts to zero,
11 twos=0 #+ since an uninitialized variable is null, not zero.
12 threes=0
13 fours=0
14 fives=0
15 sixes=0
16
17 print_result ()
18 {
19 echo
20 echo "ones = $ones"
21 echo "twos = $twos"
22 echo "threes = $threes"
23 echo "fours = $fours"
24 echo "fives = $fives"
25 echo "sixes = $sixes"
26 echo
27 }
28
29 update_count()
30 {
31 case "$1" in
32 0) let "ones += 1";; # Since die has no "zero", this corresponds to 1.
33 1) let "twos += 1";; # And this to 2, etc.
34 2) let "threes += 1";;
35 3) let "fours += 1";;
36 4) let "fives += 1";;
37 5) let "sixes += 1";;
38 esac
39 }
40
41 echo
42
43
44 while [ "$throw" -lt "$MAXTHROWS" ]
45 do
46 let "die1 = RANDOM % $PIPS"
47 update_count $die1
48 let "throw += 1"
49 done
50
51 print_result
52
53 exit 0
54
55 # The scores should distribute fairly evenly, assuming RANDOM is fairly random.
56 # With $MAXTHROWS at 600, all should cluster around 100, plus-or-minus 20 or so.
57 #
58 # Keep in mind that RANDOM is a pseudorandom generator,
59 #+ and not a spectacularly good one at that.
60
61 # Randomness is a deep and complex subject.
62 # Sufficiently long "random" sequences may exhibit
63 #+ chaotic and other "non-random" behavior.
64
65 # Exercise (easy):
66 # ---------------
67 # Rewrite this script to flip a coin 1000 times.
68 # Choices are "HEADS" and "TAILS".
As we have seen in the last example, it is best to reseed the RANDOM generator each time it is invoked. Using
the same seed for RANDOM repeats the same series of numbers. [2] (This mirrors the behavior of the
random() function in C.)
1 #!/bin/bash
2 # seeding-random.sh: Seeding the RANDOM variable.
3
4 MAXCOUNT=25 # How many numbers to generate.
5
6 random_numbers ()
7 {
8 count=0
9 while [ "$count" -lt "$MAXCOUNT" ]
10 do
11 number=$RANDOM
12 echo -n "$number "
13 let "count += 1"
14 done
15 }
16
17 echo; echo
18
19 RANDOM=1 # Setting RANDOM seeds the random number generator.
20 random_numbers
21
22 echo; echo
23
24 RANDOM=1 # Same seed for RANDOM...
25 random_numbers # ...reproduces the exact same number series.
26 #
27 # When is it useful to duplicate a "random" number series?
28
29 echo; echo
30
31 RANDOM=2 # Trying again, but with a different seed...
32 random_numbers # gives a different number series.
33
34 echo; echo
35
36 # RANDOM=$$ seeds RANDOM from process id of script.
37 # It is also possible to seed RANDOM from 'time' or 'date' commands.
38
39 # Getting fancy...
40 SEED=$(head -1 /dev/urandom | od -N 1 | awk '{ print $2 }')
41 # Pseudo-random output fetched
42 #+ from /dev/urandom (system pseudo-random device-file),
43 #+ then converted to line of printable (octal) numbers by "od",
44 #+ finally "awk" retrieves just one number for SEED.
45 RANDOM=$SEED
46 random_numbers
47
48 echo; echo
49
50 exit 0
The /dev/urandom pseudo-device file provides a method of generating much more "random"
pseudorandom numbers than the $RANDOM variable. dd if=/dev/urandom of=targetfile
bs=1 count=XX creates a file of well-scattered pseudorandom numbers. However, assigning these
numbers to a variable in a script requires a workaround, such as filtering through od (as in above
example, Example 16-14, and Example A-36), or even piping to md5sum (see Example 35-14).
There are also other ways to generate pseudorandom numbers in a script. Awk provides a convenient
means of doing this.
1 #!/bin/bash
2 # random2.sh: Returns a pseudorandom number in the range 0 - 1.
3 # Uses the awk rand() function.
4
5 AWKSCRIPT=' { srand(); print rand() } '
6 # Command(s) / parameters passed to awk
7 # Note that srand() reseeds awk's random number generator.
8
9
10 echo -n "Random number between 0 and 1 = "
11
12 echo | awk "$AWKSCRIPT"
13 # What happens if you leave out the 'echo'?
14
15 exit 0
16
17
18 # Exercises:
19 # ---------
20
21 # 1) Using a loop construct, print out 10 different random numbers.
22 # (Hint: you must reseed the "srand()" function with a different seed
23 #+ in each pass through the loop. What happens if you fail to do this?)
24
25 # 2) Using an integer multiplier as a scaling factor, generate random numbers
26 #+ in the range between 10 and 100.
27
28 # 3) Same as exercise #2, above, but generate random integers this time.
The date command also lends itself to generating pseudorandom integer sequences.
Notes
[1] True "randomness," insofar as it exists at all, can only be found in certain incompletely understood
natural phenomena, such as radioactive decay. Computers only simulate randomness, and
computer-generated sequences of "random" numbers are therefore referred to as pseudorandom.
[2] The seed of a computer-generated pseudorandom number series can be considered an identification
label. For example, think of the pseudorandom series with a seed of 23 as Series #23.
A property of a pseurandom number series is the length of the cycle before it starts repeating itself. A
good pseurandom generator will produce series with very long cycles.
Bash supports a surprising number of string manipulation operations. Unfortunately, these tools lack a unified
focus. Some are a subset of parameter substitution, and others fall under the functionality of the UNIX expr
command. This results in inconsistent command syntax and overlap of functionality, not to mention
confusion.
String Length
${#string}
expr length $string
These are the equivalent of strlen() in C.
expr "$string" : '.*'
1 stringZ=abcABC123ABCabc
2
3 echo ${#stringZ} # 15
4 echo `expr length $stringZ` # 15
5 echo `expr "$stringZ" : '.*'` # 15
1 #!/bin/bash
2 # paragraph-space.sh
3 # Ver. 2.0, Reldate 05Aug08
4
5 # Inserts a blank line between paragraphs of a single-spaced text file.
6 # Usage: $0 <FILENAME
7
8 MINLEN=60 # May need to change this value.
9 # Assume lines shorter than $MINLEN characters ending in a period
10 #+ terminate a paragraph. See exercises at end of script.
11
12 while read line # For as many lines as the input file has...
13 do
14 echo "$line" # Output the line itself.
15
16 len=${#line}
17 if [[ "$len" -lt "$MINLEN" && "$line" =~ \[*\.\] ]]
18 then echo # Add a blank line immediately
19 fi #+ after short line terminated by a period.
20 done
21
22 exit
23
24 # Exercises:
25 # ---------
26 # 1) The script usually inserts a blank line at the end
27 #+ of the target file. Fix this.
28 # 2) Line 17 only considers periods as sentence terminators.
29 # Modify this to include other common end-of-sentence characters,
30 #+ such as ?, !, and ".
1 stringZ=abcABC123ABCabc
2 # |------|
3 # 12345678
4
5 echo `expr match "$stringZ" 'abc[A-Z]*.2'` # 8
6 echo `expr "$stringZ" : 'abc[A-Z]*.2'` # 8
Index
1 stringZ=abcABC123ABCabc
2 # 123456 ...
3 echo `expr index "$stringZ" C12` # 6
4 # C position.
5
6 echo `expr index "$stringZ" 1c` # 3
7 # 'c' (in #3 position) matches before '1'.
This is the near equivalent of strchr() in C.
Substring Extraction
${string:position}
Extracts substring from $string at $position.
If the $string parameter is "*" or "@", then this extracts the positional parameters, [1] starting at
$position.
${string:position:length}
Extracts $length characters of substring from $string at $position.
1 stringZ=abcABC123ABCabc
2 # 0123456789.....
3 # 0-based indexing.
4
5 echo ${stringZ:0} # abcABC123ABCabc
6 echo ${stringZ:1} # bcABC123ABCabc
7 echo ${stringZ:7} # 23ABCabc
8
9 echo ${stringZ:7:3} # 23A
10 # Three characters of substring.
11
12
13
14 # Is it possible to index from the right end of the string?
15
16 echo ${stringZ:-4} # abcABC123ABCabc
17 # Defaults to full string, as in ${parameter:-default}.
18 # However . . .
19
20 echo ${stringZ:(-4)} # Cabc
21 echo ${stringZ: -4} # Cabc
22 # Now, it works.
23 # Parentheses or added space "escape" the position parameter.
24
25 # Thank you, Dan Jacobson, for pointing this out.
The position and length arguments can be "parameterized," that is, represented as a variable, rather
than as a numerical constant.
1 #!/bin/bash
2 # rand-string.sh
3 # Generating an 8-character "random" string.
4
5 if [ -n "$1" ] # If command-line argument present,
6 then #+ then set start-string to it.
7 str0="$1"
8 else # Else use PID of script as start-string.
9 str0="$$"
10 fi
11
12 POS=2 # Starting from position 2 in the string.
13 LEN=8 # Extract eight characters.
14
15 str1=$( echo "$str0" | md5sum | md5sum )
16 # Doubly scramble: ^^^^^^ ^^^^^^
17
18 randstring="${str1:$POS:$LEN}"
19 # Can parameterize ^^^^ ^^^^
20
21 echo "$randstring"
22
23 exit $?
24
25 # bozo$ ./rand-string.sh my-password
26 # 1bdd88c4
27
28 # No, this is is not recommended
29 #+ as a method of generating hack-proof passwords.
If the $string parameter is "*" or "@", then this extracts a maximum of $length positional
parameters, starting at $position.
1 stringZ=abcABC123ABCabc
2 # 123456789......
3 # 1-based indexing.
4
5 echo `expr substr $stringZ 1 2` # ab
6 echo `expr substr $stringZ 4 3` # ABC
expr match "$string" '\($substring\)'
Extracts $substring at beginning of $string, where $substring is a regular expression.
expr "$string" : '\($substring\)'
Extracts $substring at beginning of $string, where $substring is a regular expression.
1 stringZ=abcABC123ABCabc
2 # =======
3
4 echo `expr match "$stringZ" '\(.[b-c]*[A-Z]..[0-9]\)'` # abcABC1
5 echo `expr "$stringZ" : '\(.[b-c]*[A-Z]..[0-9]\)'` # abcABC1
6 echo `expr "$stringZ" : '\(.......\)'` # abcABC1
7 # All of the above forms give an identical result.
expr match "$string" '.*\($substring\)'
Extracts $substring at end of $string, where $substring is a regular expression.
expr "$string" : '.*\($substring\)'
Extracts $substring at end of $string, where $substring is a regular expression.
1 stringZ=abcABC123ABCabc
2 # ======
3
4 echo `expr match "$stringZ" '.*\([A-C][A-C][A-C][a-c]*\)'` # ABCabc
5 echo `expr "$stringZ" : '.*\(......\)'` # ABCabc
Substring Removal
${string#substring}
Deletes shortest match of $substring from front of $string.
${string##substring}
Deletes longest match of $substring from front of $string.
1 stringZ=abcABC123ABCabc
2 # |----| shortest
3 # |----------| longest
4
5 echo ${stringZ#a*C} # 123ABCabc
6 # Strip out shortest match between 'a' and 'C'.
7
8 echo ${stringZ##a*C} # abc
9 # Strip out longest match between 'a' and 'C'.
${string%substring}
Deletes shortest match of $substring from back of $string.
For example:
1 stringZ=abcABC123ABCabc
2 # || shortest
3 # |------------| longest
4
5 echo ${stringZ%b*c} # abcABC123ABCa
6 # Strip out shortest match between 'b' and 'c', from back of $stringZ.
7
8 echo ${stringZ%%b*c} # a
9 # Strip out longest match between 'b' and 'c', from back of $stringZ.
This operator is useful for generating filenames.
1 #!/bin/bash
2 # cvt.sh:
3 # Converts all the MacPaint image files in a directory to "pbm" format.
4
5 # Uses the "macptopbm" binary from the "netpbm" package,
6 #+ which is maintained by Brian Henderson ([email protected]).
7 # Netpbm is a standard part of most Linux distros.
8
9 OPERATION=macptopbm
10 SUFFIX=pbm # New filename suffix.
11
12 if [ -n "$1" ]
13 then
14 directory=$1 # If directory name given as a script argument...
15 else
16 directory=$PWD # Otherwise use current working directory.
17 fi
18
19 # Assumes all files in the target directory are MacPaint image files,
20 #+ with a ".mac" filename suffix.
21
22 for file in $directory/* # Filename globbing.
23 do
24 filename=${file%.*c} # Strip ".mac" suffix off filename
25 #+ ('.*c' matches everything
26 #+ between '.' and 'c', inclusive).
27 $OPERATION $file > "$filename.$SUFFIX"
28 # Redirect conversion to new filename.
29 rm -f $file # Delete original files after converting.
30 echo "$filename.$SUFFIX" # Log what is happening to stdout.
31 done
32
33 exit 0
34
35 # Exercise:
36 # --------
37 # As it stands, this script converts *all* the files in the current
38 #+ working directory.
39 # Modify it to work *only* on files with a ".mac" suffix.
1 #!/bin/bash
2 # ra2ogg.sh: Convert streaming audio files (*.ra) to ogg.
3
4 # Uses the "mplayer" media player program:
5 # http://www.mplayerhq.hu/homepage
6 # Uses the "ogg" library and "oggenc":
7 # http://www.xiph.org/
8 #
9 # This script may need appropriate codecs installed, such as sipr.so ...
10 # Possibly also the compat-libstdc++ package.
11
12
13 OFILEPREF=${1%%ra} # Strip off the "ra" suffix.
14 OFILESUFF=wav # Suffix for wav file.
15 OUTFILE="$OFILEPREF""$OFILESUFF"
16 E_NOARGS=85
17
18 if [ -z "$1" ] # Must specify a filename to convert.
19 then
20 echo "Usage: `basename $0` [filename]"
21 exit $E_NOARGS
22 fi
23
24
25 ##########################################################################
26 mplayer "$1" -ao pcm:file=$OUTFILE
27 oggenc "$OUTFILE" # Correct file extension automatically added by oggenc.
28 ##########################################################################
29
30 rm "$OUTFILE" # Delete intermediate *.wav file.
31 # If you want to keep it, comment out above line.
32
33 exit $?
34
35 # Note:
36 # ----
37 # On a Website, simply clicking on a *.ram streaming audio file
38 #+ usually only downloads the URL of the actual *.ra audio file.
39 # You can then use "wget" or something similar
40 #+ to download the *.ra file itself.
41
42
43 # Exercises:
44 # ---------
45 # As is, this script converts only *.ra filenames.
46 # Add flexibility by permitting use of *.ram and other filenames.
47 #
48 # If you're really ambitious, expand the script
49 #+ to do automatic downloads and conversions of streaming audio files.
50 # Given a URL, batch download streaming audio files (using "wget")
51 #+ and convert them on the fly.
1 #!/bin/bash
2 # getopt-simple.sh
3 # Author: Chris Morgan
4 # Used in the ABS Guide with permission.
5
6
7 getopt_simple()
8 {
9 echo "getopt_simple()"
10 echo "Parameters are '$*'"
11 until [ -z "$1" ]
12 do
13 echo "Processing parameter of: '$1'"
14 if [ ${1:0:1} = '/' ]
15 then
16 tmp=${1:1} # Strip off leading '/' . . .
17 parameter=${tmp%%=*} # Extract name.
18 value=${tmp##*=} # Extract value.
19 echo "Parameter: '$parameter', value: '$value'"
20 eval $parameter=$value
21 fi
22 shift
23 done
24 }
25
26 # Pass all options to getopt_simple().
27 getopt_simple $*
28
29 echo "test is '$test'"
30 echo "test2 is '$test2'"
31
32 exit 0 # See also, UseGetOpt.sh, a modified versio of this script.
33
34 ---
35
36 sh getopt_example.sh /test=value1 /test2=value2
37
38 Parameters are '/test=value1 /test2=value2'
39 Processing parameter of: '/test=value1'
40 Parameter: 'test', value: 'value1'
41 Processing parameter of: '/test2=value2'
42 Parameter: 'test2', value: 'value2'
43 test is 'value1'
44 test2 is 'value2'
45
Substring Replacement
${string/substring/replacement}
Replace first match of $substring with $replacement. [2]
${string//substring/replacement}
Replace all matches of $substring with $replacement.
1 stringZ=abcABC123ABCabc
2
3 echo ${stringZ/abc/xyz} # xyzABC123ABCabc
4 # Replaces first match of 'abc' with 'xyz'.
5
6 echo ${stringZ//abc/xyz} # xyzABC123ABCxyz
7 # Replaces all matches of 'abc' with # 'xyz'.
8
9 echo ---------------
10 echo "$stringZ" # abcABC123ABCabc
11 echo ---------------
12 # The string itself is not altered!
13
14 # Can the match and replacement strings be parameterized?
15 match=abc
16 repl=000
17 echo ${stringZ/$match/$repl} # 000ABC123ABCabc
18 # ^ ^ ^^^
19 echo ${stringZ//$match/$repl} # 000ABC123ABC000
20 # Yes! ^ ^ ^^^ ^^^
21
22 echo
23
24 # What happens if no $replacement string is supplied?
25 echo ${stringZ/abc} # ABC123ABCabc
26 echo ${stringZ//abc} # ABC123ABC
27 # A simple deletion takes place.
${string/#substring/replacement}
If $substring matches front end of $string, substitute $replacement for $substring.
${string/%substring/replacement}
If $substring matches back end of $string, substitute $replacement for $substring.
1 stringZ=abcABC123ABCabc
2
3 echo ${stringZ/#abc/XYZ} # XYZABC123ABCabc
4 # Replaces front-end match of 'abc' with 'XYZ'.
5
6 echo ${stringZ/%abc/XYZ} # abcABC123ABCXYZ
7 # Replaces back-end match of 'abc' with 'XYZ'.
A Bash script may invoke the string manipulation facilities of awk as an alternative to using its built-in
operations.
1 #!/bin/bash
2 # substring-extraction.sh
3
4 String=23skidoo1
5 # 012345678 Bash
6 # 123456789 awk
7 # Note different string indexing system:
8 # Bash numbers first character of string as 0.
9 # Awk numbers first character of string as 1.
10
11 echo ${String:2:4} # position 3 (0-1-2), 4 characters long
12 # skid
13
14 # The awk equivalent of ${string:pos:length} is substr(string,pos,length).
15 echo | awk '
16 { print substr("'"${String}"'",3,4) # skid
17 }
18 '
19 # Piping an empty "echo" to awk gives it dummy input,
20 #+ and thus makes it unnecessary to supply a filename.
21
22 echo "----"
23
24 # And likewise:
25
26 echo | awk '
27 { print index("'"${String}"'", "skid") # 3
28 } # (skid starts at position 3)
29 ' # The awk equivalent of "expr index" ...
30
31 exit 0
10.1.2. Further Reference
For more on string manipulation in scripts, refer to Section 10.2 and the relevant section of the expr command
listing.
Script examples:
1. Example 16-9
2. Example 10-9
3. Example 10-10
4. Example 10-11
5. Example 10-13
6. Example A-36
7. Example A-41
Notes
${parameter}
Same as $parameter, i.e., value of the variable parameter. In certain contexts, only the less
ambiguous ${parameter} form works.
1 your_id=${USER}-on-${HOSTNAME}
2 echo "$your_id"
3 #
4 echo "Old \$PATH = $PATH"
5 PATH=${PATH}:/opt/bin # Add /opt/bin to $PATH for duration of script.
6 echo "New \$PATH = $PATH"
${parameter-default}, ${parameter:-default}
If parameter not set, use default.
1 var1=1
2 var2=2
3 # var3 is unset.
4
5 echo ${var1-$var2} # 1
6 echo ${var3-$var2} # 2
7 # ^ Note the $ prefix.
8
9
10
11 echo ${username-`whoami`}
12 # Echoes the result of `whoami`, if variable $username is still unset.
${parameter-default} and ${parameter:-default} are almost
equivalent. The extra : makes a difference only when parameter has been declared,
but is null.
1 #!/bin/bash
2 # param-sub.sh
3
4 # Whether a variable has been declared
5 #+ affects triggering of the default option
6 #+ even if the variable is null.
7
8 username0=
9 echo "username0 has been declared, but is set to null."
10 echo "username0 = ${username0-`whoami`}"
11 # Will not echo.
12
13 echo
14
15 echo username1 has not been declared.
16 echo "username1 = ${username1-`whoami`}"
17 # Will echo.
18
19 username2=
20 echo "username2 has been declared, but is set to null."
21 echo "username2 = ${username2:-`whoami`}"
22 # ^
23 # Will echo because of :- rather than just - in condition test.
24 # Compare to first instance, above.
25
26
27 #
28
29 # Once again:
30
31 variable=
32 # variable has been declared, but is set to null.
33
34 echo "${variable-0}" # (no output)
35 echo "${variable:-1}" # 1
36 # ^
37
38 unset variable
39
40 echo "${variable-2}" # 2
41 echo "${variable:-3}" # 3
42
43 exit 0
The default parameter construct finds use in providing "missing" command-line arguments in scripts.
1 DEFAULT_FILENAME=generic.data
2 filename=${1:-$DEFAULT_FILENAME}
3 # If not otherwise specified, the following command block operates
4 #+ on the file "generic.data".
5 # Begin-Command-Block
6 # ...
7 # ...
8 # ...
9 # End-Command-Block
10
11
12
13 # From "hanoi2.bash" example:
14 DISKS=${1:-E_NOPARAM} # Must specify how many disks.
15 # Set $DISKS to $1 command-line-parameter,
16 #+ or to $E_NOPARAM if that is unset.
See also Example 3-4, Example 30-2, and Example A-6.
Compare this method with using an and list to supply a default command-line argument.
${parameter=default}, ${parameter:=default}
Both forms nearly equivalent. The : makes a difference only when $parameter has been declared
and is null, [1] as above.
Both forms nearly equivalent. The : makes a difference only when parameter has been declared
and is null, see below.
Both forms nearly equivalent. The : makes a difference only when parameter has been declared
and is null, as above.
1 #!/bin/bash
2
3 # Check some of the system's environmental variables.
4 # This is good preventative maintenance.
5 # If, for example, $USER, the name of the person at the console, is not set,
6 #+ the machine will not recognize you.
7
8 : ${HOSTNAME?} ${USER?} ${HOME?} ${MAIL?}
9 echo
10 echo "Name of the machine is $HOSTNAME."
11 echo "You are $USER."
12 echo "Your home directory is $HOME."
13 echo "Your mail INBOX is located in $MAIL."
14 echo
15 echo "If you are reading this message,"
16 echo "critical environmental variables have been set."
17 echo
18 echo
19
20 # ------------------------------------------------------
21
22 # The ${variablename?} construction can also check
23 #+ for variables set within the script.
24
25 ThisVariable=Value-of-ThisVariable
26 # Note, by the way, that string variables may be set
27 #+ to characters disallowed in their names.
28 : ${ThisVariable?}
29 echo "Value of ThisVariable is $ThisVariable".
30
31 echo; echo
32
33
34 : ${ZZXy23AB?"ZZXy23AB has not been set."}
35 # Since ZZXy23AB has not been set,
36 #+ then the script terminates with an error message.
37
38 # You can specify the error message.
39 # : ${variablename?"ERROR MESSAGE"}
40
41
42 # Same result with: dummy_variable=${ZZXy23AB?}
43 # dummy_variable=${ZZXy23AB?"ZXy23AB has not been set."}
44 #
45 # echo ${ZZXy23AB?} >/dev/null
46
47 # Compare these methods of checking whether a variable has been set
48 #+ with "set -u" . . .
49
50
51
52 echo "You will not see this message, because script already terminated."
53
54 HERE=0
55 exit $HERE # Will NOT exit here.
56
57 # In fact, this script will return an exit status (echo $?) of 1.
1 #!/bin/bash
2 # usage-message.sh
3
4 : ${1?"Usage: $0 ARGUMENT"}
5 # Script exits here if command-line parameter absent,
6 #+ with following error message.
7 # usage-message.sh: 1: Usage: usage-message.sh ARGUMENT
8
9 echo "These two lines echo only if command-line parameter given."
10 echo "command-line parameter = \"$1\""
11
12 exit 0 # Will exit here only if command-line parameter present.
13
14 # Check the exit status, both with and without command-line parameter.
15 # If command-line parameter present, then "$?" is 0.
16 # If not, then "$?" is 1.
Parameter substitution and/or expansion. The following expressions are the complement to the match in
expr string operations (see Example 16-9). These particular ones are used mostly in parsing file path names.
${#var}
String length (number of characters in $var). For an array, ${#array} is the length of the first
element in the array.
Exceptions:
◊
${#*} and ${#@} give the number of positional parameters.
◊ For an array, ${#array[*]} and ${#array[@]} give the number of elements in
the array.
1 #!/bin/bash
2 # length.sh
3
4 E_NO_ARGS=65
5
6 if [ $# -eq 0 ] # Must have command-line args to demo script.
7 then
8 echo "Please invoke this script with one or more command-line arguments."
9 exit $E_NO_ARGS
10 fi
11
12 var01=abcdEFGH28ij
13 echo "var01 = ${var01}"
14 echo "Length of var01 = ${#var01}"
15 # Now, let's try embedding a space.
16 var02="abcd EFGH28ij"
17 echo "var02 = ${var02}"
18 echo "Length of var02 = ${#var02}"
19
20 echo "Number of command-line arguments passed to script = ${#@}"
21 echo "Number of command-line arguments passed to script = ${#*}"
22
23 exit 0
${var#Pattern}, ${var##Pattern}
${var#Pattern} Remove from $var the shortest part of $Pattern that matches the front end
of $var.
${var##Pattern} Remove from $var the longest part of $Pattern that matches the front end
of $var.
${var%Pattern} Remove from $var the shortest part of $Pattern that matches the back end
of $var.
${var%%Pattern} Remove from $var the longest part of $Pattern that matches the back end
of $var.
1 #!/bin/bash
2 # patt-matching.sh
3
4 # Pattern matching using the # ## % %% parameter substitution operators.
5
6 var1=abcd12345abc6789
7 pattern1=a*c # * (wild card) matches everything between a - c.
8
9 echo
10 echo "var1 = $var1" # abcd12345abc6789
11 echo "var1 = ${var1}" # abcd12345abc6789
12 # (alternate form)
13 echo "Number of characters in ${var1} = ${#var1}"
14 echo
15
16 echo "pattern1 = $pattern1" # a*c (everything between 'a' and 'c')
17 echo "--------------"
18 echo '${var1#$pattern1} =' "${var1#$pattern1}" # d12345abc6789
19 # Shortest possible match, strips out first 3 characters abcd12345abc6789
20 # ^^^^^ |-|
21 echo '${var1##$pattern1} =' "${var1##$pattern1}" # 6789
22 # Longest possible match, strips out first 12 characters abcd12345abc6789
23 # ^^^^^ |----------|
24
25 echo; echo; echo
26
27 pattern2=b*9 # everything between 'b' and '9'
28 echo "var1 = $var1" # Still abcd12345abc6789
29 echo
30 echo "pattern2 = $pattern2"
31 echo "--------------"
32 echo '${var1%pattern2} =' "${var1%$pattern2}" # abcd12345a
33 # Shortest possible match, strips out last 6 characters abcd12345abc6789
34 # ^^^^ |----|
35 echo '${var1%%pattern2} =' "${var1%%$pattern2}" # a
36 # Longest possible match, strips out last 12 characters abcd12345abc6789
37 # ^^^^ |-------------|
38
39 # Remember, # and ## work from the left end (beginning) of string,
40 # % and %% work from the right end.
41
42 echo
43
44 exit 0
1 #!/bin/bash
2 # rfe.sh: Renaming file extensions.
3 #
4 # rfe old_extension new_extension
5 #
6 # Example:
7 # To rename all *.gif files in working directory to *.jpg,
8 # rfe gif jpg
9
10
11 E_BADARGS=65
12
13 case $# in
14 0|1) # The vertical bar means "or" in this context.
15 echo "Usage: `basename $0` old_file_suffix new_file_suffix"
16 exit $E_BADARGS # If 0 or 1 arg, then bail out.
17 ;;
18 esac
19
20
21 for filename in *.$1
22 # Traverse list of files ending with 1st argument.
23 do
24 mv $filename ${filename%$1}$2
25 # Strip off part of filename matching 1st argument,
26 #+ then append 2nd argument.
27 done
28
29 exit 0
If Replacement is omitted, then the first match of Pattern is replaced by nothing, that is,
deleted.
${var//Pattern/Replacement}
Global replacement. All matches of Pattern, within var replaced with Replacement.
As above, if Replacement is omitted, then all occurrences of Pattern are replaced by nothing,
that is, deleted.
Example 10-12. Using pattern matching to parse arbitrary strings
1 #!/bin/bash
2
3 var1=abcd-1234-defg
4 echo "var1 = $var1"
5
6 t=${var1#*-*}
7 echo "var1 (with everything, up to and including first - stripped out) = $t"
8 # t=${var1#*-} works just the same,
9 #+ since # matches the shortest string,
10 #+ and * matches everything preceding, including an empty string.
11 # (Thanks, Stephane Chazelas, for pointing this out.)
12
13 t=${var1##*-*}
14 echo "If var1 contains a \"-\", returns empty string... var1 = $t"
15
16
17 t=${var1%*-*}
18 echo "var1 (with everything from the last - on stripped out) = $t"
19
20 echo
21
22 # -------------------------------------------
23 path_name=/home/bozo/ideas/thoughts.for.today
24 # -------------------------------------------
25 echo "path_name = $path_name"
26 t=${path_name##/*/}
27 echo "path_name, stripped of prefixes = $t"
28 # Same effect as t=`basename $path_name` in this particular case.
29 # t=${path_name%/}; t=${t##*/} is a more general solution,
30 #+ but still fails sometimes.
31 # If $path_name ends with a newline, then `basename $path_name` will not work,
32 #+ but the above expression will.
33 # (Thanks, S.C.)
34
35 t=${path_name%/*.*}
36 # Same effect as t=`dirname $path_name`
37 echo "path_name, stripped of suffixes = $t"
38 # These will fail in some cases, such as "../", "/foo////", # "foo/", "/".
39 # Removing suffixes, especially when the basename has no suffix,
40 #+ but the dirname does, also complicates matters.
41 # (Thanks, S.C.)
42
43 echo
44
45 t=${path_name:11}
46 echo "$path_name, with first 11 chars stripped off = $t"
47 t=${path_name:11:5}
48 echo "$path_name, with first 11 chars stripped off, length 5 = $t"
49
50 echo
51
52 t=${path_name/bozo/clown}
53 echo "$path_name with \"bozo\" replaced by \"clown\" = $t"
54 t=${path_name/today/}
55 echo "$path_name with \"today\" deleted = $t"
56 t=${path_name//o/O}
57 echo "$path_name with all o's capitalized = $t"
58 t=${path_name//o/}
59 echo "$path_name with all o's deleted = $t"
60
61 exit 0
${var/#Pattern/Replacement}
If prefix of var matches Pattern, then substitute Replacement for Pattern.
${var/%Pattern/Replacement}
If suffix of var matches Pattern, then substitute Replacement for Pattern.
1 #!/bin/bash
2 # var-match.sh:
3 # Demo of pattern replacement at prefix / suffix of string.
4
5 v0=abc1234zip1234abc # Original variable.
6 echo "v0 = $v0" # abc1234zip1234abc
7 echo
8
9 # Match at prefix (beginning) of string.
10 v1=${v0/#abc/ABCDEF} # abc1234zip1234abc
11 # |-|
12 echo "v1 = $v1" # ABCDEF1234zip1234abc
13 # |----|
14
15 # Match at suffix (end) of string.
16 v2=${v0/%abc/ABCDEF} # abc1234zip123abc
17 # |-|
18 echo "v2 = $v2" # abc1234zip1234ABCDEF
19 # |----|
20
21 echo
22
23 # ----------------------------------------------------
24 # Must match at beginning / end of string,
25 #+ otherwise no replacement results.
26 # ----------------------------------------------------
27 v3=${v0/#123/000} # Matches, but not at beginning.
28 echo "v3 = $v3" # abc1234zip1234abc
29 # NO REPLACEMENT.
30 v4=${v0/%123/000} # Matches, but not at end.
31 echo "v4 = $v4" # abc1234zip1234abc
32 # NO REPLACEMENT.
33
34 exit 0
${!varprefix*}, ${!varprefix@}
Matches names of all previously declared variables beginning with varprefix.
Notes
[1] If $parameter is null in a non-interactive script, it will terminate with a 127 exit status (the Bash error
code for "command not found").
--Shakespeare, Othello
Operations on code blocks are the key to structured and organized shell scripts. Looping and branching
constructs provide the tools for accomplishing this.
11.1. Loops
A loop is a block of code that iterates [1] a list of commands as long as the loop control condition is true.
for loops
During each pass through the loop, arg takes on the value of each successive variable
in the list.
1 #!/bin/bash
2 # Listing the planets.
3
4 for planet in Mercury Venus Earth Mars Jupiter Saturn Uranus Neptune Pluto
5 do
6 echo $planet # Each planet on a separate line.
7 done
8
9 echo; echo
10
11 for planet in "Mercury Venus Earth Mars Jupiter Saturn Uranus Neptune Pluto"
12 # All planets on same line.
13 # Entire 'list' enclosed in quotes creates a single variable.
14 # Why? Whitespace incorporated into the variable.
15 do
16 echo $planet
17 done
18
19 echo; echo "Whoops! Pluto is no longer a planet!"
20
21 exit 0
Each [list] element may contain multiple parameters. This is useful when processing parameters
in groups. In such cases, use the set command (see Example 15-16) to force parsing of each [list]
element and assignment of each component to the positional parameters.
Example 11-2. for loop with two parameters in each [list] element
1 #!/bin/bash
2 # Planets revisited.
3
4 # Associate the name of each planet with its distance from the sun.
5
6 for planet in "Mercury 36" "Venus 67" "Earth 93" "Mars 142" "Jupiter 483"
7 do
8 set -- $planet # Parses variable "planet"
9 #+ and sets positional parameters.
10 # The "--" prevents nasty surprises if $planet is null or
11 #+ begins with a dash.
12
13 # May need to save original positional parameters,
14 #+ since they get overwritten.
15 # One way of doing this is to use an array,
16 # original_params=("$@")
17
18 echo "$1 $2,000,000 miles from the sun"
19 #-------two tabs---concatenate zeroes onto parameter $2
20 done
21
22 # (Thanks, S.C., for additional clarification.)
23
24 exit 0
1 #!/bin/bash
2 # fileinfo.sh
3
4 FILES="/usr/sbin/accept
5 /usr/sbin/pwck
6 /usr/sbin/chroot
7 /usr/bin/fakefile
8 /sbin/badblocks
9 /sbin/ypbind" # List of files you are curious about.
10 # Threw in a dummy file, /usr/bin/fakefile.
11
12 echo
13
14 for file in $FILES
15 do
16
17 if [ ! -e "$file" ] # Check if file exists.
18 then
19 echo "$file does not exist."; echo
20 continue # On to next.
21 fi
22
23 ls -l $file | awk '{ print $8 " file size: " $5 }' # Print 2 fields.
24 whatis `basename $file` # File info.
25 # Note that the whatis database needs to have been set up for this to work.
26 # To do this, as root run /usr/bin/makewhatis.
27 echo
28 done
29
30 exit 0
If the [list] in a for loop contains wild cards (* and ?) used in filename expansion, then globbing
takes place.
1 #!/bin/bash
2 # list-glob.sh: Generating [list] in a for-loop, using "globbing"
3
4 echo
5
6 for file in *
7 # ^ Bash performs filename expansion
8 #+ on expressions that globbing recognizes.
9 do
10 ls -l "$file" # Lists all files in $PWD (current directory).
11 # Recall that the wild card character "*" matches every filename,
12 #+ however, in "globbing," it doesn't match dot-files.
13
14 # If the pattern matches no file, it is expanded to itself.
15 # To prevent this, set the nullglob option
16 #+ (shopt -s nullglob).
17 # Thanks, S.C.
18 done
19
20 echo; echo
21
22 for file in [jx]*
23 do
24 rm -f $file # Removes only files beginning with "j" or "x" in $PWD.
25 echo "Removed file \"$file\"".
26 done
27
28 echo
29
30 exit 0
Omitting the in [list] part of a for loop causes the loop to operate on $@ -- the positional
parameters. A particularly clever illustration of this is Example A-15. See also Example 15-17.
1 #!/bin/bash
2
3 # Invoke this script both with and without arguments,
4 #+ and see what happens.
5
6 for a
7 do
8 echo -n "$a "
9 done
10
11 # The 'in list' missing, therefore the loop operates on '$@'
12 #+ (command-line argument list, including whitespace).
13
14 echo
15
16 exit 0
It is possible to use command substitution to generate the [list] in a for loop. See also Example
16-54, Example 11-10 and Example 16-48.
Example 11-6. Generating the [list] in a for loop with command substitution
1 #!/bin/bash
2 # for-loopcmd.sh: for-loop with [list]
3 #+ generated by command substitution.
4
5 NUMBERS="9 7 3 8 37.53"
6
7 for number in `echo $NUMBERS` # for number in 9 7 3 8 37.53
8 do
9 echo -n "$number "
10 done
11
12 echo
13 exit 0
Here is a somewhat more complex example of using command substitution to create the [list].
1 #!/bin/bash
2 # bin-grep.sh: Locates matching strings in a binary file.
3
4 # A "grep" replacement for binary files.
5 # Similar effect to "grep -a"
6
7 E_BADARGS=65
8 E_NOFILE=66
9
10 if [ $# -ne 2 ]
11 then
12 echo "Usage: `basename $0` search_string filename"
13 exit $E_BADARGS
14 fi
15
16 if [ ! -f "$2" ]
17 then
18 echo "File \"$2\" does not exist."
19 exit $E_NOFILE
20 fi
21
22
23 IFS=$'\012' # Per suggestion of Anton Filippov.
24 # was: IFS="\n"
25 for word in $( strings "$2" | grep "$1" )
26 # The "strings" command lists strings in binary files.
27 # Output then piped to "grep", which tests for desired string.
28 do
29 echo $word
30 done
31
32 # As S.C. points out, lines 23 - 30 could be replaced with the simpler
33 # strings "$2" | grep "$1" | tr -s "$IFS" '[\n*]'
34
35
36 # Try something like "./bin-grep.sh mem /bin/ls"
37 #+ to exercise this script.
38
39 exit 0
1 #!/bin/bash
2 # userlist.sh
3
4 PASSWORD_FILE=/etc/passwd
5 n=1 # User number
6
7 for name in $(awk 'BEGIN{FS=":"}{print $1}' < "$PASSWORD_FILE" )
8 # Field separator = : ^^^^^^
9 # Print first field ^^^^^^^^
10 # Get input from password file ^^^^^^^^^^^^^^^^^
11 do
12 echo "USER #$n = $name"
13 let "n += 1"
14 done
15
16
17 # USER #1 = root
18 # USER #2 = bin
19 # USER #3 = daemon
20 # ...
21 # USER #30 = bozo
22
23 exit 0
24
25 # Exercise:
26 # --------
27 # How is it that an ordinary user (or a script run by same)
28 #+ can read /etc/passwd?
29 # Isn't this a security hole? Why or why not?
1 #!/bin/bash
2 # findstring.sh:
3 # Find a particular string in the binaries in a specified directory.
4
5 directory=/usr/bin/
6 fstring="Free Software Foundation" # See which files come from the FSF.
7
8 for file in $( find $directory -type f -name '*' | sort )
9 do
10 strings -f $file | grep "$fstring" | sed -e "s%$directory%%"
11 # In the "sed" expression,
12 #+ it is necessary to substitute for the normal "/" delimiter
13 #+ because "/" happens to be one of the characters filtered out.
14 # Failure to do so gives an error message. (Try it.)
15 done
16
17 exit $?
18
19 # Exercise (easy):
20 # ---------------
21 # Convert this script to take command-line parameters
22 #+ for $directory and $fstring.
A final example of [list] / command substitution, but this time the "command" is a function.
1 generate_list ()
2 {
3 echo "one two three"
4 }
5
6 for word in $(generate_list) # Let "word" grab output of function.
7 do
8 echo "$word"
9 done
10
11 # one
12 # two
13 # three
1 #!/bin/bash
2 # symlinks.sh: Lists symbolic links in a directory.
3
4
5 directory=${1-`pwd`}
6 # Defaults to current working directory,
7 #+ if not otherwise specified.
8 # Equivalent to code block below.
9 # ----------------------------------------------------------
10 # ARGS=1 # Expect one command-line argument.
11 #
12 # if [ $# -ne "$ARGS" ] # If not 1 arg...
13 # then
14 # directory=`pwd` # current working directory
15 # else
16 # directory=$1
17 # fi
18 # ----------------------------------------------------------
19
20 echo "symbolic links in directory \"$directory\""
21
22 for file in "$( find $directory -type l )" # -type l = symbolic links
23 do
24 echo "$file"
25 done | sort # Otherwise file list is unsorted.
26 # Strictly speaking, a loop isn't really necessary here,
27 #+ since the output of the "find" command is expanded into a single word.
28 # However, it's easy to understand and illustrative this way.
29
30 # As Dominik 'Aeneas' Schnitzer points out,
31 #+ failing to quote $( find $directory -type l )
32 #+ will choke on filenames with embedded whitespace.
33 # containing whitespace.
34
35 exit 0
36
37
38 # --------------------------------------------------------
39 # Jean Helou proposes the following alternative:
40
41 echo "symbolic links in directory \"$directory\""
42 # Backup of the current IFS. One can never be too cautious.
43 OLDIFS=$IFS
44 IFS=:
45
46 for file in $(find $directory -type l -printf "%p$IFS")
47 do # ^^^^^^^^^^^^^^^^
48 echo "$file"
49 done|sort
50
51 # And, James "Mike" Conley suggests modifying Helou's code thusly:
52
53 OLDIFS=$IFS
54 IFS='' # Null IFS means no word breaks
55 for file in $( find $directory -type l )
56 do
57 echo $file
58 done | sort
59
60 # This works in the "pathological" case of a directory name having
61 #+ an embedded colon.
62 # "This also fixes the pathological case of the directory name having
63 #+ a colon (or space in earlier example) as well."
64
The stdout of a loop may be redirected to a file, as this slight modification to the previous example
shows.
1 #!/bin/bash
2 # symlinks.sh: Lists symbolic links in a directory.
3
4 OUTFILE=symlinks.list # save file
5
6 directory=${1-`pwd`}
7 # Defaults to current working directory,
8 #+ if not otherwise specified.
9
10
11 echo "symbolic links in directory \"$directory\"" > "$OUTFILE"
12 echo "---------------------------" >> "$OUTFILE"
13
14 for file in "$( find $directory -type l )" # -type l = symbolic links
15 do
16 echo "$file"
17 done | sort >> "$OUTFILE" # stdout of loop
18 # ^^^^^^^^^^^^^ redirected to save file.
19
20 exit 0
There is an alternative syntax to a for loop that will look very familiar to C programmers. This
requires double parentheses.
1 #!/bin/bash
2 # Multiple ways to count up to 10.
3
4 echo
5
6 # Standard syntax.
7 for a in 1 2 3 4 5 6 7 8 9 10
8 do
9 echo -n "$a "
10 done
11
12 echo; echo
13
14 # +==========================================+
15
16 # Using "seq" ...
17 for a in `seq 10`
18 do
19 echo -n "$a "
20 done
21
22 echo; echo
23
24 # +==========================================+
25
26 # Using brace expansion ...
27 # Bash, version 3+.
28 for a in {1..10}
29 do
30 echo -n "$a "
31 done
32
33 echo; echo
34
35 # +==========================================+
36
37 # Now, let's do the same, using C-like syntax.
38
39 LIMIT=10
40
41 for ((a=1; a <= LIMIT ; a++)) # Double parentheses, and "LIMIT" with no "$".
42 do
43 echo -n "$a "
44 done # A construct borrowed from 'ksh93'.
45
46 echo; echo
47
48 # +=========================================================================+
49
50 # Let's use the C "comma operator" to increment two variables simultaneously.
51
52 for ((a=1, b=1; a <= LIMIT ; a++, b++))
53 do # The comma chains together operations.
54 echo -n "$a-$b "
55 done
56
57 echo; echo
58
59 exit 0
See also Example 27-16, Example 27-17, and Example A-6.
---
1 #!/bin/bash
2 # Faxing (must have 'efax' package installed).
3
4 EXPECTED_ARGS=2
5 E_BADARGS=85
6 MODEM_PORT="/dev/ttyS2" # May be different on your machine.
7 # ^^^^^ PCMCIA modem card default port.
8
9 if [ $# -ne $EXPECTED_ARGS ]
10 # Check for proper number of command-line args.
11 then
12 echo "Usage: `basename $0` phone# text-file"
13 exit $E_BADARGS
14 fi
15
16
17 if [ ! -f "$2" ]
18 then
19 echo "File $2 is not a text file."
20 # File is not a regular file, or does not exist.
21 exit $E_BADARGS
22 fi
23
24
25 fax make $2 # Create fax-formatted files from text files.
26
27 for file in $(ls $2.0*) # Concatenate the converted files.
28 # Uses wild card (filename "globbing")
29 #+ in variable list.
30 do
31 fil="$fil $file"
32 done
33
34 efax -d "$MODEM_PORT" -t "T$1" $fil # Finally, do the work.
35 # Trying adding -o1 if above line fails.
36
37
38 # As S.C. points out, the for-loop can be eliminated with
39 # efax -d /dev/ttyS2 -o1 -t "T$1" $2.0*
40 #+ but it's not quite as instructive [grin].
41
42 exit $? # Also, efax sends diagnostic messages to stdout.
while
This construct tests for a condition at the top of a loop, and keeps looping as long as that condition is
true (returns a 0 exit status). In contrast to a for loop, a while loop finds use in situations where the
number of loop repetitions is not known beforehand.
while [ condition ]
do
command(s)...
done
The bracket construct in a while loop is nothing more than our old friend, the test brackets used in an
if/then test. In fact, a while loop can legally use the more versatile double-brackets construct (while [[
condition ]]).
As is the case with for loops, placing the do on the same line as the condition test requires a
semicolon.
while [ condition ] ; do
Note that the test brackets are not mandatory in a while loop. See, for example, the getopts construct.
1 #!/bin/bash
2
3 var0=0
4 LIMIT=10
5
6 while [ "$var0" -lt "$LIMIT" ]
7 # ^ ^
8 # Spaces, because these are "test-brackets" . . .
9 do
10 echo -n "$var0 " # -n suppresses newline.
11 # ^ Space, to separate printed out numbers.
12
13 var0=`expr $var0 + 1` # var0=$(($var0+1)) also works.
14 # var0=$((var0 + 1)) also works.
15 # let "var0 += 1" also works.
16 done # Various other methods also work.
17
18 echo
19
20 exit 0
1 #!/bin/bash
2
3 echo
4 # Equivalent to:
5 while [ "$var1" != "end" ] # while test "$var1" != "end"
6 do
7 echo "Input variable #1 (end to exit) "
8 read var1 # Not 'read $var1' (why?).
9 echo "variable #1 = $var1" # Need quotes because of "#" . . .
10 # If input is 'end', echoes it here.
11 # Does not test for termination condition until top of loop.
12 echo
13 done
14
15 exit 0
A while loop may have multiple conditions. Only the final condition determines when the loop
terminates. This necessitates a slightly different loop syntax, however.
1 #!/bin/bash
2
3 var1=unset
4 previous=$var1
5
6 while echo "previous-variable = $previous"
7 echo
8 previous=$var1
9 [ "$var1" != end ] # Keeps track of what $var1 was previously.
10 # Four conditions on "while", but only last one controls loop.
11 # The *last* exit status is the one that counts.
12 do
13 echo "Input variable #1 (end to exit) "
14 read var1
15 echo "variable #1 = $var1"
16 done
17
18 # Try to figure out how this all works.
19 # It's a wee bit tricky.
20
21 exit 0
As with a for loop, a while loop may employ C-style syntax by using the double-parentheses construct
(see also Example 8-5).
1 #!/bin/bash
2 # wh-loopc.sh: Count to 10 in a "while" loop.
3
4 LIMIT=10
5 a=1
6
7 while [ "$a" -le $LIMIT ]
8 do
9 echo -n "$a "
10 let "a+=1"
11 done # No surprises, so far.
12
13 echo; echo
14
15 # +=================================================================+
16
17 # Now, repeat with C-like syntax.
18
19 ((a = 1)) # a=1
20 # Double parentheses permit space when setting a variable, as in C.
21
22 while (( a <= LIMIT )) # Double parentheses, and no "$" preceding variables.
23 do
24 echo -n "$a "
25 ((a += 1)) # let "a+=1"
26 # Yes, indeed.
27 # Double parentheses permit incrementing a variable with C-like syntax.
28 done
29
30 echo
31
32 # C programmers can feel right at home in Bash.
33
34 exit 0
1 t=0
2
3 condition ()
4 {
5 ((t++))
6
7 if [ $t -lt 5 ]
8 then
9 return 0 # true
10 else
11 return 1 # false
12 fi
13 }
14
15 while condition
16 # ^^^^^^^^^
17 # Function call -- four loop iterations.
18 do
19 echo "Still going: t = $t"
20 done
21
22 # Still going: t = 1
23 # Still going: t = 2
24 # Still going: t = 3
25 # Still going: t = 4
Similar to the if-test construct, a while loop can omit the test brackets.
1 while condition
2 do
3 command(s) ...
4 done
By coupling the power of the read command with a while loop, we get the handy while read construct,
useful for reading and parsing files.
A while loop may have its stdin redirected to a file by a < at its end.
until [ condition-is-true ]
do
command(s)...
done
Note that an until loop tests for the terminating condition at the top of the loop, differing from a
similar construct in some programming languages.
As is the case with for loops, placing the do on the same line as the condition test requires a
semicolon.
until [ condition-is-true ] ; do
1 #!/bin/bash
2
3 END_CONDITION=end
4
5 until [ "$var1" = "$END_CONDITION" ]
6 # Tests condition here, at top of loop.
7 do
8 echo "Input variable #1 "
9 echo "($END_CONDITION to exit)"
10 read var1
11 echo "variable #1 = $var1"
12 echo
13 done
14
15 # ------------------------------------------- #
16
17 # As with "for" and "while" loops,
18 #+ an "until" loop permits C-like test constructs.
19
20 LIMIT=10
21 var=0
22
23 until (( var > LIMIT ))
24 do # ^^ ^ ^ ^^ No brackets, no $ prefixing variables.
25 echo -n "$var "
26 (( var++ ))
27 done # 0 1 2 3 4 5 6 7 8 9 10
28
29
30 exit 0
How to choose between a for loop or a while loop or until loop? In C, you would typically use a for loop
when the number of loop iterations is known beforehand. With Bash, however, the situation is fuzzier. The
Bash for loop is more loosely structured and more flexible than its equivalent in other languages. Therefore,
feel free to use whatever type of loop gets the job done in the simplest way.
Notes
[1] Iteration: Repeated execution of a command or group of commands, usually -- but not always, while a
given condition holds, or until a given condition is met.
1 #!/bin/bash
2 # nested-loop.sh: Nested "for" loops.
3
4 outer=1 # Set outer loop counter.
5
6 # Beginning of outer loop.
7 for a in 1 2 3 4 5
8 do
9 echo "Pass $outer in outer loop."
10 echo "---------------------"
11 inner=1 # Reset inner loop counter.
12
13 # ===============================================
14 # Beginning of inner loop.
15 for b in 1 2 3 4 5
16 do
17 echo "Pass $inner in inner loop."
18 let "inner+=1" # Increment inner loop counter.
19 done
20 # End of inner loop.
21 # ===============================================
22
23 let "outer+=1" # Increment outer loop counter.
24 echo # Space between output blocks in pass of outer loop.
25 done
26 # End of outer loop.
27
28 exit 0
See Example 27-11 for an illustration of nested while loops, and Example 27-13 to see a while loop nested
inside an until loop.
break, continue
The break and continue loop control commands [1] correspond exactly to their counterparts in other
programming languages. The break command terminates the loop (breaks out of it), while continue
causes a jump to the next iteration of the loop, skipping all the remaining commands in that particular
loop cycle.
1 #!/bin/bash
2
3 LIMIT=19 # Upper limit
4
5 echo
6 echo "Printing Numbers 1 through 20 (but not 3 and 11)."
7
8 a=0
9
10 while [ $a -le "$LIMIT" ]
11 do
12 a=$(($a+1))
13
14 if [ "$a" -eq 3 ] || [ "$a" -eq 11 ] # Excludes 3 and 11.
15 then
16 continue # Skip rest of this particular loop iteration.
17 fi
18
19 echo -n "$a " # This will not execute for 3 and 11.
20 done
21
22 # Exercise:
23 # Why does the loop print up to 20?
24
25 echo; echo
26
27 echo Printing Numbers 1 through 20, but something happens after 2.
28
29 ##################################################################
30
31 # Same loop, but substituting 'break' for 'continue'.
32
33 a=0
34
35 while [ "$a" -le "$LIMIT" ]
36 do
37 a=$(($a+1))
38
39 if [ "$a" -gt 2 ]
40 then
41 break # Skip entire rest of loop.
42 fi
43
44 echo -n "$a "
45 done
46
47 echo; echo; echo
48
49 exit 0
The break command may optionally take a parameter. A plain break terminates only the innermost
loop in which it is embedded, but a break N breaks out of N levels of loop.
1 #!/bin/bash
2 # break-levels.sh: Breaking out of loops.
3
4 # "break N" breaks out of N level loops.
5
6 for outerloop in 1 2 3 4 5
7 do
8 echo -n "Group $outerloop: "
9
10 # --------------------------------------------------------
11 for innerloop in 1 2 3 4 5
12 do
13 echo -n "$innerloop "
14
15 if [ "$innerloop" -eq 3 ]
16 then
17 break # Try break 2 to see what happens.
18 # ("Breaks" out of both inner and outer loops.)
19 fi
20 done
21 # --------------------------------------------------------
22
23 echo
24 done
25
26 echo
27
28 exit 0
The continue command, similar to break, optionally takes a parameter. A plain continue cuts short
the current iteration within its loop and begins the next. A continue N terminates all remaining
iterations at its loop level and continues with the next iteration at the loop, N levels above.
1 #!/bin/bash
2 # The "continue N" command, continuing at the Nth level loop.
3
4 for outer in I II III IV V # outer loop
5 do
6 echo; echo -n "Group $outer: "
7
8 # --------------------------------------------------------------------
9 for inner in 1 2 3 4 5 6 7 8 9 10 # inner loop
10 do
11
12 if [[ "$inner" -eq 7 && "$outer" = "III" ]]
13 then
14 continue 2 # Continue at loop on 2nd level, that is "outer loop".
15 # Replace above line with a simple "continue"
16 # to see normal loop behavior.
17 fi
18
19 echo -n "$inner " # 7 8 9 10 will not echo on "Group III."
20 done
21 # --------------------------------------------------------------------
22
23 done
24
25 echo; echo
26
27 # Exercise:
28 # Come up with a meaningful use for "continue N" in a script.
29
30 exit 0
Notes
[1] These are shell builtins, whereas other loop commands, such as while and case, are keywords.
case "$variable" in
"$condition1" )
command...
;;
"$condition2" )
command...
;;
esac
◊ Quoting the variables is not mandatory, since word splitting does not take
place.
◊ Each test line ends with a right paren ).
◊ Each condition block ends with a double semicolon ;;.
◊ If a condition tests true, then the associated commands execute and the case
block terminates.
◊ The entire case block ends with an esac (case spelled backwards).
1 #!/bin/bash
2 # Testing ranges of characters.
3
4 echo; echo "Hit a key, then hit return."
5 read Keypress
6
7 case "$Keypress" in
8 [[:lower:]] ) echo "Lowercase letter";;
9 [[:upper:]] ) echo "Uppercase letter";;
10 [0-9] ) echo "Digit";;
11 * ) echo "Punctuation, whitespace, or other";;
12 esac # Allows ranges of characters in [square brackets],
13 #+ or POSIX ranges in [[double square brackets.
14
15 # In the first version of this example,
16 #+ the tests for lowercase and uppercase characters were
17 #+ [a-z] and [A-Z].
18 # This no longer works in certain locales and/or Linux distros.
19 # POSIX is more portable.
20 # Thanks to Frank Wang for pointing this out.
21
22 # Exercise:
23 # --------
24 # As the script stands, it accepts a single keystroke, then terminates.
25 # Change the script so it accepts repeated input,
26 #+ reports on each keystroke, and terminates only when "X" is hit.
27 # Hint: enclose everything in a "while" loop.
28
29 exit 0
1 #!/bin/bash
2
3 # Crude address database
4
5 clear # Clear the screen.
6
7 echo " Contact List"
8 echo " ------- ----"
9 echo "Choose one of the following persons:"
10 echo
11 echo "[E]vans, Roland"
12 echo "[J]ones, Mildred"
13 echo "[S]mith, Julie"
14 echo "[Z]ane, Morris"
15 echo
16
17 read person
18
19 case "$person" in
20 # Note variable is quoted.
21
22 "E" | "e" )
23 # Accept upper or lowercase input.
24 echo
25 echo "Roland Evans"
26 echo "4321 Flash Dr."
27 echo "Hardscrabble, CO 80753"
28 echo "(303) 734-9874"
29 echo "(303) 734-9892 fax"
30 echo "[email protected]"
31 echo "Business partner & old friend"
32 ;;
33 # Note double semicolon to terminate each option.
34
35 "J" | "j" )
36 echo
37 echo "Mildred Jones"
38 echo "249 E. 7th St., Apt. 19"
39 echo "New York, NY 10009"
40 echo "(212) 533-2814"
41 echo "(212) 533-9972 fax"
42 echo "[email protected]"
43 echo "Ex-girlfriend"
44 echo "Birthday: Feb. 11"
45 ;;
46
47 # Add info for Smith & Zane later.
48
49 * )
50 # Default option.
51 # Empty input (hitting RETURN) fits here, too.
52 echo
53 echo "Not yet in database."
54 ;;
55
56 esac
57
58 echo
59
60 # Exercise:
61 # --------
62 # Change the script so it accepts multiple inputs,
63 #+ instead of terminating after displaying just one address.
64
65 exit 0
1 #! /bin/bash
2
3 case "$1" in
4 "") echo "Usage: ${0##*/} <filename>"; exit $E_PARAM;;
5 # No command-line parameters,
6 # or first parameter empty.
7 # Note that ${0##*/} is ${var##pattern} param substitution.
8 # Net result is $0.
9
10 -*) FILENAME=./$1;; # If filename passed as argument ($1)
11 #+ starts with a dash,
12 #+ replace it with ./$1
13 #+ so further commands don't interpret it
14 #+ as an option.
15
16 * ) FILENAME=$1;; # Otherwise, $1.
17 esac
Here is an more straightforward example of command-line parameter handling:
1 #! /bin/bash
2
3
4 while [ $# -gt 0 ]; do # Until you run out of parameters . . .
5 case "$1" in
6 -d|--debug)
7 # "-d" or "--debug" parameter?
8 DEBUG=1
9 ;;
10 -c|--conf)
11 CONFFILE="$2"
12 shift
13 if [ ! -f $CONFFILE ]; then
14 echo "Error: Supplied file doesn't exist!"
15 exit $E_CONFFILE # File not found error.
16 fi
17 ;;
18 esac
19 shift # Check next set of parameters.
20 done
21
22 # From Stefano Falsetto's "Log2Rot" script,
23 #+ part of his "rottlog" package.
24 # Used with permission.
Example 11-26. Using command substitution to generate the case variable
1 #!/bin/bash
2 # case-cmd.sh: Using command substitution to generate a "case" variable.
3
4 case $( arch ) in # "arch" returns machine architecture.
5 # Equivalent to 'uname -m' ...
6 i386 ) echo "80386-based machine";;
7 i486 ) echo "80486-based machine";;
8 i586 ) echo "Pentium-based machine";;
9 i686 ) echo "Pentium2+-based machine";;
10 * ) echo "Other type of machine";;
11 esac
12
13 exit 0
1 #!/bin/bash
2 # match-string.sh: Simple string matching.
3
4 match_string ()
5 { # Exact string match.
6 MATCH=0
7 E_NOMATCH=90
8 PARAMS=2 # Function requires 2 arguments.
9 E_BAD_PARAMS=91
10
11 [ $# -eq $PARAMS ] || return $E_BAD_PARAMS
12
13 case "$1" in
14 "$2") return $MATCH;;
15 * ) return $E_NOMATCH;;
16 esac
17
18 }
19
20
21 a=one
22 b=two
23 c=three
24 d=two
25
26
27 match_string $a # wrong number of parameters
28 echo $? # 91
29
30 match_string $a $b # no match
31 echo $? # 90
32
33 match_string $b $d # match
34 echo $? # 0
35
36
37 exit 0
Example 11-28. Checking for alphabetic input
1 #!/bin/bash
2 # isalpha.sh: Using a "case" structure to filter a string.
3
4 SUCCESS=0
5 FAILURE=-1
6
7 isalpha () # Tests whether *first character* of input string is alphabetic.
8 {
9 if [ -z "$1" ] # No argument passed?
10 then
11 return $FAILURE
12 fi
13
14 case "$1" in
15 [a-zA-Z]*) return $SUCCESS;; # Begins with a letter?
16 * ) return $FAILURE;;
17 esac
18 } # Compare this with "isalpha ()" function in C.
19
20
21 isalpha2 () # Tests whether *entire string* is alphabetic.
22 {
23 [ $# -eq 1 ] || return $FAILURE
24
25 case $1 in
26 *[!a-zA-Z]*|"") return $FAILURE;;
27 *) return $SUCCESS;;
28 esac
29 }
30
31 isdigit () # Tests whether *entire string* is numerical.
32 { # In other words, tests for integer variable.
33 [ $# -eq 1 ] || return $FAILURE
34
35 case $1 in
36 *[!0-9]*|"") return $FAILURE;;
37 *) return $SUCCESS;;
38 esac
39 }
40
41
42
43 check_var () # Front-end to isalpha ().
44 {
45 if isalpha "$@"
46 then
47 echo "\"$*\" begins with an alpha character."
48 if isalpha2 "$@"
49 then # No point in testing if first char is non-alpha.
50 echo "\"$*\" contains only alpha characters."
51 else
52 echo "\"$*\" contains at least one non-alpha character."
53 fi
54 else
55 echo "\"$*\" begins with a non-alpha character."
56 # Also "non-alpha" if no argument passed.
57 fi
58
59 echo
60
61 }
62
63 digit_check () # Front-end to isdigit ().
64 {
65 if isdigit "$@"
66 then
67 echo "\"$*\" contains only digits [0 - 9]."
68 else
69 echo "\"$*\" has at least one non-digit character."
70 fi
71
72 echo
73
74 }
75
76 a=23skidoo
77 b=H3llo
78 c=-What?
79 d=What?
80 e=`echo $b` # Command substitution.
81 f=AbcDef
82 g=27234
83 h=27a34
84 i=27.34
85
86 check_var $a
87 check_var $b
88 check_var $c
89 check_var $d
90 check_var $e
91 check_var $f
92 check_var # No argument passed, so what happens?
93 #
94 digit_check $g
95 digit_check $h
96 digit_check $i
97
98
99 exit 0 # Script improved by S.C.
100
101 # Exercise:
102 # --------
103 # Write an 'isfloat ()' function that tests for floating point numbers.
104 # Hint: The function duplicates 'isdigit ()',
105 #+ but adds a test for a mandatory decimal point.
select
The select construct, adopted from the Korn Shell, is yet another tool for building menus.
This prompts the user to enter one of the choices presented in the variable list. Note that select uses
the $PS3 prompt (#? ) by default, but this may be changed.
1 #!/bin/bash
2
3 PS3='Choose your favorite vegetable: ' # Sets the prompt string.
4 # Otherwise it defaults to #? .
5
6 echo
7
8 select vegetable in "beans" "carrots" "potatoes" "onions" "rutabagas"
9 do
10 echo
11 echo "Your favorite veggie is $vegetable."
12 echo "Yuck!"
13 echo
14 break # What happens if there is no 'break' here?
15 done
16
17 exit
18
19 # Exercise:
20 # --------
21 # Fix this script to accept user input not specified in
22 #+ the "select" statement.
23 # For example, if the user inputs "peas,"
24 #+ the script would respond "Sorry. That is not on the menu."
If in list is omitted, then select uses the list of command line arguments ($@) passed to the script
or the function containing the select construct.
1 #!/bin/bash
2
3 PS3='Choose your favorite vegetable: '
4
5 echo
6
7 choice_of()
8 {
9 select vegetable
10 # [in list] omitted, so 'select' uses arguments passed to function.
11 do
12 echo
13 echo "Your favorite veggie is $vegetable."
14 echo "Yuck!"
15 echo
16 break
17 done
18 }
19
20 choice_of beans rice carrots radishes tomatoes spinach
21 # $1 $2 $3 $4 $5 $6
22 # passed to choice_of() function
23
24 exit 0
The classic form of command substitution uses backquotes (`...`). Commands within backquotes (backticks)
generate command-line text.
1 script_name=`basename $0`
2 echo "The name of this script is $script_name."
The output of commands can be used as arguments to another command, to set a variable, and even for
generating the argument list in a for loop.
Even when there is no word splitting, command substitution can remove trailing newlines.
1 dir_listing=`ls -l`
2 echo $dir_listing # unquoted
3
4 # Expecting a nicely ordered directory listing.
5
6 # However, what you get is:
7 # total 3 -rw-rw-r-- 1 bozo bozo 30 May 13 17:15 1.txt -rw-rw-r-- 1 bozo
8 # bozo 51 May 15 20:57 t2.sh -rwxr-xr-x 1 bozo bozo 217 Mar 5 21:13 wi.sh
9
10 # The newlines disappeared.
11
12
13 echo "$dir_listing" # quoted
14 # -rw-rw-r-- 1 bozo 30 May 13 17:15 1.txt
15 # -rw-rw-r-- 1 bozo 51 May 15 20:57 t2.sh
16 # -rwxr-xr-x 1 bozo 217 Mar 5 21:13 wi.sh
Command substitution even permits setting a variable to the contents of a file, using either redirection or the
cat command.
1 #!/bin/bash
2 # stupid-script-tricks.sh: Don't try this at home, folks.
3 # From "Stupid Script Tricks," Volume I.
4
5
6 dangerous_variable=`cat /boot/vmlinuz` # The compressed Linux kernel itself.
7
8 echo "string-length of \$dangerous_variable = ${#dangerous_variable}"
9 # string-length of $dangerous_variable = 794151
10 # (Does not give same count as 'wc -c /boot/vmlinuz'.)
11
12 # echo "$dangerous_variable"
13 # Don't try this! It would hang the script.
14
15
16 # The document author is aware of no useful applications for
17 #+ setting a variable to the contents of a binary file.
18
19 exit 0
Notice that a buffer overrun does not occur. This is one instance where an interpreted language, such as
Bash, provides more protection from programmer mistakes than a compiled language.
Command substitution permits setting a variable to the output of a loop. The key to this is grabbing the output
of an echo command within the loop.
1 #!/bin/bash
2 # csubloop.sh: Setting a variable to the output of a loop.
3
4 variable1=`for i in 1 2 3 4 5
5 do
6 echo -n "$i" # The 'echo' command is critical
7 done` #+ to command substitution here.
8
9 echo "variable1 = $variable1" # variable1 = 12345
10
11
12 i=0
13 variable2=`while [ "$i" -lt 10 ]
14 do
15 echo -n "$i" # Again, the necessary 'echo'.
16 let "i += 1" # Increment.
17 done`
18
19 echo "variable2 = $variable2" # variable2 = 0123456789
20
21 # Demonstrates that it's possible to embed a loop
22 #+ within a variable declaration.
23
24 exit 0
Command substitution makes it possible to extend the toolset available to Bash. It is simply a matter of
writing a program or script that outputs to stdout (like a well-behaved UNIX tool should) and assigning
that output to a variable.
1 #include <stdio.h>
2
3 /* "Hello, world." C program */
4
5 int main()
6 {
7 printf( "Hello, world.\n" );
8 return (0);
9 }
bash$ gcc -o hello hello.c
1 #!/bin/bash
2 # hello.sh
3
4 greeting=`./hello`
5 echo $greeting
bash$ sh hello.sh
Hello, world.
1 #!/bin/bash
2 # agram2.sh
3 # Example of nested command substitution.
4
5 # Uses "anagram" utility
6 #+ that is part of the author's "yawl" word list package.
7 # http://ibiblio.org/pub/Linux/libs/yawl-0.3.2.tar.gz
8 # http://bash.neuralshortcircuit.com/yawl-0.3.2.tar.gz
9
10 E_NOARGS=66
11 E_BADARG=67
12 MINLEN=7
13
14 if [ -z "$1" ]
15 then
16 echo "Usage $0 LETTERSET"
17 exit $E_NOARGS # Script needs a command-line argument.
18 elif [ ${#1} -lt $MINLEN ]
19 then
20 echo "Argument must have at least $MINLEN letters."
21 exit $E_BADARG
22 fi
23
24
25
26 FILTER='.......' # Must have at least 7 letters.
27 # 1234567
28 Anagrams=( $(echo $(anagram $1 | grep $FILTER) ) )
29 # $( $( nested command sub. ) )
30 # ( array assignment )
31
32 echo
33 echo "${#Anagrams[*]} 7+ letter anagrams found"
34 echo
35 echo ${Anagrams[0]} # First anagram.
36 echo ${Anagrams[1]} # Second anagram.
37 # Etc.
38
39 # echo "${Anagrams[*]}" # To list all the anagrams in a single line . . .
40
41 # Look ahead to the "Arrays" chapter for enlightenment on
42 #+ what's going on here.
43
44 # See also the agram.sh script for an example of anagram finding.
45
46 exit $?
1. Example 11-7
2. Example 11-26
3. Example 9-16
4. Example 16-3
5. Example 16-22
6. Example 16-17
7. Example 16-54
8. Example 11-13
9. Example 11-10
10. Example 16-32
11. Example 20-8
12. Example A-16
13. Example 29-3
14. Example 16-47
15. Example 16-48
16. Example 16-49
Notes
[1] For purposes of command substitution, a command may be an external system command, an internal
scripting builtin, or even a script function.
[2] In a more technically correct sense, command substitution extracts the stdout of a command, then
assigns it to a variable using the = operator.
[3] In fact, nesting with backticks is also possible, but only by escaping the inner backticks, as John Default
points out.
Variations
1 z=$(($z+3))
2 z=$((z+3)) # Also correct.
3 # Within double parentheses,
4 #+ parameter dereferencing
5 #+ is optional.
6
7 # $((EXPRESSION)) is arithmetic expansion. # Not to be confused with
8 #+ command substitution.
9
10
11
12 # You may also use operations within double parentheses without assignment.
13
14 n=0
15 echo "n = $n" # n = 0
16
17 (( n += 1 )) # Increment.
18 # (( $n += 1 )) is incorrect!
19 echo "n = $n" # n = 1
20
21
22 let z=z+3
23 let "z += 3" # Quotes permit the use of spaces in variable assignment.
24 # The 'let' operator actually performs arithmetic evaluation,
25 #+ rather than expansion.
Examples of arithmetic expansion in scripts:
1. Example 16-9
2. Example 11-14
3. Example 27-1
4. Example 27-11
5. Example A-16
Don't break the chain! Send out your ten copies today!
Courtesy 'NIX "fortune cookies", with some alterations and many apologies
Mastering the commands on your Linux machine is an indispensable prelude to writing effective shell scripts.
Table of Contents
15. Internal Commands and Builtins
16. External Filters, Programs and Commands
17. System and Administrative Commands
When a command or the shell itself initiates (or spawns) a new subprocess to carry out a task, this is called
forking. This new process is the child, and the process that forked it off is the parent. While the child
process is doing its work, the parent process is still executing.
Note that while a parent process gets the process ID of the child process, and can thus pass arguments to it,
the reverse is not true. This can create problems that are subtle and hard to track down.
1 #!/bin/bash
2 # spawn.sh
3
4
5 PIDS=$(pidof sh $0) # Process IDs of the various instances of this script.
6 P_array=( $PIDS ) # Put them in an array (why?).
7 echo $PIDS # Show process IDs of parent and child processes.
8 let "instances = ${#P_array[*]} - 1" # Count elements, less 1.
9 # Why subtract 1?
10 echo "$instances instance(s) of this script running."
11 echo "[Hit Ctl-C to exit.]"; echo
12
13
14 sleep 1 # Wait.
15 sh $0 # Play it again, Sam.
16
17 exit 0 # Not necessary; script will never get to here.
18 # Why not?
19
20 # After exiting with a Ctl-C,
21 #+ do all the spawned instances of the script die?
22 # If so, why?
23
24 # Note:
25 # ----
26 # Be careful not to run this script too long.
27 # It will eventually eat up too many system resources.
28
29 # Is having a script spawn multiple instances of itself
30 #+ an advisable scripting technique.
31 # Why or why not?
Generally, a Bash builtin does not fork a subprocess when it executes within a script. An external system
command or filter in a script usually will fork a subprocess.
A builtin may be a synonym to a system command of the same name, but Bash reimplements it internally. For
example, the Bash echo command is not the same as /bin/echo, although their behavior is almost
identical.
1 #!/bin/bash
2
3 echo "This line uses the \"echo\" builtin."
4 /bin/echo "This line uses the /bin/echo system command."
A keyword is a reserved word, token or operator. Keywords have a special meaning to the shell, and indeed
are the building blocks of the shell's syntax. As examples, for, while, do, and ! are keywords. Similar to a
builtin, a keyword is hard-coded into Bash, but unlike a builtin, a keyword is not in itself a command, but a
subunit of a command construct. [2]
I/O
echo
prints (to stdout) an expression or variable (see Example 4-1).
1 echo Hello
2 echo $a
An echo requires the -e option to print escaped characters. See Example 5-2.
Normally, each echo command prints a terminal newline, but the -n option suppresses this.
See also Example 16-22, Example 16-3, Example 16-47, and Example 16-48.
Be aware that echo `command` deletes any linefeeds that the output of command generates.
The $IFS (internal field separator) variable normally contains \n (linefeed) as one of its set of
whitespace characters. Bash therefore splits the output of command at linefeeds into arguments to
echo. Then echo outputs these arguments, separated by spaces.
bash$ ls -l /usr/share/apps/kjezz/sounds
-rw-r--r-- 1 root root 1407 Nov 7 2000 reflect.au
-rw-r--r-- 1 root root 362 Nov 7 2000 seconds.au
1 # Embedding a linefeed?
2 echo "Why doesn't this string \n split on two lines?"
3 # Doesn't split.
4
5 # Let's try something else.
6
7 echo
8
9 echo $"A line of text containing
10 a linefeed."
11 # Prints as two distinct lines (embedded linefeed).
12 # But, is the "$" variable prefix really necessary?
13
14 echo
15
16 echo "This string splits
17 on two lines."
18 # No, the "$" is not needed.
19
20 echo
21 echo "---------------"
22 echo
23
24 echo -n $"Another line of text containing
25 a linefeed."
26 # Prints as two distinct lines (embedded linefeed).
27 # Even the -n option fails to suppress the linefeed here.
28
29 echo
30 echo
31 echo "---------------"
32 echo
33 echo
34
35 # However, the following doesn't work as expected.
36 # Why not? Hint: Assignment to a variable.
37 string1=$"Yet another line of text containing
38 a linefeed (maybe)."
39
40 echo $string1
41 # Yet another line of text containing a linefeed (maybe).
42 # ^
43 # Linefeed becomes a space.
44
45 # Thanks, Steve Parker, for pointing this out.
This command is a shell builtin, and not the same as /bin/echo, although its
behavior is similar.
printf
The printf, formatted print, command is an enhanced echo. It is a limited variant of the C language
printf() library function, and its syntax is somewhat different.
This is the Bash builtin version of the /bin/printf or /usr/bin/printf command. See the
printf manpage (of the system command) for in-depth coverage.
1 E_BADDIR=85
2
3 var=nonexistent_directory
4
5 error()
6 {
7 printf "$@" >&2
8 # Formats positional params passed, and sends them to stderr.
9 echo
10 exit $E_BADDIR
11 }
12
13 cd $var || error $"Can't cd to %s." "$var"
14
15 # Thanks, S.C.
See also Example 35-15.
read
"Reads" the value of a variable from stdin, that is, interactively fetches input from the keyboard.
The -a option lets read get array variables (see Example 27-6).
Example 15-3. Variable assignment, using read
1 #!/bin/bash
2 # "Reading" variables.
3
4 echo -n "Enter the value of variable 'var1': "
5 # The -n option to echo suppresses newline.
6
7 read var1
8 # Note no '$' in front of var1, since it is being set.
9
10 echo "var1 = $var1"
11
12
13 echo
14
15 # A single 'read' statement can set multiple variables.
16 echo -n "Enter the values of variables 'var2' and 'var3' "
17 echo =n "(separated by a space or tab): "
18 read var2 var3
19 echo "var2 = $var2 var3 = $var3"
20 # If you input only one value,
21 #+ the other variable(s) will remain unset (null).
22
23 exit 0
A read without an associated variable assigns its input to the dedicated variable $REPLY.
1 #!/bin/bash
2 # read-novar.sh
3
4 echo
5
6 # -------------------------- #
7 echo -n "Enter a value: "
8 read var
9 echo "\"var\" = "$var""
10 # Everything as expected here.
11 # -------------------------- #
12
13 echo
14
15 # ------------------------------------------------------------------- #
16 echo -n "Enter another value: "
17 read # No variable supplied for 'read', therefore...
18 #+ Input to 'read' assigned to default variable, $REPLY.
19 var="$REPLY"
20 echo "\"var\" = "$var""
21 # This is equivalent to the first code block.
22 # ------------------------------------------------------------------- #
23
24 echo
25 echo "========================="
26 echo
27
28
29 # This example is similar to the "reply.sh" script.
30 # However, this one shows that $REPLY is available
31 #+ even after a 'read' to a variable in the conventional way.
32
33
34 # ================================================================= #
35
36 # In some instances, you might wish to discard the first value read.
37 # In such cases, simply ignore the $REPLY variable.
38
39 { # Code block.
40 read # Line 1, to be discarded.
41 read line2 # Line 2, saved in variable.
42 } <$0
43 echo "Line 2 of this script is:"
44 echo "$line2" # # read-novar.sh
45 echo # #!/bin/bash line discarded.
46
47 # See also the soundcard-on.sh script.
48
49 exit 0
Normally, inputting a \ suppresses a newline during input to a read. The -r option causes an
inputted \ to be interpreted literally.
1 #!/bin/bash
2
3 echo
4
5 echo "Enter a string terminated by a \\, then press <ENTER>."
6 echo "Then, enter a second string (no \\ this time), and again press <ENTER>."
7
8 read var1 # The "\" suppresses the newline, when reading $var1.
9 # first line \
10 # second line
11
12 echo "var1 = $var1"
13 # var1 = first line second line
14
15 # For each line terminated by a "\"
16 #+ you get a prompt on the next line to continue feeding characters into var1.
17
18 echo; echo
19
20 echo "Enter another string terminated by a \\ , then press <ENTER>."
21 read -r var2 # The -r option causes the "\" to be read literally.
22 # first line \
23
24 echo "var2 = $var2"
25 # var2 = first line \
26
27 # Data entry terminates with the first <ENTER>.
28
29 echo
30
31 exit 0
The read command has some interesting options that permit echoing a prompt and even reading
keystrokes without hitting ENTER.
1 # Read a keypress without hitting ENTER.
2
3 read -s -n1 -p "Hit a key " keypress
4 echo; echo "Keypress was "\"$keypress\""."
5
6 # -s option means do not echo input.
7 # -n N option means accept only N characters of input.
8 # -p option means echo the following prompt before reading input.
9
10 # Using these options is tricky, since they need to be in the correct order.
The -n option to read also allows detection of the arrow keys and certain of the other unusual keys.
1 #!/bin/bash
2 # arrow-detect.sh: Detects the arrow keys, and a few more.
3 # Thank you, Sandro Magi, for showing me how.
4
5 # --------------------------------------------
6 # Character codes generated by the keypresses.
7 arrowup='\[A'
8 arrowdown='\[B'
9 arrowrt='\[C'
10 arrowleft='\[D'
11 insert='\[2'
12 delete='\[3'
13 # --------------------------------------------
14
15 SUCCESS=0
16 OTHER=65
17
18 echo -n "Press a key... "
19 # May need to also press ENTER if a key not listed above pressed.
20 read -n3 key # Read 3 characters.
21
22 echo -n "$key" | grep "$arrowup" #Check if character code detected.
23 if [ "$?" -eq $SUCCESS ]
24 then
25 echo "Up-arrow key pressed."
26 exit $SUCCESS
27 fi
28
29 echo -n "$key" | grep "$arrowdown"
30 if [ "$?" -eq $SUCCESS ]
31 then
32 echo "Down-arrow key pressed."
33 exit $SUCCESS
34 fi
35
36 echo -n "$key" | grep "$arrowrt"
37 if [ "$?" -eq $SUCCESS ]
38 then
39 echo "Right-arrow key pressed."
40 exit $SUCCESS
41 fi
42
43 echo -n "$key" | grep "$arrowleft"
44 if [ "$?" -eq $SUCCESS ]
45 then
46 echo "Left-arrow key pressed."
47 exit $SUCCESS
48 fi
49
50 echo -n "$key" | grep "$insert"
51 if [ "$?" -eq $SUCCESS ]
52 then
53 echo "\"Insert\" key pressed."
54 exit $SUCCESS
55 fi
56
57 echo -n "$key" | grep "$delete"
58 if [ "$?" -eq $SUCCESS ]
59 then
60 echo "\"Delete\" key pressed."
61 exit $SUCCESS
62 fi
63
64
65 echo " Some other key pressed."
66
67 exit $OTHER
68
69 # ========================================= #
70
71 # Mark Alexander came up with a simplified
72 #+ version of the above script (Thank you!).
73 # It eliminates the need for grep.
74
75 #!/bin/bash
76
77 uparrow=$'\x1b[A'
78 downarrow=$'\x1b[B'
79 leftarrow=$'\x1b[D'
80 rightarrow=$'\x1b[C'
81
82 read -s -n3 -p "Hit an arrow key: " x
83
84 case "$x" in
85 $uparrow)
86 echo "You pressed up-arrow"
87 ;;
88 $downarrow)
89 echo "You pressed down-arrow"
90 ;;
91 $leftarrow)
92 echo "You pressed left-arrow"
93 ;;
94 $rightarrow)
95 echo "You pressed right-arrow"
96 ;;
97 esac
98
99 exit $?
100
101 # ========================================= #
102
103 # Antonio Macchi has a simpler alternative.
104
105 #!/bin/bash
106
107 while true
108 do
109 read -sn1 a
110 test "$a" == `echo -en "\e"` || continue
111 read -sn1 a
112 test "$a" == "[" || continue
113 read -sn1 a
114 case "$a" in
115 A) echo "up";;
116 B) echo "down";;
117 C) echo "right";;
118 D) echo "left";;
119 esac
120 done
121
122 # ========================================= #
123
124 # Exercise:
125 # --------
126 # 1) Add detection of the "Home," "End," "PgUp," and "PgDn" keys.
The -n option to read will not detect the ENTER (newline) key.
The -t option to read permits timed input (see Example 9-4 and Example A-41).
The read command may also "read" its variable value from a file redirected to stdin. If the file
contains more than one line, only the first line is assigned to the variable. If read has more than one
parameter, then each of these variables gets assigned a successive whitespace-delineated string.
Caution!
1 #!/bin/bash
2
3 read var1 <data-file
4 echo "var1 = $var1"
5 # var1 set to the entire first line of the input file "data-file"
6
7 read var2 var3 <data-file
8 echo "var2 = $var2 var3 = $var3"
9 # Note non-intuitive behavior of "read" here.
10 # 1) Rewinds back to the beginning of input file.
11 # 2) Each variable is now set to a corresponding string,
12 # separated by whitespace, rather than to an entire line of text.
13 # 3) The final variable gets the remainder of the line.
14 # 4) If there are more variables to be set than whitespace-terminated strings
15 # on the first line of the file, then the excess variables remain empty.
16
17 echo "------------------------------------------------"
18
19 # How to resolve the above problem with a loop:
20 while read line
21 do
22 echo "$line"
23 done <data-file
24 # Thanks, Heiner Steven for pointing this out.
25
26 echo "------------------------------------------------"
27
28 # Use $IFS (Internal Field Separator variable) to split a line of input to
29 # "read", if you do not want the default to be whitespace.
30
31 echo "List of all users:"
32 OIFS=$IFS; IFS=: # /etc/passwd uses ":" for field separator.
33 while read name passwd uid gid fullname ignore
34 do
35 echo "$name ($fullname)"
36 done </etc/passwd # I/O redirection.
37 IFS=$OIFS # Restore original $IFS.
38 # This code snippet also by Heiner Steven.
39
40
41
42 # Setting the $IFS variable within the loop itself
43 #+ eliminates the need for storing the original $IFS
44 #+ in a temporary variable.
45 # Thanks, Dim Segebart, for pointing this out.
46 echo "------------------------------------------------"
47 echo "List of all users:"
48
49 while IFS=: read name passwd uid gid fullname ignore
50 do
51 echo "$name ($fullname)"
52 done </etc/passwd # I/O redirection.
53
54 echo
55 echo "\$IFS still $IFS"
56
57 exit 0
1 #!/bin/sh
2 # readpipe.sh
3 # This example contributed by Bjon Eriksson.
4
5 last="(null)"
6 cat $0 |
7 while read line
8 do
9 echo "{$line}"
10 last=$line
11 done
12
13 echo
14 echo "++++++++++++++++++++++"
15 printf "\nAll done, last: $last\n"
16
17 exit 0 # End of code.
18 # (Partial) output of script follows.
19 # The 'echo' supplies extra brackets.
20
21 #############################################
22
23 ./readpipe.sh
24
25 {#!/bin/sh}
26 {last="(null)"}
27 {cat $0 |}
28 {while read line}
29 {do}
30 {echo "{$line}"}
31 {last=$line}
32 {done}
33 {printf "nAll done, last: $lastn"}
34
35
36 All done, last: (null)
37
38 The variable (last) is set within the loop/subshell
39 but its value does not persist outside the loop.
The gendiff script, usually found in /usr/bin on many Linux distros, pipes the
output of find to a while read construct.
Filesystem
cd
The familiar cd change directory command finds use in scripts where execution of a command
requires being in a specified directory.
The cd command does not function as expected when presented with two forward
slashes.
bash$ cd //
bash$ pwd
//
The output should, of course, be /. This is a problem both from the command-line and
in a script.
pwd
Print Working Directory. This gives the user's (or script's) current directory (see Example 15-9). The
effect is identical to reading the value of the builtin variable $PWD.
pushd, popd, dirs
This command set is a mechanism for bookmarking working directories, a means of moving back and
forth through directories in an orderly manner. A pushdown stack is used to keep track of directory
names. Options allow various manipulations of the directory stack.
pushd dir-name pushes the path dir-name onto the directory stack and simultaneously
changes the current working directory to dir-name
popd removes (pops) the top directory path name off the directory stack and simultaneously changes
the current working directory to that directory popped from the stack.
dirs lists the contents of the directory stack (compare this with the $DIRSTACK variable). A
successful pushd or popd will automatically invoke dirs.
Scripts that require various changes to the current working directory without hard-coding the
directory name changes can make good use of these commands. Note that the implicit $DIRSTACK
array variable, accessible from within a script, holds the contents of the directory stack.
1 #!/bin/bash
2
3 dir1=/usr/local
4 dir2=/var/spool
5
6 pushd $dir1
7 # Will do an automatic 'dirs' (list directory stack to stdout).
8 echo "Now in directory `pwd`." # Uses back-quoted 'pwd'.
9
10 # Now, do some stuff in directory 'dir1'.
11 pushd $dir2
12 echo "Now in directory `pwd`."
13
14 # Now, do some stuff in directory 'dir2'.
15 echo "The top entry in the DIRSTACK array is $DIRSTACK."
16 popd
17 echo "Now back in directory `pwd`."
18
19 # Now, do some more stuff in directory 'dir1'.
20 popd
21 echo "Now back in original working directory `pwd`."
22
23 exit 0
24
25 # What happens if you don't 'popd' -- then exit the script?
26 # Which directory do you end up in? Why?
Variables
let
The let command carries out arithmetic operations on variables. [3] In many cases, it functions as a
less complex version of expr.
1 #!/bin/bash
2
3 echo
4
5 let a=11 # Same as 'a=11'
6 let a=a+5 # Equivalent to let "a = a + 5"
7 # (Double quotes and spaces make it more readable.)
8 echo "11 + 5 = $a" # 16
9
10 let "a <<= 3" # Equivalent to let "a = a << 3"
11 echo "\"\$a\" (=16) left-shifted 3 places = $a"
12 # 128
13
14 let "a /= 4" # Equivalent to let "a = a / 4"
15 echo "128 / 4 = $a" # 32
16
17 let "a -= 5" # Equivalent to let "a = a - 5"
18 echo "32 - 5 = $a" # 27
19
20 let "a *= 10" # Equivalent to let "a = a * 10"
21 echo "27 * 10 = $a" # 270
22
23 let "a %= 8" # Equivalent to let "a = a % 8"
24 echo "270 modulo 8 = $a (270 / 8 = 33, remainder $a)"
25 # 6
26
27
28 # Does "let" permit C-style operators?
29 # Yes, just as the (( ... )) double-parentheses construct does.
30
31 let a++ # C-style (post) increment.
32 echo "6++ = $a" # 6++ = 7
33 let a-- # C-style decrement.
34 echo "7-- = $a" # 7-- = 6
35 # Of course, ++a, etc., also allowed . . .
36 echo
37
38
39 # Trinary operator.
40
41 # Note that $a is 6, see above.
42 let "t = a<7?7:11" # True
43 echo $t # 7
44
45 let a++
46 let "t = a<7?7:11" # False
47 echo $t # 11
48
49 exit
eval
eval arg1 [arg2] ... [argN]
Combines the arguments in an expression or list of expressions and evaluates them. Any variables
within the expression are expanded. The net result is to convert a string into a command.
The eval command can be used for code generation from the command-line or within
a script.
1 #!/bin/bash
2 # Exercising "eval" ...
3
4 y=`eval ls -l` # Similar to y=`ls -l`
5 echo $y #+ but linefeeds removed because "echoed" variable is unquoted.
6 echo
7 echo "$y" # Linefeeds preserved when variable is quoted.
8
9 echo; echo
10
11 y=`eval df` # Similar to y=`df`
12 echo $y #+ but linefeeds removed.
13
14 # When LF's not preserved, it may make it easier to parse output,
15 #+ using utilities such as "awk".
16
17 echo
18 echo "==========================================================="
19 echo
20
21 eval "`seq 3 | sed -e 's/.*/echo var&=ABCDEFGHIJ/'`"
22 # var1=ABCDEFGHIJ
23 # var2=ABCDEFGHIJ
24 # var3=ABCDEFGHIJ
25
26 echo
27 echo "==========================================================="
28 echo
29
30
31 # Now, showing how to do something useful with "eval" . . .
32 # (Thank you, E. Choroba!)
33
34 version=3.4 # Can we split the version into major and minor
35 #+ part in one command?
36 echo "version = $version"
37 eval major=${version/./;minor=} # Replaces '.' in version by ';minor='
38 # The substitution yields '3; minor=4'
39 #+ so eval does minor=4, major=3
40 echo Major: $major, minor: $minor # Major: 3, minor: 4
1 #!/bin/bash
2 # arr-choice.sh
3
4 # Passing arguments to a function to select
5 #+ one particular variable out of a group.
6
7 arr0=( 10 11 12 13 14 15 )
8 arr1=( 20 21 22 23 24 25 )
9 arr2=( 30 31 32 33 34 35 )
10 # 0 1 2 3 4 5 Element number (zero-indexed)
11
12
13 choose_array ()
14 {
15 eval array_member=\${arr${array_number}[element_number]}
16 # ^ ^^^^^^^^^^^^
17 # Using eval to construct the name of a variable,
18 #+ in this particular case, an array name.
19
20 echo "Element $element_number of array $array_number is $array_member"
21 } # Function can be rewritten to take parameters.
22
23 array_number=0 # First array.
24 element_number=3
25 choose_array # 13
26
27 array_number=2 # Third array.
28 element_number=4
29 choose_array # 34
30
31 array_number=3 # Null array (arr3 not allocated).
32 element_number=4
33 choose_array # (null)
34
35 # Thank you, Antonio Macchi, for pointing this out.
1 #!/bin/bash
2 # echo-params.sh
3
4 # Call this script with a few command-line parameters.
5 # For example:
6 # sh echo-params.sh first second third fourth fifth
7
8 params=$# # Number of command-line parameters.
9 param=1 # Start at first command-line param.
10
11 while [ "$param" -le "$params" ]
12 do
13 echo -n "Command-line parameter "
14 echo -n \$$param # Gives only the *name* of variable.
15 # ^^^ # $1, $2, $3, etc.
16 # Why?
17 # \$ escapes the first "$"
18 #+ so it echoes literally,
19 #+ and $param dereferences "$param" . . .
20 #+ . . . as expected.
21 echo -n " = "
22 eval echo \$$param # Gives the *value* of variable.
23 # ^^^^ ^^^ # The "eval" forces the *evaluation*
24 #+ of \$$
25 #+ as an indirect variable reference.
26
27 (( param ++ )) # On to the next.
28 done
29
30 exit $?
31
32 # =================================================
33
34 $ sh echo-params.sh first second third fourth fifth
35 Command-line parameter $1 = first
36 Command-line parameter $2 = second
37 Command-line parameter $3 = third
38 Command-line parameter $4 = fourth
39 Command-line parameter $5 = fifth
1 #!/bin/bash
2 # Killing ppp to force a log-off.
3 # For dialup connection, of course.
4
5 # Script should be run as root user.
6
7 SERPORT=ttyS3
8 # Depending on the hardware and even the kernel version,
9 #+ the modem port on your machine may be different --
10 #+ /dev/ttyS1 or /dev/ttyS2.
11
12
13 killppp="eval kill -9 `ps ax | awk '/ppp/ { print $1 }'`"
14 # -------- process ID of ppp -------
15
16 $killppp # This variable is now a command.
17
18
19 # The following operations must be done as root user.
20
21 chmod 666 /dev/$SERPORT # Restore r+w permissions, or else what?
22 # Since doing a SIGKILL on ppp changed the permissions on the serial port,
23 #+ we restore permissions to previous state.
24
25 rm /var/lock/LCK..$SERPORT # Remove the serial port lock file. Why?
26
27 exit $?
28
29 # Exercises:
30 # ---------
31 # 1) Have script check whether root user is invoking it.
32 # 2) Do a check on whether the process to be killed
33 #+ is actually running before attempting to kill it.
34 # 3) Write an alternate version of this script based on 'fuser':
35 #+ if [ fuser -s /dev/modem ]; then . . .
1 #!/bin/bash
2 # A version of "rot13" using 'eval'.
3 # Compare to "rot13.sh" example.
4
5 setvar_rot_13() # "rot13" scrambling
6 {
7 local varname=$1 varvalue=$2
8 eval $varname='$(echo "$varvalue" | tr a-z n-za-m)'
9 }
10
11
12 setvar_rot_13 var "foobar" # Run "foobar" through rot13.
13 echo $var # sbbone
14
15 setvar_rot_13 var "$var" # Run "sbbone" through rot13.
16 # Back to original variable.
17 echo $var # foobar
18
19 # This example by Stephane Chazelas.
20 # Modified by document author.
21
22 exit 0
1 eval var=\$$var
The eval command can be risky, and normally should be avoided when there exists a
reasonable alternative. An eval $COMMANDS executes the contents of COMMANDS,
which may contain such unpleasant surprises as rm -rf *. Running an eval on
unfamiliar code written by persons unknown is living dangerously.
set
The set command changes the value of internal script variables/options. One use for this is to toggle
option flags which help determine the behavior of the script. Another application for it is to reset the
positional parameters that a script sees as the result of a command (set `command`). The script
can then parse the fields of the command output.
1 #!/bin/bash
2 # ex34.sh
3 # Script "set-test"
4
5 # Invoke this script with three command-line parameters,
6 # for example, "sh ex34.sh one two three".
7
8 echo
9 echo "Positional parameters before set \`uname -a\` :"
10 echo "Command-line argument #1 = $1"
11 echo "Command-line argument #2 = $2"
12 echo "Command-line argument #3 = $3"
13
14
15 set `uname -a` # Sets the positional parameters to the output
16 # of the command `uname -a`
17
18 echo
19 echo +++++
20 echo $_ # +++++
21 # Flags set in script.
22 echo $- # hB
23 # Anomalous behavior?
24 echo
25
26 echo "Positional parameters after set \`uname -a\` :"
27 # $1, $2, $3, etc. reinitialized to result of `uname -a`
28 echo "Field #1 of 'uname -a' = $1"
29 echo "Field #2 of 'uname -a' = $2"
30 echo "Field #3 of 'uname -a' = $3"
31 echo \#\#\#
32 echo $_ # ###
33 echo
34
35 exit 0
1 #!/bin/bash
2 # revposparams.sh: Reverse positional parameters.
3 # Script by Dan Jacobson, with stylistic revisions by document author.
4
5
6 set a\ b c d\ e;
7 # ^ ^ Spaces escaped
8 # ^ ^ Spaces not escaped
9 OIFS=$IFS; IFS=:;
10 # ^ Saving old IFS and setting new one.
11
12 echo
13
14 until [ $# -eq 0 ]
15 do # Step through positional parameters.
16 echo "### k0 = "$k"" # Before
17 k=$1:$k; # Append each pos param to loop variable.
18 # ^
19 echo "### k = "$k"" # After
20 echo
21 shift;
22 done
23
24 set $k # Set new positional parameters.
25 echo -
26 echo $# # Count of positional parameters.
27 echo -
28 echo
29
30 for i # Omitting the "in list" sets the variable -- i --
31 #+ to the positional parameters.
32 do
33 echo $i # Display new positional parameters.
34 done
35
36 IFS=$OIFS # Restore IFS.
37
38 # Question:
39 # Is it necessary to set an new IFS, internal field separator,
40 #+ in order for this script to work properly?
41 # What happens if you don't? Try it.
42 # And, why use the new IFS -- a colon -- in line 17,
43 #+ to append to the loop variable?
44 # What is the purpose of this?
45
46 exit 0
47
48 $ ./revposparams.sh
49
50 ### k0 =
51 ### k = a b
52
53 ### k0 = a b
54 ### k = c a b
55
56 ### k0 = c a b
57 ### k = d e c a b
58
59 -
60 3
61 -
62
63 d e
64 c
65 a b
Invoking set without any options or arguments simply lists all the environmental and other variables
that have been initialized.
bash$ set
AUTHORCOPY=/home/bozo/posts
BASH=/bin/bash
BASH_VERSION=$'2.05.8(1)-release'
...
XAUTHORITY=/home/bozo/.Xauthority
_=/etc/bashrc
variable22=abc
variable23=xzy
Using set with the -- option explicitly assigns the contents of a variable to the positional parameters.
If no variable follows the -- it unsets the positional parameters.
1 #!/bin/bash
2
3 variable="one two three four five"
4
5 set -- $variable
6 # Sets positional parameters to the contents of "$variable".
7
8 first_param=$1
9 second_param=$2
10 shift; shift # Shift past first two positional params.
11 # shift 2 also works.
12 remaining_params="$*"
13
14 echo
15 echo "first parameter = $first_param" # one
16 echo "second parameter = $second_param" # two
17 echo "remaining parameters = $remaining_params" # three four five
18
19 echo; echo
20
21 # Again.
22 set -- $variable
23 first_param=$1
24 second_param=$2
25 echo "first parameter = $first_param" # one
26 echo "second parameter = $second_param" # two
27
28 # ======================================================
29
30 set --
31 # Unsets positional parameters if no variable specified.
32
33 first_param=$1
34 second_param=$2
35 echo "first parameter = $first_param" # (null value)
36 echo "second parameter = $second_param" # (null value)
37
38 exit 0
bash$
1 #!/bin/bash
2 # unset.sh: Unsetting a variable.
3
4 variable=hello # Initialized.
5 echo "variable = $variable"
6
7 unset variable # Unset.
8 # In this particular context,
9 #+ same effect as: variable=
10 echo "(unset) variable = $variable" # $variable is null.
11
12 if [ -z "$variable" ] # Try a string-length test.
13 then
14 echo "\$variable has zero length."
15 fi
16
17 exit 0
In most contexts, an undeclared variable and one that has been unset are equivalent.
However, the ${parameter:-default} parameter substitution construct can distinguish
between the two.
export
The export [4] command makes available variables to all child processes of the running script or
shell. One important use of the export command is in startup files, to initialize and make accessible
environmental variables to subsequent user processes.
Unfortunately, there is no way to export variables back to the parent process, to the
process that called or invoked the script or shell.
However, as Greg Keraunen points out, in certain situations this may have a different
effect than setting a variable, then exporting it.
declare, typeset
The declare and typeset commands specify and/or restrict properties of variables.
readonly
Same as declare -r, sets a variable as read-only, or, in effect, as a constant. Attempts to change the
variable fail with an error message. This is the shell analog of the C language const type qualifier.
getopts
This powerful tool parses command-line arguments passed to the script. This is the Bash analog of the
getopt external command and the getopt library function familiar to C programmers. It permits
passing and concatenating multiple options [5] and associated arguments to a script (for example
scriptname -abc -e /usr/local).
The getopts construct uses two implicit variables. $OPTIND is the argument pointer (OPTion INDex)
and $OPTARG (OPTion ARGument) the (optional) argument attached to an option. A colon following
the option name in the declaration tags that option as having an associated argument.
A getopts construct usually comes packaged in a while loop, which processes the options and
arguments one at a time, then increments the implicit $OPTIND variable to point to the next.
1. The arguments passed from the command-line to the script must be preceded
by a dash (-). It is the prefixed - that lets getopts recognize command-line
arguments as options. In fact, getopts will not process arguments without the
prefixed -, and will terminate option processing at the first argument
encountered lacking them.
2. The getopts template differs slightly from the standard while loop, in that it
lacks condition brackets.
3. The getopts construct is a highly functional replacement for the traditional
getopt external command.
1 #!/bin/bash
2 # ex33.sh: Exercising getopts and OPTIND
3 # Script modified 10/09/03 at the suggestion of Bill Gradwohl.
4
5
6 # Here we observe how 'getopts' processes command-line arguments to script.
7 # The arguments are parsed as "options" (flags) and associated arguments.
8
9 # Try invoking this script with:
10 # 'scriptname -mn'
11 # 'scriptname -oq qOption' (qOption can be some arbitrary string.)
12 # 'scriptname -qXXX -r'
13 #
14 # 'scriptname -qr'
15 #+ - Unexpected result, takes "r" as the argument to option "q"
16 # 'scriptname -q -r'
17 #+ - Unexpected result, same as above
18 # 'scriptname -mnop -mnop' - Unexpected result
19 # (OPTIND is unreliable at stating where an option came from.)
20 #
21 # If an option expects an argument ("flag:"), then it will grab
22 #+ whatever is next on the command-line.
23
24 NO_ARGS=0
25 E_OPTERROR=85
26
27 if [ $# -eq "$NO_ARGS" ] # Script invoked with no command-line args?
28 then
29 echo "Usage: `basename $0` options (-mnopqrs)"
30 exit $E_OPTERROR # Exit and explain usage.
31 # Usage: scriptname -options
32 # Note: dash (-) necessary
33 fi
34
35
36 while getopts ":mnopq:rs" Option
37 do
38 case $Option in
39 m ) echo "Scenario #1: option -m- [OPTIND=${OPTIND}]";;
40 n | o ) echo "Scenario #2: option -$Option- [OPTIND=${OPTIND}]";;
41 p ) echo "Scenario #3: option -p- [OPTIND=${OPTIND}]";;
42 q ) echo "Scenario #4: option -q-\
43 with argument \"$OPTARG\" [OPTIND=${OPTIND}]";;
44 # Note that option 'q' must have an associated argument,
45 #+ otherwise it falls through to the default.
46 r | s ) echo "Scenario #5: option -$Option-";;
47 * ) echo "Unimplemented option chosen.";; # Default.
48 esac
49 done
50
51 shift $(($OPTIND - 1))
52 # Decrements the argument pointer so it points to next argument.
53 # $1 now references the first non-option item supplied on the command-line
54 #+ if one exists.
55
56 exit $?
57
58 # As Bill Gradwohl states,
59 # "The getopts mechanism allows one to specify: scriptname -mnop -mnop
60 #+ but there is no reliable way to differentiate what came
61 #+ from where by using OPTIND."
62 # There are, however, workarounds.
Script Behavior
1 #!/bin/bash
2
3 . data-file # Load a data file.
4 # Same effect as "source data-file", but more portable.
5
6 # The file "data-file" must be present in current working directory,
7 #+ since it is referred to by its 'basename'.
8
9 # Now, reference some data from that file.
10
11 echo "variable1 (from data-file) = $variable1"
12 echo "variable3 (from data-file) = $variable3"
13
14 let "sum = $variable2 + $variable4"
15 echo "Sum of variable2 + variable4 (from data-file) = $sum"
16 echo "message1 (from data-file) is \"$message1\""
17 # Note: escaped quotes
18
19 print_message This is the message-print function in the data-file.
20
21
22 exit 0
File data-file for Example 15-22, above. Must be present in same directory.
1 #!/bin/bash
2 # self-source.sh: a script sourcing itself "recursively."
3 # From "Stupid Script Tricks," Volume II.
4
5 MAXPASSCNT=100 # Maximum number of execution passes.
6
7 echo -n "$pass_count "
8 # At first execution pass, this just echoes two blank spaces,
9 #+ since $pass_count still uninitialized.
10
11 let "pass_count += 1"
12 # Assumes the uninitialized variable $pass_count
13 #+ can be incremented the first time around.
14 # This works with Bash and pdksh, but
15 #+ it relies on non-portable (and possibly dangerous) behavior.
16 # Better would be to initialize $pass_count to 0 before incrementing.
17
18 while [ "$pass_count" -le $MAXPASSCNT ]
19 do
20 . $0 # Script "sources" itself, rather than calling itself.
21 # ./$0 (which would be true recursion) doesn't work here. Why?
22 done
23
24 # What occurs here is not actually recursion,
25 #+ since the script effectively "expands" itself, i.e.,
26 #+ generates a new section of code
27 #+ with each pass through the 'while' loop',
28 # with each 'source' in line 20.
29 #
30 # Of course, the script interprets each newly 'sourced' "#!" line
31 #+ as a comment, and not as the start of a new script.
32
33 echo
34
35 exit 0 # The net effect is counting from 1 to 100.
36 # Very impressive.
37
38 # Exercise:
39 # --------
40 # Write a script that uses this trick to actually do something useful.
exit
Unconditionally terminates a script. [6] The exit command may optionally take an integer argument,
which is returned to the shell as the exit status of the script. It is good practice to end all but the
simplest scripts with an exit 0, indicating a successful run.
If a script terminates with an exit lacking an argument, the exit status of the script is
the exit status of the last command executed in the script, not counting the exit. This is
equivalent to an exit $?.
An exit command may also be used to terminate a subshell.
exec
This shell builtin replaces the current process with a specified command. Normally, when the shell
encounters a command, it forks off a child process to actually execute the command. Using the exec
builtin, the shell does not fork, and the command exec'ed replaces the shell. When used in a script,
therefore, it forces an exit from the script when the exec'ed command terminates. [7]
1 #!/bin/bash
2
3 exec echo "Exiting \"$0\"." # Exit from script here.
4
5 # ----------------------------------
6 # The following lines never execute.
7
8 echo "This echo will never echo."
9
10 exit 99 # This script will not exit here.
11 # Check exit value after script terminates
12 #+ with an 'echo $?'.
13 # It will *not* be 99.
1 #!/bin/bash
2 # self-exec.sh
3
4 # Note: Set permissions on this script to 555 or 755,
5 # then call it with ./self-exec.sh or sh ./self-exec.sh.
6
7 echo
8
9 echo "This line appears ONCE in the script, yet it keeps echoing."
10 echo "The PID of this instance of the script is still $$."
11 # Demonstrates that a subshell is not forked off.
12
13 echo "==================== Hit Ctl-C to exit ===================="
14
15 sleep 1
16
17 exec $0 # Spawns another instance of this same script
18 #+ that replaces the previous one.
19
20 echo "This line will never echo!" # Why not?
21
22 exit 99 # Will not exit here!
23 # Exit code will not be 99!
An exec also serves to reassign file descriptors. For example, exec <zzz-file replaces stdin
with the file zzz-file.
The -exec option to find is not the same as the exec shell builtin.
shopt
This command permits changing shell options on the fly (see Example 25-1 and Example 25-2). It
often appears in the Bash startup files, but also has its uses in scripts. Needs version 2 or later of Bash.
1 shopt -s cdspell
2 # Allows minor misspelling of directory names with 'cd'
3
4 cd /hpme # Oops! Mistyped '/home'.
5 pwd # /home
6 # The shell corrected the misspelling.
caller
Putting a caller command inside a function echoes to stdout information about the caller of that
function.
1 #!/bin/bash
2
3 function1 ()
4 {
5 # Inside function1 ().
6 caller 0 # Tell me about it.
7 }
8
9 function1 # Line 9 of script.
10
11 # 9 main test.sh
12 # ^ Line number that the function was called from.
13 # ^^^^ Invoked from "main" part of script.
14 # ^^^^^^^ Name of calling script.
15
16 caller 0 # Has no effect because it's not inside a function.
A caller command can also return caller information from a script sourced within another script.
Analogous to a function, this is a "subroutine call."
Commands
true
A command that returns a successful (zero) exit status, but does nothing else.
bash$ true
bash$ echo $?
0
1 # Endless loop
2 while true # alias for ":"
3 do
4 operation-1
5 operation-2
6 ...
7 operation-n
8 # Need a way to break out of loop or script will hang.
9 done
false
A command that returns an unsuccessful exit status, but does nothing else.
bash$ false
bash$ echo $?
1
1 # Testing "false"
2 if false
3 then
4 echo "false evaluates \"true\""
5 else
6 echo "false evaluates \"false\""
7 fi
8 # false evaluates "false"
9
10
11 # Looping while "false" (null loop)
12 while false
13 do
14 # The following code will not execute.
15 operation-1
16 operation-2
17 ...
18 operation-n
19 # Nothing happens!
20 done
type [cmd]
Similar to the which external command, type cmd identifies "cmd." Unlike which, type is a Bash
builtin. The useful -a option to type identifies keywords and builtins, and also locates system
commands with identical names.
The type command can be useful for testing whether a certain command exists.
hash [cmds]
Records the path name of specified commands -- in the shell hash table [8] -- so the shell or script
will not need to search the $PATH on subsequent calls to those commands. When hash is called with
no arguments, it simply lists the commands that have been hashed. The -r option resets the hash
table.
bind
The bind builtin displays or modifies readline [9] key bindings.
help
Gets a short usage summary of a shell builtin. This is the counterpart to whatis, but for builtins. The
display of help information got a much-needed update in the version 4 release of Bash.
jobs
Lists the jobs running in the background, giving the job number. Not as useful as ps.
It is all too easy to confuse jobs and processes. Certain builtins, such as kill, disown,
and wait accept either a job number or a process number as an argument. The fg, bg
and jobs commands accept only a job number.
bash $ jobs
[1]+ Running sleep 100 &
"1" is the job number (jobs are maintained by the current shell). "1384" is the PID or
process ID number (processes are maintained by the system). To kill this job/process,
either a kill %1 or a kill 1384 works.
Thanks, S.C.
disown
Remove job(s) from the shell's table of active jobs.
fg, bg
The fg command switches a job running in the background into the foreground. The bg command
restarts a suspended job, and runs it in the background. If no job number is specified, then the fg or bg
command acts upon the currently running job.
wait
Suspend script execution until all jobs running in background have terminated, or until the job number
or process ID specified as an option terminates. Returns the exit status of waited-for command.
You may use the wait command to prevent a script from exiting before a background job finishes
executing (this would create a dreaded orphan process).
1 #!/bin/bash
2
3 ROOT_UID=0 # Only users with $UID 0 have root privileges.
4 E_NOTROOT=65
5 E_NOPARAMS=66
6
7 if [ "$UID" -ne "$ROOT_UID" ]
8 then
9 echo "Must be root to run this script."
10 # "Run along kid, it's past your bedtime."
11 exit $E_NOTROOT
12 fi
13
14 if [ -z "$1" ]
15 then
16 echo "Usage: `basename $0` find-string"
17 exit $E_NOPARAMS
18 fi
19
20
21 echo "Updating 'locate' database..."
22 echo "This may take a while."
23 updatedb /usr & # Must be run as root.
24
25 wait
26 # Don't run the rest of the script until 'updatedb' finished.
27 # You want the the database updated before looking up the file name.
28
29 locate $1
30
31 # Without the 'wait' command, in the worse case scenario,
32 #+ the script would exit while 'updatedb' was still running,
33 #+ leaving it as an orphan process.
34
35 exit 0
Optionally, wait can take a job identifier as an argument, for example, wait%1 or wait $PPID.
[10] See the job id table.
Within a script, running a command in the background with an ampersand (&) may
cause the script to hang until ENTER is hit. This seems to occur with commands that
write to stdout. It can be a major annoyance.
1 #!/bin/bash
2 # test.sh
3
4 ls -l &
5 echo "Done."
bash$ ./test.sh
Done.
[bozo@localhost test-scripts]$ total 1
-rwxr-xr-x 1 bozo bozo 34 Oct 11 15:09 test.sh
_
1 #!/bin/bash
2 # test.sh
3
4 ls -l &
5 echo "Done."
6 wait
bash$ ./test.sh
Done.
[bozo@localhost test-scripts]$ total 1
-rwxr-xr-x 1 bozo bozo 34 Oct 11 15:09 test.sh
Redirecting the output of the command to a file or even to /dev/null also takes
care of this problem.
suspend
This has a similar effect to Control-Z, but it suspends the shell (the shell's parent process should
resume it at an appropriate time).
logout
Exit a login shell, optionally specifying an exit status.
times
Gives statistics on the system time elapsed when executing commands, in the following form:
0m0.020s 0m0.020s
This capability is of relatively limited value, since it is not common to profile and benchmark shell
scripts.
kill
Forcibly terminate a process by sending it an appropriate terminate signal (see Example 17-6).
1 #!/bin/bash
2 # self-destruct.sh
3
4 kill $$ # Script kills its own process here.
5 # Recall that "$$" is the script's PID.
6
7 echo "This line will not echo."
8 # Instead, the shell sends a "Terminated" message to stdout.
9
10 exit 0 # Normal exit? No!
11
12 # After this script terminates prematurely,
13 #+ what exit status does it return?
14 #
15 # sh self-destruct.sh
16 # echo $?
17 # 143
18 #
19 # 143 = 128 + 15
20 # TERM signal
kill -l lists all the signals (as does the file /usr/include/asm/signal.h).
A kill -9 is a sure kill, which will usually terminate a process that stubbornly
refuses to die with a plain kill. Sometimes, a kill -15 works. A zombie process,
that is, a child process that has terminated, but that the parent process has not (yet)
killed, cannot be killed by a logged-on user -- you can't kill something that is already
dead -- but init will generally clean it up sooner or later.
killall
The killall command kills a running process by name, rather than by process ID. If there are multiple
instances of a particular command running, then doing a killall on that command will terminate them
all.
This refers to the killall command in /usr/bin, not the killall script in
/etc/rc.d/init.d.
command
The command directive disables aliases and functions for the command immediately following it.
bash$ command ls
This is one of three shell directives that effect script command processing. The others
are builtin and enable.
builtin
Invoking builtin BUILTIN_COMMAND runs the command BUILTIN_COMMAND as a shell
builtin, temporarily disabling both functions and external system commands with the same name.
enable
This either enables or disables a shell builtin command. As an example, enable -n kill disables
the shell builtin kill, so that when Bash subsequently encounters kill, it invokes the external command
/bin/kill.
The -a option to enable lists all the shell builtins, indicating whether or not they are enabled. The -f
filename option lets enable load a builtin as a shared library (DLL) module from a properly
compiled object file. [11].
autoload
This is a port to Bash of the ksh autoloader. With autoload in place, a function with an autoload
declaration will load from an external file at its first invocation. [12] This saves system resources.
Note that autoload is not a part of the core Bash installation. It needs to be loaded in with enable
-f (see above).
Notation Meaning
%N Job number [N]
%S Invocation (command-line) of job begins with string S
%?S Invocation (command-line) of job contains within it string S
%% "current" job (last job stopped in foreground or started in background)
%+ "current" job (last job stopped in foreground or started in background)
%- Last job
$! Last background process
Notes
[1] As Nathan Coulter points out, "while forking a process is a low-cost operation, executing a new
program in the newly-forked child process adds more overhead."
[2] An exception to this is the time command, listed in the official Bash documentation as a keyword
("reserved word").
[3] Note that let cannot be used for setting string variables.
[4] To Export information is to make it available in a more general context. See also scope.
[5] An option is an argument that acts as a flag, switching script behaviors on or off. The argument
associated with a particular option indicates the behavior that the option (flag) switches on or off.
[6] Technically, an exit only terminates the process (or shell) in which it is running, not the parent process.
[7] Unless the exec is used to reassign file descriptors.
[8]
Hashing is a method of creating lookup keys for data stored in a table. The data items themselves are
"scrambled" to create keys, using one of a number of simple mathematical algorithms (methods, or
recipes).
An advantage of hashing is that it is fast. A disadvantage is that collisions -- where a single key maps to
more than one data item -- are possible.
Standard UNIX commands make shell scripts more versatile. The power of scripts comes from coupling
system commands and shell directives with simple programming constructs.
16.1. Basic Commands
The first commands a novice learns
ls
The basic file "list" command. It is all too easy to underestimate the power of this humble command.
For example, using the -R, recursive option, ls provides a tree-like listing of a directory structure.
Other useful options are -S, sort listing by file size, -t, sort by file modification time, -b, show
escape characters, and -i, show file inodes (see Example 16-4).
The ls command returns a non-zero exit status when attempting to list a non-existent
file.
bash$ ls abc
ls: abc: No such file or directory
bash$ echo $?
2
Example 16-1. Using ls to create a table of contents for burning a CDR disk
1 #!/bin/bash
2 # ex40.sh (burn-cd.sh)
3 # Script to automate burning a CDR.
4
5
6 SPEED=10 # May use higher speed if your hardware supports it.
7 IMAGEFILE=cdimage.iso
8 CONTENTSFILE=contents
9 # DEVICE=/dev/cdrom For older versions of cdrecord
10 DEVICE="1,0,0"
11 DEFAULTDIR=/opt # This is the directory containing the data to be burned.
12 # Make sure it exists.
13 # Exercise: Add a test for this.
14
15 # Uses Joerg Schilling's "cdrecord" package:
16 # http://www.fokus.fhg.de/usr/schilling/cdrecord.html
17
18 # If this script invoked as an ordinary user, may need to suid cdrecord
19 #+ chmod u+s /usr/bin/cdrecord, as root.
20 # Of course, this creates a security hole, though a relatively minor one.
21
22 if [ -z "$1" ]
23 then
24 IMAGE_DIRECTORY=$DEFAULTDIR
25 # Default directory, if not specified on command-line.
26 else
27 IMAGE_DIRECTORY=$1
28 fi
29
30 # Create a "table of contents" file.
31 ls -lRF $IMAGE_DIRECTORY > $IMAGE_DIRECTORY/$CONTENTSFILE
32 # The "l" option gives a "long" file listing.
33 # The "R" option makes the listing recursive.
34 # The "F" option marks the file types (directories get a trailing /).
35 echo "Creating table of contents."
36
37 # Create an image file preparatory to burning it onto the CDR.
38 mkisofs -r -o $IMAGEFILE $IMAGE_DIRECTORY
39 echo "Creating ISO9660 file system image ($IMAGEFILE)."
40
41 # Burn the CDR.
42 echo "Burning the disk."
43 echo "Please be patient, this will take a while."
44 wodim -v -isosize dev=$DEVICE $IMAGEFILE
45 # In newer Linux distros, the "wodim" utility assumes the
46 #+ functionality of "cdrecord."
47 exitcode=$?
48 echo "Exit code = $exitcode"
49
50 exit $exitcode
cat, tac
cat, an acronym for concatenate, lists a file to stdout. When combined with redirection (> or >>), it
is commonly used to concatenate files.
1 # Uses of 'cat'
2 cat filename # Lists the file.
3
4 cat file.1 file.2 file.3 > file.123 # Combines three files into one.
The -n option to cat inserts consecutive numbers before all lines of the target file(s). The -b option
numbers only the non-blank lines. The -v option echoes nonprintable characters, using ^ notation.
The -s option squeezes multiple consecutive blank lines into a single blank line.
In a pipe, it may be more efficient to redirect the stdin to a file, rather than to cat the
file.
cp
This is the file copy command. cp file1 file2 copies file1 to file2, overwriting file2 if
it already exists (see Example 16-6).
Particularly useful are the -a archive flag (for copying an entire directory tree), the
-u update flag (which prevents overwriting identically-named newer files), and the
-r and -R recursive flags.
1 cp -u source_dir/* dest_dir
2 # "Synchronize" dest_dir to source_dir
3 #+ by copying over all newer and not previously existing files.
mv
This is the file move command. It is equivalent to a combination of cp and rm. It may be used to
move multiple files to a directory, or even to rename a directory. For some examples of using mv in a
script, see Example 10-11 and Example A-2.
When used in a non-interactive script, mv takes the -f (force) option to bypass user
input.
rm
Delete (remove) a file or files. The -f option forces removal of even readonly files, and is useful for
bypassing user input in a script.
The rm command will, by itself, fail to remove filenames beginning with a dash.
Why? Because rm sees a dash-prefixed filename as an option.
bash$ rm -badname
rm: invalid option -- b
Try `rm --help' for more information.
One clever workaround is to precede the filename with a " -- " (the end-of-options
flag).
bash$ rm -- -badname
Another method to is to preface the filename to be removed with a dot-slash .
bash$ rm ./-badname
When used with the recursive flag -r, this command removes files all the way down
the directory tree from the current directory. A careless rm -rf * can wipe out a big
chunk of a directory structure.
rmdir
Remove directory. The directory must be empty of all files -- including "invisible" dotfiles [1] -- for
this command to succeed.
mkdir
Make directory, creates a new directory. For example, mkdir -p
project/programs/December creates the named directory. The -p option automatically
creates any necessary parent directories.
chmod
Changes the attributes of an existing file or directory (see Example 15-14).
1 chmod +x filename
2 # Makes "filename" executable for all users.
3
4 chmod u+s filename
5 # Sets "suid" bit on "filename" permissions.
6 # An ordinary user may execute "filename" with same privileges as the file's owner.
7 # (This does not apply to shell scripts.)
One particularly interesting chattr option is i. A chattr +i filename marks the file as immutable.
The file cannot be modified, linked to, or deleted, not even by root. This file attribute can be set or
removed only by root. In a similar fashion, the a option marks the file as append only.
root# rm file1.txt
If a file has the s (secure) attribute set, then when it is deleted its block is overwritten with binary
zeroes. [2]
If a file has the u (undelete) attribute set, then when it is deleted, its contents can still be retrieved
(undeleted).
If a file has the c (compress) attribute set, then it will automatically be compressed on writes to disk,
and uncompressed on reads.
The file attributes set with chattr do not show in a file listing (ls -l).
ln
Creates links to pre-existings files. A "link" is a reference to a file, an alternate name for it. The ln
command permits referencing the linked file by more than one name and is a superior alternative to
aliasing (see Example 4-6).
The ln creates only a reference, a pointer to the file only a few bytes in size.
The ln command is most often used with the -s, symbolic or "soft" link flag. Advantages of using the
-s flag are that it permits linking across file systems or to directories.
The syntax of the command is a bit tricky. For example: ln -s oldfile newfile links the
previously existing oldfile to the newly created link, newfile.
If a file named newfile has previously existed, an error message will result.
Both of these [types of links] provide a certain measure of dual reference -- if you edit the contents
of the file using any name, your changes will affect both the original name and either a hard or soft
new name. The differences between them occurs when you work at a higher level. The advantage of
a hard link is that the new name is totally independent of the old name -- if you remove or rename
the old name, that does not affect the hard link, which continues to point to the data while it would
leave a soft link hanging pointing to the old name which is no longer there. The advantage of a soft
link is that it can refer to a different file system (since it is just a reference to a file name, not to
actual data). And, unlike a hard link, a symbolic link can refer to a directory.
Links give the ability to invoke a script (or any other type of executable) with multiple names, and
having that script behave according to how it was invoked.
1 #!/bin/bash
2 # hello.sh: Saying "hello" or "goodbye"
3 #+ depending on how script is invoked.
4
5 # Make a link in current working directory ($PWD) to this script:
6 # ln -s hello.sh goodbye
7 # Now, try invoking this script both ways:
8 # ./hello.sh
9 # ./goodbye
10
11
12 HELLO_CALL=65
13 GOODBYE_CALL=66
14
15 if [ $0 = "./goodbye" ]
16 then
17 echo "Good-bye!"
18 # Some other goodbye-type commands, as appropriate.
19 exit $GOODBYE_CALL
20 fi
21
22 echo "Hello!"
23 # Some other hello-type commands, as appropriate.
24 exit $HELLO_CALL
man, info
These commands access the manual and information pages on system commands and installed
utilities. When available, the info pages usually contain more detailed descriptions than do the man
pages.
There have been various attempts at "automating" the writing of man pages. For a script that makes a
tentative first step in that direction, see Example A-39.
Notes
[1]
Dotfiles are files whose names begin with a dot, such as ~/.Xdefaults. Such filenames do not
appear in a normal ls listing (although an ls -a will show them), and they cannot be deleted by an
accidental rm -rf *. Dotfiles are generally used as setup and configuration files in a user's home
directory.
[2] This particular feature may not yet be implemented in the version of the ext2/ext3 filesystem installed
on your system. Check the documentation for your Linux distro.
find
-exec COMMAND \;
Carries out COMMAND on each file that find matches. The command sequence terminates with ; (the
";" is escaped to make certain the shell passes it to find literally, without interpreting it as a special
character).
If COMMAND contains {}, then find substitutes the full path name of the selected file for "{}".
Example 16-3. Badname, eliminate file names in current directory containing bad characters
and whitespace.
1 #!/bin/bash
2 # badname.sh
3 # Delete filenames in current directory containing bad characters.
4
5 for filename in *
6 do
7 badname=`echo "$filename" | sed -n /[\+\{\;\"\\\=\?~\(\)\<\>\&\*\|\$]/p`
8 # badname=`echo "$filename" | sed -n '/[+{;"\=?~()<>&*|$]/p'` also works.
9 # Deletes files containing these nasties: + { ; " \ = ? ~ ( ) < > & * | $
10 #
11 rm $badname 2>/dev/null
12 # ^^^^^^^^^^^ Error messages deep-sixed.
13 done
14
15 # Now, take care of files containing all manner of whitespace.
16 find . -name "* *" -exec rm -f {} \;
17 # The path name of the file that _find_ finds replaces the "{}".
18 # The '\' ensures that the ';' is interpreted literally, as end of command.
19
20 exit 0
21
22 #---------------------------------------------------------------------
23 # Commands below this line will not execute because of _exit_ command.
24
25 # An alternative to the above script:
26 find . -name '*[+{;"\\=?~()<>&*|$ ]*' -maxdepth 0 \
27 -exec rm -f '{}' \;
28 # The "-maxdepth 0" option ensures that _find_ will not search
29 #+ subdirectories below $PWD.
30
31 # (Thanks, S.C.)
1 #!/bin/bash
2 # idelete.sh: Deleting a file by its inode number.
3
4 # This is useful when a filename starts with an illegal character,
5 #+ such as ? or -.
6
7 ARGCOUNT=1 # Filename arg must be passed to script.
8 E_WRONGARGS=70
9 E_FILE_NOT_EXIST=71
10 E_CHANGED_MIND=72
11
12 if [ $# -ne "$ARGCOUNT" ]
13 then
14 echo "Usage: `basename $0` filename"
15 exit $E_WRONGARGS
16 fi
17
18 if [ ! -e "$1" ]
19 then
20 echo "File \""$1"\" does not exist."
21 exit $E_FILE_NOT_EXIST
22 fi
23
24 inum=`ls -i | grep "$1" | awk '{print $1}'`
25 # inum = inode (index node) number of file
26 # -----------------------------------------------------------------------
27 # Every file has an inode, a record that holds its physical address info.
28 # -----------------------------------------------------------------------
29
30 echo; echo -n "Are you absolutely sure you want to delete \"$1\" (y/n)? "
31 # The '-v' option to 'rm' also asks this.
32 read answer
33 case "$answer" in
34 [nN]) echo "Changed your mind, huh?"
35 exit $E_CHANGED_MIND
36 ;;
37 *) echo "Deleting file \"$1\".";;
38 esac
39
40 find . -inum $inum -exec rm {} \;
41 # ^^
42 # Curly brackets are placeholder
43 #+ for text output by "find."
44 echo "File "\"$1"\" deleted!"
45
46 exit 0
1 #!/bin/bash
2 # Find suid root files.
3 # A strange suid file might indicate a security hole,
4 #+ or even a system intrusion.
5
6 directory="/usr/sbin"
7 # Might also try /sbin, /bin, /usr/bin, /usr/local/bin, etc.
8 permissions="+4000" # suid root (dangerous!)
9
10
11 for file in $( find "$directory" -perm "$permissions" )
12 do
13 ls -ltF --author "$file"
14 done
See Example 16-30, Example 3-4, and Example 11-9 for scripts using find. Its manpage provides
more detail on this complex and powerful command.
xargs
A filter for feeding arguments to a command, and also a tool for assembling the commands
themselves. It breaks a data stream into small enough chunks for filters and commands to process.
Consider it as a powerful replacement for backquotes. In situations where command substitution fails
with a too many arguments error, substituting xargs often works. [1] Normally, xargs reads from
stdin or from a pipe, but it can also be given the output of a file.
The default command for xargs is echo. This means that input piped to xargs may have linefeeds and
other whitespace characters stripped out.
bash$ ls -l
total 0
-rw-rw-r-- 1 bozo bozo 0 Jan 29 23:58 file1
-rw-rw-r-- 1 bozo bozo 0 Jan 29 23:58 file2
bash$ ls -l | xargs
total 0 -rw-rw-r-- 1 bozo bozo 0 Jan 29 23:58 file1 -rw-rw-r-- 1 bozo bozo 0 Jan...
ls | xargs -p -l gzip gzips every file in current directory, one at a time, prompting before
each operation.
Note that xargs processes the arguments passed to it sequentially, one at a time.
Another useful option is -0, in combination with find -print0 or grep -lZ.
This allows handling arguments containing whitespace or quotes.
Either of the above will remove any file containing "GUI". (Thanks, S.C.)
Or:
1 #!/bin/bash
2
3 # Generates a log file in current directory
4 # from the tail end of /var/log/messages.
5
6 # Note: /var/log/messages must be world readable
7 # if this script invoked by an ordinary user.
8 # #root chmod 644 /var/log/messages
9
10 LINES=5
11
12 ( date; uname -a ) >>logfile
13 # Time and machine name
14 echo ---------------------------------------------------------- >>logfile
15 tail -n $LINES /var/log/messages | xargs | fmt -s >>logfile
16 echo >>logfile
17 echo >>logfile
18
19 exit 0
20
21 # Note:
22 # ----
23 # As Frank Wang points out,
24 #+ unmatched quotes (either single or double quotes) in the source file
25 #+ may give xargs indigestion.
26 #
27 # He suggests the following substitution for line 15:
28 # tail -n $LINES /var/log/messages | tr -d "\"'" | xargs | fmt -s >>logfile
29
30
31
32 # Exercise:
33 # --------
34 # Modify this script to track changes in /var/log/messages at intervals
35 #+ of 20 minutes.
36 # Hint: Use the "watch" command.
1 #!/bin/bash
2 # copydir.sh
3
4 # Copy (verbose) all files in current directory ($PWD)
5 #+ to directory specified on command-line.
6
7 E_NOARGS=85
8
9 if [ -z "$1" ] # Exit if no argument given.
10 then
11 echo "Usage: `basename $0` directory-to-copy-to"
12 exit $E_NOARGS
13 fi
14
15 ls . | xargs -i -t cp ./{} $1
16 # ^^ ^^ ^^
17 # -t is "verbose" (output command-line to stderr) option.
18 # -i is "replace strings" option.
19 # {} is a placeholder for output text.
20 # This is similar to the use of a curly-bracket pair in "find."
21 #
22 # List the files in current directory (ls .),
23 #+ pass the output of "ls" as arguments to "xargs" (-i -t options),
24 #+ then copy (cp) these arguments ({}) to new directory ($1).
25 #
26 # The net result is the exact equivalent of
27 #+ cp * $1
28 #+ unless any of the filenames has embedded "whitespace" characters.
29
30 exit 0
1 #!/bin/bash
2 # kill-byname.sh: Killing processes by name.
3 # Compare this script with kill-process.sh.
4
5 # For instance,
6 #+ try "./kill-byname.sh xterm" --
7 #+ and watch all the xterms on your desktop disappear.
8
9 # Warning:
10 # -------
11 # This is a fairly dangerous script.
12 # Running it carelessly (especially as root)
13 #+ can cause data loss and other undesirable effects.
14
15 E_BADARGS=66
16
17 if test -z "$1" # No command-line arg supplied?
18 then
19 echo "Usage: `basename $0` Process(es)_to_kill"
20 exit $E_BADARGS
21 fi
22
23
24 PROCESS_NAME="$1"
25 ps ax | grep "$PROCESS_NAME" | awk '{print $1}' | xargs -i kill {} 2&>/dev/null
26 # ^^ ^^
27
28 # ---------------------------------------------------------------
29 # Notes:
30 # -i is the "replace strings" option to xargs.
31 # The curly brackets are the placeholder for the replacement.
32 # 2&>/dev/null suppresses unwanted error messages.
33 #
34 # Can grep "$PROCESS_NAME" be replaced by pidof "$PROCESS_NAME"?
35 # ---------------------------------------------------------------
36
37 exit $?
38
39 # The "killall" command has the same effect as this script,
40 #+ but using it is not quite as educational.
1 #!/bin/bash
2 # wf2.sh: Crude word frequency analysis on a text file.
3
4 # Uses 'xargs' to decompose lines of text into single words.
5 # Compare this example to the "wf.sh" script later on.
6
7
8 # Check for input file on command-line.
9 ARGS=1
10 E_BADARGS=85
11 E_NOFILE=86
12
13 if [ $# -ne "$ARGS" ]
14 # Correct number of arguments passed to script?
15 then
16 echo "Usage: `basename $0` filename"
17 exit $E_BADARGS
18 fi
19
20 if [ ! -f "$1" ] # Check if file exists.
21 then
22 echo "File \"$1\" does not exist."
23 exit $E_NOFILE
24 fi
25
26
27
28 #####################################################
29 cat "$1" | xargs -n1 | \
30 # List the file, one word per line.
31 tr A-Z a-z | \
32 # Shift characters to lowercase.
33 sed -e 's/\.//g' -e 's/\,//g' -e 's/ /\
34 /g' | \
35 # Filter out periods and commas, and
36 #+ change space between words to linefeed,
37 sort | uniq -c | sort -nr
38 # Finally remove duplicates, prefix occurrence count
39 #+ and sort numerically.
40 #####################################################
41
42 # This does the same job as the "wf.sh" example,
43 #+ but a bit more ponderously, and it runs more slowly (why?).
44
45 exit $?
expr
All-purpose expression evaluator: Concatenates and evaluates the arguments according to the
operation given (arguments must be separated by spaces). Operations may be arithmetic, comparison,
string, or logical.
expr 3 + 5
returns 8
expr 5 % 3
returns 2
expr 1 / 0
returns the error message, expr: division by zero
The multiplication operator must be escaped when used in an arithmetic expression with
expr.
y=`expr $y + 1`
Increment a variable, with the same effect as let y=y+1 and y=$(($y+1)). This is an
example of arithmetic expansion.
z=`expr substr $string $position $length`
Extract substring of $length characters, starting at $position.
1 #!/bin/bash
2
3 # Demonstrating some of the uses of 'expr'
4 # =======================================
5
6 echo
7
8 # Arithmetic Operators
9 # ---------- ---------
10
11 echo "Arithmetic Operators"
12 echo
13 a=`expr 5 + 3`
14 echo "5 + 3 = $a"
15
16 a=`expr $a + 1`
17 echo
18 echo "a + 1 = $a"
19 echo "(incrementing a variable)"
20
21 a=`expr 5 % 3`
22 # modulo
23 echo
24 echo "5 mod 3 = $a"
25
26 echo
27 echo
28
29 # Logical Operators
30 # ------- ---------
31
32 # Returns 1 if true, 0 if false,
33 #+ opposite of normal Bash convention.
34
35 echo "Logical Operators"
36 echo
37
38 x=24
39 y=25
40 b=`expr $x = $y` # Test equality.
41 echo "b = $b" # 0 ( $x -ne $y )
42 echo
43
44 a=3
45 b=`expr $a \> 10`
46 echo 'b=`expr $a \> 10`, therefore...'
47 echo "If a > 10, b = 0 (false)"
48 echo "b = $b" # 0 ( 3 ! -gt 10 )
49 echo
50
51 b=`expr $a \< 10`
52 echo "If a < 10, b = 1 (true)"
53 echo "b = $b" # 1 ( 3 -lt 10 )
54 echo
55 # Note escaping of operators.
56
57 b=`expr $a \<= 3`
58 echo "If a <= 3, b = 1 (true)"
59 echo "b = $b" # 1 ( 3 -le 3 )
60 # There is also a "\>=" operator (greater than or equal to).
61
62
63 echo
64 echo
65
66
67
68 # String Operators
69 # ------ ---------
70
71 echo "String Operators"
72 echo
73
74 a=1234zipper43231
75 echo "The string being operated upon is \"$a\"."
76
77 # length: length of string
78 b=`expr length $a`
79 echo "Length of \"$a\" is $b."
80
81 # index: position of first character in substring
82 # that matches a character in string
83 b=`expr index $a 23`
84 echo "Numerical position of first \"2\" in \"$a\" is \"$b\"."
85
86 # substr: extract substring, starting position & length specified
87 b=`expr substr $a 2 6`
88 echo "Substring of \"$a\", starting at position 2,\
89 and 6 chars long is \"$b\"."
90
91
92 # The default behavior of the 'match' operations is to
93 #+ search for the specified match at the BEGINNING of the string.
94 #
95 # Using Regular Expressions ...
96 b=`expr match "$a" '[0-9]*'` # Numerical count.
97 echo Number of digits at the beginning of \"$a\" is $b.
98 b=`expr match "$a" '\([0-9]*\)'` # Note that escaped parentheses
99 # == == #+ trigger substring match.
100 echo "The digits at the beginning of \"$a\" are \"$b\"."
101
102 echo
103
104 exit 0
The : (null) operator can substitute for match. For example, b=`expr $a : [0-9]*` is the
exact equivalent of b=`expr match $a [0-9]*` in the above listing.
1 #!/bin/bash
2
3 echo
4 echo "String operations using \"expr \$string : \" construct"
5 echo "==================================================="
6 echo
7
8 a=1234zipper5FLIPPER43231
9
10 echo "The string being operated upon is \"`expr "$a" : '\(.*\)'`\"."
11 # Escaped parentheses grouping operator. == ==
12
13 # ***************************
14 #+ Escaped parentheses
15 #+ match a substring
16 # ***************************
17
18
19 # If no escaped parentheses...
20 #+ then 'expr' converts the string operand to an integer.
21
22 echo "Length of \"$a\" is `expr "$a" : '.*'`." # Length of string
23
24 echo "Number of digits at the beginning of \"$a\" is `expr "$a" : '[0-9]*'`."
25
26 # ------------------------------------------------------------------------- #
27
28 echo
29
30 echo "The digits at the beginning of \"$a\" are `expr "$a" : '\([0-9]*\)'`."
31 # == ==
32 echo "The first 7 characters of \"$a\" are `expr "$a" : '\(.......\)'`."
33 # ===== == ==
34 # Again, escaped parentheses force a substring match.
35 #
36 echo "The last 7 characters of \"$a\" are `expr "$a" : '.*\(.......\)'`."
37 # ==== end of string operator ^^
38 # (actually means skip over one or more of any characters until specified
39 #+ substring)
40
41 echo
42
43 exit 0
The above script illustrates how expr uses the escaped parentheses -- \( ... \) -- grouping operator in tandem
with regular expression parsing to match a substring. Here is a another example, this time from "real life."
Notes
[1] And even when xargs is not strictly necessary, it can speed up execution of a command involving
batch-processing of multiple files.
date
Simply invoked, date prints the date and time to stdout. Where this command gets interesting is in
its formatting and parsing options.
1 #!/bin/bash
2 # Exercising the 'date' command
3
4 echo "The number of days since the year's beginning is `date +%j`."
5 # Needs a leading '+' to invoke formatting.
6 # %j gives day of year.
7
8 echo "The number of seconds elapsed since 01/01/1970 is `date +%s`."
9 # %s yields number of seconds since "UNIX epoch" began,
10 #+ but how is this useful?
11
12 prefix=temp
13 suffix=$(date +%s) # The "+%s" option to 'date' is GNU-specific.
14 filename=$prefix.$suffix
15 echo "Temporary filename = $filename"
16 # It's great for creating "unique and random" temp filenames,
17 #+ even better than using $$.
18
19 # Read the 'date' man page for more formatting options.
20
21 exit 0
bash$ date
Fri Mar 29 21:07:39 MST 2002
bash$ date -u
Sat Mar 30 04:07:42 UTC 2002
1 #!/bin/bash
2 # date-calc.sh
3 # Author: Nathan Coulter
4 # Used in ABS Guide with permission (thanks!).
5
6 MPHR=60 # Minutes per hour.
7 HPD=24 # Hours per day.
8
9 diff () {
10 printf '%s' $(( $(date -u -d"$TARGET" +%s) -
11 $(date -u -d"$CURRENT" +%s)))
12 # %d = day of month.
13 }
14
15
16 CURRENT=$(date -u -d '2007-09-01 17:30:24' '+%F %T.%N %Z')
17 TARGET=$(date -u -d'2007-12-25 12:30:00' '+%F %T.%N %Z')
18 # %F = full date, %T = %H:%M:%S, %N = nanoseconds, %Z = time zone.
19
20 printf '\nIn 2007, %s ' \
21 "$(date -d"$CURRENT +
22 $(( $(diff) /$MPHR /$MPHR /$HPD / 2 )) days" '+%d %B')"
23 # %B = name of month ^ halfway
24 printf 'was halfway between %s ' "$(date -d"$CURRENT" '+%d %B')"
25 printf 'and %s\n' "$(date -d"$TARGET" '+%d %B')"
26
27 printf '\nOn %s at %s, there were\n' \
28 $(date -u -d"$CURRENT" +%F) $(date -u -d"$CURRENT" +%T)
29 DAYS=$(( $(diff) / $MPHR / $MPHR / $HPD ))
30 CURRENT=$(date -d"$CURRENT +$DAYS days" '+%F %T.%N %Z')
31 HOURS=$(( $(diff) / $MPHR / $MPHR ))
32 CURRENT=$(date -d"$CURRENT +$HOURS hours" '+%F %T.%N %Z')
33 MINUTES=$(( $(diff) / $MPHR ))
34 CURRENT=$(date -d"$CURRENT +$MINUTES minutes" '+%F %T.%N %Z')
35 printf '%s days, %s hours, ' "$DAYS" "$HOURS"
36 printf '%s minutes, and %s seconds ' "$MINUTES" "$(diff)"
37 printf 'until Christmas Dinner!\n\n'
38
39 # Exercise:
40 # --------
41 # Rewrite the diff () function to accept passed parameters,
42 #+ rather than using global variables.
The date command has quite a number of output options. For example %N gives the nanosecond
portion of the current time. One interesting use for this is to generate random integers.
1 date +%j
2 # Echoes day of the year (days elapsed since January 1).
3
4 date +%k%M
5 # Echoes hour and minute in 24-hour format, as a single digit string.
6
7
8
9 # The 'TZ' parameter permits overriding the default time zone.
10 date # Mon Mar 28 21:42:16 MST 2005
11 TZ=EST date # Mon Mar 28 23:42:16 EST 2005
12 # Thanks, Frank Kannemann and Pete Sjoberg, for the tip.
13
14
15 SixDaysAgo=$(date --date='6 days ago')
16 OneMonthAgo=$(date --date='1 month ago') # Four weeks back (not a month!)
17 OneYearAgo=$(date --date='1 year ago')
See also Example 3-4 and Example A-43.
zdump
Time zone dump: echoes the time in a specified time zone.
time
Outputs verbose timing statistics for executing a command.
real 0m0.067s
user 0m0.004s
sys 0m0.005s
See also the very similar times command in the previous section.
As of version 2.0 of Bash, time became a shell reserved word, with slightly altered
behavior in a pipeline.
touch
Utility for updating access/modification times of a file to current system time or other specified time,
but also useful for creating a new file. The command touch zzz will create a new file of zero
length, named zzz, assuming that zzz did not previously exist. Time-stamping empty files in this
way is useful for storing date information, for example in keeping track of modification times on a
project.
at 2pm January 15 prompts for a set of commands to execute at that time. These commands
should be shell-script compatible, since, for all practical purposes, the user is typing in an executable
shell script a line at a time. Input terminates with a Ctl-D.
Using either the -f option or input redirection (<), at reads a command list from a file. This file is an
executable shell script, though it should, of course, be non-interactive. Particularly clever is including
the run-parts command in the file to execute a different set of scripts.
batch
The batch job control command is similar to at, but it runs a command list when the system load
drops below .8. Like at, it can read commands from a file with the -f option.
The concept of batch processing dates back to the era of mainframe computers. It means running a
set of commands without user intervention.
cal
Prints a neatly formatted monthly calendar to stdout. Will do current year or a large range of past
and future years.
sleep
This is the shell equivalent of a wait loop. It pauses for a specified number of seconds, doing nothing.
It can be useful for timing or in processes running in the background, checking for a specific event
every so often (polling), as in Example 31-6.
The usleep command does not provide particularly accurate timing, and is therefore
unsuitable for critical timing loops.
hwclock, clock
The hwclock command accesses or adjusts the machine's hardware clock. Some options require root
privileges. The /etc/rc.d/rc.sysinit startup file uses hwclock to set the system time from
the hardware clock at bootup.
sort
File sort utility, often used as a filter in a pipe. This command sorts a text stream or file forwards or
backwards, or according to various keys or character positions. Using the -m option, it merges
presorted input files. The info page lists its many capabilities and options. See Example 11-9,
Example 11-10, and Example A-8.
tsort
Topological sort, reading in pairs of whitespace-separated strings and sorting according to input
patterns. The original purpose of tsort was to sort a list of dependencies for an obsolete version of the
ld linker in an "ancient" version of UNIX.
The results of a tsort will usually differ markedly from those of the standard sort command, above.
uniq
This filter removes duplicate lines from a sorted file. It is often seen in a pipe coupled with sort.
The sort INPUTFILE | uniq -c | sort -nr command string produces a frequency of
occurrence listing on the INPUTFILE file (the -nr options to sort cause a reverse numerical sort).
This template finds use in analysis of log files and dictionary lists, and wherever the lexical structure
of a document needs to be examined.
1 #!/bin/bash
2 # wf.sh: Crude word frequency analysis on a text file.
3 # This is a more efficient version of the "wf2.sh" script.
4
5
6 # Check for input file on command-line.
7 ARGS=1
8 E_BADARGS=85
9 E_NOFILE=86
10
11 if [ $# -ne "$ARGS" ] # Correct number of arguments passed to script?
12 then
13 echo "Usage: `basename $0` filename"
14 exit $E_BADARGS
15 fi
16
17 if [ ! -f "$1" ] # Check if file exists.
18 then
19 echo "File \"$1\" does not exist."
20 exit $E_NOFILE
21 fi
22
23
24
25 ########################################################
26 # main ()
27 sed -e 's/\.//g' -e 's/\,//g' -e 's/ /\
28 /g' "$1" | tr 'A-Z' 'a-z' | sort | uniq -c | sort -nr
29 # =========================
30 # Frequency of occurrence
31
32 # Filter out periods and commas, and
33 #+ change space between words to linefeed,
34 #+ then shift characters to lowercase, and
35 #+ finally prefix occurrence count and sort numerically.
36
37 # Arun Giridhar suggests modifying the above to:
38 # . . . | sort | uniq -c | sort +1 [-f] | sort +0 -nr
39 # This adds a secondary sort key, so instances of
40 #+ equal occurrence are sorted alphabetically.
41 # As he explains it:
42 # "This is effectively a radix sort, first on the
43 #+ least significant column
44 #+ (word or string, optionally case-insensitive)
45 #+ and last on the most significant column (frequency)."
46 #
47 # As Frank Wang explains, the above is equivalent to
48 #+ . . . | sort | uniq -c | sort +0 -nr
49 #+ and the following also works:
50 #+ . . . | sort | uniq -c | sort -k1nr -k
51 ########################################################
52
53 exit 0
54
55 # Exercises:
56 # ---------
57 # 1) Add 'sed' commands to filter out other punctuation,
58 #+ such as semicolons.
59 # 2) Modify the script to also filter out multiple spaces and
60 #+ other whitespace.
expand, unexpand
The expand filter converts tabs to spaces. It is often used in a pipe.
The unexpand filter converts spaces to tabs. This reverses the effect of expand.
cut
A tool for extracting fields from files. It is similar to the print $N command set in awk, but more
limited. It may be simpler to use cut in a script than awk. Particularly important are the -d (delimiter)
and -f (field specifier) options.
The join command operates on exactly two files, but pastes together only those lines with a common
tagged field (usually a numerical label), and writes the result to stdout. The files to be joined
should be sorted according to the tagged field for the matchups to work properly.
1 File: 1.data
2
3 100 Shoes
4 200 Laces
5 300 Socks
1 File: 2.data
2
3 100 $40.00
4 200 $1.00
5 300 $2.00
1 #!/bin/bash
2 # script-detector.sh: Detects scripts within a directory.
3
4 TESTCHARS=2 # Test first 2 characters.
5 SHABANG='#!' # Scripts begin with a "sha-bang."
6
7 for file in * # Traverse all the files in current directory.
8 do
9 if [[ `head -c$TESTCHARS "$file"` = "$SHABANG" ]]
10 # head -c2 #!
11 # The '-c' option to "head" outputs a specified
12 #+ number of characters, rather than lines (the default).
13 then
14 echo "File \"$file\" is a script."
15 else
16 echo "File \"$file\" is *not* a script."
17 fi
18 done
19
20 exit 0
21
22 # Exercises:
23 # ---------
24 # 1) Modify this script to take as an optional argument
25 #+ the directory to scan for scripts
26 #+ (rather than just the current working directory).
27 #
28 # 2) As it stands, this script gives "false positives" for
29 #+ Perl, awk, and other scripting language scripts.
30 # Correct this.
1 #!/bin/bash
2 # rnd.sh: Outputs a 10-digit random number
3
4 # Script by Stephane Chazelas.
5
6 head -c4 /dev/urandom | od -N4 -tu4 | sed -ne '1s/.* //p'
7
8
9 # =================================================================== #
10
11 # Analysis
12 # --------
13
14 # head:
15 # -c4 option takes first 4 bytes.
16
17 # od:
18 # -N4 option limits output to 4 bytes.
19 # -tu4 option selects unsigned decimal format for output.
20
21 # sed:
22 # -n option, in combination with "p" flag to the "s" command,
23 # outputs only matched lines.
24
25
26
27 # The author of this script explains the action of 'sed', as follows.
28
29 # head -c4 /dev/urandom | od -N4 -tu4 | sed -ne '1s/.* //p'
30 # ----------------------------------> |
31
32 # Assume output up to "sed" --------> |
33 # is 0000000 1198195154\n
34
35 # sed begins reading characters: 0000000 1198195154\n.
36 # Here it finds a newline character,
37 #+ so it is ready to process the first line (0000000 1198195154).
38 # It looks at its <range><action>s. The first and only one is
39
40 # range action
41 # 1 s/.* //p
42
43 # The line number is in the range, so it executes the action:
44 #+ tries to substitute the longest string ending with a space in the line
45 # ("0000000 ") with nothing (//), and if it succeeds, prints the result
46 # ("p" is a flag to the "s" command here, this is different
47 #+ from the "p" command).
48
49 # sed is now ready to continue reading its input. (Note that before
50 #+ continuing, if -n option had not been passed, sed would have printed
51 #+ the line once again).
52
53 # Now, sed reads the remainder of the characters, and finds the
54 #+ end of the file.
55 # It is now ready to process its 2nd line (which is also numbered '$' as
56 #+ it's the last one).
57 # It sees it is not matched by any <range>, so its job is done.
58
59 # In few word this sed commmand means:
60 # "On the first line only, remove any character up to the right-most space,
61 #+ then print it."
62
63 # A better way to do this would have been:
64 # sed -e 's/.* //;q'
65
66 # Here, two <range><action>s (could have been written
67 # sed -e 's/.* //' -e q):
68
69 # range action
70 # nothing (matches line) s/.* //
71 # nothing (matches line) q (quit)
72
73 # Here, sed only reads its first line of input.
74 # It performs both actions, and prints the line (substituted) before
75 #+ quitting (because of the "q" action) since the "-n" option is not passed.
76
77 # =================================================================== #
78
79 # An even simpler altenative to the above one-line script would be:
80 # head -c4 /dev/urandom| od -An -tu4
81
82 exit
1 #!/bin/bash
2
3 filename=sys.log
4
5 cat /dev/null > $filename; echo "Creating / cleaning out file."
6 # Creates file if it does not already exist,
7 #+ and truncates it to zero length if it does.
8 # : > filename and > filename also work.
9
10 tail /var/log/messages > $filename
11 # /var/log/messages must have world read permission for this to work.
12
13 echo "$filename contains tail end of system log."
14
15 exit 0
To list a specific line of a text file, pipe the output of head to tail -n 1. For example
head -n 8 database.txt | tail -n 1 lists the 8th line of the file
database.txt.
To set a variable to a given block of a text file:
Search the target file(s) for occurrences of pattern, where pattern may be literal text or a
Regular Expression.
The -l option lists only the files in which matches were found, but not the matching lines.
The -r (recursive) option searches files in the current working directory and all subdirectories below
it.
The -n option lists the matching lines, together with line numbers.
Example 16-16. Printing out the From lines in stored e-mail messages
1 #!/bin/bash
2 # from.sh
3
4 # Emulates the useful "from" utility in Solaris, BSD, etc.
5 # Echoes the "From" header line in all messages
6 #+ in your e-mail directory.
7
8
9 MAILDIR=~/mail/* # No quoting of variable. Why?
10 GREP_OPTS="-H -A 5 --color" # Show file, plus extra context lines
11 #+ and display "From" in color.
12 TARGETSTR="^From" # "From" at beginning of line.
13
14 for file in $MAILDIR # No quoting of variable.
15 do
16 grep $GREP_OPTS "$TARGETSTR" "$file"
17 # ^^^^^^^^^^ # Again, do not quote this variable.
18 echo
19 done
20
21 exit $?
22
23 # Might wish to pipe the output of this script to 'more' or
24 #+ redirect it to a file . . .
When invoked with more than one target file given, grep specifies which file contains matches.
To force grep to show the filename when searching only one target file, simply give
/dev/null as the second file.
1 #!/bin/bash
2 # grp.sh: Rudimentary reimplementation of grep.
3
4 E_BADARGS=85
5
6 if [ -z "$1" ] # Check for argument to script.
7 then
8 echo "Usage: `basename $0` pattern"
9 exit $E_BADARGS
10 fi
11
12 echo
13
14 for file in * # Traverse all files in $PWD.
15 do
16 output=$(sed -n /"$1"/p $file) # Command substitution.
17
18 if [ ! -z "$output" ] # What happens if "$output" is not quoted?
19 then
20 echo -n "$file: "
21 echo "$output"
22 fi # sed -ne "/$1/s|^|${file}: |p" is equivalent to above.
23
24 echo
25 done
26
27 echo
28
29 exit 0
30
31 # Exercises:
32 # ---------
33 # 1) Add newlines to output, if more than one match in any given file.
34 # 2) Add features.
How can grep search for two (or more) separate patterns? What if you want grep to display all lines
in a file or files that contain both "pattern1" and "pattern2"?
1 #!/bin/bash
2 # cw-solver.sh
3 # This is actually a wrapper around a one-liner (line 46).
4
5 # Crossword puzzle and anagramming word game solver.
6 # You know *some* of the letters in the word you're looking for,
7 #+ so you need a list of all valid words
8 #+ with the known letters in given positions.
9 # For example: w...i....n
10 # 1???5????10
11 # w in position 1, 3 unknowns, i in the 5th, 4 unknowns, n at the end.
12 # (See comments at end of script.)
13
14
15 E_NOPATT=71
16 DICT=/usr/share/dict/word.lst
17 # ^^^^^^^^ Looks for word list here.
18 # ASCII word list, one word per line.
19 # If you happen to need an appropriate list,
20 #+ download the author's "yawl" word list package.
21 # http://ibiblio.org/pub/Linux/libs/yawl-0.3.2.tar.gz
22 # or
23 # http://bash.neuralshortcircuit.com/yawl-0.3.2.tar.gz
24
25
26 if [ -z "$1" ] # If no word pattern specified
27 then #+ as a command-line argument . . .
28 echo #+ . . . then . . .
29 echo "Usage:" #+ Usage message.
30 echo
31 echo ""$0" \"pattern,\""
32 echo "where \"pattern\" is in the form"
33 echo "xxx..x.x..."
34 echo
35 echo "The x's represent known letters,"
36 echo "and the periods are unknown letters (blanks)."
37 echo "Letters and periods can be in any position."
38 echo "For example, try: sh cw-solver.sh w...i....n"
39 echo
40 exit $E_NOPATT
41 fi
42
43 echo
44 # ===============================================
45 # This is where all the work gets done.
46 grep ^"$1"$ "$DICT" # Yes, only one line!
47 # | |
48 # ^ is start-of-word regex anchor.
49 # $ is end-of-word regex anchor.
50
51 # From _Stupid Grep Tricks_, vol. 1,
52 #+ a book the ABS Guide author may yet get around
53 #+ to writing . . . one of these days . . .
54 # ===============================================
55 echo
56
57
58 exit $? # Script terminates here.
59 # If there are too many words generated,
60 #+ redirect the output to a file.
61
62 $ sh cw-solver.sh w...i....n
63
64 wellington
65 workingman
66 workingmen
egrep -- extended grep -- is the same as grep -E. This uses a somewhat different, extended set of
Regular Expressions, which can make the search a bit more flexible. It also allows the boolean | (or)
operator.
fgrep -- fast grep -- is the same as grep -F. It does a literal string search (no Regular Expressions),
which generally speeds things up a bit.
On some Linux distros, egrep and fgrep are symbolic links to, or aliases for
grep, but invoked with the -E and -F options, respectively.
1 #!/bin/bash
2 # dict-lookup.sh
3
4 # This script looks up definitions in the 1913 Webster's Dictionary.
5 # This Public Domain dictionary is available for download
6 #+ from various sites, including
7 #+ Project Gutenberg (http://www.gutenberg.org/etext/247).
8 #
9 # Convert it from DOS to UNIX format (with only LF at end of line)
10 #+ before using it with this script.
11 # Store the file in plain, uncompressed ASCII text.
12 # Set DEFAULT_DICTFILE variable below to path/filename.
13
14
15 E_BADARGS=85
16 MAXCONTEXTLINES=50 # Maximum number of lines to show.
17 DEFAULT_DICTFILE="/usr/share/dict/webster1913-dict.txt"
18 # Default dictionary file pathname.
19 # Change this as necessary.
20 # Note:
21 # ----
22 # This particular edition of the 1913 Webster's
23 #+ begins each entry with an uppercase letter
24 #+ (lowercase for the remaining characters).
25 # Only the *very first line* of an entry begins this way,
26 #+ and that's why the search algorithm below works.
27
28
29
30 if [[ -z $(echo "$1" | sed -n '/^[A-Z]/p') ]]
31 # Must at least specify word to look up, and
32 #+ it must start with an uppercase letter.
33 then
34 echo "Usage: `basename $0` Word-to-define [dictionary-file]"
35 echo
36 echo "Note: Word to look up must start with capital letter,"
37 echo "with the rest of the word in lowercase."
38 echo "--------------------------------------------"
39 echo "Examples: Abandon, Dictionary, Marking, etc."
40 exit $E_BADARGS
41 fi
42
43
44 if [ -z "$2" ] # May specify different dictionary
45 #+ as an argument to this script.
46 then
47 dictfile=$DEFAULT_DICTFILE
48 else
49 dictfile="$2"
50 fi
51
52 # ---------------------------------------------------------
53 Definition=$(fgrep -A $MAXCONTEXTLINES "$1 \\" "$dictfile")
54 # Definitions in form "Word \..."
55 #
56 # And, yes, "fgrep" is fast enough
57 #+ to search even a very large text file.
58
59
60 # Now, snip out just the definition block.
61
62 echo "$Definition" |
63 sed -n '1,/^[A-Z]/p' |
64 # Print from first line of output
65 #+ to the first line of the next entry.
66 sed '$d' | sed '$d'
67 # Delete last two lines of output
68 #+ (blank line and first line of next entry).
69 # ---------------------------------------------------------
70
71 exit $?
72
73 # Exercises:
74 # ---------
75 # 1) Modify the script to accept any type of alphabetic input
76 # + (uppercase, lowercase, mixed case), and convert it
77 # + to an acceptable format for processing.
78 #
79 # 2) Convert the script to a GUI application,
80 # + using something like 'gdialog' or 'zenity' . . .
81 # The script will then no longer take its argument(s)
82 # + from the command-line.
83 #
84 # 3) Modify the script to parse one of the other available
85 # + Public Domain Dictionaries, such as the U.S. Census Bureau Gazetteer.
See also Example A-41 for an example of speedy fgrep lookup on a large text file.
agrep (approximate grep) extends the capabilities of grep to approximate matching. The search string
may differ by a specified number of characters from the resulting matches. This utility is not part of
the core Linux distribution.
To search compressed files, use zgrep, zegrep, or zfgrep. These also work on
non-compressed files, though slower than plain grep, egrep, fgrep. They are handy
for searching through a mixed set of files, some compressed, some not.
1 #!/bin/bash
2 # lookup: Does a dictionary lookup on each word in a data file.
3
4 file=words.data # Data file from which to read words to test.
5
6 echo
7
8 while [ "$word" != end ] # Last word in data file.
9 do # ^^^
10 read word # From data file, because of redirection at end of loop.
11 look $word > /dev/null # Don't want to display lines in dictionary file.
12 lookup=$? # Exit status of 'look' command.
13
14 if [ "$lookup" -eq 0 ]
15 then
16 echo "\"$word\" is valid."
17 else
18 echo "\"$word\" is invalid."
19 fi
20
21 done <"$file" # Redirects stdin to $file, so "reads" come from there.
22
23 echo
24
25 exit 0
26
27 # ----------------------------------------------------------------
28 # Code below line will not execute because of "exit" command above.
29
30
31 # Stephane Chazelas proposes the following, more concise alternative:
32
33 while read word && [[ $word != end ]]
34 do if look "$word" > /dev/null
35 then echo "\"$word\" is valid."
36 else echo "\"$word\" is invalid."
37 fi
38 done <"$file"
39
40 exit 0
sed, awk
Scripting languages especially suited for parsing text files and command output. May be embedded
singly or in combination in pipes and shell scripts.
sed
Non-interactive "stream editor", permits using many ex commands in batch mode. It finds many uses
in shell scripts.
awk
Programmable file extractor and formatter, good for manipulating and/or extracting fields (columns)
in structured text files. Its syntax is similar to C.
wc
wc gives a "word count" on a file or I/O stream:
bash $ wc /usr/share/doc/sed-4.1.2/README
13 70 447 README
[13 lines 70 words 447 characters]
wc -w gives only the word count.
Using wc to count how many .txt files are in current working directory:
1 $ ls *.txt | wc -l
2 # Will work as long as none of the "*.txt" files
3 #+ have a linefeed embedded in their name.
4
5 # Alternative ways of doing this are:
6 # find . -maxdepth 1 -name \*.txt -print0 | grep -cz .
7 # (shopt -s nullglob; set -- *.txt; echo $#)
8
9 # Thanks, S.C.
Using wc to total up the size of all the files whose names begin with letters in the range d - h
Using wc to count the instances of the word "Linux" in the main source file for this book.
Must use quoting and/or brackets, as appropriate. Quotes prevent the shell from
reinterpreting the special characters in tr command sequences. Brackets should be
quoted to prevent expansion by the shell.
Either tr "A-Z" "*" <filename or tr A-Z \* <filename changes all the uppercase
letters in filename to asterisks (writes to stdout). On some systems this may not work, but tr
A-Z '[**]' will.
1 #!/bin/bash
2 # Changes a file to all uppercase.
3
4 E_BADARGS=85
5
6 if [ -z "$1" ] # Standard check for command-line arg.
7 then
8 echo "Usage: `basename $0` filename"
9 exit $E_BADARGS
10 fi
11
12 tr a-z A-Z <"$1"
13
14 # Same effect as above, but using POSIX character set notation:
15 # tr '[:lower:]' '[:upper:]' <"$1"
16 # Thanks, S.C.
17
18 exit 0
19
20 # Exercise:
21 # Rewrite this script to give the option of changing a file
22 #+ to *either* upper or lowercase.
1 #!/bin/bash
2 #
3 # Changes every filename in working directory to all lowercase.
4 #
5 # Inspired by a script of John Dubois,
6 #+ which was translated into Bash by Chet Ramey,
7 #+ and considerably simplified by the author of the ABS Guide.
8
9
10 for filename in * # Traverse all files in directory.
11 do
12 fname=`basename $filename`
13 n=`echo $fname | tr A-Z a-z` # Change name to lowercase.
14 if [ "$fname" != "$n" ] # Rename only files not already lowercase.
15 then
16 mv $fname $n
17 fi
18 done
19
20 exit $?
21
22
23 # Code below this line will not execute because of "exit".
24 #--------------------------------------------------------#
25 # To run it, delete script above line.
26
27 # The above script will not work on filenames containing blanks or newlines.
28 # Stephane Chazelas therefore suggests the following alternative:
29
30
31 for filename in * # Not necessary to use basename,
32 # since "*" won't return any file containing "/".
33 do n=`echo "$filename/" | tr '[:upper:]' '[:lower:]'`
34 # POSIX char set notation.
35 # Slash added so that trailing newlines are not
36 # removed by command substitution.
37 # Variable substitution:
38 n=${n%/} # Removes trailing slash, added above, from filename.
39 [[ $filename == $n ]] || mv "$filename" "$n"
40 # Checks if filename already lowercase.
41 done
42
43 exit $?
1 #!/bin/bash
2 # Du.sh: DOS to UNIX text file converter.
3
4 E_WRONGARGS=65
5
6 if [ -z "$1" ]
7 then
8 echo "Usage: `basename $0` filename-to-convert"
9 exit $E_WRONGARGS
10 fi
11
12 NEWFILENAME=$1.unx
13
14 CR='\015' # Carriage return.
15 # 015 is octal ASCII code for CR.
16 # Lines in a DOS text file end in CR-LF.
17 # Lines in a UNIX text file end in LF only.
18
19 tr -d $CR < $1 > $NEWFILENAME
20 # Delete CR's and write to new file.
21
22 echo "Original DOS text file is \"$1\"."
23 echo "Converted UNIX text file is \"$NEWFILENAME\"."
24
25 exit 0
26
27 # Exercise:
28 # --------
29 # Change the above script to convert from UNIX to DOS.
1 #!/bin/bash
2 # rot13.sh: Classic rot13 algorithm,
3 # encryption that might fool a 3-year old.
4
5 # Usage: ./rot13.sh filename
6 # or ./rot13.sh <filename
7 # or ./rot13.sh and supply keyboard input (stdin)
8
9 cat "$@" | tr 'a-zA-Z' 'n-za-mN-ZA-M' # "a" goes to "n", "b" to "o", etc.
10 # The 'cat "$@"' construction
11 #+ permits getting input either from stdin or from files.
12
13 exit 0
1 #!/bin/bash
2 # crypto-quote.sh: Encrypt quotes
3
4 # Will encrypt famous quotes in a simple monoalphabetic substitution.
5 # The result is similar to the "Crypto Quote" puzzles
6 #+ seen in the Op Ed pages of the Sunday paper.
7
8
9 key=ETAOINSHRDLUBCFGJMQPVWZYXK
10 # The "key" is nothing more than a scrambled alphabet.
11 # Changing the "key" changes the encryption.
12
13 # The 'cat "$@"' construction gets input either from stdin or from files.
14 # If using stdin, terminate input with a Control-D.
15 # Otherwise, specify filename as command-line parameter.
16
17 cat "$@" | tr "a-z" "A-Z" | tr "A-Z" "$key"
18 # | to uppercase | encrypt
19 # Will work on lowercase, uppercase, or mixed-case quotes.
20 # Passes non-alphabetic characters through unchanged.
21
22
23 # Try this script with something like:
24 # "Nothing so needs reforming as other people's habits."
25 # --Mark Twain
26 #
27 # Output is:
28 # "CFPHRCS QF CIIOQ MINFMBRCS EQ FPHIM GIFGUI'Q HETRPQ."
29 # --BEML PZERC
30
31 # To reverse the encryption:
32 # cat "$@" | tr "$key" "A-Z"
33
34
35 # This simple-minded cipher can be broken by an average 12-year old
36 #+ using only pencil and paper.
37
38 exit 0
39
40 # Exercise:
41 # --------
42 # Modify the script so that it will either encrypt or decrypt,
43 #+ depending on command-line argument(s).
tr variants
The tr utility has two historic variants. The BSD version does not use brackets (tr a-z A-Z), but
the SysV one does (tr '[a-z]' '[A-Z]'). The GNU version of tr resembles the BSD one.
fold
A filter that wraps lines of input to a specified width. This is especially useful with the -s option,
which breaks lines at word spaces (see Example 16-26 and Example A-1).
fmt
Simple-minded file formatter, used as a filter in a pipe to "wrap" long lines of text output.
1 #!/bin/bash
2
3 WIDTH=40 # 40 columns wide.
4
5 b=`ls /usr/local/bin` # Get a file listing...
6
7 echo $b | fmt -w $WIDTH
8
9 # Could also have been done by
10 # echo $b | fold - -s -w $WIDTH
11
12 exit 0
1 #!/bin/bash
2 # colms.sh
3 # A minor modification of the example file in the "column" man page.
4
5
6 (printf "PERMISSIONS LINKS OWNER GROUP SIZE MONTH DAY HH:MM PROG-NAME\n" \
7 ; ls -l | sed 1d) | column -t
8 # ^^^^^^ ^^
9
10 # The "sed 1d" in the pipe deletes the first line of output,
11 #+ which would be "total N",
12 #+ where "N" is the total number of files found by "ls -l".
13
14 # The -t option to "column" pretty-prints a table.
15
16 exit 0
colrm
Column removal filter. This removes columns (characters) from a file and writes the file, lacking the
range of specified columns, back to stdout. colrm 2 4 <filename removes the second
through fourth characters from each line of the text file filename.
If the file contains tabs or nonprintable characters, this may cause unpredictable
behavior. In such cases, consider using expand and unexpand in a pipe preceding
colrm.
nl
Line numbering filter: nl filename lists filename to stdout, but inserts consecutive numbers
at the beginning of each non-blank line. If filename omitted, operates on stdin.
The output of nl is very similar to cat -b, since, by default nl does not list blank lines.
1 #!/bin/bash
2 # line-number.sh
3
4 # This script echoes itself twice to stdout with its lines numbered.
5
6 # 'nl' sees this as line 4 since it does not number blank lines.
7 # 'cat -n' sees the above line as number 6.
8
9 nl `basename $0`
10
11 echo; echo # Now, let's try it with 'cat -n'
12
13 cat -n `basename $0`
14 # The difference is that 'cat -n' numbers the blank lines.
15 # Note that 'nl -ba' will also do so.
16
17 exit 0
18 # -----------------------------------------------------------------
pr
Print formatting filter. This will paginate files (or stdout) into sections suitable for hard copy
printing or viewing on screen. Various options permit row and column manipulation, joining lines,
setting margins, numbering lines, adding page headers, and merging files, among other things. The pr
command combines much of the functionality of nl, paste, fold, column, and expand.
A particularly useful option is -d, forcing double-spacing (same effect as sed -G).
gettext
The GNU gettext package is a set of utilities for localizing and translating the text output of programs
into foreign languages. While originally intended for C programs, it now supports quite a number of
programming and scripting languages.
The gettext program works on shell scripts. See the info page.
msgfmt
A program for generating binary message catalogs. It is used for localization.
iconv
A utility for converting file(s) to a different encoding (character set). Its chief use is for localization.
TeX is Donald Knuth's elaborate typsetting system. It is often convenient to write a shell script
encapsulating all the options and arguments passed to one of these markup languages.
For example, enscript filename.txt -p filename.ps produces the PostScript output file
filename.ps.
groff, tbl, eqn
Yet another text markup and display formatting language is groff. This is the enhanced GNU version
of the venerable UNIX roff/troff display and typesetting package. Manpages use groff.
The tbl table processing utility is considered part of groff, as its function is to convert table markup
into groff commands.
The eqn equation processing utility is likewise part of groff, and its function is to convert equation
markup into groff commands.
1 #!/bin/bash
2 # manview.sh: Formats the source of a man page for viewing.
3
4 # This script is useful when writing man page source.
5 # It lets you look at the intermediate results on the fly
6 #+ while working on it.
7
8 E_WRONGARGS=85
9
10 if [ -z "$1" ]
11 then
12 echo "Usage: `basename $0` filename"
13 exit $E_WRONGARGS
14 fi
15
16 # ---------------------------
17 groff -Tascii -man $1 | less
18 # From the man page for groff.
19 # ---------------------------
20
21 # If the man page includes tables and/or equations,
22 #+ then the above code will barf.
23 # The following line can handle such cases.
24 #
25 # gtbl < "$1" | geqn -Tlatin1 | groff -Tlatin1 -mtty-char -man
26 #
27 # Thanks, S.C.
28
29 exit $? # See also the "maned.sh" script.
The lex lexical analyzer produces programs for pattern matching. This has been replaced by the
nonproprietary flex on Linux systems.
The yacc utility creates a parser based on a set of specifications. This has been replaced by the
nonproprietary bison on Linux systems.
Notes
[1] This is only true of the GNU version of tr, not the generic version often found on commercial UNIX
systems.
tar
The standard UNIX archiving utility. [1] Originally a Tape ARchiving program, it has developed into
a general purpose package that can handle all manner of archiving with all types of destination
devices, ranging from tape drives to regular files to even stdout (see Example 3-4). GNU tar has
been patched to accept various compression filters, for example: tar czvf archive_name.tar.gz *,
which recursively archives and gzips all files in a directory tree except dotfiles in the current working
directory ($PWD). [2]
The mailshar command is a Bash script that uses shar to concatenate multiple files into a single one
for e-mailing. This script supports compression and uuencoding.
ar
Creation and manipulation utility for archives, mainly used for binary object file libraries.
rpm
The Red Hat Package Manager, or rpm utility provides a wrapper for source or binary archives. It
includes commands for installing and checking the integrity of packages, among other things.
A simple rpm -i package_name.rpm usually suffices to install a package, though there are many
more options available.
cpio
This specialized archiving copy command (copy input and output) is rarely seen any more, having
been supplanted by tar/gzip. It still has its uses, such as moving a directory tree. With an appropriate
block size (for copying) specified, it can be appreciably faster than tar.
1 #!/bin/bash
2
3 # Copying a directory tree using cpio.
4
5 # Advantages of using 'cpio':
6 # Speed of copying. It's faster than 'tar' with pipes.
7 # Well suited for copying special files (named pipes, etc.)
8 #+ that 'cp' may choke on.
9
10 ARGS=2
11 E_BADARGS=65
12
13 if [ $# -ne "$ARGS" ]
14 then
15 echo "Usage: `basename $0` source destination"
16 exit $E_BADARGS
17 fi
18
19 source="$1"
20 destination="$2"
21
22 ###################################################################
23 find "$source" -depth | cpio -admvp "$destination"
24 # ^^^^^ ^^^^^
25 # Read the 'find' and 'cpio' info pages to decipher these options.
26 # The above works only relative to $PWD (current directory) . . .
27 #+ full pathnames are specified.
28 ###################################################################
29
30
31 # Exercise:
32 # --------
33
34 # Add code to check the exit status ($?) of the 'find | cpio' pipe
35 #+ and output appropriate error messages if anything went wrong.
36
37 exit $?
rpm2cpio
This command extracts a cpio archive from an rpm one.
1 #!/bin/bash
2 # de-rpm.sh: Unpack an 'rpm' archive
3
4 : ${1?"Usage: `basename $0` target-file"}
5 # Must specify 'rpm' archive name as an argument.
6
7
8 TEMPFILE=$$.cpio # Tempfile with "unique" name.
9 # $$ is process ID of script.
10
11 rpm2cpio < $1 > $TEMPFILE # Converts rpm archive into
12 #+ cpio archive.
13 cpio --make-directories -F $TEMPFILE -i # Unpacks cpio archive.
14 rm -f $TEMPFILE # Deletes cpio archive.
15
16 exit 0
17
18 # Exercise:
19 # Add check for whether 1) "target-file" exists and
20 #+ 2) it is an rpm archive.
21 # Hint: Parse output of 'file' command.
pax
The pax portable archive exchange toolkit facilitates periodic file backups and is designed to be
cross-compatible between various flavors of UNIX. It was ported from BSD to Linux.
Compression
gzip
The standard GNU/UNIX compression utility, replacing the inferior and proprietary compress. The
corresponding decompression command is gunzip, which is the equivalent of gzip -d.
The -c option sends the output of gzip to stdout. This is useful when piping to
other commands.
The zcat filter decompresses a gzipped file to stdout, as possible input to a pipe or redirection. This
is, in effect, a cat command that works on compressed files (including files processed with the older
compress utility). The zcat command is equivalent to gzip -dc.
On some commercial UNIX systems, zcat is a synonym for uncompress -c, and will
not work on gzipped files.
See also Example 7-7.
bzip2
An alternate compression utility, usually more efficient (but slower) than gzip, especially on large
files. The corresponding decompression command is bunzip2.
File Information
file
A utility for identifying file types. The command file file-name will return a file specification
for file-name, such as ascii text or data. It references the magic numbers found in
/usr/share/magic, /etc/magic, or /usr/lib/magic, depending on the Linux/UNIX
distribution.
The -f option causes file to run in batch mode, to read from a designated file a list of filenames to
analyze. The -z option, when used on a compressed target file, forces an attempt to analyze the
uncompressed file type.
1 #!/bin/bash
2 # strip-comment.sh: Strips out the comments (/* COMMENT */) in a C program.
3
4 E_NOARGS=0
5 E_ARGERROR=66
6 E_WRONG_FILE_TYPE=67
7
8 if [ $# -eq "$E_NOARGS" ]
9 then
10 echo "Usage: `basename $0` C-program-file" >&2 # Error message to stderr.
11 exit $E_ARGERROR
12 fi
13
14 # Test for correct file type.
15 type=`file $1 | awk '{ print $2, $3, $4, $5 }'`
16 # "file $1" echoes file type . . .
17 # Then awk removes the first field, the filename . . .
18 # Then the result is fed into the variable "type."
19 correct_type="ASCII C program text"
20
21 if [ "$type" != "$correct_type" ]
22 then
23 echo
24 echo "This script works on C program files only."
25 echo
26 exit $E_WRONG_FILE_TYPE
27 fi
28
29
30 # Rather cryptic sed script:
31 #--------
32 sed '
33 /^\/\*/d
34 /.*\*\//d
35 ' $1
36 #--------
37 # Easy to understand if you take several hours to learn sed fundamentals.
38
39
40 # Need to add one more line to the sed script to deal with
41 #+ case where line of code has a comment following it on same line.
42 # This is left as a non-trivial exercise.
43
44 # Also, the above code deletes non-comment lines with a "*/" . . .
45 #+ not a desirable result.
46
47 exit 0
48
49
50 # ----------------------------------------------------------------
51 # Code below this line will not execute because of 'exit 0' above.
52
53 # Stephane Chazelas suggests the following alternative:
54
55 usage() {
56 echo "Usage: `basename $0` C-program-file" >&2
57 exit 1
58 }
59
60 WEIRD=`echo -n -e '\377'` # or WEIRD=$'\377'
61 [[ $# -eq 1 ]] || usage
62 case `file "$1"` in
63 *"C program text"*) sed -e "s%/\*%${WEIRD}%g;s%\*/%${WEIRD}%g" "$1" \
64 | tr '\377\n' '\n\377' \
65 | sed -ne 'p;n' \
66 | tr -d '\n' | tr '\377' '\n';;
67 *) usage;;
68 esac
69
70 # This is still fooled by things like:
71 # printf("/*");
72 # or
73 # /* /* buggy embedded comment */
74 #
75 # To handle all special cases (comments in strings, comments in string
76 #+ where there is a \", \\" ...),
77 #+ the only way is to write a C parser (using lex or yacc perhaps?).
78
79 exit 0
which
which command gives the full path to "command." This is useful for finding out whether a particular
command or utility is installed on the system.
$bash which rm
/usr/bin/rm
For an interesting use of this command, see Example 35-14.
whereis
Similar to which, above, whereis command gives the full path to "command," but also to its
manpage.
$bash whereis rm
rm: /bin/rm /usr/share/man/man1/rm.1.bz2
whatis
whatis command looks up "command" in the whatis database. This is useful for identifying system
commands and important configuration files. Consider it a simplified man command.
1 #!/bin/bash
2
3 # What are all those mysterious binaries in /usr/X11R6/bin?
4
5 DIRECTORY="/usr/X11R6/bin"
6 # Try also "/bin", "/usr/bin", "/usr/local/bin", etc.
7
8 for file in $DIRECTORY/*
9 do
10 whatis `basename $file` # Echoes info about the binary.
11 done
12
13 exit 0
14
15 # You may wish to redirect output of this script, like so:
16 # ./what.sh >>whatis.db
17 # or view it a page at a time on stdout,
18 # ./what.sh | less
bash$ vdir
total 10
-rw-r--r-- 1 bozo bozo 4034 Jul 18 22:04 data1.xrolo
-rw-r--r-- 1 bozo bozo 4602 May 25 13:58 data1.xrolo.bak
-rw-r--r-- 1 bozo bozo 877 Dec 17 2000 employment.xrolo
bash ls -l
total 10
-rw-r--r-- 1 bozo bozo 4034 Jul 18 22:04 data1.xrolo
-rw-r--r-- 1 bozo bozo 4602 May 25 13:58 data1.xrolo.bak
-rw-r--r-- 1 bozo bozo 877 Dec 17 2000 employment.xrolo
locate, slocate
The locate command searches for files using a database stored for just that purpose. The slocate
command is the secure version of locate (which may be aliased to slocate).
/usr/lib/xephem/catalogs/hickson.edb
getfacl, setfacl
These commands retrieve or set the file access control list -- the owner, group, and file permissions.
bash$ getfacl *
# file: test1.txt
# owner: bozo
# group: bozgrp
user::rw-
group::rw-
other::r--
# file: test2.txt
# owner: bozo
# group: bozgrp
user::rw-
group::rw-
other::r--
readlink
Disclose the file that a symbolic link points to.
strings
Use the strings command to find printable strings in a binary or data file. It will list sequences of
printable characters found in the target file. This might be handy for a quick 'n dirty examination of a
core dump or for looking at an unknown graphic image file (strings image-file | more
might show something like JFIF, which would identify the file as a jpeg graphic). In a script, you
would probably parse the output of strings with grep or sed. See Example 11-7 and Example 11-9.
1 #!/bin/bash
2 # wstrings.sh: "word-strings" (enhanced "strings" command)
3 #
4 # This script filters the output of "strings" by checking it
5 #+ against a standard word list file.
6 # This effectively eliminates gibberish and noise,
7 #+ and outputs only recognized words.
8
9 # ===========================================================
10 # Standard Check for Script Argument(s)
11 ARGS=1
12 E_BADARGS=85
13 E_NOFILE=86
14
15 if [ $# -ne $ARGS ]
16 then
17 echo "Usage: `basename $0` filename"
18 exit $E_BADARGS
19 fi
20
21 if [ ! -f "$1" ] # Check if file exists.
22 then
23 echo "File \"$1\" does not exist."
24 exit $E_NOFILE
25 fi
26 # ===========================================================
27
28
29 MINSTRLEN=3 # Minimum string length.
30 WORDFILE=/usr/share/dict/linux.words # Dictionary file.
31 # May specify a different word list file
32 #+ of one-word-per-line format.
33 # For example, the "yawl" word-list package,
34 # http://bash.neuralshortcircuit.com/yawl-0.3.2.tar.gz
35
36
37 wlist=`strings "$1" | tr A-Z a-z | tr '[:space:]' Z | \
38 tr -cs '[:alpha:]' Z | tr -s '\173-\377' Z | tr Z ' '`
39
40 # Translate output of 'strings' command with multiple passes of 'tr'.
41 # "tr A-Z a-z" converts to lowercase.
42 # "tr '[:space:]'" converts whitespace characters to Z's.
43 # "tr -cs '[:alpha:]' Z" converts non-alphabetic characters to Z's,
44 #+ and squeezes multiple consecutive Z's.
45 # "tr -s '\173-\377' Z" converts all characters past 'z' to Z's
46 #+ and squeezes multiple consecutive Z's,
47 #+ which gets rid of all the weird characters that the previous
48 #+ translation failed to deal with.
49 # Finally, "tr Z ' '" converts all those Z's to whitespace,
50 #+ which will be seen as word separators in the loop below.
51
52 # ****************************************************************
53 # Note the technique of feeding the output of 'tr' back to itself,
54 #+ but with different arguments and/or options on each pass.
55 # ****************************************************************
56
57
58 for word in $wlist # Important:
59 # $wlist must not be quoted here.
60 # "$wlist" does not work.
61 # Why not?
62 do
63
64 strlen=${#word} # String length.
65 if [ "$strlen" -lt "$MINSTRLEN" ] # Skip over short strings.
66 then
67 continue
68 fi
69
70 grep -Fw $word "$WORDFILE" # Match whole words only.
71 # ^^^ # "Fixed strings" and
72 #+ "whole words" options.
73
74 done
75
76
77 exit $?
Comparison
diff, patch
diff: flexible file comparison utility. It compares the target files line-by-line sequentially. In some
applications, such as comparing word dictionaries, it may be helpful to filter the files through sort and
uniq before piping them to diff. diff file-1 file-2 outputs the lines in the files that differ,
with carets showing which file each particular line belongs to.
The --side-by-side option to diff outputs each compared file, line by line, in separate columns,
with non-matching lines marked. The -c and -u options likewise make the output of the command
easier to interpret.
There are available various fancy frontends for diff, such as sdiff, wdiff, xdiff, and mgdiff.
The diff command returns an exit status of 0 if the compared files are identical, and 1
if they differ. This permits use of diff in a test construct within a shell script (see
below).
A common use for diff is generating difference files to be used with patch The -e option outputs
files suitable for ed or ex scripts.
patch: flexible versioning utility. Given a difference file generated by diff, patch can upgrade a
previous version of a package to a newer version. It is much more convenient to distribute a relatively
small "diff" file than the entire body of a newly revised package. Kernel "patches" have become the
preferred method of distributing the frequent releases of the Linux kernel.
1 cd /usr/src
2 gzip -cd patchXX.gz | patch -p0
3 # Upgrading kernel source using 'patch'.
4 # From the Linux kernel docs "README",
5 # by anonymous author (Alan Cox?).
The diff command can also recursively compare directories (for the filenames
present).
The merge (3-way file merge) command is an interesting adjunct to diff3. Its syntax is merge
Mergefile file1 file2. The result is to output to Mergefile the changes that lead from
file1 to file2. Consider this command a stripped-down version of patch.
sdiff
Compare and/or edit two files in order to merge them into an output file. Because of its interactive
nature, this command would find little use in a script.
cmp
The cmp command is a simpler version of diff, above. Whereas diff reports the differences between
two files, cmp merely shows at what point they differ.
Like diff, cmp returns an exit status of 0 if the compared files are identical, and 1 if
they differ. This permits use in a test construct within a shell script.
1 #!/bin/bash
2
3 ARGS=2 # Two args to script expected.
4 E_BADARGS=65
5 E_UNREADABLE=66
6
7 if [ $# -ne "$ARGS" ]
8 then
9 echo "Usage: `basename $0` file1 file2"
10 exit $E_BADARGS
11 fi
12
13 if [[ ! -r "$1" || ! -r "$2" ]]
14 then
15 echo "Both files to be compared must exist and be readable."
16 exit $E_UNREADABLE
17 fi
18
19 cmp $1 $2 &> /dev/null # /dev/null buries the output of the "cmp" command.
20 # cmp -s $1 $2 has same result ("-s" silent flag to "cmp")
21 # Thank you Anders Gustavsson for pointing this out.
22 #
23 # Also works with 'diff', i.e., diff $1 $2 &> /dev/null
24
25 if [ $? -eq 0 ] # Test exit status of "cmp" command.
26 then
27 echo "File \"$1\" is identical to file \"$2\"."
28 else
29 echo "File \"$1\" differs from file \"$2\"."
30 fi
31
32 exit 0
◊ -1 suppresses column 1
◊ -2 suppresses column 2
◊ -3 suppresses column 3
◊ -12 suppresses both columns 1 and 2, etc.
This command is useful for comparing "dictionaries" or word lists -- sorted text files with one word
per line.
Utilities
basename
Strips the path information from a file name, printing only the file name. The construction
basename $0 lets the script know its name, that is, the name it was invoked by. This can be used
for "usage" messages if, for example a script is called with missing arguments:
basename and dirname can operate on any arbitrary string. The argument does not
need to refer to an existing file, or even be a filename for that matter (see Example
A-7).
1 #!/bin/bash
2
3 a=/home/bozo/daily-journal.txt
4
5 echo "Basename of /home/bozo/daily-journal.txt = `basename $a`"
6 echo "Dirname of /home/bozo/daily-journal.txt = `dirname $a`"
7 echo
8 echo "My own home is `basename ~/`." # `basename ~` also works.
9 echo "The home of my home is `dirname ~/`." # `dirname ~` also works.
10
11 exit 0
split, csplit
These are utilities for splitting a file into smaller chunks. Their usual use is for splitting up large files
in order to back them up on floppies or preparatory to e-mailing or uploading them.
The csplit command splits a file according to context, the split occuring where patterns are matched.
1 #!/bin/bash
2 # splitcopy.sh
3
4 # A script that splits itself into chunks,
5 #+ then reassembles the chunks into an exact copy
6 #+ of the original script.
7
8 CHUNKSIZE=4 # Size of first chunk of split files.
9 OUTPREFIX=xx # csplit prefixes, by default,
10 #+ files with "xx" ...
11
12 csplit "$0" "$CHUNKSIZE"
13
14 # Some comment lines for padding . . .
15 # Line 15
16 # Line 16
17 # Line 17
18 # Line 18
19 # Line 19
20 # Line 20
21
22 cat "$OUTPREFIX"* > "$0.copy" # Concatenate the chunks.
23 rm "$OUTPREFIX"* # Get rid of the chunks.
24
25 exit $?
The cksum command shows the size, in bytes, of its target, whether file or stdout.
The md5sum and sha1sum commands display a dash when they receive their input
from stdout.
1 #!/bin/bash
2 # file-integrity.sh: Checking whether files in a given directory
3 # have been tampered with.
4
5 E_DIR_NOMATCH=70
6 E_BAD_DBFILE=71
7
8 dbfile=File_record.md5
9 # Filename for storing records (database file).
10
11
12 set_up_database ()
13 {
14 echo ""$directory"" > "$dbfile"
15 # Write directory name to first line of file.
16 md5sum "$directory"/* >> "$dbfile"
17 # Append md5 checksums and filenames.
18 }
19
20 check_database ()
21 {
22 local n=0
23 local filename
24 local checksum
25
26 # ------------------------------------------- #
27 # This file check should be unnecessary,
28 #+ but better safe than sorry.
29
30 if [ ! -r "$dbfile" ]
31 then
32 echo "Unable to read checksum database file!"
33 exit $E_BAD_DBFILE
34 fi
35 # ------------------------------------------- #
36
37 while read record[n]
38 do
39
40 directory_checked="${record[0]}"
41 if [ "$directory_checked" != "$directory" ]
42 then
43 echo "Directories do not match up!"
44 # Tried to use file for a different directory.
45 exit $E_DIR_NOMATCH
46 fi
47
48 if [ "$n" -gt 0 ] # Not directory name.
49 then
50 filename[n]=$( echo ${record[$n]} | awk '{ print $2 }' )
51 # md5sum writes records backwards,
52 #+ checksum first, then filename.
53 checksum[n]=$( md5sum "${filename[n]}" )
54
55
56 if [ "${record[n]}" = "${checksum[n]}" ]
57 then
58 echo "${filename[n]} unchanged."
59
60 elif [ "`basename ${filename[n]}`" != "$dbfile" ]
61 # Skip over checksum database file,
62 #+ as it will change with each invocation of script.
63 # ---
64 # This unfortunately means that when running
65 #+ this script on $PWD, tampering with the
66 #+ checksum database file will not be detected.
67 # Exercise: Fix this.
68 then
69 echo "${filename[n]} : CHECKSUM ERROR!"
70 # File has been changed since last checked.
71 fi
72
73 fi
74
75
76
77 let "n+=1"
78 done <"$dbfile" # Read from checksum database file.
79
80 }
81
82 # =================================================== #
83 # main ()
84
85 if [ -z "$1" ]
86 then
87 directory="$PWD" # If not specified,
88 else #+ use current working directory.
89 directory="$1"
90 fi
91
92 clear # Clear screen.
93 echo " Running file integrity check on $directory"
94 echo
95
96 # ------------------------------------------------------------------ #
97 if [ ! -r "$dbfile" ] # Need to create database file?
98 then
99 echo "Setting up database file, \""$directory"/"$dbfile"\"."; echo
100 set_up_database
101 fi
102 # ------------------------------------------------------------------ #
103
104 check_database # Do the actual work.
105
106 echo
107
108 # You may wish to redirect the stdout of this script to a file,
109 #+ especially if the directory checked has many files in it.
110
111 exit 0
112
113 # For a much more thorough file integrity check,
114 #+ consider the "Tripwire" package,
115 #+ http://sourceforge.net/projects/tripwire/.
116
Also see Example A-19, Example 35-14, and Example 10-2 for creative uses of the md5sum
command.
There have been reports that the 128-bit md5sum can be cracked, so the more secure
160-bit sha1sum is a welcome new addition to the checksum toolkit.
Security consultants have demonstrated that even sha1sum can be compromised. Fortunately, newer
Linux distros include longer bit-length sha224sum, sha256sum, sha384sum, and sha512sum
commands.
uuencode
This utility encodes binary files (images, sound files, compressed files, etc.) into ASCII characters,
making them suitable for transmission in the body of an e-mail message or in a newsgroup posting.
This is especially useful where MIME (multimedia) encoding is not available.
uudecode
This reverses the encoding, decoding uuencoded files back into the original binaries.
1 #!/bin/bash
2 # Uudecodes all uuencoded files in current working directory.
3
4 lines=35 # Allow 35 lines for the header (very generous).
5
6 for File in * # Test all the files in $PWD.
7 do
8 search1=`head -n $lines $File | grep begin | wc -w`
9 search2=`tail -n $lines $File | grep end | wc -w`
10 # Uuencoded files have a "begin" near the beginning,
11 #+ and an "end" near the end.
12 if [ "$search1" -gt 0 ]
13 then
14 if [ "$search2" -gt 0 ]
15 then
16 echo "uudecoding - $File -"
17 uudecode $File
18 fi
19 fi
20 done
21
22 # Note that running this script upon itself fools it
23 #+ into thinking it is a uuencoded file,
24 #+ because it contains both "begin" and "end".
25
26 # Exercise:
27 # --------
28 # Modify this script to check each file for a newsgroup header,
29 #+ and skip to next if not found.
30
31 exit 0
The fold -s command may be useful (possibly in a pipe) to process long uudecoded
text messages downloaded from Usenet newsgroups.
mimencode, mmencode
The mimencode and mmencode commands process multimedia-encoded e-mail attachments.
Although mail user agents (such as pine or kmail) normally handle this automatically, these particular
utilities permit manipulating such attachments manually from the command-line or in batch
processing mode by means of a shell script.
crypt
At one time, this was the standard UNIX file encryption utility. [5] Politically-motivated government
regulations prohibiting the export of encryption software resulted in the disappearance of crypt from
much of the UNIX world, and it is still missing from most Linux distributions. Fortunately,
programmers have come up with a number of decent alternatives to it, among them the author's very
own cruft (see Example A-4).
openssl
This is an Open Source implementation of Secure Sockets Layer encryption.
1 # To encrypt a file:
2 openssl aes-128-ecb -salt -in file.txt -out file.encrypted \
3 -pass pass:my_password
4 # ^^^^^^^^^^^ User-selected password.
5 # aes-128-ecb is the encryption method chosen.
6
7 # To decrypt an openssl-encrypted file:
8 openssl aes-128-ecb -d -salt -in file.encrypted -out file.txt \
9 -pass pass:my_password
10 # ^^^^^^^^^^^ User-selected password.
Piping openssl to/from tar makes it possible to encrypt an entire directory tree.
1 # To encrypt a directory:
2
3 sourcedir="/home/bozo/testfiles"
4 encrfile="encr-dir.tar.gz"
5 password=my_secret_password
6
7 tar czvf - "$sourcedir" |
8 openssl des3 -salt -out "$encrfile" -pass pass:"$password"
9 # ^^^^ Uses des3 encryption.
10 # Writes encrypted file "encr-dir.tar.gz" in current working directory.
11
12 # To decrypt the resulting tarball:
13 openssl des3 -d -salt -in "$encrfile" -pass pass:"$password" |
14 tar -xzv
15 # Decrypts and unpacks into current working directory.
Of course, openssl has many other uses, such as obtaining signed certificates for Web sites. See the
info page.
shred
Securely erase a file by overwriting it multiple times with random bit patterns before deleting it. This
command has the same effect as Example 16-60, but does it in a more thorough and elegant manner.
Advanced forensic technology may still be able to recover the contents of a file, even
after application of shred.
Miscellaneous
mktemp
Create a temporary file [6] with a "unique" filename. When invoked from the command-line without
additional arguments, it creates a zero-length file in the /tmp directory.
bash$ mktemp
/tmp/tmp.zzsvql3154
1 PREFIX=filename
2 tempfile=`mktemp $PREFIX.XXXXXX`
3 # ^^^^^^ Need at least 6 placeholders
4 #+ in the filename template.
5 # If no filename template supplied,
6 #+ "tmp.XXXXXXXXXX" is the default.
7
8 echo "tempfile name = $tempfile"
9 # tempfile name = filename.QA2ZpY
10 # or something similar...
11
12 # Creates a file of that name in the current working directory
13 #+ with 600 file permissions.
14 # A "umask 177" is therefore unnecessary,
15 #+ but it's good programming practice anyhow.
make
Utility for building and compiling binary packages. This can also be used for any set of operations
triggered by incremental changes in source files.
The make command checks a Makefile, a list of file dependencies and operations to be carried out.
The make utility is, in effect, a powerful scripting language similar in many ways to Bash, but with
the capability of recognizing dependencies. For in-depth coverage of this useful tool set, see the GNU
software documentation site.
install
Special purpose file copying command, similar to cp, but capable of setting permissions and attributes
of the copied files. This command seems tailormade for installing software packages, and as such it
shows up frequently in Makefiles (in the make install : section). It could likewise prove
useful in installation scripts.
dos2unix
This utility, written by Benjamin Lin and collaborators, converts DOS-formatted text files (lines
terminated by CR-LF) to UNIX format (lines terminated by LF only), and vice-versa.
ptx
The ptx [targetfile] command outputs a permuted index (cross-reference list) of the targetfile. This
may be further filtered and formatted in a pipe, if necessary.
more, less
Pagers that display a text file or stream to stdout, one screenful at a time. These may be used to
filter the output of stdout . . . or of a script.
Notes
[1] An archive, in the sense discussed here, is simply a set of related files stored in a single location.
[2] A tar czvf ArchiveName.tar.gz * will include dotfiles in subdirectories below the current
working directory. This is an undocumented GNU tar "feature."
[3] The checksum may be expressed as a hexadecimal number, or to some other base.
[4] For even better security, use the sha256sum, sha512, and sha1pass commands.
[5] This is a symmetric block cipher, used to encrypt files on a single system or local network, as opposed
to the public key cipher class, of which pgp is a well-known example.
[6] Creates a temporary directory when invoked with the -d option.
host
Searches for information about an Internet host by name or IP address, using DNS.
ipcalc
Displays IP information for a host. With the -h option, ipcalc does a reverse DNS lookup, finding the
name of the host (server) from the IP address.
nslookup
Do an Internet "name server lookup" on a host by IP address. This is essentially equivalent to ipcalc
-h or dig -x . The command may be run either interactively or noninteractively, i.e., from within a
script.
The nslookup command has allegedly been "deprecated," but it is still useful.
Non-authoritative answer:
Name: kuhleersparnis.ch
dig
Domain Information Groper. Similar to nslookup, dig does an Internet name server lookup on a host.
May be run from the command-line or from within a script.
Some interesting options to dig are +time=N for setting a query timeout to N seconds, +nofail for
continuing to query servers until a reply is received, and -x for doing a reverse address lookup.
;; QUESTION SECTION:
;2.6.9.81.in-addr.arpa. IN PTR
;; AUTHORITY SECTION:
6.9.81.in-addr.arpa. 3600 IN SOA ns.eltel.net. noc.eltel.net.
2002031705 900 600 86400 3600
1 #!/bin/bash
2 # spam-lookup.sh: Look up abuse contact to report a spammer.
3 # Thanks, Michael Zick.
4
5 # Check for command-line arg.
6 ARGCOUNT=1
7 E_WRONGARGS=65
8 if [ $# -ne "$ARGCOUNT" ]
9 then
10 echo "Usage: `basename $0` domain-name"
11 exit $E_WRONGARGS
12 fi
13
14
15 dig +short $1.contacts.abuse.net -c in -t txt
16 # Also try:
17 # dig +nssearch $1
18 # Tries to find "authoritative name servers" and display SOA records.
19
20 # The following also works:
21 # whois -h whois.abuse.net $1
22 # ^^ ^^^^^^^^^^^^^^^ Specify host.
23 # Can even lookup multiple spammers with this, i.e."
24 # whois -h whois.abuse.net $spamdomain1 $spamdomain2 . . .
25
26
27 # Exercise:
28 # --------
29 # Expand the functionality of this script
30 #+ so that it automatically e-mails a notification
31 #+ to the responsible ISP's contact address(es).
32 # Hint: use the "mail" command.
33
34 exit $?
35
36 # spam-lookup.sh chinatietong.com
37 # A known spam domain.
38
39 # "[email protected]"
40 # "[email protected]"
41 # "[email protected]"
42
43
44 # For a more elaborate version of this script,
45 #+ see the SpamViz home page, http://www.spamviz.net/index.html.
1 #! /bin/bash
2 # is-spammer.sh: Identifying spam domains
3
4 # $Id: is-spammer, v 1.4 2004/09/01 19:37:52 mszick Exp $
5 # Above line is RCS ID info.
6 #
7 # This is a simplified version of the "is_spammer.bash
8 #+ script in the Contributed Scripts appendix.
9
10 # is-spammer <domain.name>
11
12 # Uses an external program: 'dig'
13 # Tested with version: 9.2.4rc5
14
15 # Uses functions.
16 # Uses IFS to parse strings by assignment into arrays.
17 # And even does something useful: checks e-mail blacklists.
18
19 # Use the domain.name(s) from the text body:
20 # http://www.good_stuff.spammer.biz/just_ignore_everything_else
21 # ^^^^^^^^^^^
22 # Or the domain.name(s) from any e-mail address:
23 # [email protected]
24 #
25 # as the only argument to this script.
26 #(PS: have your Inet connection running)
27 #
28 # So, to invoke this script in the above two instances:
29 # is-spammer.sh spammer.biz
30
31
32 # Whitespace == :Space:Tab:Line Feed:Carriage Return:
33 WSP_IFS=$'\x20'$'\x09'$'\x0A'$'\x0D'
34
35 # No Whitespace == Line Feed:Carriage Return
36 No_WSP=$'\x0A'$'\x0D'
37
38 # Field separator for dotted decimal ip addresses
39 ADR_IFS=${No_WSP}'.'
40
41 # Get the dns text resource record.
42 # get_txt <error_code> <list_query>
43 get_txt() {
44
45 # Parse $1 by assignment at the dots.
46 local -a dns
47 IFS=$ADR_IFS
48 dns=( $1 )
49 IFS=$WSP_IFS
50 if [ "${dns[0]}" == '127' ]
51 then
52 # See if there is a reason.
53 echo $(dig +short $2 -t txt)
54 fi
55 }
56
57 # Get the dns address resource record.
58 # chk_adr <rev_dns> <list_server>
59 chk_adr() {
60 local reply
61 local server
62 local reason
63
64 server=${1}${2}
65 reply=$( dig +short ${server} )
66
67 # If reply might be an error code . . .
68 if [ ${#reply} -gt 6 ]
69 then
70 reason=$(get_txt ${reply} ${server} )
71 reason=${reason:-${reply}}
72 fi
73 echo ${reason:-' not blacklisted.'}
74 }
75
76 # Need to get the IP address from the name.
77 echo 'Get address of: '$1
78 ip_adr=$(dig +short $1)
79 dns_reply=${ip_adr:-' no answer '}
80 echo ' Found address: '${dns_reply}
81
82 # A valid reply is at least 4 digits plus 3 dots.
83 if [ ${#ip_adr} -gt 6 ]
84 then
85 echo
86 declare query
87
88 # Parse by assignment at the dots.
89 declare -a dns
90 IFS=$ADR_IFS
91 dns=( ${ip_adr} )
92 IFS=$WSP_IFS
93
94 # Reorder octets into dns query order.
95 rev_dns="${dns[3]}"'.'"${dns[2]}"'.'"${dns[1]}"'.'"${dns[0]}"'.'
96
97 # See: http://www.spamhaus.org (Conservative, well maintained)
98 echo -n 'spamhaus.org says: '
99 echo $(chk_adr ${rev_dns} 'sbl-xbl.spamhaus.org')
100
101 # See: http://ordb.org (Open mail relays)
102 echo -n ' ordb.org says: '
103 echo $(chk_adr ${rev_dns} 'relays.ordb.org')
104
105 # See: http://www.spamcop.net/ (You can report spammers here)
106 echo -n ' spamcop.net says: '
107 echo $(chk_adr ${rev_dns} 'bl.spamcop.net')
108
109 # # # other blacklist operations # # #
110
111 # See: http://cbl.abuseat.org.
112 echo -n ' abuseat.org says: '
113 echo $(chk_adr ${rev_dns} 'cbl.abuseat.org')
114
115 # See: http://dsbl.org/usage (Various mail relays)
116 echo
117 echo 'Distributed Server Listings'
118 echo -n ' list.dsbl.org says: '
119 echo $(chk_adr ${rev_dns} 'list.dsbl.org')
120
121 echo -n ' multihop.dsbl.org says: '
122 echo $(chk_adr ${rev_dns} 'multihop.dsbl.org')
123
124 echo -n 'unconfirmed.dsbl.org says: '
125 echo $(chk_adr ${rev_dns} 'unconfirmed.dsbl.org')
126
127 else
128 echo
129 echo 'Could not use that address.'
130 fi
131
132 exit 0
133
134 # Exercises:
135 # --------
136
137 # 1) Check arguments to script,
138 # and exit with appropriate error message if necessary.
139
140 # 2) Check if on-line at invocation of script,
141 # and exit with appropriate error message if necessary.
142
143 # 3) Substitute generic variables for "hard-coded" BHL domains.
144
145 # 4) Set a time-out for the script using the "+time=" option
146 to the 'dig' command.
For a much more elaborate version of the above script, see Example A-28.
traceroute
Trace the route taken by packets sent to a remote host. This command works within a LAN, WAN, or
over the Internet. The remote host may be specified by an IP address. The output of this command
may be filtered by grep or sed in a pipe.
ping
Broadcast an ICMP ECHO_REQUEST packet to another machine, either on a local or remote
network. This is a diagnostic tool for testing network connections, and it should be used with caution.
A successful ping returns an exit status of 0. This can be tested for in a script.
1 HNAME=nastyspammer.com
2 # HNAME=$HOST # Debug: test for localhost.
3 count=2 # Send only two pings.
4
5 if [[ `ping -c $count "$HNAME"` ]]
6 then
7 echo ""$HNAME" still up and broadcasting spam your way."
8 else
9 echo ""$HNAME" seems to be down. Pity."
10 fi
whois
Perform a DNS (Domain Name System) lookup. The -h option permits specifying which particular
whois server to query. See Example 4-6 and Example 16-40.
finger
Retrieve information about users on a network. Optionally, this command can display a user's
~/.plan, ~/.project, and ~/.forward files, if present.
bash$ finger
Login Name Tty Idle Login Time Office Office Phone
bozo Bozo Bozeman tty1 8 Jun 25 16:59 (:0)
bozo Bozo Bozeman ttyp0 Jun 25 16:59 (:0.0)
bozo Bozo Bozeman ttyp1 Jun 25 17:07 (:0.0)
bash$ finger bozo
Login: bozo Name: Bozo Bozeman
Directory: /home/bozo Shell: /bin/bash
Office: 2355 Clown St., 543-1234
On since Fri Aug 31 20:13 (MST) on tty1 1 hour 38 minutes idle
On since Fri Aug 31 20:13 (MST) on pts/0 12 seconds idle
On since Fri Aug 31 20:13 (MST) on pts/1
On since Fri Aug 31 20:31 (MST) on pts/2 1 hour 16 minutes idle
Mail last read Tue Jul 3 10:08 2007 (MST)
No Plan.
Out of security considerations, many networks disable finger and its associated daemon. [1]
chfn
Change information disclosed by the finger command.
vrfy
Verify an Internet e-mail address.
sx, rx
The sx and rx command set serves to transfer files to and from a remote host using the xmodem
protocol. These are generally part of a communications package, such as minicom.
sz, rz
The sz and rz command set serves to transfer files to and from a remote host using the zmodem
protocol. Zmodem has certain advantages over xmodem, such as faster transmission rate and
resumption of interrupted file transfers. Like sx and rx, these are generally part of a communications
package.
ftp
Utility and protocol for uploading / downloading files to or from a remote host. An ftp session can be
automated in a script (see Example 19-6 and Example A-4).
uucp, uux, cu
uucp: UNIX to UNIX copy. This is a communications package for transferring files between UNIX
servers. A shell script is an effective way to handle a uucp command sequence.
Since the advent of the Internet and e-mail, uucp seems to have faded into obscurity, but it still exists
and remains perfectly workable in situations where an Internet connection is not available or
appropriate. The advantage of uucp is that it is fault-tolerant, so even if there is a service interruption
the copy operation will resume where it left off when the connection is restored.
---
uux: UNIX to UNIX execute. Execute a command on a remote system. This command is part of the
uucp package.
---
cu: Call Up a remote system and connect as a simple terminal. It is a sort of dumbed-down version of
telnet. This command is part of the uucp package.
telnet
Utility and protocol for connecting to a remote host.
The telnet protocol contains security holes and should therefore probably be avoided.
Its use within a shell script is not recommended.
wget
The wget utility noninteractively retrieves or downloads files from a Web or ftp site. It works well in
a script.
1 wget -p http://www.xyz23.com/file01.html
2 # The -p or --page-requisite option causes wget to fetch all files
3 #+ required to display the specified page.
4
5 wget -r ftp://ftp.xyz24.net/~bozo/project_files/ -O $SAVEFILE
6 # The -r option recursively follows and retrieves all links
7 #+ on the specified site.
8
9 wget -c ftp://ftp.xyz25.net/bozofiles/filename.tar.bz2
10 # The -c option lets wget resume an interrupted download.
11 # This works with ftp servers and many HTTP sites.
1 #!/bin/bash
2 # quote-fetch.sh: Download a stock quote.
3
4
5 E_NOPARAMS=86
6
7 if [ -z "$1" ] # Must specify a stock (symbol) to fetch.
8 then echo "Usage: `basename $0` stock-symbol"
9 exit $E_NOPARAMS
10 fi
11
12 stock_symbol=$1
13
14 file_suffix=.html
15 # Fetches an HTML file, so name it appropriately.
16 URL='http://finance.yahoo.com/q?s='
17 # Yahoo finance board, with stock query suffix.
18
19 # -----------------------------------------------------------
20 wget -O ${stock_symbol}${file_suffix} "${URL}${stock_symbol}"
21 # -----------------------------------------------------------
22
23
24 # To look up stuff on http://search.yahoo.com:
25 # -----------------------------------------------------------
26 # URL="http://search.yahoo.com/search?fr=ush-news&p=${query}"
27 # wget -O "$savefilename" "${URL}"
28 # -----------------------------------------------------------
29 # Saves a list of relevant URLs.
30
31 exit $?
32
33 # Exercises:
34 # ---------
35 #
36 # 1) Add a test to ensure the user running the script is on-line.
37 # (Hint: parse the output of 'ps -ax' for "ppp" or "connect."
38 #
39 # 2) Modify this script to fetch the local weather report,
40 #+ taking the user's zip code as an argument.
See also Example A-30 and Example A-31.
lynx
The lynx Web and file browser can be used inside a script (with the -dump option) to retrieve a file
from a Web or ftp site noninteractively.
1 #!/bin/bash
2 # fc4upd.sh
3
4 # Script author: Frank Wang.
5 # Slight stylistic modifications by ABS Guide author.
6 # Used in ABS Guide with permission.
7
8
9 # Download Fedora Core 4 update from mirror site using rsync.
10 # Should also work for newer Fedora Cores -- 5, 6, . . .
11 # Only download latest package if multiple versions exist,
12 #+ to save space.
13
14 URL=rsync://distro.ibiblio.org/fedora-linux-core/updates/
15 # URL=rsync://ftp.kddilabs.jp/fedora/core/updates/
16 # URL=rsync://rsync.planetmirror.com/fedora-linux-core/updates/
17
18 DEST=${1:-/var/www/html/fedora/updates/}
19 LOG=/tmp/repo-update-$(/bin/date +%Y-%m-%d).txt
20 PID_FILE=/var/run/${0##*/}.pid
21
22 E_RETURN=85 # Something unexpected happened.
23
24
25 # General rsync options
26 # -r: recursive download
27 # -t: reserve time
28 # -v: verbose
29
30 OPTS="-rtv --delete-excluded --delete-after --partial"
31
32 # rsync include pattern
33 # Leading slash causes absolute path name match.
34 INCLUDE=(
35 "/4/i386/kde-i18n-Chinese*"
36 # ^ ^
37 # Quoting is necessary to prevent globbing.
38 )
39
40
41 # rsync exclude pattern
42 # Temporarily comment out unwanted pkgs using "#" . . .
43 EXCLUDE=(
44 /1
45 /2
46 /3
47 /testing
48 /4/SRPMS
49 /4/ppc
50 /4/x86_64
51 /4/i386/debug
52 "/4/i386/kde-i18n-*"
53 "/4/i386/openoffice.org-langpack-*"
54 "/4/i386/*i586.rpm"
55 "/4/i386/GFS-*"
56 "/4/i386/cman-*"
57 "/4/i386/dlm-*"
58 "/4/i386/gnbd-*"
59 "/4/i386/kernel-smp*"
60 # "/4/i386/kernel-xen*"
61 # "/4/i386/xen-*"
62 )
63
64
65 init () {
66 # Let pipe command return possible rsync error, e.g., stalled network.
67 set -o pipefail # Newly introduced in Bash, version 3.
68
69 TMP=${TMPDIR:-/tmp}/${0##*/}.$$ # Store refined download list.
70 trap "{
71 rm -f $TMP 2>/dev/null
72 }" EXIT # Clear temporary file on exit.
73 }
74
75
76 check_pid () {
77 # Check if process exists.
78 if [ -s "$PID_FILE" ]; then
79 echo "PID file exists. Checking ..."
80 PID=$(/bin/egrep -o "^[[:digit:]]+" $PID_FILE)
81 if /bin/ps --pid $PID &>/dev/null; then
82 echo "Process $PID found. ${0##*/} seems to be running!"
83 /usr/bin/logger -t ${0##*/} \
84 "Process $PID found. ${0##*/} seems to be running!"
85 exit $E_RETURN
86 fi
87 echo "Process $PID not found. Start new process . . ."
88 fi
89 }
90
91
92 # Set overall file update range starting from root or $URL,
93 #+ according to above patterns.
94 set_range () {
95 include=
96 exclude=
97 for p in "${INCLUDE[@]}"; do
98 include="$include --include \"$p\""
99 done
100
101 for p in "${EXCLUDE[@]}"; do
102 exclude="$exclude --exclude \"$p\""
103 done
104 }
105
106
107 # Retrieve and refine rsync update list.
108 get_list () {
109 echo $$ > $PID_FILE || {
110 echo "Can't write to pid file $PID_FILE"
111 exit $E_RETURN
112 }
113
114 echo -n "Retrieving and refining update list . . ."
115
116 # Retrieve list -- 'eval' is needed to run rsync as a single command.
117 # $3 and $4 is the date and time of file creation.
118 # $5 is the full package name.
119 previous=
120 pre_file=
121 pre_date=0
122 eval /bin/nice /usr/bin/rsync \
123 -r $include $exclude $URL | \
124 egrep '^dr.x|^-r' | \
125 awk '{print $3, $4, $5}' | \
126 sort -k3 | \
127 { while read line; do
128 # Get seconds since epoch, to filter out obsolete pkgs.
129 cur_date=$(date -d "$(echo $line | awk '{print $1, $2}')" +%s)
130 # echo $cur_date
131
132 # Get file name.
133 cur_file=$(echo $line | awk '{print $3}')
134 # echo $cur_file
135
136 # Get rpm pkg name from file name, if possible.
137 if [[ $cur_file == *rpm ]]; then
138 pkg_name=$(echo $cur_file | sed -r -e \
139 's/(^([^_-]+[_-])+)[[:digit:]]+\..*[_-].*$/\1/')
140 else
141 pkg_name=
142 fi
143 # echo $pkg_name
144
145 if [ -z "$pkg_name" ]; then # If not a rpm file,
146 echo $cur_file >> $TMP #+ then append to download list.
147 elif [ "$pkg_name" != "$previous" ]; then # A new pkg found.
148 echo $pre_file >> $TMP # Output latest file.
149 previous=$pkg_name # Save current.
150 pre_date=$cur_date
151 pre_file=$cur_file
152 elif [ "$cur_date" -gt "$pre_date" ]; then
153 # If same pkg, but newer,
154 pre_date=$cur_date #+ then update latest pointer.
155 pre_file=$cur_file
156 fi
157 done
158 echo $pre_file >> $TMP # TMP contains ALL
159 #+ of refined list now.
160 # echo "subshell=$BASH_SUBSHELL"
161
162 } # Bracket required here to let final "echo $pre_file >> $TMP"
163 # Remained in the same subshell ( 1 ) with the entire loop.
164
165 RET=$? # Get return code of the pipe command.
166
167 [ "$RET" -ne 0 ] && {
168 echo "List retrieving failed with code $RET"
169 exit $E_RETURN
170 }
171
172 echo "done"; echo
173 }
174
175 # Real rsync download part.
176 get_file () {
177
178 echo "Downloading..."
179 /bin/nice /usr/bin/rsync \
180 $OPTS \
181 --filter "merge,+/ $TMP" \
182 --exclude '*' \
183 $URL $DEST \
184 | /usr/bin/tee $LOG
185
186 RET=$?
187
188 # --filter merge,+/ is crucial for the intention.
189 # + modifier means include and / means absolute path.
190 # Then sorted list in $TMP will contain ascending dir name and
191 #+ prevent the following --exclude '*' from "shortcutting the circuit."
192
193 echo "Done"
194
195 rm -f $PID_FILE 2>/dev/null
196
197 return $RET
198 }
199
200 # -------
201 # Main
202 init
203 check_pid
204 set_range
205 get_list
206 get_file
207 RET=$?
208 # -------
209
210 if [ "$RET" -eq 0 ]; then
211 /usr/bin/logger -t ${0##*/} "Fedora update mirrored successfully."
212 else
213 /usr/bin/logger -t ${0##*/} \
214 "Fedora update mirrored with failure code: $RET"
215 fi
216
217 exit $RET
Using rcp, rsync, and similar utilities with security implications in a shell script may
not be advisable. Consider, instead, using ssh, scp, or an expect script.
ssh
Secure shell, logs onto a remote host and executes commands there. This secure replacement for
telnet, rlogin, rcp, and rsh uses identity authentication and encryption. See its manpage for details.
Within a loop, ssh may cause unexpected behavior. According to a Usenet post in the
comp.unix shell archives, ssh inherits the loop's stdin. To remedy this, pass ssh
either the -n or -f option.
Local Network
write
This is a utility for terminal-to-terminal communication. It allows sending lines from your terminal
(console or xterm) to that of another user. The mesg command may, of course, be used to disable
write access to a terminal
mail
Send or read e-mail messages.
This stripped-down command-line mail client works fine as a command embedded in a script.
1 #!/bin/sh
2 # self-mailer.sh: Self-mailing script
3
4 adr=${1:-`whoami`} # Default to current user, if not specified.
5 # Typing 'self-mailer.sh [email protected]'
6 #+ sends this script to that addressee.
7 # Just 'self-mailer.sh' (no argument) sends the script
8 #+ to the person invoking it, for example, [email protected].
9 #
10 # For more on the ${parameter:-default} construct,
11 #+ see the "Parameter Substitution" section
12 #+ of the "Variables Revisited" chapter.
13
14 # ============================================================================
15 cat $0 | mail -s "Script \"`basename $0`\" has mailed itself to you." "$adr"
16 # ============================================================================
17
18 # --------------------------------------------
19 # Greetings from the self-mailing script.
20 # A mischievous person has run this script,
21 #+ which has caused it to mail itself to you.
22 # Apparently, some people have nothing better
23 #+ to do with their time.
24 # --------------------------------------------
25
26 echo "At `date`, script \"`basename $0`\" mailed to "$adr"."
27
28 exit 0
29
30 # Note that the "mailx" command (in "send" mode) may be substituted
31 #+ for "mail" ... but with somewhat different options.
mailto
Similar to the mail command, mailto sends e-mail messages from the command-line or in a script.
However, mailto also permits sending MIME (multimedia) messages.
mailstats
Show mail statistics. This command may be invoked only by root.
root# mailstats
Statistics from Tue Jan 1 20:32:08 2008
M msgsfr bytes_from msgsto bytes_to msgsrej msgsdis msgsqur Mailer
4 1682 24118K 0 0K 0 0 0 esmtp
9 212 640K 1894 25131K 0 0 0 local
=====================================================================
T 1894 24758K 1894 25131K 0 0 0
C 414 0
vacation
This utility automatically replies to e-mails that the intended recipient is on vacation and temporarily
unavailable. It runs on a network, in conjunction with sendmail, and is not applicable to a dial-up
POPmail account.
Notes
[1]
A daemon is a background process not attached to a terminal session. Daemons perform designated
services either at specified times or explicitly triggered by certain events.
The word "daemon" means ghost in Greek, and there is certainly something mysterious, almost
supernatural, about the way UNIX daemons wander about behind the scenes, silently carrying out their
appointed tasks.
tput
Initialize terminal and/or fetch information about it from terminfo data. Various options permit certain
terminal operations: tput clear is the equivalent of clear; tput reset is the equivalent of reset.
Issuing a tput cup X Y moves the cursor to the (X,Y) coordinates in the current terminal. A clear to
erase the terminal screen would normally precede this.
1. Example 35-13
2. Example 35-11
3. Example A-44
4. Example A-42
5. Example 27-2
Note that stty offers a more powerful command set for controlling a terminal.
infocmp
This command prints out extensive information about the current terminal. It references the terminfo
database.
bash$ infocmp
# Reconstructed via infocmp from file:
/usr/share/terminfo/r/rxvt
rxvt|rxvt terminal emulator (X Window System),
am, bce, eo, km, mir, msgr, xenl, xon,
colors#8, cols#80, it#8, lines#24, pairs#64,
acsc=``aaffggjjkkllmmnnooppqqrrssttuuvvwwxxyyzz{{||}}~~,
bel=^G, blink=\E[5m, bold=\E[1m,
civis=\E[?25l,
clear=\E[H\E[2J, cnorm=\E[?25h, cr=^M,
...
reset
Reset terminal parameters and clear text screen. As with clear, the cursor and prompt reappear in the
upper lefthand corner of the terminal.
clear
The clear command simply clears the text screen at the console or in an xterm. The prompt and cursor
reappear at the upper left