Contents | Bulletin | Scripting in shell and Perl | Network troubleshooting | History | Humor |
Shell Tips and Tricks
Nobody really knows what the Bourne shell's grammar is. Even examination of the source code is little help. |
Tom Duff |
Most shell scripts are quick 'n dirty solutions to non-complex problems. Therefore the first and the most important tip I can give is "not too much zeal in optimization". Optimizing scripts for speed usually is a bad idea. In case that script performs an important task, but runs too slowly convert it to a scripting language. For example Perl. That is especially prudent if the script has nested loops. Time consumed by repetitive operations adds up quickly. Use the time and times tools to profile computation-intensive commands.
Bash is not particularly efficient at handling files, so consider using more appropriate tools for this within the script, such as awk or Perl. Unless you know, or willing to learn Perl, Awk is much underappreciated utility that can and should be used more widely in shell scripts. That's probably the second most important tip I can give.
Try to write your scripts in a structured, coherent form, so they can be reorganized and reused as necessary. Borrow from somebody and use a standard header that explains that purpose of the script and document changes that you made. Above all, use common sense.
The problem with bash is that it pretty baroque and there are just too many features that you need to remember. This command, that command and so on till infinity. It is difficult to remember details of each bash has an on-line help feature that provides basic information about most of its built-in commands. To see the help description for a particular command, enter
help command(for example, help alias) at the bash UNIX prompt. To see a list of bash commands for which help is available, type help at the bash UNIX prompt. You may access the manual page for bash by entering man bash at a UNIX prompt, but beware, it is 60 pages long and not very readable.
With bash 3.x, you can reissue commands like in C-shell using arrow keys and use Ctrl-r to browse command history. (In emacs mode you can also use CTRL-p and CTRL-n )
Bash also supports "file name completion" which, if not abused, can save some typing.
Like any decent shell Bash also allows you to define aliases and you should avoid retying the same command twice not only by browsing the history but defining aliases from history. But remember that too many aliases are counterproductive. Limit you repertoire to a dozen. For such thing like browsing /var/log/messages or /var/adm/messages it's better to define functions that are more powerful tool the aliases.
Use a separate dot file with functions and aliases for example an .aliases file, and .bashrc
Old News ;-)
Please visit Heiner Steven's
SHELLdorado, the best shell scripting site on the Internet
|
The eval Command
This section describes another of the more unusual commands in the shell: eval. Its format is as follows:
eval command-linewhere command-line is a normal command line that you would type at the terminal. When you put eval in front of it, however, the net effect is that the shell scans the command line twice before executing it.[1] For the simple case, this really has no effect:
[1] Actually, what happens is that eval simply executes the command passed to it as arguments; so the shell processes the command line when passing the arguments to eval, and then once again when eval executes the command. The net result is that the command line is scanned twice by the shell.$ eval echo hello hello $But consider the following example without the use of eval:
$ pipe="|" $ ls $pipe wc -l |: No such file or directory wc: No such file or directory -l: No such file or directory $Those errors come from ls. The shell takes care of pipes and I/O redirection before variable substitution, so it never recognizes the pipe symbol inside pipe. The result is that the three arguments |, wc, and -l are passed to ls as arguments.
Putting eval in front of the command sequence gives the desired results:
$ eval ls $pipe wc –l 16 $The first time the shell scans the command line, it substitutes | as the value of pipe. Then eval causes it to rescan the line, at which point the | is recognized by the shell as the pipe symbol.
The eval command is frequently used in shell programs that build up command lines inside one or more variables. If the variables contain any characters that must be seen by the shell directly on the command line (that is, not as the result of substitution), eval can be useful. Command terminator (;, |, &), I/O redirection (<, >), and quote characters are among the characters that must appear directly on the command line to have any special meaning to the shell.
For the next example, consider writing a program last whose sole purpose is to display the last argument passed to it. You needed to get at the last argument in the mycp program in Chapter 10, "Reading and Printing Data." There you did so by shifting all the arguments until the last one was left. You can also use eval to get at it as shown:
$ cat last eval echo \$$# $ last one two three four four $ last * Get the last file zoo_report $The first time the shell scans
echo \$$#the backslash tells it to ignore the $ that immediately follows. After that, it encounters the special parameter $#, so it substitutes its value on the command line. The command now looks like this:
echo $4(the backslash is removed by the shell after the first scan). When the shell rescans this line, it substitutes the value of $4 and then executes echo.
This same technique could be used if you had a variable called arg that contained a digit, for example, and you wanted to display the positional parameter referenced by arg. You could simply write
eval echo \$$argThe only problem is that just the first nine positional parameters can be accessed this way; to access positional parameters 10 and greater, you must use the ${n} construct:
eval echo \${$arg}Here's how the eval command can be used to effectively create "pointers" to variables:
$ x=100 $ ptrx=x $ eval echo \$$ptrx Dereference ptrx 100 $ eval $ptrx=50 Store 50 in var that ptrx points to $ echo $x See what happened 50 $
Sys Admin Miscellaneous Unix Tips Answering Novice Shell Questions
A common eval use is to build a dynamic string containing valid Unix commands and then use eval to execute the string. Why do we need eval? Often, you can build a command that doesn't require eval:
evalstr="myexecutable" $evalstr # execute the command stringHowever, chances are the above command won't work if "myexecutable" requires command-line arguments. That's where eval comes in.
Our man page says that the arguments to the eval command are "read as input to the shell and the resulting commands executed". What does that mean? Think of it as the eval command forcing a second pass so the string's arguments become the arguments of the spawned child shell.
In a previous column, we built a dynamic sed command that skipped 3 header lines, printed 5 lines, and skipped 3 more lines until the end of the file:
evalstr="sed -n '4,\${p;n;p;n;p;n;p;n;p;n;n;n;}' data.file" eval $evalstr # execute the command stringThis command fails without eval. When the sed command executes in the child shell, eval forces the remainder of the string to become arguments to the child.
Possibly the coolest eval use is building dynamic Unix shell variables. The following stub script dynamically creates shell variables user1 and user2 setting them equal to the strings John and Ed, respectively:
COUNT=1 eval user${COUNT}=John echo $user1 COUNT=2 eval user${COUNT}=Ed echo $user2 Pasting Files with pasteAnother novice asked how to line up three files line by line sending the output to another file. Given the following:
file1: 1 2 3 file2: a b c file3: 7 8 9the output file should look like this:
1a7 2b8 3c9The paste command is a ready-made solution:
paste file1 file2 file3By default, the delimiter character between the columns is a tab key. The paste command provides a -d delimiter option. Everything after -d is treated as a list. For example, this paste rendition uses the pipe symbol and ampersand characters as a list:
paste -d"|&" file1 file2 file3The command produces this output:
1|a&7 2|b&8 3|c&9The pipe symbol character, |, is used between columns 1 and 2, while the ampersand, &, separates column 2 and 3. If the list is completely used, and if the paste command contains more files arguments, then paste starts at the beginning of the list.
To satisfy our original requirement, paste provides a null character, \0, signifying no character. To prevent the shell from interpreting the character, it must also be quoted:
paste -d"\0" file1 file2 file3Process a String One Character at a Time
Still another user asked how to process a string in a shell script one character at a time. Certainly, advanced scripting languages such as Perl and Ruby can solve this problem, but the cut command's -b option, which specifies the byte position, is a simple alternative:
#!/bin/ksh mystring="teststring" length=${#mystring} count=0 until [ $count -eq $length ] do ((count+=1)) char=$(echo $mystring|cut -b"$count") echo $char doneIn the stub above, string mystring's length is determined using the advanced pattern-matching capabilities of the bash and ksh shells. Any number of external Unix commands can provide a string length, but probably the command with the smallest foot print is expr:
length=$(expr "$mystring" : '.*')Also, the bash shell contains a substring expansion parameter:
${parameter:offset:length}According to the bash man page, the substring expansion expands "up to length characters of parameter starting at the character specified offset". Note that the offset starts counting from zero:
#!/bin/bash mystring="teststring" length=${#mystring} ol=1 offset=0 until [ $offset -eq $length ] do echo "${mystring:${offset}:${ol}}" ((offset+=1)) done # end scriptDeleting a File Named dash
Finally, a novice inadvertently created a file named with the single character dash, and asked us how to delete the file. No matter how he escaped the dash in the rm command, it still was considered an rm option.
It's easy enough to create the file using the touch command:
touch -To remove it, use a path to the file -- either full or relative. Assuming the dash file exists in the mydir directory, provide a full path to the file:
rm /pathto/mydir/-Or if the file exists in the current directory, provide a relative path:
rm ./-Of course, our old friend find can clobber that file everywhere:
find . -name "-" |xargs rm
[Jul 30, 2011] Advanced Techniques
developer.apple.com
Shell scripts can be powerful tools for writing software. Graphical interfaces notwithstanding, they are capable of performing nearly any task that could be performed with a more traditional language. This chapter describes several techniques that will help you write more complex software using shell scripts.
- “Using the eval Builtin for Data Structures, Arrays, and Indirection” describes how to create complex data structures in shell scripts.
- “Shell Text Formatting” tells how to do tabular layouts and use ANSI escape sequences to add color and styles to your terminal output.
- “Trapping Signals” tells how to write signal handlers in shell scripts.
- “Nonblocking I/O” and “Timing Loops” show one way to write complex interactive scripts such as games.
- “Background Jobs and Job Control” explains how to do complex tasks in the background while your script continues to execute, including how to perform some basic parallel computation. It also explains how to obtain the result codes from these jobs after they exit.
- “Application Scripting With osascript” describes how your script can interact with Mac OS X applications using AppleScript.
- “Scripting Interactive Tools Using File Descriptors” describes how you can make bidirectional connections to command-line tools.
- “Networking With Shell Scripts” describes how to use the
nc
tool (otherwise known as netcat) to write shell scripts that take advantage of TCP/IP sockets.
Tips on good shell programming practices
Once upon a time, Unix had only one shell, the Bourne shell, and when a script was written, the shell read the script and executed the commands. Then another shell appeared, and another. Each shell had its own syntax and some, like the C shell, were very different from the original. This meant that if a script took advantage of the features of one shell or another, it had to be run using that shell. Instead of typing:
doit
The user had to know to type:
/bin/ksh doit
or:
/bin/csh doit
To remedy this, a clever change was made to the Unix kernel -- now a script can be written beginning with a hash-bang (#!
) combination on the first line, followed by a shell that executes the script. As an example, take a look at the following script, nameddoit
:
#! /bin/ksh
#
# do some script here
#
In this example, the kernel reads in the scriptdoit
, sees the hash-bang, and continues reading the rest of the line, where it finds/bin/ksh
. The kernel then starts the Korn shell withdoit
as an argument and feeds it the script, as if the following command had been issued:
/bin/ksh doit
When/bin/ksh
begins reading in the script, it sees the hash-bang in the first line as a comment (because it starts with a hash) and ignores it. To be run, the full path to the shell is required, as the kernel does not search yourPATH
variable. The hash-bang handler in the kernel does more than just run an alternate shell; it actually takes the argument following the hash-bang and uses it as a command, then adds the name of the file as an argument to that command.
You could start a Perl script nameddoperl
by using the hash-bang:
#! /bin/perl
# do some perl script here
If you begin by typingdoperl
, the kernel spots the hash-bang, extracts the/bin/perl
command, then runs it as if you had typed:
/bin/perl doperl
There are two mechanisms in play that allow this to work. The first is the kernel interpretation of the hash-bang; the second is that Perl sees the first line as a comment and ignores it. This technique will not work for scripting languages that fail to treat lines starting with a hash as a comment; in those cases, it will most likely cause an error. You needn't limit your use of this method to running scripts either, although that is where it's most useful.
The following script, namedhelpme
, types itself to the terminal when you enter the commandhelpme
:
#! /bin/cat
vi unix editor
man manual pages
sh Bourne Shell
ksh Korn Shell
csh C Shell
bash Bourne Again Shell
This kernel trick will execute one argument after the name of the command. To hide the first line, change the file to usemore
by starting at line 2, but be sure to use the correct path:
#! /bin/more +2
vi unix editor
man manual pages
sh Bourne Shell
ksh Korn Shell
csh C Shell
bash Bourne Again Shell
Typinghelpme
as a command causes the kernel to convert this to:
/bin/more +2 helpme
Everything from line 2 onward is displayed:
helpme
vi unix editor
man manual pages
sh Bourne Shell
ksh Korn Shell
csh C Shell
bash Bourne Again Shell
etc.
You can also use this technique to create apparently useless scripts, such as a file that removes itself:
#! /bin/rm
If you named this fileflagged
, running it would cause the command to be issued as if you had typed:
/bin/rm flagged
You could use this in a script to indicate that you are running something, then execute the script to remove it:
#! /bin/ksh
# first refuse to run if the flagged file exists
if [-f flagged ]
then
exit
fi
# create the flag file
echo "#! /bin/rm" >flagged
chmod a+x flagged
# do some logic here
# unflag the process by executing the flag file
flagged
Before you begin building long commands with this technique, keep in mind that systems often have an upper limit (typically 32 characters) on the length of the code in the#!
line.
Testing command line arguments and usage
When you write a shell script, arguments are commonly needed for it to function properly. In order to ensure that those arguments make sense, it's often necessary to validate them.
Testing for enough arguments is the easiest method of validation. For example, if you've created a shell script that requires two file names to operate, test for at least two arguments on the command line. To do this in the Bourne and Korn shells, check the value of$#
-- a variable that contains the count of arguments, other than the command itself. It is also good practice to include a message detailing the reasons why the command failed; this is usually created in a usage function.
The scripttwofiles
below tests for two arguments on the command line:
#! /bin/ksh
# twofile script handles two files named on the command line
# a usage function to display help for the hapless user
usage ()
{
echo "twofiles"
echo "usage: twofiles file1 file2"
echo "Processes two files"
}
# test if we have two arguments on the command line
if [ $# != 2 ]
then
usage
exit
fi
# we are ok at this point so continue processing here
A safer practice is to validate as much as you can before running your execution. The following version oftwofiles
checks the argument count and tests both files. If file 1 doesn't exist (if [ 1 ! -f $1 ]
) an error message is set up, a usage is displayed, and the program exits. The same is done for file 2:
#! /bin/ksh
# twofile script handles two files named on the command line
# a usage function to display help for the hapless user
# plus an additional error message if it has been filled in
usage ()
{
echo "twofiles"
echo "usage: twofiles file1 file2"
echo "Processes two files"
echo " "
echo $errmsg
}
# test if we have two arguments on the command line
if [ $# != 2 ]
then
usage
exit
fi
# test if file one exists and send an additional error message
# to usage if not found
if [ ! -f $1 ]
then
errmsg=${1}":File Not Found"
usage
exit
fi
# same for file two
if [ ! -f $2 ]
then
errmsg=${2}":File Not Found"
usage
exit
fi
# we are ok at this point so continue processing here
Note that in the Korn shell you can also use the double bracket test syntax, which is faster. The single bracket test actually calls a program namedtest
to test the values, while the double bracket test is built into the Korn shell and does not have to call a separate program.
The double bracket test will not work in the Bourne shell:
if [[ $# != 2 ]]
or
if [[ ! -f $1 ]]
or
if [[ ! -f $2 ]]
This thorough validation can prevent later errors in the program logic when a file is suddenly found missing. Consider it good programming practice.
[Mar 17, 2010] Power Shell Usage Bash Tips & Tricks
Searching the Past
- There are several bad ways of finding previous lines from history
- Many people go for pressing
Up
lots (and lots)- A tad inefficient, perhaps
- Many people go for pressing
Ctrl+R
searches previous lines- But
Ctrl+R zip Esc
doesn’t find the lastzip
command — it also matches any line that copied, deleted, unzipped, or did anything else with a zip file
- But
- Those of a gambling bent can chance
!
and a command name- Irritating when
!gv
opensgvim
instead ofgv
- Irritating when
- Sane Incremental Searching
- Bash can cycle through lines starting in a particular way
- Just type in a few characters then press
Up
- Don’t need to press
Up
so many times - Don’t see lines that merely contain those letters
- Don’t have to chance executing the wrong line
- Don’t need to press
- Incremental searching with
Up
andDown
is configured in.inputrc
"\e[A": history-search-backward "\e[B": history-search-forward
- Old behavior still available with
Ctrl+P
andCtrl+N
- If that prevents
Left
andRight
from working, fix them like this:"\e[C": forward-char "\e[D": backward-char
- Repeating Command Arguments
- Commonly want to repeat just bits of commands
- Very often the previous command’s last argument
- Meta+. retrieves the last argument. Press repeatedly to cycle through the final argument from earlier commands
- Magic Space
- A magic space inserts a space character as normal
- And also performs history expansion in the line
- See what you type before you commit to it
- Press
Space
beforeEnter
if necessary
- Press
- Magic Space Set-Up
- Magic space is configured in
.inputrc
- Redefine what
Space
does - There are other readline-based programs without this feature,
so make it only apply in Bash:
$if Bash Space: magic-space $endif
- Magic space is configured in
- Forgetting Options
- Common to forget an option from a command
- Want to rerun the command with the option
- Go to the previous history line, then move just after the command name to type the option
- Can set up a keyboard macro to do this
- Insert-Option Macro
Meta+O
can be made to load the previous command and position the cursor for typing an option- Defined in
.inputrc
:"\M-o": "\C-p\C-a\M-f "
Ctrl+P
: previous lineCtrl+A
: start of lineMeta+F
: forward a word, past the command␣
: insert a space
- 17 unused keystrokes with just
Ctrl
orMeta
modifiers
[Aug 11, 2009] All about Linux Input-Output redirection made simple in Linux
I can give one practical purpose for this error redirection which I use on a regular basis. When I am searching for a file in the whole hard disk as a normal user, I get a lot of errors such as :
find: /file/path: Permission deniedIn such situations I use the error redirection to weed out these error messages as follows:
# find / -iname \* 2> /dev/null
Now all the error messages are redirected to /dev/null device and I get only the actual find results on the screen.
Note: /dev/null is a special kind of file in that its size is always zero. So what ever you write to that file will just disappear. The opposite of this file is /dev/zero which acts as an infinite source. For example, you can use /dev/zero to create a file of any size - for example, when creating a swap file for instance.
[Aug 4, 2009] Tech Tip View Config Files Without Comments Linux Journal
I've been using this grep invocation for years to trim comments out of config files. Comments are great but can get in your way if you just want to see the currently running configuration. I've found files hundreds of lines long which had fewer than ten active configuration lines, it's really hard to get an overview of what's going on when you have to wade through hundreds of lines of comments.
$ grep ^[^#] /etc/ntp.confThe regex ^[^#] matches the first character of any line, as long as that character that is not a #. Because blank lines don't have a first character they're not matched either, resulting in a nice compact output of just the active configuration lines.
csh.startup
INDEX 6) Small tricks, aliases and other bit 'n' pieces
This is a list of small ``tricks'' that can be incorperated into your own
.cshrc/.login startup files.
i) Show only new MOTD (messages of the the day) on login
if (-f /etc/motd ) then
cmp -s /etc/motd ~/.hushlogin
if ($status) tee ~/.hushlogin < /etc/motd
endif
ii) Changing the prompt to reflect the current directory
alias setprompt 'set prompt = "`pwd` > "'
alias cd 'chdir \!* && setprompt'
alias pushd 'pushd \!* && setprompt'
alias popd 'popd \!* && setprompt'
setprompt
iii) Searching for a particular process (given as argument)
WARNING this is for a SunOS environment and may be different for
other OS's.
alias pf 'ps auxgww|awk '\''/(^| |\(|\/)\!:1( |\)|$)/'\''|cut -c1-15,36-99'
iv) Multiline prompt
alias setprompt 'set prompt="\\
${hostname:h}:${cwd}\\
\! % "'
v) Log remote (rsh) non-interactive commands executed in this account.
add something like the following to your .cshrc (non-interactive part)
if ( ! $?prompt ) then
# Record the
set column = "`ps ww1 | head -1`" # figure out column from ps header
set column = `expr "$column" : '\(.*\)COMMAND' : '.*' + 1`
ps ww$$ | tail -1 | cut -c${column}- >> ~/command.log
exit
endif
vi) Csh Function Scripts.
Scripts which are executed by the current shell as if internal This
allows more complex setprompt scripts, and for scripts to change the
prompt, set environment variables or change the current directory.
# Csh function scripts
alias function 'set argv=(\!*); shift; source \!:1'
# Specific Csh function
alias setprompt function ~/bin/scripts/setprompt
# Directory of Csh functions (initialization)
foreach i (~/bin/csh.functions/*)
alias $i:t function $i
end
vii) File/Directory mailing Aliases
Mail files, binaries, and directories to other people easily
Usage: mailfile addresss file
alias a alias
a mailfile 'cat ~/lib/line-cut \!:2 ~/lib/line-cut |\\
/usr/ucb/mail -s "file \!:2" \!:1'
a mailuu 'uuencode \!:2 \!:2 | cat ~/lib/line-cut - ~/lib/line-cut |\\
/usr/ucb/mail -s "file \!:2.uu" \!:1'
a maildir 'tar cvf - "\!:2" | compress | uuencode "\!:2.tar.Z" |\\
cat ~/lib/line-cut - ~/lib/line-cut |\\
/usr/ucb/mail -s "file \!:2.tar.Z.uu" \!:1'
# Multiple file "tar mail"
# Usage: tarmail address "subject" file...
a tarmail 'tar cvf - \!:3* | compress | uuencode tarmail.tar.Z |\\
cat ~anthony/lib/line-cut - ~anthony/lib/line-cut |\\
/usr/ucb/mail -s "\!:2" \!:1'
-- miscellaneous sources
------------------------------------------------------------------------------
INDEX 7) Disclamer: Csh Script Programming Considered Harmful
There are plenty of reasons not to use csh for script writing.
See Csh Programming Considered Harmful
ftp://convex.com/pub/csh.whynot
http://www.cit.gu.edu.au/~anthony/info/shell/csh.whynot-1.4
also
http://www.cit.gu.edu.au/~anthony/info/shell/csh.whynot.extra
This file is an attempt to explain how to make it easier and more convenient
to use it interactively. It does NOT provide a guide to using csh as a
general script writing language, and the authors recommend that it not be
used for that purpose.
But why use csh interactively?
The aliases and history list alone make it worthwhile, extra features such
as file completion, tilde expansion, and job control make even more useful.
The tcsh command line editing and other interactive enhancements make it one
of the best interactive shells around.
There are arguably `better' shells avaliable that can be used, but I have
found many of them lacking in some important aspect or, generally not
installed on most systems. A delivered vanilla machine however, is almost
certain to have csh. A .cshrc and .login setup can then be easily copied
and is available immediately.
Faced with the choice between plain sh and bad old csh I'll take csh any
day.
-- Paul Davey (pd@x.co.uk)
-- Anthony Thyssen (anthony@cit.gu.edu.au)
UNIX tips and tricks for a new user, Part 4: Some nifty shell tricks
When writing a shell program, you often come across some special situation that you'd like to handle automatically. This tutorial includes examples of such situations from small Bourne shell scripts. These situations include base conversion from one string to another (decimal to hex, hex to decimal, decimal to octal, and so on), reading the keyboard while in a piped loop, subshell execution, inline input, executing a command once for each file in a directory, and multiple ways to construct a continuous loop.
Part 4 of this series wraps up with a collection of shell one-liners that perform useful functions.
bash Cookbook Reader - Contributions browse
The dirs command output isn't that readable with many long pathnames. To make it more readable,
you could just use the -p option on dirs:
alias dirs="dirs -p"
[Apr 4, 2008] bash Cookbook Reader - Contributions browse
the last argument
The arguments to a script (or function) are$1, $2, ...
and can be referred to as a group by$*
(or$@
). But is there an easy way to refer to the last argument in list ? Try${!#}
as in:
echo ${!#} LAST=${!#}
Bash, version 3
- The += operator is now permitted in
in places where previously only the = assignment operator was recognized.
a=1 echo $a # 1 a+=5 # Won't work under versions of Bash earlier than 3.1. echo $a # 15 a+=Hello echo $a # 15Hello
Bash, version 3
- The =~
Regular Expression matching operator within a
double brackets test expression. (Perl has a similar operator.)
#!/bin/bash variable="This is a fine mess." echo "$variable" if [[ "$variable" =~ "T*fin*es*" ]] # Regex matching with =~ operator within [[ double brackets ]]. then echo "match found" # match found fi
Interactive and non-interactive scripts
Alternatively, the script can test for the presence of i in the $- flag.
1 case $- in 2 *i*) # interactive script 3 ;; 4 *) # non-interactive script 5 ;; 6 # (Thanks to "UNIX F.A.Q.", 1993)
Scripts may be forced to run in interactive mode with the i option or with a #!/bin/bash -i header. Be aware that this may cause erratic script behavior or show error messages where no error is present.
Assorted Tips
This is just a idea of writing log clumsy worded...
- To keep a record of which user scripts have run during a particular
session or over a number of sessions, add the following lines to each
script you want to keep track of. This will keep a continuing file record
of the script names and invocation times.
1 # Append (>>) following to end of save file. 2 date>> $SAVE_FILE #Date and time. 3 echo $0>> $SAVE_FILE #Script name. 4 echo>> $SAVE_FILE #Blank line as separator. 5 # Of course, SAVE_FILE defined and exported as environmental variable in ~/.bashrc 6 # (something like ~/.scripts-run)
- A shell script may act as an embedded command inside another shell script, a Tcl or wish script, or even a Makefile. It can be invoked as as an external shell command in a C program using the system() call, i.e., system("script_name");.
- Put together a file of your favorite and most useful definitions and functions, then "include" this file in scripts as necessary with either the "dot" (.) or source command (see Section 3.2).
- It would be nice to be able to invoke X-Windows widgets from a shell script. There do, in fact, exist a couple of packages that purport to do so, namely Xscript and Xmenu, but these seem to be pretty much defunct. If you dream of a script that can create widgets, try wish (a Tcl derivative), PerlTk (Perl with Tk extensions), or tksh (ksh with Tk extensions).
[Jun 25, 2007] IFS variable and Field splitting in the Korn shell
IFS Specifies internal field separators (normally space, tab, and new line) used to separate command words that result from command or parameter substitution and for separating words with the regular built-in command read. The first character of the IFS parameter is used to separate arguments for the $* substitution.
... ... ...
[May 7, 2007] basename and dirname
basename strips off the path leaving only the final component of the name, which is assumed to be the file name. If you specify suffix and the remaining portion of name contains a suffix which matches suffix, basename removes that suffix. For example
producesbasename src/dos/printf.c .c
printfdirname
returns the directory part of the full path+name combination.
Also can be done directly in bash
basename=${file##*/}
dirname=${file%/*}
[May 7, 2007] To strip file extensions in bash, like
this.rbl --> this
fname=${file%.rbl}
More 2 Cent Tips & Tricks LG #37
Date: Tue, 12 Jan 1999 19:18:15 +0200
From: Reuben Sumner, rasumner@iname.com
Here is a two cent tip that I have been meaning to submit for a long long time now.
If you have a large stack of CD-ROMS, finding where a particular file lies can be a time consuming task. My solution uses the locate program and associated utilities to build up a database of the CDs' contents that allows for rapid searching.
First we need to create the database, the following script does the trick nicely.
#!/bin/bash onedisk() { mount /mnt/cdrom find /mnt/cdrom -maxdepth 7 -print | sed "s;^/mnt/cdrom;$1;" > $1.find eject -u cdrom } echo Enter name of disk in device: read diskname while [ -n "$diskname" ]; do onedisk $diskname echo Enter name of next disk or Enter if done: read diskname done echo OK, preparing cds.db cat *.find | sort -f | /usr/lib/findutils/frcode > cds.db echo Done...Start with no CD mounted. Run the script. It will ask for a label for the CD, a short name like "sunsite1" is best. It will then quickly scan the CD, eject it and prompt for another. When you have exhausted your collection just hit enter at the prompt. A file called cds.db will be done. To make it simple to use copy cds.db to /var/lib (or anywhere else, that is where locatedb is on my system). Now create an alias like
alias cdlocate="locate -d /var/lib/cds.db"Now if I type "cdlocate lyx" I get
debian20_contrib/debian/hamm/contrib/binary-i386/text/lyx_0.12.0.final-0.1.deb debian20_contrib/debian/hamm/contrib/binary-m68k/text/lyx_0.12.0.final-0.1.deb debian20_contrib/debian/hamm/contrib/source/text/lyx_0.12.0.final-0.1.diff.gz debian20_contrib/debian/hamm/contrib/source/text/lyx_0.12.0.final-0.1.dsc debian20_contrib/debian/hamm/contrib/source/text/lyx_0.12.0.final.orig.tar.gz lsa3/apps/wp/lyx-0.12.0-linux-elf-x86-libc5-bin.tar.gz lsa3/apps/wp/lyx-0.12.0.lsm lsa3/apps/wp/lyx-0.12.0.tar.gz lsa4/docs/french/www.linux-france.com/lgazette/issue-28/gx/lyx lsa4/powertools/i386/lyx-0.12.0-1.i386.rpm lsa4/powertools/SRPMS/lyx-0.12.0-1.src.rpm openlinux12/col/install/RPMS/lyx-0.11.32-1.i386.rpm openlinux12/col/sources/SRPMS/lyx-0.11.32-1.src.rpm suse53/suse/contents/lyxIn order to prevent locate from warning you that the database is old try touch -t 010100002020 /var/lib/cds.db to set the modification date to January 1 2020.
--
Reuben
Finding text strings in binary files
Ever wondered what's inside some of those binary files on your system (binary executables or binary data)? Several times I've gotten error messages from some command in the Solaris system, but I couldn't tell where the error was coming from because it was buried in some binary executable file.
The Solaris "strings" command lets you look at the ASCII text buried inside of executable files, and can often help you troubleshoot problems. For instance, one time I was seeing error messages like this when a user was trying to log in:
Could not set ULIMIT
I finally traced the problem down to the /bin/login command by running the "strings" command like this:
root> strings /bin/login | more
The strings command lists ASCII character sequences in binary files, and help me determine that the "Could not set ULIMIT" error was coming from this file. Once I determined that the error message I was seeing was coming from this file, solving the problem became a simple matter.
Use CDPATH to traverse filesystems faster
If you're like many Solaris users and administrators, you spend a lot of time moving back and forth between directories in similar locations. For instance, you might often work in your home directory (such as "/home/al"), the /usr/local directories, web page directories, or other user's home directories in /home.
If you're often moving back-and-forth between the same directories, and you use the Bourne shell (sh) or Korn shell (ksh) as your login shell, you can use the CDPATH shell variable to save yourself a lot of typing, and quickly move between directories.
Here's a quick demo. First move to the root directory:
cd /
Next, if it's not set already, set your CDPATH shell variable as follows:
CDPATH=/usr/spool
Then, type this cd command:
cd cron
What happens? Type this and see what happened:
pwd
The result should be "/usr/spool/cron".
When you typed "cd cron", the shell looked in your local directory for a sub-directory named "cron". When it didn't find one, it searched the CDPATH variable, and looked for a "cron" sub-directory. When it found a sub-directory named cron in the /usr/spool directory, it moved you there.
You can set your CDPATH variable just like your normal PATH variable:
CDPATH=/home/al:/usr/local:/usr/spool:/home
Group commands together with parentheses
Have you ever needed to run a series of commands, and pipe the output of all of those commands into yet another command?
For instance, what if you wanted to run the "sar", "date", "who", and "ps -ef" commands, and wanted to pipe the output of all three of those commands into the "more" command? If you tried this:
sar -u 1 5; date; who; ps -ef | more
you'll quickly find that it won't work. Only the output of the "ps -ef" command gets piped through the "more" command, and the rest of the output scrolls off the screen.
Instead, group the commands together with a pair of parentheses (and throw in a few echo statements for readability) to get the output of all these commands to pipe into the more command:
(sar -u 1 5; echo; who; echo; ps -ef; echo; date; echo) | more
Use the "at" command to run jobs some other time
Many times it's necessary to schedule programs to run at a later time. For instance, if your computer system is very busy during the day, you may need
to run jobs late at night when nobody is logged on the system.
Solaris makes this very easy with the "at" command. You can use the "at" command to run a job at almost any time--later today, early tomorrow...whenever.
Suppose you want to run the program "my_2_hour_program" at ten o'clock tonight. Simply tell the at command to run the job at 10 p.m. (2200):
/home/al> at 2200
at> my_2_hour_program > /tmp/2hour.out
at> <CTRL><D>
warning: commands will be executed using /bin/ksh
job 890193600.a at Tue Mar 17 22:00:00 1998
Or suppose you'd like to run a find command at five o'clock tomorrow morning:
/home/al> at 0500 tomorrow
at> find /home > /tmp/find.out
at> <CTRL><D>
warning: commands will be executed using /bin/ksh
job 890215200.a at Wed Mar 18 05:00:00 1998
When you're at the "at" prompt, just type the command you want to run. Try a few tests with the at command until you become comfortable with the way
it works.
Create a directory and move into it at the same time
Question: How often do you create a new directory and then move into that directory in your next command? Answer: Almost always.
I realized this trend in my own work habits, so I created a simple shell function to do the hard work for me.
md () {
mkdir -p $1 && cd $1
}
This is a Bourne shell function named "md" that works for Bourne and Korn shell users. It can be easily adapted for C shell users.
Taking advantage of the -p option of the mkdir command, the function easily creates multi-level subdirectories, and moves you into the lowest level of the directory structure. You can use the command to create one subdirectory like this:
/home/al> md docs
/home/al/docs> _
or you can create an entire directory tree and move right into the new directory like this:
/home/al> md docs/memos/internal/solaris8
/home/al/docs/memos/internal/solaris8>
Searching Multiple CD-ROM
Date: Fri, 15 Jan 1999 19:55:51 +0100 (CET)
From: JL Hopital, cdti94@magic.fr
My English is terrible,so feel free to correct if you decide to publish...
Hello,i am a French linuxer and here is my two cent tips. If you have many CD-ROMs and want to retrieve this_file_I'm_sure_i_have_but_can't_remember_where, it can helps.
It consist of 2 small scripts using gnu utilities: updatedb and locate. Normally 'updatedb' run every night, creating a database for all the mounted file systems and 'locate' is used to query this system-wide database.But you can tell them where are the files to index and where to put the database.That's what my scripts does:
The first script (addcd.sh) create a database for the cd actually mounted.You must run it once for every cdrom.
The second ( cdlocate.sh ) search in the databases created by addcd.sh and display the cdname and full path of the files matching the pattern you give in parameter. So you can search for unmounted files !
To use:
Beware that locate's regular expressions have some peculiarities, 'man locate' will explain.
- create a directory and copy in it the 2 scripts
mkdir /home/cdroms cp addcd.sh cdlocate.sh /home/cdroms- mount the first cdrom you want to index
mount /mnt/cdrom( if your mount point is different , you must adapt the script )- run addcd.sh with a fully descriptive name for this cdrom as parameter (this description will be used as part of the database name ,don't use space):
./addcd.sh Linux.Toolkit.Disk1.Oct.1996It will take some time to updatedb to create the databases specially if the cdrom contain many files.- umount the cdrom and go to step 2 for all the cdroms you want or every time you've got a new one(I have more than 70 databases created this way).
- you can now use cdlocate.sh,to retrieve files
./cdlocate.sh '*gimp*rpm'
Hope this help and happy linuxing !
---Cut here------------------------------ # addcd.sh # Author: Jose-Luc.Hopital@ac-creteil.fr # Create a filename's database in $DATABASEHOME for the cd mounted # at $MOUNTPOINT # Example usage: addcd.sh Linux.Toolkit.Disk3.Oct.1996 # to search the databases use cdlocate.sh CDNAME=$1 test "$CDNAME" = "" && { echo Usage:$0 name_of_cdrom ; exit 1 ; } # the mount point for the cd-ROM MOUNTPOINT=/mnt/cdrom # where to put the database DATABASEHOME=/home/cdroms updatedb --localpaths=$MOUNTPOINT --output=$DATABASEHOME/$CDNAME.updatedb && \ echo Database added for $CDNAME ---Cut here-------------------------------- # cdlocate.sh # Author : Jose-Luc.Hopital@ac-creteil.fr # Usage $0 pattern # search regular expression in $1 in the database's found in $DATABASEHOME # to add a database for a new cd-rom , use addcd.sh test "$*" = "" && { echo Usage:$0 pattern ; exit 1 ; } DATABASEHOME=/home/cdroms cd $DATABASEHOME # get ride of locate warning:more than 8 days old touch *.updatedb CDROMLIST=`ls *.updatedb` for CDROM in $CDROMLIST do CDROMNAME=`basename $CDROM .updatedb` locate --database=$DATABASEHOME/$CDROM $@ |sed 's/^/'$CDROMNAME:'/' done
Recommended Links
In case of broken links please try to use Google search. If you find the page please notify us about new location
Please visit Heiner Steven
SHELLdorado the best shell scripting site on the Internet
|
SHELLdorado - Newsletter Archive -- a lot of very useful tips
bash Cookbook Reader - Contributions browse
10 Essential UNIX-Linux Command Cheat Sheets TECH SOURCE FROM BOHOL
Top 10 Best Cheat Sheets and Tutorials for Linux - UNIX Commands
My 10 UNIX Command Line Mistakes
Solaris IAOQ (INFREQUENTLY ASKED AND OBSCURE QUESTIONS )
Etc
String expansionIndirect variable references - the new way
1 #!/bin/bash 2 3 # String expansion. 4 # Introduced in version 2 of bash. 5 6 # Strings of the form $'xxx' 7 # have the standard escaped characters interpreted. 8 9 echo $'Ringing bell 3 times \a \a \a' 10 echo $'Three form feeds \f \f \f' 11 echo $'10 newlines \n\n\n\n\n\n\n\n\n\n' 12 13 exit
Using arrays and other miscellaneous trickery to deal four random hands from a deck of cards
1 #!/bin/bash 2 3 # Indirect variable referencing. 4 # This has a few of the attributes of references in C++. 5 6 7 a=letter_of_alphabet 8 letter_of_alphabet=z 9 10 # Direct reference. 11 echo "a = $a" 12 13 # Indirect reference. 14 echo "Now a = ${!a}" 15 # The ${!variable} notation is greatly superior to the old "eval var1=\$$var2" 16 17 echo 18 19 t=table_cell_3 20 table_cell_3=24 21 echo "t = ${!t}" 22 table_cell_3=387 23 echo "Value of t changed to ${!t}" 24 # Useful for referencing members 25 # of an array or table, 26 # or for simulating a multi-dimensional array. 27 # An indexing option would have been nice (sigh). 28 29 30 exit 0
1 #!/bin/bash2 2 # Must specify version 2 of bash, else might not work. 3 4 # Cards: 5 # deals four random hands from a deck of cards. 6 7 UNPICKED=0 8 PICKED=1 9 10 DUPE_CARD=99 11 12 LOWER_LIMIT=0 13 UPPER_LIMIT=51 14 CARDS_IN_SUITE=13 15 CARDS=52 16 17 declare -a Deck 18 declare -a Suites 19 declare -a Cards 20 # It would have been easier and more intuitive 21 # with a single, 3-dimensional array. Maybe 22 # a future version of bash will support 23 # multidimensional arrays. 24 25 26 initialize_Deck () 27 { 28 i=$LOWER_LIMIT 29 until [ $i -gt $UPPER_LIMIT ] 30 do 31 Deck[i]=$UNPICKED 32 let "i += 1" 33 done 34 # Set each card of "Deck" as unpicked. 35 echo 36 } 37 38 initialize_Suites () 39 { 40 Suites[0]=C #Clubs 41 Suites[1]=D #Diamonds 42 Suites[2]=H #Hearts 43 Suites[3]=S #Spades 44 } 45 46 initialize_Cards () 47 { 48 Cards=(2 3 4 5 6 7 8 9 10 J Q K A) 49 # Alternate method of initializing array. 50 } 51 52 pick_a_card () 53 { 54 card_number=$RANDOM 55 let "card_number %= $CARDS" 56 if [ ${Deck[card_number]} -eq $UNPICKED ] 57 then 58 Deck[card_number]=$PICKED 59 return $card_number 60 else 61 return $DUPE_CARD 62 fi 63 } 64 65 parse_card () 66 { 67 number=$1 68 let "suite_number = number / CARDS_IN_SUITE" 69 suite=${Suites[suite_number]} 70 echo -n "$suite-" 71 let "card_no = number % CARDS_IN_SUITE" 72 Card=${Cards[card_no]} 73 printf %-4s $Card 74 # Print cards in neat columns. 75 } 76 77 seed_random () 78 { 79 # Seed random number generator. 80 seed=`eval date +%s` 81 let "seed %= 32766" 82 RANDOM=$seed 83 } 84 85 deal_cards () 86 { 87 echo 88 89 cards_picked=0 90 while [ $cards_picked -le $UPPER_LIMIT ] 91 do 92 pick_a_card 93 t=$? 94 95 if [ $t -ne $DUPE_CARD ] 96 then 97 parse_card $t 98 99 u=$cards_picked+1 100 # Change back to 1-based indexing (temporarily). 101 let "u %= $CARDS_IN_SUITE" 102 if [ $u -eq 0 ] 103 then 104 echo 105 echo 106 fi 107 # Separate hands. 108 109 let "cards_picked += 1" 110 fi 111 done 112 113 echo 114 115 return 0 116 } 117 118 119 # Structured programming: 120 # entire program logic modularized in functions. 121 122 #================ 123 seed_random 124 initialize_Deck 125 initialize_Suites 126 initialize_Cards 127 deal_cards 128 129 exit 0 130 #================ 131 132 133 134 # Exercise 1: 135 # Add comments to thoroughly document this script. 136 137 # Exercise 2: 138 # Revise the script to print out each hand sorted in suites. 139 # You may add other bells and whistles if you like. 140 141 # Exercise 3: 142 # Simplify
No comments:
Post a Comment