Shell scripting is full of secrets and hidden tricks, so it pays to have a few patterns up your sleeve. For inspiration, try these scripts from real projects, including Homebrew, BashBlog, and nvm. By learning from these examples, you can improve your own shell scripts and master new techniques.
Locate a default config location
To keep your configuration files neat
When you want to specify a config directory for your project, use this pattern:
“${XDG_CONFIG_HOME:-$HOME/.config}/myproject/config”
This value will expand to a directory that your project can use to store per-user configuration files.
A very common pattern, this takes advantage of shell parameter expansion, one of the many expansion types that Bash supports. It also uses the XDG Base Directory Specification, which helps to keep your directory layout clean and standard.
The :- syntax ensures this part of the value is either $XDG_CONFIG_HOME—if it’s set and non-empty—or $HOME/.config otherwise. The XDG spec recommends that $HOME/.config is used as the default if XDG_CONFIG_HOME is unavailable; this pattern respects that.
Todo.txt, a shell script that manages a to-do list file, uses this pattern to define one of the locations it searches for its config file.
Locate an executable program
Ensure your script’s dependencies are met appropriately
To locate a binary from a set of possible alternatives, use a pattern like this:
[[ -f Markdown.pl ]] && markdown_bin=./Markdown.pl \
|| markdown_bin=$(which Markdown.pl 2>/dev/null \
|| which markdown 2>/dev/null)
This pattern uses boolean logic, short-circuiting, and test operators to check a condition (the Markdown.pl file’s existence) to set a variable using either the default file or an alternative. The alternative uses the which command to locate one of two possible backups in the current user’s path.
The resulting variable, markdown_bin, will contain either a path to the appropriate executable or the empty string (which you can test for using -n). BashBlog, a simple blog system written in a single bash script, uses this pattern to provide optional support for Markdown using the standard program.
Generate a random file name
When you need to write to a new file, but you don’t care about its name, try this pattern:
while [[ -f $out ]]; do out=${out%.html}.$RANDOM.html; done
This while loop uses the -f operator again, also taking advantage of the special RANDOM variable. The clever bit is ensuring that the file doesn’t already exist, although it’s a brute force solution: just generate random file names until one of them doesn’t already exist.
$RANDOM isn’t the best approach if you need genuinely random numbers, but it’s fine for this type of use. In this specific case, the final filename will be something like mypost.6592.html, mypost.26005.html, and so on.
BashBlog makes use of this common pattern to send generated output to a temporary file.
Require a variable
Defensive programming is good practice
To ensure a variable is defined before doing anything else, use this pattern:
do_stuff() {
[[ -z $global_variable ]] && return
}
The combination of -z to test for an empty string and the short-circuit && to return early is highly flexible and applicable to a wide range of situations. In this specific case, it’s used to check for a global variable and terminate a function early. This is good practice if you have a function that depends on a value and cannot take any reasonable action other than failing gracefully.
You can also check for the existence of positional parameters, making them required if you bail out early on failure:
[[ -z $1 ]] && return
Again, BashBlog uses this pattern extensively. Homebrew also uses it in this brew script, which checks common variables like BASH_VERSION, PWD, and HOME.
Assign parameters to local variables
For anyone who has to maintain your code, including you
Using clear, sensible names for your variables makes your code easier to read and less risky to edit. Here’s an example of a pattern that can help enormously:
foo() {
username=$1
name=$2
…
}
Unlike most programming languages, Bash scripting doesn’t support named parameters, either for programs or functions. In their place are positional parameters, like $1 to refer to the first, $2 for the second, and so on. But inside a script or function, especially those on the longer side, $1 and $2 soon become awkward to work with.
This pattern works by addressing that problem, assigning positional parameters to local variables with readable names at the earliest opportunity.
Reassigning parameters also helps to avoid problems when they are processed in a loop or altered using the set builtin. If your function receives a username in $1, reassigning it will ensure it’s available later, no matter what happens to $1. It also helps in case you ever need to rearrange your function’s parameters; if so, you just need to change the initial assignment right at the top of the function, not every use of $1 within it.
The nvm script uses this pattern, and it’s widely used in many shell scripts.
Redirect output from several commands to a file
This pattern can remove a lot of redundancy
You probably already know how to redirect like a pro, but it can still be awkward, especially with multiple commands. Thankfully, there’s a shortcut, and this pattern makes full use of it:
{
command1
command2
} > filename
This pattern will run command1, then command2, redirecting each one’s output to the file named filename. The more commands you need to run, the more you win by not having to repeat the filename and the redirection operator each time, and it’s a lot easier to use a different filename if you need to.
It’s important to note that a very similar grouping syntax exists, using parentheses instead of curly braces:
(
command1
command2
) > filename
The difference is that these commands now run in a subshell, meaning—for example—that any variable assignments are not available outside the grouping.
There are additional syntax nuances, but if you use this multiline format, you shouldn’t run into them.
Process a file line by line
Take things one step at a time
This pattern will help you process config files, Markdown text, and other types of plain text files. It’s most useful when each line of a file is mostly independent of the rest:
while IFS=” read -r line; do
…
command $line
…
done < $filename
There’s a lot going on here, so I’ll explain it bit by bit. The read builtin retrieves data from stdin and stores it in a variable—line, in this case. It returns false when there is nothing to read, so, in combination with input redirection from a named file, the loop will continue until all input has been consumed. The IFS (internal field separator) variable, in this case, affects how read handles leading/trailing space, ensuring they are preserved.
Use heredocs to write to a file
This specific syntax looks a lot cleaner
The heredoc syntax is one that you’ll find reason to use again and again, once you’ve picked it up. But it’s even more powerful than you might realize, as this pattern demonstrates:
bar() {
cat <<- EOF > “$filename”
Enter lines
of text here
EOF
}
Here documents let you direct multiple lines of input, without having to use any awkward escape characters or repeat complex syntax. The one in this pattern is even more special, however, with two interesting features.
First, it uses the syntax with a hyphen after the double less-than signs to trim leading tabs from each line between the delimiters. This lets you indent everything to line up nicely, which is a particular concern inside a function—or any other block construct.
Second, it uses cat with an output redirect to send the contents of the heredoc to a file. This makes it trivial to populate a file from a script, whether it’s a README, config file, or something else.
If you want to use variable expansion inside a here doc, make sure you leave the delimiter in the first line unquoted.
Stand on the shoulders of giants
Try exploring popular shell projects on GitHub for inspiration and to learn new techniques. If you find a certain syntax or command you don’t understand, look it up to understand why it’s used the way it is. Learn from other shell programmers, and your scripting will improve as you go.

