This is the standard modern shell. You should always script in Bash for portability.
It's a rare exception where you'd need to script in the original Bourne shell (bootstrapping Alpine is one of the few use cases for that).
- Core Reading Material
- Bash vs Other Languages
- Advanced Library of Scripts
- Perl, Awk, Sed
- JSON
- Binaries Debugging
- Commands
- Tips & Tricks
- Debugging
- Other Cool Resources
- Style Guide
You may occasionally see the following in DevOps job specs:
10 lines of Bash is better than 100 lines of Java
This true, but also, if you're only writing 100 lines of Java you probably don't have enough error handling.
All scripts should have the following set at the top of them.
If you haven't accounted for all the exit codes and variables in the script, it should crash for safety like other languages until you fix your code.
set -euo pipefail
See the Debugging section further down for details.
1000+ DevOps Bash Scripts
AWS, GCP, Kubernetes, Docker, CI/CD, APIs, SQL, PostgreSQL, MySQL, Hive, Impala, Kafka, Hadoop, Jenkins, GitHub, GitLab, BitBucket, Azure DevOps, TeamCity, Spotify, MP3, LDAP, Code/Build Linting, pkg mgmt for Linux, Mac, Python, Perl, Ruby, NodeJS, Golang etc.
Also contains advanced configs. eg: .bashrc
, .vimrc
, .gitconfig
, . screenrc
, .tmux.conf
etc.
https://github.com/HariSekhon/DevOps-Bash-tools
This is more than the manuals above, you could study this repo for years, or just run its scripts today to save you the time.
You need to learn at least some basic one-liners of Perl, Awk and Sed to be proficient in shell scripting.
You also need to learn Regex to use these tools effectively.
See the JSON doc for commands to help with processing JSON which is often output by modern Rest APIs.
See the Binaries Debugging doc for commands to examine and work with binaries.
See Also:
- Disk Management commands
Some less well known commands to remember:
Command | Description |
---|---|
cmp |
compare whether two files diff. Shorter than doing an md5sum on both (md5 on Mac) |
comm |
print or omit lines are common or unique between two files |
expand |
expands tabs to spaces |
unexpand |
converts spaces to tabs |
fmt |
|
join |
join matching records from multiple files |
logger |
sends messages to the system log ie. syslog /var/log/messages |
mail |
send email from the command line |
whatis / proppos / man -k |
search for man pages containing the argument string |
command |
execute a binary from the path instead of a function of the same name |
builtin |
execute a shell builtin instead of a function of the same name |
dialog |
create an interactive curses menu |
say |
Mac command that speaks the words piped in. I use this to impress my kids by making the computers talk |
type |
Tells you what a command is, path to binary or shell builtin. -P returns only binary |
xev |
Prints the keystrokes to the Linux X server GUI |
tail --follow=name --retry <filename> |
GNU tail can retry and continue following a file if it's renamed |
split |
Split a text file into smaller parts by lines or bytes. Useful for parallel data processing |
csplit |
Split with context by splitting on pattern such as regex |
pr -m |
Prints files into columns |
column -t |
Prints stdin into aligned columns to tidy up output (used in various scripts in DevOps-Bash-tools |
paste |
merges text files to stdout eg. paste -s -d+ | bc |
open |
Mac command to open a URL in the default web browser |
xmessage |
Linux X GUI pop-up message command |
xclip |
Copy stdin to the Linux X UI clipboard, or the clipboard to stdout |
pbcopy |
Copy stdin to the Mac UI clipboard |
pbpaste |
Prints the Mac UI clipboard to stdout |
gprof |
Profile output of an executable's time in each system call. Compile with gcc -pg first |
expect |
Controls interactive programs by sending them timed input from an 'expect' script |
autoexpect |
Generate an expect script automatically from an interactive run |
flock |
File lock, useful for advanced scripting. I'd previously written flock.pl to allow shell scripts to using programming file locking |
script |
Copy everything you write in the shell to a file to record an outline of a script for you |
tee |
Pipes output to one or more file arguments as well as stdout for further piping. The -a switch appends instead of overwrites the file |
fold |
Wraps text to the width specified by -w N . Use -s to only fold on spaces, not mid-word |
pandoc |
Universal document converter (eg. generate_repos_markdown_table.sh) |
iconv |
Convert between character encodings |
hexyl |
Hex terminal viewer https://github.com/sharkdp/hexyl |
file |
Determines file type |
pig |
Parallel implementation of gzip . Call in tar using the -I option: tar czvf -I 'pigz -9' myfile.tar.gz * |
Environment variables to keep in mind:
Variable | Description |
---|---|
EDITOR |
Set the editor to open automatically in unix commands like visudo |
TMOUT |
Times out the shell or script after N seconds from the time this variable it set |
RANDOM |
A random number |
CDPATH |
List of directories that a cd command will take you to with only the basename |
Treat a process as a file handle to read from:
<(somecommand)
Treat a process as a file handle to write in to:
>(somecommand)
A neat trick is to tee /dev/stderr
to have the output appear on your screen while also sending it onwards for further
processing in a shell pipeline.
echo stuff | tee /dev/stderr | xargs echo processing
Tee to two programs:
echo stuff | tee >(cat) | cat
Something that seemed cool in the 2000s was FIFO pipes (first in first out). These are special files that one process can write into and another process can read from:
mkfifo /tmp/test.fifo
this hangs if there isn't another process reading from the fifo pseudo-file:
echo stuff > /tmp/test.fifo
in another shell:
cat /tmp/test.fifo
In practice, I can't recall finding a need for this since the 2000s. There usually better solutions.
FIFOs have no real security though and rely on file permissions to stop somebody or some other program writing unexpected input into the listening program, which may not be coded defensively enough. In practice people just use temporary files between processes not started in the same shell if they really have to. Situations which require long-running IPC would probably be better done in a real programming language.
cat -n
less -N
nl
!n
- re-runs command numbern
from thehistory
!$
- the last argument of the previous command, usually a filename from a previous command. Useful to run another command on the previous file operated on!:n*
- takes the Nth arg to the end from the last command
Add readline support (command history - the ability to press Up
-Enter
to re-execute previous commands)
to tools that lack it like Oracle's SQL*Plus.
Prefix any command with rlwrap
:
rlwrap <command>
This works by intercepting user input and storing it and replacing it when you press Up
or Down
,
essentially giving you command history.
This is usually available in the rlwrap
package on RHEL and Debian-based Linux systems
and brew on Mac.
Prints commands as a script runs so you can see what command generated an error:
set -x
Fail if any command returns an unhandled non-zero exit code.
If you have unhandled errors your script should die so you know what the script is doing at all times and doesn't result in unintended consequences:
set -e
In Bash, but not available in old Bourne shell:
set -o pipefail
Which fails on non-zero exit codes from commands in pipes which otherwise get masked by the exit code of the last command in the shell pipeline.
This prevents commands running on typo variables or empty variables (a variable set to the result of a command that returned nothing instead of the expected output) as that can have disastrous consequences:
set -u
Imagine rm -fr "/apps/$empty_variable"
which would delete all the apps instead of the expected one,
or worse on /
would delete the entire operating system and all data on it.
Start a clean shell without any functions, aliases or other settings to help in debugging:
env - bash --norc --noprofile
In DevOps-Bash-tools this a function called cleanshell
.
- Greg's Wiki - Wooledge.org - the grumpy old greycat guy on IRC in the 2000s would often send noobs to his classic resource
- Shelldorado
- explainshell.com - explains a bash shell statement
- Reddit - r/bash
- ShellCheck - online version of the popular
shellcheck
command line tool to find bugs and improvements to make in shell code
Google Shell Guide - I don't always agree with everything in here but here it is if you're interested
Points I disagree with the Google style guide on:
- 2 space indentation - Python already set the standard with 4 space indentation for ease of readability 20+ years ago
- 80 character width is also antiquated. 100 or 120 char width is probably fine for most people, you are unlikely to be editing scripts on an old 80 character console
- shell pipes should not all be on one line unless they're a trivial mere couple commands
- because any changes will have a larger blast radius in trying to scan what part of the line changed
- split command pipes one command per line for easier
git diff
ing showing the command that changed
${var}
variables surrounded by braces is only needed for variables that touch other strings and would otherwise be misinterpreted. You don't get paid to put in extra characters everywhere- the Google guideline then tells you not to bother doing it for single character variables unless they touch another adjacent string, but doesn't follow this same logic for full word variables
[[
is more advanced and less portable than[
- only use it when you need regex matching
Partial port from private Knowledge Base page 2008+