Unit testing bash scripts the cloud-native way

Unit testing bash scripts the cloud-native way

23 March 2021

Joachim De Groot

Unit testing?

Unit testing is a type of test used in software programming. Parts of the source code are tested against different tests to determine if that part of the code works and behaves as intended. Unit tests are often run automatically and written by the same software developer who has developed the code. During unit tests certain (external) dependencies are taken away by what is called “mocking”. Mocking is basically faking or simulating the expected behavior or response of a dependency. To do this, testing frameworks are often used but are not necessary. This is done because creating tests that incorporate certain dependencies are impractical or even impossible.

Unit testing bash?

Now, bash scripts are mostly written to automate repetitive and relatively easy actions. While bash scripts can be very powerful, they can also easily become complex and difficult to read. I have more than once started writing a script that started small and simple but became larger and more complex as I accounted for different situations and added more logic and features. In a lot of teams, there is the philosophy to test everything that could possibly break. With that in mind, we should also unit test our bash scripts. There are a small number of testing frameworks that work with bash scripts. Some have extra dependencies to Python or Lua, but if you want to keep everything simpler there are some that are natively made in bash. Some examples are shUnit2, assert.sh and bats-core.

The cloud-native way?

The method that I am going to show below is even more simple. I am going to use native functionality found in bash to test my script. At FlowFactor, we like to work in an automated and Cloud-Native way. That is why I’m going to use Docker and Jenkins. The example below is simplified but still shows the way of working. Imagine that you have a Kubernetes cluster in your company and that you are writing a new component, a container containing a bash script that is going to run as a pod inside your cluster. The pod is going to monitor a persistent volume and will send an e-mail if there are more than three crash dumps.

Example script

#!/bin/bash
AMOUNT=$(ls crashdump | wc -l)
if [[ AMOUNT -ge 3 ]]; then
  mail -s “Alert: Too many files: $AMOUNT” alerts@toomanycrashes.com
fi

The first line gives us the amount of files named crashdump. Afterwards, there is a simple if-structure that only sends an e-mail if AMOUNT is more than 3.

Example Dockerfile

FROM ubuntu
RUN mkdir /scriptDir
COPY script.sh /scriptDir/script.sh
RUN chmod +x /scriptDir/script.sh
ENTRYPOINT [ “/scriptDir/script.sh” ]

We then create a Dockerfile that creates a directory, copies over our script, makes it executable and sets it as the entrypoint.

test.sh:

As mentioned above, this is a very simple script but it’s a good example to show how you can easily create a simple unit test with only native bash functionality.

#!/bin/bash
shopt -s expand_aliases
# Execute before all tests
alias mail="echo 'alert send' > output;false"
echo 'Test results for our image/script' > results.txt
count=0
# test 1 (less than or equal)
# Execute before test 1
alias ls="echo -e 'file1\nfile2'"
echo 'No output' > output
# Run script
. /scriptDir/script.sh
# Check result (assert)
((count=count+1))
if [[ $(< output) == 'alert send' ]]; then
echo "$count. FAIL: Alert mail send while less than three files" >> results.txt
else
echo "$count. PASS: Alert mail not send" >> results.txt
fi
# Prepare test 2 (greater than or equal)
alias ls="echo -e 'file1\nfile2\nfile3\nfile4'"
echo 'No output' > output
# Run script
. /scriptDir/script.sh
# Check result (assert)
((count=count+1))
if [[ $(< output) == 'alert send' ]]; then
echo "$count. PASS: Alert mail send" >> results.txt
else
echo "$count. FAIL: Did not send Alert mail" >> results.txt
fi
# After all
unalias ls
unalias mail
# Display test results
cat results.txt
# Return exit 1 if FAIL found
if grep -q "FAIL" results.txt; then
echo "exiting with exit 1"
exit 1
fi

The test script consists of a couple of parts: Before all, before, execute, assert, after and after all. These are the standard parts of a unit test.
The first command in our script enables the shell option expand_aliases. This makes aliases available inside our script, which is essential to this method of testing bash scripts. Using aliases, we are going to mock the commands inside our script.

In the before all, we create an alias for the mail command that mocks away the real mail command. We don’t want to send an actual e-mail while testing our script.

In the before test 1, we mock the ls command to simulate returning two files. We define it here because, contrary to the mocked mail command, the output of the ls changes according to the test scenario.

Next, we will run our script. The . before our script means that we source our script instead of executing it directly.

After the script ends, we check the output and, accordingly to the result, write fail or pass to our results.txt. We expect that our output file does not contain alert send to pass our first test scenario.

For test scenario 2, we execute the same steps except that our mocked ls is now returning 4 files.
For this test to pass, we do expect our output file to contain alert send.

At the end of the test script, we unalias the ls and mail commands, print the results.txt and check if it contains any fails. If that is the case, the script will end with exit code 1.

Dockerfile

FROM flowfactor/script:latest
COPY test.sh /scriptDir/test.sh
RUN chmod +x /scriptDir/test.sh
ENTRYPOINT [ “/scriptDir/test.sh” ]

The most important thing in the Dockerfile for the unit test image is that we are starting from the main image and are simply adding our test script and changing the entrypoint.

Jenkinsfile

pipeline {
  agent any
  stages {
     stage(‘build’) {
       steps {
          script {
              image = docker.build(“flowfactor/script”)
          }
       }
  }
  stage(‘test’) {
     steps {
        script {
            docker.build(“flowfactor/unittest”, “./unittest”)
            sh “docker run flowfactor/unittest”
        }
      }
    }
  }
}

The Jenkinsfile contains three steps. Buildperforms docker build on our main Dockerfile and tags it as our main script image. Testperforms docker build on our unit test Dockerfile and then runs that image to perform the unit tests.

The output in Jenkins will show the following output:

+ docker run flowfactor/unittest
Test results for our image/script
1. PASS: Alert mail not send
2. PASS: Alert mail send

As mentioned earlier, if a fail would be detected by the test script, an exit 1 would be returned and the pipeline run would fail.

Conclusion?

To be honest, the script and test script are pretty silly examples but they show an effective and easy way to automatically unit test bash scripts using native bash functions, Docker and Jenkins. I have used this same method to successfully unit test larger and more complex scripts. The only difference is that there were more aliases that returned more complex mock data and a more in-depth assert part to check the test results.

Related posts
No Comments

Sorry, the comment form is closed at this time.