How Docker Simplifies Creating a Multi-distribution Linux Shell Script

How Docker Simplifies Creating a Multi-distribution Linux Shell Script

Do you write Linux shell scripts? Do your scripts work on more than one distribution? If so what does your development environment look like and what tools do you use to let you develop, debug, and maintain your scripts both quickly and relatively hassle-free?

The reason that I ask that I’ve been tearing through The Insider’s Guide to Technical Writing recently. As a result, I’ve gained a new lease on life as a technical writer. The Insider's Guide to Technical Writing

This isn’t to say that I didn’t have a strong professional work ethic or solid experience prior to reading the book. It’s that since I’ve begun reading it, I’ve felt so much more confident in how I undertake the role than I have been up until now.

Specifically, I only document and approve PR’s about topics that I’ve personally tested. If I’m not sure they work, they don’t get my approval.

This might sound strange.

However, like unit testing or security best practices in software, sometimes you can feel under pressure from deadlines to get things done quicker than you should, effectively rushing things through without being 100% sure that they work.

So it was recently when I was going through the manual installation section of the ownCloud administration documentation. I started a quick run through of the steps outlined in response to a new issue only to find that:

  1. Some details were missing.
  2. Some were wrong (likely outdated).

These omissions and errors meant there was room for doubt and error — especially for newer ownCloud users.

What’s more, while code samples are very helpful (you don’t have to figure out what you have to type) they still leave so much work for the user, as they have to copy and paste the code examples into their terminal manually.

Let’s Make The User’s Life Easier with a Script

As we’re attempting to help them save time and effort, why not provide a script that they can use as part of a build process, one able to be scheduled via Cron?

So despite (or in spite of) the factual errors and omissions, I became excited at the prospect of revising the documentation.

Now I could have raised an issue with the core development team to create the script, detailing what it should do when it was ready.

However, I’m not only a technical writer, I’m also a software developer. And what do developers love to do more than most else? Design and write code!

So it was that I started designing a shell script that would automate the process of installing all of ownCloud’s dependencies.

The Script Outline

The first thing to do was to assess what it needed to do and the environment in which it would run.

As I already knew what it had to do, using the documentation as my guide, I moved on to assessing the environment requirements.

Currently, ownCloud officially supports several Linux distributions. These are:

  • Ubuntu 16.04
  • Debian 7 and 8
  • SUSE Linux Enterprise Server 12 and 12 SP1
  • Red Hat Enterprise Linux/Centos 6.5 and 7

So when the script was finished, it had to achieve the same outcome, regardless of which distribution it was run on.

A bash shell script seemed to make the most sense. I could have written it in Ruby, Python, or PHP. However, I always associated shell scripts with SysAdmin & DevOps work. What’s more, I had an itch to scratch!

Here’s an admission: I’ve been a bash script hacker since 1999. But I’ve never actually developed my proficiency past a certain point.

So I saw this as an opportunity to grow my skills and learn more about bash while indulging one of my oldest, technical, passions: Linux. On top of that, I’ve long been curious about how the different distributions organise themselves.

And so the decision was made. Then came the next question:

How would I make it portable, yet not spend more time than was necessary provisioning the different Linux distributions?

VirtualBox, Vagrant, and a provisioner such as Ansible seemed out of the question. If I went down that path, I’d likely spend more time writing the provisioning scripts than writing the actually setup script — which was my main focus!

So I chose to go with Docker instead, as it demands only a limited amount of time and effort to get an environment up and going.

Given that, I created a custom Docker setup, based largely on an existing project, which you can find on GitHub. In it, you can see that it uses Docker-Compose to create a two-container setup.

There’s a web service that provides Apache 2 and PHP 7. And there’s a MySQL container that, surprise, provides MySQL.

The web service uses any one of three Dockerfiles, which are based on a different base image. There’s one for Ubuntu, openSUSE Leaf 42.3, and CentOS 7.

Each of them installs a set of packages, sets up a user & group, and sets up some permissions on a required directory so that that user can access it.

While they don’t do a lot, they’s still essential. By using them, I was able to write a shell script that achieved the same outcome across each distribution.

The script doesn’t actually do a lot. But, here’s a quick summary:

  1. Check that the user isn’t root.
  2. Make sure the directory is readable and writable by the web user.
  3. Install the required packages.
  4. Run the installer.

Doing so taught me:

  • What could stay the same
  • What had to be changed
  • Other changes in the environment

Interestingly, the key step was determining which distribution was being used, so that the script knew which package manager to use to install the required packages, and what the packages were called.

This was done with two functions: which_distro and install_required_packages.

Function 1: which_distro

function which_distro()
  case "$( grep -Eoi 'Debian|SUSE|Ubuntu' /etc/issue )" in
      echo "SUSE"
      echo "Ubuntu"


  # Need to do a bit more work to detect RedHat-based distributions
  if [ -e "$redhat_release_file" ]; then
    case "$( grep -Eoi 'CentOS' $redhat_release_file )" in
        echo "CentOS"

The function first greps /etc/issue for one of Debian, SUSE, or Ubuntu. If it contains one of the three strings, then we know its that distribution. I did this because I’ve found that these distributions consistently show themselves there.

Determining if the distribution was RedHat or CentOS was a bit harder, as these two don’t always store the identifying information consistently in /etc/issue.

They can store it there. But they can also store it in /proc/version as well as /etc/redhat-release. /etc/redhat-release seems to be the most consistent approach.

Given that, if /etc/redhat-release is available, then we know that one of the two distributions is being used. From there, the script greps for which of them it is, similar to the previous approach.

Function 2: install_required_packages

function install_required_packages()
  case "$( which_distro )" in
    "SUSE") echo "Installing required packages on SUSE"
    "Ubuntu"|"Debian") echo "Installing required packages on Ubuntu/Debian"
    "CentOS") echo "Installing required packages on Centos"

The second script just uses which_distro to determine which distribution is being used, and then:

  1. Provide visual confirmation to the user about which distribution it’s going to install the packages for.
  2. Calls a distribution-specific function to install the required packages.

Now for the more interesting part, the distribution-specific installers.


I started off with Ubuntu/Debian. I’ve got the most experience with them as I’ve been using them since about 2003. What’s heartening is that it was the distribution that required the least amount of effort.

I know a number of the idioms and quirks, so perhaps I’m biased. But it took the least amount of effort.

function install_required_ubuntu_debian_packages()
  sudo apt-get -y -q update && \
    apt-get install -y -q wget make npm \
      nodejs nodejs-legacy unzip git


I then refactored the script to work with openSUSE. To be honest, while I live in Nuremberg, the hometown of SUSE I think, I’ve barely used it.

What’s more, it took the most amount of effort to code. While the changes largely only reflect using Zypper instead of Apt, it took some experimenting to both get the base environment working and to find the combination of packages and dependencies.

function install_required_suse_packages()
  sudo zypper --quiet --non-interactive install \
    wget make nodejs6 nodejs-common \
    unzip git npm6 phantomjs


Finally, I added an installer for CentOS. While it looks quite large by lines of code, it took less effort than openSUSE. It took time to figure out how to get PhantomJS up and going to be fair, but not all that much.

function install_required_centos_packages()
  sudo yum update -q -y
  sudo yum --enablerepo=cr -q -y install wget make nodejs unzip git npm bzip2 file

  # Install PhantomJS - see
  # It's not in the official repos, so needs to be installed independently.
  sudo yum install fontconfig freetype freetype-devel fontconfig-devel libstdc++
  sudo mkdir -p /opt/phantomjs
  sudo tar -jxvf phantomjs-2.1.1-linux-x86_64.tar.bz2 --strip-components 1 --directory /opt/phantomjs/
  sudo ln -s /opt/phantomjs/bin/phantomjs /usr/bin/phantomjs

What About RHEL?

At this stage, I’ve not completed refactoring the script to work with RHEL. I expect to have that done later in the week.

What Was Learned

It’s been an interesting journey building a shell script that works across multiple Linux distributions. I learned some things, including:

  • How distributions don’t necessarily do the same things in the same way.
  • How some packages are readily available across each distribution, while others you have to find and install all on your own.
  • How by using Docker, I wasted almost no time building each distribution. This cemented for me why it’s my preferred approach to setting up and deploying software.

While it’s frustrating that each distribution doesn’t provide the packages that each other does, it makes sense. They’re created by different people, to serve different audiences and needs.

It makes no sense for them to be identical. And it’s something to keep in mind when you’re writing shell scripts. It may save you a lot of confusion and frustration.

In Conclusion

And that’s been a whirlwind run through of both how Docker’s helped me create a shell script that works across multiple Linux distributions, as well as a bit of a step-through of the relevant sections of the script.

If you’re a bash expert (or more of an expert than myself), I’d love to know how you would improve or change the script. I’d love to know if there are better ways, quicker ways, easier ways to do it.

Please leave your feedback in the comments or the discussion of the script on GitHub.

Do you need to get your head around Docker Compose quickly?

What about needing to dockerize existing applications to make them easier to deploy, reducing the time required for develwpers to get started on projects, or learning how to debug an existing Docker Compose-based app? Then this free book is for you!

You might also be interested in these tutorials too...

Deploying With Docker - Take 1, Or "Houston, We Have a Problem"
Wed, May 3, 2017

Deploying With Docker - Take 1, Or "Houston, We Have a Problem"

A little while ago, I wrote two parts in a multi-part series about using Docker. As someone who’s reasonably new to Docker — and been bitten by the Docker bug — I wanted to share what I’d learned, in the hopes that others may benefit.

How to Debug a Docker Compose Build
Tue, Apr 18, 2017

How to Debug a Docker Compose Build

If you’re using Docker Compose to deploy an application (whether locally or remotely) and something’s not working, here’s a concise approach you can use to debug the deployment and get your containers up and running properly.

Want more tutorials like this?

If so, enter your email address in the field below and click subscribe.

You can unsubscribe at any time by clicking the link in the footer of the emails you'll receive. Here's my privacy policy, if you'd like to know more. I use Mailchimp to send emails. You can learn more about their privacy practices here.

Join the discussion

comments powered by Disqus