Validate Markdown Files With MarkdownLint
How do you know that your Markdown content is valid? You use MarkdownLint! In this post, I step through how to install, configure, and use it, as well as how to use one-time rule overrides.
Anything to do with software and computers usually ends up getting tricky, time-consuming, and repetitive. As such, if you value your sanity, you’ll want to regularly look for ways to automate these tasks away. One of the best ways to do that, is by using Make - a veteran automation tool.
If you’re under about 35 years of age, you can be forgiven for not having heard of Make. It first appeared in April of 1976 (a great year and month btw); created by Stuart Feldman. If you’re familiar with Ant, Rake, MSBuild, Phing, Capistrano, Maven, or tools of this nature, then you’ll know, roughly, what to expect.
If not, or if you need a refresher, the manual describes it as:
GNU Make is a tool which controls the generation of executables and other non-source files of a program from the program’s source files. Make gets its knowledge of how to build your program from a file called the makefile, which lists each of the non-source files and how to compute it from other files.
While the description talks about software generation, Make’s an automation tool. It gets its information from a Makefile, which contain a set of directives, known as “targets”, which carry out a particular task.
As a working example, I’m going to step you through part of a Makefile that I wrote, earlier in the week, to help me improve the quality of the content I produce for the ownCloud documentation. The intent of the Makefile is to check the written prose, to ensure that it doesn’t use poor or weak words, and doesn’t contain any grammar mistakes. To do this, I’m making use of the NPM command-line tool write-good.
Let’s get started, by looking at the first target, called check_all_files_prose
.
check_all_files_prose:
@echo "Checking quality of the prose in all files"
write-good --parse modules/{administration,developer,user}_manual/**/*.adoc
The first line is the name of the target, which is what you tell make
to execute.
The second line prints the string in inverted commas to the screen.
And the third line calls the command-line utility write-good
, passing it the argument --parse
, along with a glob expression, which is the path to one or more AsciiDoc files.
Assuming that both write-good and GNU Make are already available, I can run this target with the following command:
make check_all_files_prose
I need to work on that target name, as it’s not the most intuitive. Right?!
Now, only being able to call existing commands and perform simple glob expressions isn’t the most helpful. Gladly, there’s more to GNU Make and Makefiles than this simple example suggests.
Let’s say that, instead of parsing every AsciiDoc file in an Antora installation, I want to check AsciiDoc files that are staged for commit. Here’s how I’d do that:
FILES=$(shell git diff --staged --name-only | grep -E \.adoc$)
check_staged_files_prose:
@echo "Checking quality of the prose in the changed files"
$(foreach file,$(FILES),write-good --parse $(file);)
The first line initialises a variable called FILES
with the list of staged AsciiDoc files.
The call to git retrieves all the staged files, and grep filters out all but AsciiDoc files.
Ideally, you’d check Markdown, reStructuredText, text and any other file that contains human prose.
But my needs are simple in this case.
Note that the two shell commands are wrapped inside $(shell
and )
.
This tells Make that the commands need to be executed in a sub-shell, and that we want to store the result in the variable we’re initialising.
Now that FILES
is initialised, we can use it in the new target.
As before, I’ve used an @echo
statement, to let the user know what’s happening.
That’s not so important as we’ve seen that before.
What is important is the final line. This is the GNU Make form or a foreach loop. I’m stressing GNU make, as different versions of Make implement the loop differently.
With that said the command’s, again, wrapped in a $()
, which indicates a Make command.
The first string, foreach
, tells Make the command to run.
The second string, file
, provides the name of the loop variable.
The third, $(FILES)
, is the construct that’s being looped over.
The string after the second comma is the command to run for every iteration of the foreach
, which passes the value of $(file)
to a call to write-good
.
OK, so you’ve seen how to:
Let’s now finish up by seeing how to create user-defined functions.
The syntax for defining a function in a Makefile, you can see in the example below.
define generate_pdf_manual
asciidoctor-pdf $(1) \
-a pdf-stylesdir=$(STYLESDIR)/ \
-a pdf-style=$(STYLE) \
-a pdf-fontsdir=$(FONTSDIR) \
-a examplesdir=$(BASEDIR)/modules/$(3)/examples/ \
-a imagesdir=$(BASEDIR)/modules/$(3)/assets/images/ \
-a appversion=$(APPVERSION) \
--out-file $(2) \
--destination-dir $(BUILDDIR)
endef
It defines the function generate_pdf_manual
, which calls the asciidoctor-pdf command.
Asciidoctor-pdf generates a PDF file from a collection of AsciiDoc files.
To further demonstrate how to use pre-defined variables, I’ve called the command with a number of arguments, passing a user-defined variable to each one.
Note that you have to wrap each one in $()
, similar to executing a command.
You can see that it’s not overly complex. It may appear so, as I’ve referenced so many command arguments. But I’ve tried to keep it readable by splitting the command across several lines.
To use the function in a target, we need to use Make’s call
function, as follows:
$(call generate_pdf_manual,book.admin.adoc,administration_manual.pdf,administration_manual)
Similar to the shell
command that we’ve seen previously, call executes a user-defined function.
The first argument is the name of the function to execute.
Subsequent arguments (you can have as many as you like) are passed directly to the function, and are referenced by their numeric order, e.g., $(1)
, and $(2)
.
You can see this in the previous, and in the following, example.
It takes a little getting used to, but it grows on you after a while.
Now what if we want to get a bit fiddly, and execute a target conditionally? To do that, we’d do something similar to the following.
define optimise_pdf_manual
[ -f $(BUILDDIR)/$(1) ] && \
cd $(BUILDDIR) \
&& optimize-pdf $(1) \
&& rm $(1) \
&& rename 's/\-optimized//' * \
&& cd -
endef
The first line looks similar to a Bash test construct.
It first tests if the file exists in the build directory ([ -f $(BUILDDIR)/$(1) ]
).
If it does, (&&
), then we execute the commands that follow.
This command sequence:
While make and GNU Make is over 42 years old as I write this, and other build tools likely have a simpler, cleaner syntax, you’re not likely to find a build/automation tool that’s as universally available, well understood, and well documented.
I’ve only scratched the surface in what I’ve covered in this post. If you take even a brief glance at GNU Make’s documentation, you’ll get a feel for just how capable and powerful it is.
It is, to use an Australian-ism, a bit “crusty”. Other, more recent build automation tools, at least to my mind, have far cleaner syntax. However, it’s such a powerful tool, one that I’ve loved learning and exploring in recent months.
If you’re looking to invest in a build tool, one that you know you can get support for, and that will be maintained well into the future, then GNU Make is the one for you.
Are you already a GNU Make legend? What are your pro/power tips? What are your tips for newcomers?
I’d love to hear your suggestions in the comments below.
How do you know that your Markdown content is valid? You use MarkdownLint! In this post, I step through how to install, configure, and use it, as well as how to use one-time rule overrides.
Antora is the premier technical documentation platform. However, that doesn’t mean that it’s a breeze to use right from the get-go. If you have just begun using it or want to get the most out of it, this post will step you through the three key concepts that you need to know about.
If you need to create and maintain technical writing, there are a large number of solutions that will give you a lot of what you want. However, which one is the best? Today, I’ll show you which one I believe is the best choice.
I’ve been reviewing the AsciiDoc plugin for IntelliJ over the last four weeks. In this post, I share everything that I learned.
Please consider buying me a coffee. It really helps me to keep producing new tutorials.
Join the discussion
comments powered by Disqus