Makefile is Still Cool in 2020
What we will discuss today is a simple pattern for makefile that does not require one to fully indoctrinate c/c++ into your shop to take advantage of a couple of its time saving perks.
Makefiles: Improving developer workflow
Quickly, what is a makefile?
A makefile is a written set of task instructions, intended to inform a shell how to compile and link a program together. These tasks are defined by a set of commands you might typically type in your terminal. They are, in fact, just commands you would type in the terminal. We want focus on how these task definitions become a map for what is possible within a project.
Makefile as a common interface for multiple languages
What we will discuss today is a simple makefile pattern that provides several time saving perks without requiring a thorough indoctrination of C. This pattern also does not require you to abandon an existing collection of helper scripts and development tools. Changes will be a seamless improvement to the original code maintainers.
Far too often, contractors and new developers prefer to simply rewrite complex systems into more streamlined versions. I share this preference and have rewritten my fair share of legacy build-deploy systems. Despite this natural tendency, how can a new engineer improve an existing developer workflow with a small amount of time?
The Problem: Many steps, repos, languages, and minds
Makefiles showed up around forty years ago. Back then the computing landscape looked very different than today's twenty-two billion (or more) Internet-connected device count. Despite the many differences that arise over time, programmers then had very similar issues to contend with. Programmers still had to compose logic in written form, which is prone to errors and mistakes made by the author. Past coders also had to find a way to deliver their work into another environment, namely another computer aside from the device it was developed on. I am positive the old adage "it works on my machine", was discovered early on in software engineering history.
One of the reasons Make was originally developed was to cut down on the amount of cognitive load one has to juggle when creating software. Local compilation requirements can become very cumbersome, even if you are the author; worse if you are not the author untangling a series of opaque deployment failures.
A problem we face now are more languages, larger teams, and more repositories (given the popularity of micro-service repositories and specialized languages). Without a clear contract to follow, engineers must do their best to remember all the distinct patterns and dependencies within a particular repository (because try as they might, many engineers still struggle to keep READMES and other documentation up-to-date).
Here is a small example scenario of such a developer-test workflow based on some popular tooling. In these examples, the two script snippets belong to two different repositories with different setup needs and are ultimately deploying to a lambda function.
Updating/testing a python flavored AWS lambda function
# install the "right" version of python
# install the aws cli
rm -rf __pycache__
cd src
mkdir -p python/lib/python3.7/site-packages
pip install -r requirements.txt -t python/lib/python3.7/site-packages
zip -r src.zip *
aws s3 cp src.zip s3://my-bucket/src.zip
aws lambda update-function-code --function-name myFunc --s3-bucket my-bucket --s3-key src.zip --publish
# then go to console.aws.amazon.com/lambda/functions/myFunc
# click the play button with contents {"test": 42}
Updating/testing a node flavored AWS lambda function
# install the "right" version of node
# install the aws cli
cd src
npm install
zip -r src.zip *
aws s3 cp src.zip s3://my-bucket/src.zip
aws lambda update-function-code --function-name myOtherFunc --s3-bucket my-bucket --s3-key src.zip --publish
# then go to console.aws.amazon.com/lambda/functions/myOtherFunc
# click the play button with contents {"test": 42}
Consider the above sets of commands that would be typed in order to populate an environment for user-testing code changes. Lets say developers in this scenario are required to run these commands before each change can be viewed. First we want to convert the steps away from code and into plain language, especially the few lines which are meant for a human to go and click through GUI.
- Setup the local build environment
- Clean up local build environment
- Gather project dependencies
- Create build artifact
- Update/deploy project infrastructure
- Test project
We could convert them into equivalent make actions verbs. The step of converting the original commands into plain language is useful because a common vocabulary between languages/repositories is one key advantage of using this technique. Ideally, all the original commands are captured within the new vocabulary in a common place that makes sense to all project maintainers.
Makefile
setup:
@echo "setup build environment"
@sudo apt-get -y install python3.7 \
&& curl "https://s3.amazonaws.com/aws-cli/awscli-bundle.zip" -o "awscli-bundle.zip" \
&& unzip awscli-bundle.zip \
&& sudo ./awscli-bundle/install -i /usr/local/aws -b /usr/local/bin/aws
clean:
@echo "cleaning build environment"
@rm -rf __pycache__
@rm -rf src/python/lib/python3.7/site-packages
build:
@echo "building artifacts"
@cd src \
&& mkdir -p python/lib/python3.7/site-packages \
&& pip install -r requirements.txt -t python/lib/python3.7/site-packages \
&& zip -r src.zip *
deploy:
@echo "deploying"
@aws s3 cp src.zip s3://my-bucket/src.zip \
&& aws lambda update-function-code --function-name myFunc --s3-bucket my-bucket --s3-key src.zip --publish
test:
@echo "smoke testing, expecting answer 42";
@aws lambda invoke --function-name myFunc \
--payload '{"test": 42}' \
--invocation-type RequestResponse \
--log-type Tail myFunc.out | jq '.LogResult' -r | base64 --decode
A frontend application that may be vue, react, angular, has totally different steps when compared to a lambda. Another repository that acts an open source library will have a very different setup, testing, etc. Well established verbs give the makefile much more meaning, as an executable document. Whats more is that these very same commands can be used verbatim by a ci/cd environment.
ommands into the same step.
Executable Documentation
This process has created an on-boarding script for new developers who join the team later. Instead of taking hours or days to setup a new developer's environment,
a new engineer can simply run the make commands (`make setup build deploy test`) to get up and running. If any issues are discovered, the developer can speak/type "the make setup step doesn't work for me", allowing triage efforts to be simple.
Specific discussion points in the codebase are arguably more streamlined than a refactored codebase. Not only is this clear vocabulary more useful than a broad refactor, it is _faster to implement_ than a refactor. These simplified verbs become more valuable the more repositories a team manages. On the contrary, a refactor of N repositories has a way of causing at least a linear time cost increase.
This executable is of a better caliber than _traditional_ on-boarding documents which may or may not have executable snippets. If they do happened to have snippets, those snippets become outdated rapidly. Code that is not actually executed, and executed often, will inevitably rot. Makefile code will be used everyday by developers on the project and are updated as needed, unlike a neglected on-boarding word document or wiki pages with examples and snippets.
Leveling up after adoption
A frontend application that may be vue, react, angular, has totally different steps when compared to a lambda. Another repository that acts an open source library will have a very different setup, testing, etc. Well established verbs give the Makefile much more meaning, as an executable document. Whats more is that these very same commands can be used verbatim by a CI/CD environment.
Circle CI Example
jobs:
test:
docker:
- image: circleci/python
steps:
- checkout
- run: make setup
- run: make build
- run: make deploy
- run: make test
When a Makefile is developed and then consumed by a continuous integration solution, our scripts to remain an up-to-date living document that explains how one `builds`, `deploys` etc. If a developer needs to know how to `{verb}` _(build, deploy, test, setup)_, then they have only to open the simple text file labelled Makefile. It becomes a map for what is possible within a specific project.
This kind of contract between the continuous integration, developers, and repositories creates a familiar knowledge base, clearing up any obscurity around new and old codebases. Knowledge that was previously silo'd by the author(s) becomes exposed within the Makefile, or else the continuous integration fails on the spot. Using continuous integration tools to police your living documentation saves time that isn't even captured by sprint planning.
A Totally different example
Here is an example of how the same verbal contract will be used for a frontend project.
Makefile
TF_VER := 0.12.12
setup:
@make -s terraform \
&& npm install ;\
make -s clean
build:
@terraform init \
&& terraform get \
&& @make --no-print-directory create_dist
test:
@npm run test \
&& terraform validate \
&& make lint
lint:
@npm run lint \
&& terraform fmt
deploy:
@terraform apply ${AUTO_APPROVE}
create_dist:
@npm run build
clean:
@rm -f .terraform/terraform.tfstate
terraform:
@echo "installing terraform ${TF_VER}"
wget https://releases.hashicorp.com/terraform/${TF_VER}/terraform_${TF_VER}_linux_amd64.zip > /dev/null 2>&1 \
&& unzip ./terraform_${TF_VER}_linux_amd64.zip -d . \
&& rm -f ./terraform_${TF_VER}_linux_amd64.zip \
&& chmod +x ./terraform \
&& sudo mv ./terraform /usr/bin/
We can see that is has a few more named actions, and those actions are called by the primary known verbs `setup, build, deploy, test`. This convention is crucial to the pattern. By ensuring that all repositories have the same Makefile interface, whether frontend, lambda microservice, library, monolith, or some other hybrid codebase. You ensure that everyone can just `git clone myRepo.git && make setup` to get up and running.
Counter Arguments
I have had heard several reasonable arguments against using Makefiles for this purpose. "Bash is easier for us", "we only use the XYZ language", "our production machines are a different OS than our development machines".
It is true that there is no silver bullet. These organizational recommendations are no exception. It will take time, effort, and likely some polite healthy conflict between engineers to come to a consensus on common verbs. Even after successfully completing that, the scripts become something else to maintain. The rewards outweigh the minimal costs, especially when the alternative is considered; repeated hours of conversation about the wrong things (because person A knows it as XYZ, and person B knows it as QRS). Days or weeks of development time lost to an obscure on-boarding processes (because you are asking the developer to learn your team's unwritten rituals).
With these concepts in mind, there are still best cases and worst cases for this kind of solution. If you can't use makefiles, then apply the principles to some other tool like `bash` or `powershell`. The convention may not be as readily recognizable as the ole tried and true Makefile, but it is far better than nothing.
An especially good use case if the shop:
- Deploys to a linux runtime.
- Team members develop on _(mostly)_ linux/apple.
- Where GNU make is installed by default; principles should work with BSD make, however the mileage with syntax and examples may vary.
- Has multiple programming languages.
- Team members are not terminal shy.
Not the best use case if the shop:
- Is primarily a Microsoft shop.
- Exclusively uses powershell.
- Microsoft does not easily support normal makefiles.
- Team members develop on a Microsoft os.
- Has a single language for your entire shop.
- Ex: all our code is written in golang; ex: all our code is c#, etc.
Makefile Gotchas
Hopefully this blog has piqued your interest and you will give Make a try - these are some common errors that trip people up, causing time spent on syntax over logic.
- Must use `tab` for whitespace in the makefile action scope.
- Whitespace is very specific if you use it.
- If you use spaces to the left of a command within an action, then you will see `Makefile:{LineNumber}: *** missing separator. Stop.`
- Control statements `ifeq`, `endif`, etc, cannot have whitespace in front of them
- Must use either casing `makefile` or `Makefile`
- If you use camel-casing ex: `MakeFile`, then a confusing error will occur `make: *** No rule to make target 'clean'. Stop.`
- It's not a bash shell
- Not everything you type in a bash shell will execute the exact same way in the Makefile
- Every line is a new context!
- Local variables are unset
- pwd context is reset back to root directory (of the Makefile)
- Access variables from the cli with `${MY_VAR_NAME}`
- Access variables in context with
$$MY_VAR_NAME
Conclusion
Make is an excellent tool to create a build/deployment contract within a project that can be maintained by engineers working on the code to produce reliable, clear, and tested environment setup procedures. Make has had decades to mature and be used in a vast array of projects across different technology stacks, languages, etc. The rationale of using Make, and the value it presents to a team and their project(s), is the same that has led to tools in all sorts of languages, the "keep in the code" approach of Infrastructure-as-Code, and a variety of build tools across various languages. The benefit of Make is its maturity, flexibility, and language agnosticism - if you can run it in a shell you can run it in Make. Make's simple flexibility exposes great power to teams to define, maintain, and consistently test their projects' environment in a common lexicon that can be run across any modern CI/CD platform.
The JBS Quick Launch Lab
Free Qualified Assessment
Quantify what it will take to implement your next big idea!
Our assessment session will deliver tangible timelines, costs, high-level requirements, and recommend architectures that will work best. Let JBS prove to you and your team why over 24 years of experience matters.