NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
How I stopped worrying and loved Makefiles (gagor.pro)
frankwiles 3 days ago [-]
If you’re really looking for a tool to collect small steps/script I highly recommend you check out the ‘just’ cli tool. It’s completely replaced our use of Make and the syntax is much easier.
Pesthuf 3 days ago [-]
I've always wanted to use one of those, but you have to convince the other developers on your team to use it as well. Make is a relatively easy sell because it's on pretty much every system or at least trivial to install and has decades of reputation.
alwaysbeconsing 3 days ago [-]
> it's on pretty much every system

Some system have GNU, some BSD.

> at least trivial to install

Then same for `just`, package manager have it.

ErikBjare 3 days ago [-]
This is why I'm uninterested in just. I'd rather just use make than add another dependency.
pdimitar 18 hours ago [-]
It would have taken you less time to install it than it took you to write your comment.
ErikBjare 3 hours ago [-]
You're not accounting for updating my hundreds of existing Makefiles and CI config, and asking users to install yet another build dependency.
metaltyphoon 3 days ago [-]
Heresy, I know, but Make is a dependency in windows land.
ErikBjare 3 hours ago [-]
Luckily few of my users build from source on Windows.
srid 3 days ago [-]
I've been using this on almost all of my projects, and am really pleased with it. Shell autocompletion is a nice bonus. If you also Nix, checkout `just-flake`:

https://github.com/juspay/just-flake

MrDresden 3 days ago [-]
Have moved all my 'frontend' Makefiles to Just and couldn't be happier.
foobarqux 3 days ago [-]
It doesn't do conditional processing of only out of date dependencies though, which is something that is often needed.
kstrauser 3 days ago [-]
Seconded. For all the million things we used Makefiles for besides compiling software, Just is much more ergonomic.
___timor___ 3 days ago [-]
Looks interesting.
devsda 3 days ago [-]
Agree.

I should specifically mention their docs. The docs are easily approachable with plenty of clear and concise examples. I even have a pdf copy of the doc book for quick reference.

xonix 3 days ago [-]
ricardobeat 3 days ago [-]
Came here to say the same. Make works fine, until it evolves into a web of commands and associated shell scripts, and then most users give up on trying to understand what is happening. Justfiles are much more manageable.
AceJohnny2 3 days ago [-]
lol that's a purple link for me: https://github.com/casey/just
sam_bristow 3 days ago [-]
If you're going to use Makefiles as a top-level build wrapper you might be interested in self-documenting targets.

https://marmelab.com/blog/2016/02/29/auto-documented-makefil...

JonChesterfield 3 days ago [-]
The punchline here is a help target that digs through your makefiles looking for comments:

  help:
 @grep -E '^[a-zA-Z_-]+:.*?## .*$$' $(MAKEFILE_LIST) | sort | awk 'BEGIN {FS = ":.*?## "}; {printf "\033[36m%-30s\033[0m %s\n", $$1, $$2}'
I stumbled over .*? since ? usually means optional, but turns out it means lazy in this context. The $$ would be makefile escaping. Dropping the grep and changing the regex slightly, I just appended this to an overengineered makefile:

  # Help idea derived from https://marmelab.com/blog/2016/02/29/auto-documented-makefile.html
  # Prints the help-text from `target: ## help-text`, slightly reformatted and sorted
  .PHONY: help
  help: ## Write this help
   awk 'BEGIN {FS = ":.*#+"}; /^[a-zA-Z_*.-]+:.*## .*$$/ {printf "%-30s %s\n", $$1, $$2}' $(MAKEFILE_LIST) | sort

Very nice idea, thanks for sharing it!
ceving 3 days ago [-]
If you can give up regexp greediness sed+column can do it:

    sed -rn 's/^([^:]+):.*[ ]##[ ](.+)/\1:\2/p' $(MAKEFILE_LIST) | column -ts: -l2 | sort
JonChesterfield 3 days ago [-]
Sed confuses me more than awk but you're right. That would also remove the only use of awk in my makefile (sed is there already for hacking around spaces in filenames).

Whitespace padding output in sed is probably horrible, column looks simpler than printf via bash or trying to use make's $info.

defrost 3 days ago [-]
Whilst acknowledging that "Confused by SED" has overlap with "have used SED for 40 years and have three books purely on SED" I can recommend

https://www.grymoire.com/Unix/Sed.html

as a reference some might occasionally swear by.

    Anyhow, sed is a marvelous utility. Unfortunately, most people never learn its real power. The language is very simple, but the documentation is terrible. The Solaris on-line manual pages for sed are five pages long, and two of those pages describe the 34 different errors you can get. A program that spends as much space documenting the errors as it does documenting the language has a serious learning curve.
JonChesterfield 3 days ago [-]
Appreciated, reading through it. I suspect the majority of the sed experience can be attributed to it using "posix regular expressions" by default. It was about a decade after first discovering sed that I realised passing -E was really important.

It is difficult for newcomers to guess that "extended regular expressions" refers to the barely-usable subset of "regular expressions" and "posix regular expressions" are terrible in comparison to either.

edit: alright, yes, one can program in that. Sed can recurse.

  .PHONY: help3
  help3:
   sed -nE 's/^([a-zA-Z_*.-]+):.*## (.*)$$/\1 :\2/ p' \
   $(MAKEFILE_LIST) | \
   sed -E -e ':again s/^([^:]{1,16})[:]([^:]+)$$/\1 :\2/ ' -e 't again '  |\
   sed -E 's/^([^ ]*)([ ]*):(.*)$$/\1:\2\3/' |\
   sort
The first invocation filters out the lines of interest, second one space pads to 16. That works by putting the colon before the help text and repeatedly inserting a space before the colon until there are at least sixteen non-colon characters in the first group.

Composing the -n/p combination with act on everything is a stumbling block for merging the multiple invocations together but I expect it to be solvable.

JonChesterfield 3 days ago [-]
After a slightly dubious use of time, I can confirm that columns is not necessary. Also noticed that the original version missed double_colon:: style targets. I fear sort is not necessary either but one has to draw the line somewhere.

  HELP_PADDING := 30
  
  .PHONY: awkhelp
  awkhelp: ## Write this help using awk
   @echo "awkhelp:"
   @awk 'BEGIN {FS = ":.*#+"}; /^[a-zA-Z_*.-]+:.*## .*$$/ {printf "  %-'$(HELP_PADDING)'s %s\n", $$1, $$2}' \
   $(MAKEFILE_LIST) | \
   sort
  
  .PHONY: sedhelp
  sedhelp: ## Write this help using sed
   @echo "sedhelp:"
   @sed -E \
   -e '/^([a-zA-Z_*.-]+::?[ ]*)##[ ]*([^#]*)$$/ !d # grep' \
   -e 's/([a-zA-Z_*.-]+:):?(.*)/  \1\2/ # drop :: and prefix pad' \
   -e ':again s/^([^#]{1,'$(HELP_PADDING)'})##[ ]*([^#]*)$$/\1 ##\2/ # insert a space' \
   -e 't again # do it again (termination is via {1, HELP_PADDING})' \
   -e 's/^([^#]*)##([^#]*)$$/\1\2/ # remove the ##' \
   $(MAKEFILE_LIST) | \
   sort
konfekt 3 days ago [-]
Which make version does the latter command use? For GNU make 4.2.1 on Linux, I hade more luck with something along the lines of

    @awk '/^[a-zA-Z_-]+:[^#]*## .*$$/ {match($$0, /## (.*)$$/, arr); printf "\033[36m%-30s\033[0m %s\n", $$1, substr($$0, RSTART+3)}' $(MAKEFILE_LIST)
JonChesterfield 3 days ago [-]
Looks like 4.3 but I don't think it matters - awk vs gawk/nawk might be significant though, gawk 5.2 on the machine I ran this on.

The match with substr is interesting. It's more complicated than setting the field separator to something like :|#+ but should mean : in the help text works. For something one only writes and debugs once, probably better to do the complicated thing that always works.

gawk will write the groups to an array, that's possibly more legible (and slower? should be slower than the leading non-capture //)

  @gawk 'match($$0, /^([a-zA-Z_*.-]+):.*## (.*)$$/, arr) {printf "  %-30s %s\n", arr[1], arr[2]}' $(MAKEFILE_LIST) | sort
rcarmo 3 days ago [-]
I have been doing this for a long while now, and it's great
Groxx 3 days ago [-]
I've been using this for a while now - it's fantastic, highly recommended for non-core-Unix-build-stuff. Just make `help` your default recipe and voila, a massive drop in people asking "how do I run tests in make" -> they probably tried to `make` already and it told them how.
___timor___ 3 days ago [-]
Great idea, added to my todo :)
_benj 3 days ago [-]
Great tip! Thanks for sharing :-)
unmdplyr 4 days ago [-]
Even the slightest attempt at guessing the host system or searching for tools present on a system will quickly convert this into a blog article earnestly begging for deep societal changes and analyses the inevitable marching of time.
jimbokun 3 days ago [-]
The real reason container systems like Docker became so popular.
banish-m4 3 days ago [-]
Hahaha. Yeap. And project dependency analysis and library feature detection quickly demand the GNU variant, autotools (gasp!), clunky scripts, or something else. Use make for simple things and simple things only.
patrickmay 3 days ago [-]
Eric S. Raymond has a new project to eliminate the need for autotools: https://gitlab.com/esr/autodafe

It appears to give the benefits of autotools with the simplicity of make.

tomjen3 3 days ago [-]
Yeah. Which is why I wouldn't implement that in the first place - I come down hard on the idea that you should ship your dependencies, have them in a standard place or use a third-party resolution system (like every vaugely modern setup does. C#, Python, Ruby, Node, Rust even Java all understand that this is not optional anymore).

Its a code smell to me when your build system starts to become too complicated and not fit on a screen.

ReleaseCandidat 3 days ago [-]
You do know that there is software that runs on more than one version of Linux (or even on a *BSD, MacOS or Windows), works with more than a single version of a compiler,...?
tomjen3 3 days ago [-]
Thats only really a problem for C/C++, which I have had the misfortune to make build systems for.

The solution is not to ship complicated and bug infested build systems, it is to fix the dependency problem in the same way any other language has done so far and until we do this, ship the dependencies with the program.

And if you aren't making a GUI and don't need to target Windows, just wrap it in Docker, which enforces a functional dependency system and means you can ship your dependencies.

ReleaseCandidat 3 days ago [-]
> Thats only really a problem for C/C++

Even if that would be true, still every language that isn't C (or JS in the browser) needs to link or build against some C (or Fortran), which results in more or less working solutions on how to integrate with or build C sources. Of course you may not see that (as long as it works), because somebody else has wrapped that up in a package (or whatever) for your language of choice, but somebody has to do that.

> ship the dependencies with the program.

My post should have been the answer to this argument: this is not possible for example when "shipping" the source for a cross-platform library. Or a cross-platform end-user program/app. Or just about anything which isn't "just" some web-backend or a server of some kind.

Does that mean _you_ need a complicated (I'd call that "working for anything but the most basic stuff") build-system? No.

I am a bit puzzled by this phrase:

> The solution is not to ship complicated and bug infested build systems, it is to fix the dependency problem in the same way any other language has done so far

But any other language than C or C++ (sadly, _way_ less than "any other") solves that by using a complicated and bug infested build system(s) and package manager(s) or a combination of both.

tomjen3 3 days ago [-]
>But any other language than C or C++ (sadly, _way_ less than "any other") solves that by using a complicated and bug infested build system(s) and package manager(s) or a combination of both.

Which would go away if C++ had a working package manager, that worked in the same way every other language did, was written once only and about as bug free as the compiler. This would also allow you to ship the source code for the library with a simple file that list the dependencies you need.

I guess my final issue boils down to this: CMake does too much and too little. It is too hard to get it reliably pull down the libraries I want to use and it can do too much as part of the build to the point that it becomes too complicated.

Do one thing, and do it well.

ReleaseCandidat 3 days ago [-]
Oh, please don't get me wrong, I hate CMake and Autotools (I'm not sure which is worse). And both Vcpkg and Conan have their own problems.
lelanthran 3 days ago [-]
> The solution is not to ship complicated and bug infested build systems, it is to fix the dependency problem in the same way any other language has done so far

With maybe one (or two) exceptions, those other languages' build systems are incredibly susceptible to supply-chain attacks.

And, to be honest, unless you have a burning need for autoconf's main value proposition (cross-compiling for a different target system), plain gnu-make and storing your dependencies in your repo is probably a lot safer than many other build systems.

I've built software with dependencies on libpng, libcurl, libsodium and more and was confident in the security of the resulting binary. I've also done one or two node.js projects, and had much less confidence that it won't be supply-chain attacked on the next build.

chriswarbo 3 days ago [-]
> if you aren't making a GUI and don't need to target Windows, just wrap it in Docker

In other words, if you also don't need to target MacOS or *BSD, which were the parent's stated requirements. MacOS can't run Docker, and BSDs seem to be unstable/unsupported targets, so the only stable way to get Docker on such a machine is to run Linux in a VM. Which isn't really a solution for cross-platform development, as it is a denial of it.

(Also, if you were to go down that route, why not just ship the VM; rather than dragging in all the crap that Docker entails?)

Annatar 3 days ago [-]
[dead]
keyle 3 days ago [-]
I use makefiles a lot as make-do but I must admit the syntax does my head in at times. If anyone has a good resource that teaches makefile progressively, I'd be interested.

The main issue I have is that it goes from dead simple to pit of what the hell is happening with various things interacting with each others.

liveranga 3 days ago [-]
redact207 3 days ago [-]
I just use copilot for this and bash scripting, which I both do very little of and forget the syntax
throwaway290232 3 days ago [-]
> If anyone has a good resource that teaches makefile progressively

https://www.gnu.org/software/make/manual/make.html

da-x 3 days ago [-]
This pattern of usage always seems like abuse.

When `make` is used as a glorified front-end to `bash` scriptlets, why not use `bash` directly instead of having two-level of scripting?

See: https://blog.aloni.org/posts/bash-functional-command-relay/

mickeyp 3 days ago [-]
Because PHONY targets can do that, too, and without the needless manual work. Because a Makefile can still do Makefile things: PHONY targets depending on other PHONY targets, which so happens to depend on that one openapi json export you also create, which in turn depends on ...

You can do that in Bash. And now you've reinvented Makefile, but poorly.

dalore 3 days ago [-]
Because the Makefile also becomes a central place of what you can run in a project without having dozens of different shell scripts. You can comment on targets, depend on others. Makefile targets to restore, the build i18n files, etc

I made a bash script that takes your Makefile and gives you a nice dialog menu for the targets that have comments. Works nicely as a self documenting project command menu.

https://gist.github.com/oppianmatt/dcc6f19542b080973e6164c71...

https://private-user-images.githubusercontent.com/48596/3262...

nrclark 3 days ago [-]
GNU Make gives me:

   - tab completion of targets
   - automatic dependency execution
   - automatic entry points between every task
   - result caching
   - parallel execution
Yes, it’s possible to do all of this by hand in shell scripts. But why would I, when Make is ubiquitous and battle-tested?
penguin_booze 3 days ago [-]
Tab completion of targets is usually done by a separate package, not by GNU make itself: bash-completion.
nrclark 3 days ago [-]
Yes, that's true. But it is "on-by-default" for 95% of desktop Linux machines, with no special action needed.
Am4TIfIsER0ppos 3 days ago [-]
Its on until you try to tab-complete a filename that is created through make but the completion script can't detect it at which point the entire bash-completion package is uninstalled.
nrclark 3 days ago [-]
What?
kergonath 3 days ago [-]
Making sure a dependency is up to date before doing something is annoying. Building a representation of dependencies to figure out what can be done in parallel is a bit more complex. Doing it for dozens of targets is a major pain in the backside.

Sure, you can do it in bash, or python, or whatever. But then you have a cumbersome, not particularly interesting piece of code full of boiler plate. Of course, you can design it a bit, organise things neatly, and then use a config file because fiddling with the code in each project is unsustainable in the long run. At this point, you’ve just made a poor copy of make and thrown away all the good bits that result from decades of experience and weird corner cases.

The syntax of Makefiles is terrible, but make itself is very useful and versatile.

And that pattern is not abuse, it’s the sort of things Make was designed for. It’s just that we’re used to think of make as this old thing that just runs a compiler and that’s such a pain to deal with that we need Makefile generators to do it properly. And certainly that’s true for complex software compilation, but make is more versatile than that.

otabdeveloper4 3 days ago [-]
`make` does dependency resolution. That's its original job, by the way, and calling out the dependency resolution steps to bash was the original intention.
lfmunoz4 2 days ago [-]
Also there are plugins for all editors to allow clicking on a target and it runs code. This would be significantly harder with bash.
3 days ago [-]
lijok 3 days ago [-]
It is abuse, but people love ergonomics more than they love reducing dependencies
toast0 3 days ago [-]
If you use Make to build, and you use Make to deploy, it's good for ergonomics, and it's good for reducing dependencies.

Certainly abuse, but hey.

kristjansson 3 days ago [-]
Turns out software is for humans, mostly :)
Mister_Snuggles 3 days ago [-]
My favourite (ab)use of `make` is as a batch runner: https://news.ycombinator.com/item?id=32441602

This (ab)use of `make` runs multiple times a day, every day, and works perfectly every time.

The inspiration of this was an (ab)use of `make` as a way to paralellize Linux system startup: https://web.archive.org/web/20110606144530/http://www.ibm.co...

matheusmoreira 3 days ago [-]
I (ab)use make to manage my dotfiles.

https://github.com/matheusmoreira/.files/blob/master/GNUmake...

I'm surprised at how well this thing works every time I use it. I even blogged about it.

https://www.matheusmoreira.com/articles/managing-dotfiles-wi...

Recently made a tool that processes the make database and prints the phony targets and their dependencies:

https://github.com/matheusmoreira/.files/blob/master/~/.loca...

I use it as a sort of makefile help command. Works surprisingly well too.

dissent 3 days ago [-]
One issue I don't hear mentioned often is reuse.

A task runner's tasks could be arbitrarily complicated, pulling in all sorts of dependencies of their own. This is less true for the traditional compile targets make was designed for.

Because the things we do in a Makefile are pretty much always project local and don't get reused, it limits how much heavy lifting these tasks are likely to do for us. Whereas if you built your our CLI in Python with Click or something, you would be able to make it a development dependency of your project. You can afford to invest in those tasks more because they'll be reused.

The Just command runner has the same problem, but at least it's designed to be a task runner.

kristjansson 3 days ago [-]
Build a CLI / complex task as part of your project, then invoke it via make. This pattern is much more about documenting and composing steps than implementing them
dissent 3 days ago [-]
Why invoke it via make when it can invoke itself? It's just another dependency that's not needed in this scenario.
nrclark 3 days ago [-]
Make is leaner and more agile than a custom CLI. It takes no time to get started, and there is no boilerplate. Removing or adding steps is trivial, running shell commands is trivial, and hooking into the dependency graph is trivial. Parallelism is built-in, as is dependency resolution. Tab-completion is standard on most Linux distros.

It's also better from an architectural separation perspective. Your custom CLI will have custom commands and flags, and probably will need to be called in some standardized way. Make is very good at calling your toolchain / build-system (whatever that might be) with the exact arguments that you want. And things like "make all" or "make clean" are muscle-memory for hundreds of thousands of developers.

Why mix your custom tooling with the task of standardizing an entry point?

I've had really great success over the years with the pattern of "some build system or another (cmake, bazel, autotools, etc) orchestrated by a top-level Makefile." It's simple, portable, and flexible. What's not to like, other than ugly syntax?

dissent 2 days ago [-]
I know a lot of developers who agree with you.

But how is make dependency free? You need to install make. Which version? GNU make, or FreeBSD make? What platform are you installing it on? What version? In my team we had to get all our devs to manually upgrade from 3 to 4 as we were using modern make features to make it a nicer task runner.

These are all things you've already had to deal with in the custom CLI, which is also a perfectly good entry point. You also have a lot more control of command line arguments, rather than just make targets ("just" has also added this as a feature)

matheusmoreira 3 days ago [-]
GNU Make is a surprisingly lisplike language. At some point I realized it had an eval function which I could use to metaprogram it. This actually ruined one of my projects, I got so lost trying to create the perfect makefile system that nothing short of a rewrite would fix it. As a side effect, I finally understood what autoconf's purpose was.
jdougan 3 days ago [-]
You can build Gnu Make so it can work with Guile Scheme.
yboris 3 days ago [-]
I don't like makefiles, but I've been enjoying justfiles: https://github.com/casey/just
lifthrasiir 3 days ago [-]
Ninja is simple and fast, but intentionally limited in order to not be programmable. Make is powerful and versatile (especially the GNU variant) but has an arcane syntax and lots of pitfalls. I feel that there is a niche between make and ninja for the task runner.
dragontamer 3 days ago [-]
> I feel that there is a niche between make and ninja for the task runner.

I think that's called CMake.

ryukoposting 3 days ago [-]
As one of the unfortunate souls damned to using CMake regularly, I can confidently say that it is slower, less maintainable, and less intelligible than make.
lifthrasiir 3 days ago [-]
No, CMake is a compatibility layer on top of existing task runners like make and ninja. I don't want a compatibility layer, and also CMake has even more features than make.
qart 3 days ago [-]
CMake is a (not the) correct answer according to the Ninja manual[1]. "Some explicit non-goals: convenient syntax for writing build files by hand. You should generate your ninja files using another program. This is how we can sidestep many policy decisions."

The other options are here: https://github.com/ninja-build/ninja/wiki/List-of-generators...

[1] https://ninja-build.org/manual.html#_design_goals

lifthrasiir 3 days ago [-]
It is entirely correct that Ninja is technically designed for a related but different problem. But you can write Ninja by hand with some restrictions (I have done so for example), so bring back some of those sidestepped decisions may still be worthwhile.
a1o 3 days ago [-]
CMake is a very cool buildsystem that can do a lot, you can use it to run tests, fetch sources, and everything else. The documentation is the only thing that I find quite suboptimal, they could really add examples in there and better explain things. Or at least have list of projects that they believe are using CMake correctly so one can have some guidance. It took me five years before I was comfortable writing CMake from scratch.
saghm 3 days ago [-]
I had a boss who once quipped that CMake became much easier for him to understand and write once he realized it was just a really shitty version of BASIC with only global variables. (He later added "but two separate namespaces for them" because of the prevalent use of environmental variables as well as CMake-specific variables)
greenavocado 3 days ago [-]
We need something for CMake like what Elixir is to Erlang. I hate the syntax, fit, and finish but I am very thankful it exists.
ris 3 days ago [-]
Unfortunately make's behaviour around dynamically setting variables/environment variables is insane and quickly leads you towards hairy eval commands with extremely tricky quoting & escaping.
john-tells-all 3 days ago [-]
I suggest using := as in `APP_URL := http://localhost` vs raw "=". The colon-equals format means "set value now", so it's easier to understand.

I've never used eval.

Make's use of Bash can lead to hairy quoting/escaping. I use Make as a "dumb high-level runner" and put any sort of intelligence, like conditionals or loops or networking, in lower-level scripts or programs.

Make is an orchestrator.

Fire-Dragon-DoL 3 days ago [-]
What's the rationale for using makefiles as script runners over just having a directory with scripts inside?

Not for compiling, just as script runner.

I see this practice often and I haven't found a good reason

blipvert 3 days ago [-]
If scripts need particular arguments the make is a good place to record them.

I use it quite a lot for automating deployments - if you want to Terraform up a VM:

  make colo1-fooserver01.vm
Then if you want to run Ansible against it:

  make colo1-fooserver01
You don’t have remember or type all of the flags or arguments - just type make and hit tab for a list of the targets that you can build
globular-toast 3 days ago [-]
Most shells will tab complete after `./scripts/` too. In fact that's probably more common than make completion.

I think the real reason is you have it all in one file rather than multiple scripts which makes it easier to edit and maintain.

blipvert 3 days ago [-]
Quite - one makefile rather than dozens of scripts which all do practically the same things.
wizhi 3 days ago [-]
But this can literally just be done in a simple shell script as well. The makefile ends up just being a redundant way to run a shell script.
lelanthran 3 days ago [-]
> But this can literally just be done in a simple shell script as well.

Only if there's no dependencies. It's unusual that GP's type of usage has no dependencies.

chuckadams 3 days ago [-]
When my shell scripts depend on another script ... they run the other script. Make definitely has its place, especially when dependencies get complex and parallel, but it's hardly necessary for simple cases. Once Make is needed, it's trivial to drop in and have it wrap the standalone scripts.
lelanthran 3 days ago [-]
> When my shell scripts depend on another script ... they run the other script.

I hear you, but you're running the other script unconditionally. If it downloads something, it will download it every time you run the first script.

In this simple case, make runs the other script conditionally, so it need not run every time.

blipvert 3 days ago [-]
I build .tf files from parameters for each host in the Makefile (and script which knows the vSphere topology) for one-shot execution (it only creates the VM, it doesn’t manage the lifecycle) and also template config that needs to be done before deployment - there are plenty of dependencies
Groxx 3 days ago [-]
Dependency management, definitely. Loads of scripts don't work until X has been done, and X, Y, Z, and sometimes QWERTY have to be done first, and they take minutes and a ton of bandwidth so you don't want to do them unless you have to...

... and if your scripts do all that, they've basically rebuilt make, but it's undocumented and worse.

(I say this as someone with LOTS of experience with make, and am not really a fan because I know too much and it's horrifying. But I dislike custom crippled versions even more.)

mark_story 3 days ago [-]
It can help abstract the differences you may have across projects. If you're on a team with many projects/repositories, having conventions across them all helps improve onboarding, cross-teamwork and promotes better dev ux. A really simple way to do this is make. It lets you have the common targets and convert them to the relevant target. This can become more useful as you write automation for CI and deployments for all your projects.
shmerl 3 days ago [-]
Likely a declarative way to specify dependencies. But not sure if make as a tool for that is the best option in general.
omoikane 3 days ago [-]
Dependency management and automatic parallelization (via `make -j`).
nesarkvechnep 3 days ago [-]
I’m tired of seeing articles full of examples of .PHONY targets. Make works with the file system!
liampulles 3 days ago [-]
I've gotten so used to using Makefiles with Go dev that for my other side projects with node, python, ruby, etc. I wrap the tools and commands I can't remember in a Makefile. A quick squizz after a few months away reminds me of how that environment works with building, testing, etc.
zokier 3 days ago [-]
The "Simple Makefile for Python projects" exemplifies why I dislike (ab)using make. It doesn't actually track deps properly, so venv doesn't get updated if requirements.txt changes, and nor does the dependency change tracking work properly for test target. To make it more correct you'd need bunch of .stamp files and/or globs, and even then it might be iffy. For lots of uses the simple file mtime based change/dep tracking is just too crude, and phony targets are largely an antipattern.

For script-running just is great, for full dep tracking build tool something like buck2 is an improvement.

The one place where make shines is when your workflow is truly file-based, so all steps can really be described as some variations of "transform file A to file B".

nrclark 3 days ago [-]
My team uses Make to handle the top-level scripting for a Python development project, and it works great. It was pretty easy to set up the correct dependency relationships.

Make is a powerful tool. You just have to understand how it thinks about the world, and adjust your own thinking a bit if needed.

If you just want to have tasks that depend on other tasks, you don't need stamps, phony, or anything else.

But what happens when you want to say "only rebuild my venv if requirements.txt changed"? that's a file dependency that you can reasonably express between requirements.txt and venv/bin/activate. And then all of a sudden, you're squarely in Make's wheelhouse.

1oooqooq 3 days ago [-]
i always hated make. until i read the manual...
pizzafeelsright 3 days ago [-]
Now this sounds like wisdom. I'm going to read the French manual.
jaza 3 days ago [-]
Thanks! That article demystified quite a bit of the magic of Makefiles for me.

However, even after 10+ years of professional dev work during which I've regularly crossed paths with Makefiles, they still scare me, and I've still never written one from scratch myself. I cling to bash scripts, which I'm also a rookie at (or, for more complex cases, I write Python scripts, which I'm much more comfortable with).

I guess one day I'll read the manual, and digest some tutorials, and actually learn make. But I've made it this far...

lelanthran 3 days ago [-]
> However, even after 10+ years of professional dev work during which I've regularly crossed paths with Makefiles, they still scare me, and I've still never written one from scratch myself.

Make has its warts (look at when and what it was designed for, after all!), but I've found it much easier to write Makefiles than YAML-for-github-runners.

If you're used to YAML, you get pleasantly surprised when using Make, which does execution of dependency trees in a more readable manner than most CI/CD YAML files do.

JohnFen 2 days ago [-]
There is quite a lot to love about make! I still haven't seen an alternative that is substantially better, and most are worse in one way or another.

My opinion may be a bit skewed by the fact that I write code that gets built on a variety of different platforms, though, and make is essentially universal. It lets me have a consistent build process regardless of platform.

It's also very useful for automation that isn't related to building code.

kevincox 1 days ago [-]
One minor tip is that you can define .PHONY multiple times. I find this much easier to manage because the .PHONY definition is right next to the target itself.

    .PHONY: foo
    foo:
        echo foo

    bar: barin
        cp barin bar

    .PHONY: baz
    baz: bar
        echo baz
akdor1154 3 days ago [-]
I'd love some minor tweak of Make to compare/cache with hashes instead of mtime. A worse-is-better Bazel if you will.
alextingle 3 days ago [-]
This is better achieved with the compiler level tooling, rather than in Make. It's pretty easy to replace `cc` with a short script that runs the pre-compiler and compares the result with a cache.

If the source code is kept tight, it doesn't yield much of a speed-up, though - precompiling hundreds of unnecessary headers can take a lot of time. Better to put some effort into moving unnecessary #includes out of header files. In fact, you can use your log of cache-hits to guide that work.

legobmw99 2 days ago [-]
I also wish I could make certain variables/flags part of the dependencies of a file, like if I have something with a lot of #defines that I really want to rebuild any time CPPFLAGS changes
JonChesterfield 3 days ago [-]
This might be worth building into the makefile interpreter. Doing it in the makefiles themselves is quite difficult to get right and very messy.
ceving 3 days ago [-]
Most "modern" build tools do ruin make. Recently I did this for Go: https://gist.github.com/ceving/edeb6f58429d552e8828a70640db3... But it does not feel right.
lfmunoz4 2 days ago [-]
For people using make and vscode my plugin is a must have:

https://marketplace.visualstudio.com/items?itemName=lfm.vsco...

It allows you to click above target to run target.

web3-is-a-scam 3 days ago [-]
I’ve replaced my Make usage mostly with taskfile.dev and/or Warp workflows/notebooks
wodenokoto 3 days ago [-]
I'm a bit unsure what this does better than just calling those few lines directly in a shell script.
john-tells-all 3 days ago [-]
I adore Makefiles as they're a very simple way to create a higher level abstraction in my projects. It builds data and runs small bits of code, so I can concentrate on the business-level value. I haven't found a better/simpler way that good old Makefiles.
TheRealPomax 3 days ago [-]
The fact that make is basically useless on Windows still makes it "not the first thing to try", to be honest. And it's rare to still see a project that can't be cross-platform, only lots that went "well my computer runs XYZ so I'm only compiling on that" =)
alextingle 3 days ago [-]
Make works just fine on Windows.
legobmw99 2 days ago [-]
If you’re very careful with forward-vs-back slashes and your username isn’t “Firstname Lastname”, sure. But in practice there are tons of issues especially for non expert users
shepherdjerred 3 days ago [-]
Make is excellent if you use it properly to model your dependencies. This works really well for languages like C/C++, but I think Make really struggles with languages like Go, JavaScript, and Python or when your using a large combination of technologies.

I've found Earthly [0] to be the _perfect_ tool to replace Make. It's a familiar syntax (combination of Dockerfiles + Makefiles). Every target is run in an isolated Docker container, and each target can copy files from other targets. This allows Earthly to perform caching and parallelization for free, and in addition you get lots of safety with containerization. I've been using Earthly for a couple of years now and I love it.

Some things I've built with it:

* At work [1], we use it to build Docker images for E2E testing. This includes building a Go project, our mkdocs documentation, our Vue UI, and a ton of little scripts all over the place for generating documentation, release notes, dependency information (like the licenses of our deps), etc.

* I used it to create my macOS cross compiler project [2].

* A project for playing a collaborative game of Pokemon on Discord [3]

IMO Makefiles are great if you have a few small targets. If you're looking at more than >50 lines, if your project uses many languages, or you need to run targets in a Docker container, then Earthly is a great choice.

[0]: https://earthly.dev/

[1]: https://p3m.dev/

[2]: https://github.com/shepherdjerred/macos-cross-compiler

[3]: https://github.com/shepherdjerred/discord-plays-pokemon

renewiltord 3 days ago [-]
Taskfiles are pretty cool. The tab/space things makes Makefiles painful for me.
IkemenSensei 3 days ago [-]
I learned to stop worrying about Makefiles and loved Rakefiles instead because they allow you to write Ruby code anytime you need to do slightly complex tasks.
___timor___ 3 days ago [-]
Makefile or Vagrantfile. Ruby's syntax is perfect for custom DSLs.
xg15 2 days ago [-]
Obligatory reminder that the phrase "How I stopped worrying and loved X" was supposed to be a sign of insanity, not an endorsement of X.
banish-m4 3 days ago [-]
For simple things, make is fine. It does file caching and the GNU tool does parallel concurrency.

For anything needing light-to-moderate software configuration management, cmake is readily available and simple.

If you're running a Google then you're using a build and DVCS that make use of extensive caching, synthetic filesystems, and tools that wrap other tools.

I won't say what not to use. ;)

alhirzel 3 days ago [-]
> on my system binary is only 16kB in size,

I don't believe that.

patrickmay 3 days ago [-]
119k on MacOS.
DeathArrow 3 days ago [-]
$ make love

make: ** No rule to make target `love'. Stop.

lessbones 3 days ago [-]
tell me you've never seen Dr. Strangelove without telling me you've never seen Dr. Strangelove
throwaway290232 3 days ago [-]
Makefiles fill that great need for a high-level 'scripting DSL', where you have a lot of different programs (or scripts), with a loose set of dependencies or order of operation, and you want a very simple way to call them, with some very simple logic determining the order, arguments to pass, parallelization, etc. Their ubiquity on all platforms makes it even easier to use them.

I much prefer Make to alternatives like Just or Taskfile. Besides the fact that more people know Make, Make actually has incredibly useful functionality that alternatives remove 'for simplicity', but then later you realize you want that functionality and go back to Make. Sometimes old tricks are the best tricks.

smackeyacky 3 days ago [-]
Seeing people reinvent make every 10 years is very frustrating. Just learn make!
kbolino 3 days ago [-]
Make is conceptually great but brings a lot of legacy baggage. You often need to set up .PHONY targets, reset .SUFFIXES, and/or set MAKEFLAGS += --no-builtin-rules. There's also dollar-symbol variables (which plague Perl and shell as well) which made lots of sense in the 1970s with teletypes but hinder readability today (what the hell was $@ again?).
chuckadams 3 days ago [-]
Or the fact that $FOO interprets as $(F)OO without the slightest warning. And of course if you're in a script line, you probably meant $$FOO..

Make certainly has some obscure variables, but of all the basic knowledge of Make you need to learn, $@ is near the top of the list (it's "target". an @ sign looks kind of like a bullseye. If you want to see it as visiting a dependency graph, it's the dependency you're currently "at").

kbolino 3 days ago [-]
Sure, if you use make enough, you likely remember the most important dollar-symbol stuff. But $@ is "all arguments (obeying quoting)" in bash, which is nothing like what it means in make.
chuckadams 3 days ago [-]
That of course becomes "$$@" in a Makefile recipe if you want bash's behavior and not make's... which is one of the reasons I tend to keep my shell scripts in separate files, and only grow a Makefile to wrap them later if I have to. These days I just have the directory of scripts and no Makefile. Even the rare times I do C, I prefer a script that recompiles everything and slapping ccache on top of it (but usually I'm dealing with an existing Makefile, and I just pray that it's not generated by automake)
VS1999 3 days ago [-]
Practically any language will do for running a set of tasks to compile a program. Unless you already love this "wheel" intimately, just learn a real language and use that.
lelanthran 3 days ago [-]
> Practically any language will do for running a set of tasks to compile a program.

Like? None of the mainstream languages have any sort of builtin for dependency resolution.

throwaway290232 3 days ago [-]
[dead]
gosub100 3 days ago [-]
Yeah remember tup? It was on hn probably a decade ago, people put a ton of work into it. Now "just" is the hot new thing.
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 10:44:11 GMT+0000 (Coordinated Universal Time) with Vercel.