To put it differently: Who's to say whether they were using the checks wrong, or just doing the wrong checks?
Not making an excuse for them though. They should have done that.
I guess it depends in what way it was "unclear/confusing".
“Almost never is” except when an attacker knows you’re counting on this fact, and poisons a good bundle with a bad binary that your code skips over, but macOS doesn’t when it executes the binary.
Edit: actually, even in today's mostly-x86_64-only world, there are fat binaries in macOS, because there is a separate "x86_64h" architecture for "haswell and better". So even in a pure 64bit intel world, there's going to be fat binaries for a while. For example, "file /usr/lib/libobjc.dylib" shows three slices on macOS 10.13:
On iOS, having a non-fat binary is almost the exception to the rule. For the longest time, it was common to have both armv6 and armv7 slices, and these days, armv7 and aarch64 slices. Granted, with iOS11 dropping armv and apps starting to drop iOS10 support, we'll have a run with non-fat aarch64 binaries for a while. This is quite visible for compile times and compile errors during development! Also, for iOS, there's bitcode and app thinning which does mean end user devices are often served a single slice non-fat binary anyways.
libobjc.dylib: Mach-O universal binary with 3 architectures: [x86_64:Mach-O 64-bit dynamically linked shared library x86_64] [x86_64h] libobjc.dylib (for architecture x86_64): Mach-O 64-bit dynamically linked shared library x86_64 libobjc.dylib (for architecture i386): Mach-O dynamically linked shared library i386 libobjc.dylib (for architecture x86_64h): Mach-O 64-bit dynamically linked shared library x86_64h
Vendors of closed source iOS libraries, such as the "Google maps for iOS" SDK, often ship fat binaries for the .dylibs containing both armv7, aarch64, i386 and/or x86_64. Why are Intel slices for iOS a thing? To be able to run your app and the library in the Xcode iOS simulator, which actually runs x86 code only. That's why it's not called an "emulator".
The history of fat binaries in macOS goes all the way back to NeXTSTEP (of course, since macOS is basically a modern NeXTSTEP, with NSObject still showing off the legacy behind the curtain to new iOS developers) where even m68k was a common slice. https://en.wikipedia.org/wiki/Fat_binary#NeXTSTEP_Multi-Arch... which at times even exploded to "Quad-fat binaries" containing slices for m68k, i386, pa-risc and sparc all together in one executable.
The bug described in the article says that some third-party code signature validation methods were flawed and didn't properly detect unsigned code that the third-party programs would then execute.