NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
A History of APL in Fifty Functions (2016) (jsoftware.com)
Y_Y 1581 days ago [-]
I wish the "Iverson bracket" (number 0 in this list) had caught on in a bigger way. It can really simplify writing conditional expressions, as opposed to e.g. a case expression.
contravariant 1581 days ago [-]
It's also an exceptionally useful technique in mathematics if you've got a few nested sums with dependend indices. By writing the condition in an Iverson bracket it becomes trivial to change the order of summation or change one of the variables.
dubya 1581 days ago [-]
I don't know if Knuth popularized it, but it's somewhat common in combinatorics. Similar idea to using indicator functions when doing change of variables in calculus.

https://arxiv.org/abs/math/9205211

david_draco 1581 days ago [-]
You can do that easily in numpy. ((x>0)-(x<0))*1 will give zeros and ones.
andrewla 1581 days ago [-]
I think the most interesting bit of trivia here is the discussion of Bertelsen's Number: 50847478, which is famous as being a published but incorrect count of the primes less than 1e9. 50847534 is the correct answer, but the incorrect number was derived in the 19th century and has persisted in textbooks through the 20th.

In some ways, with computers, this is pretty trivial; what's impressive is that this was computed long before computers were available, and although incorrect, was remarkably close. Computers have spoiled us.

derefr 1581 days ago [-]
Cool, but I was disappointed to find that it assumes you already know enough APL to understand the extensional definitions or analogies it gives. I was hoping it would be more of a “bootstrapping APL from a simple, explicitly-defined-in-the-text set of operators” kind of thing.
tumba 1581 days ago [-]
APL is not so much a model of computation as a new notation for semantics which already exist. The real semantic explanation of APL is set theory and traditional mathematics.

Probably the best deep ground-level explanation of APL is Iverson’s paper “Notation as a tool of thought.” [0]

The bootstrapping explanation you describe sounds a lot like what Paul Graham did in “On Lisp” [1] and in a much more complex fashion, Queinnec in “Lisp in Small Pieces” [2], both highly recommended.

[0] https://www.jsoftware.com/papers/tot.htm

[1] http://www.paulgraham.com/onlisptext.html

[2] https://www.amazon.com/dp/0521545668

kd0amg 1581 days ago [-]
It takes a fair amount of work on top of "set theory and traditional mathematics" if you want to actually state a semantics for APL.
xvilka 1581 days ago [-]
If anyone is willing to contribute to making APL more popular - feel free to send a pull request to Learn X in Y minutes site[1].

[1] https://github.com/adambard/learnxinyminutes-docs/issues/358...

coldcode 1581 days ago [-]
I learned APL in the mid 70's in college but never got to use it for much (I do remember writing a table tennis game which was pretty bizarre). But I always remember how amazing it was to learn the "combos" that did so much work in so few symbols. It was like learning to master a video game like Street Fighter. But APL was always a mostly write-only language, unless you used it every day, your code quickly became WTF-is-that.
Athas 1581 days ago [-]
APL really is a wonderful core notation, even though the full language is rather crufty. I've long been saddened that its promises of parallelism never seemed to work out.
lelf 1581 days ago [-]
Modern APL incarnations (kdb, dyalog apl) are quite damn parallel.
Athas 1581 days ago [-]
I've never used K, but isn't Dyalog mostly parallel through isolates? Last I checked, basic array operations were not automatically executed in parallel. Did this change?
jharsman 1581 days ago [-]
Dyalog launches multiple threads for certain operations if the arrays are large enough to justify the overhead.
derefr 1581 days ago [-]
That seems like the right approach for CPU-targeted code. Has an APL descendant ever been created to target a GPGPU compute-kernel, or even to compile to an FPGA netlist description of a DSP?
jodrellblank 1581 days ago [-]
Aaron Hsu’s co-dfns is a compiler of a subset of APL, written in APL, which compiles to GPU code.

Do a HN Algolia search for his username “arcfide” to find a lot of discussion, and there’s a couple of YouTube video talks, one of him talking through the codebase on a livestream, another a recording of a conference style tall introducing it to people.

It needs Dyalog APL, but that’s now easily available for non commercial use.

lelf 1581 days ago [-]
smabie 1581 days ago [-]
kdb+/q supports massive parallelism.
ngcc_hk 1581 days ago [-]
Interesting. Hope it can be written for APL novice.
contingencies 1581 days ago [-]
Disappointing they didn't webfont the thing. ⍢
aloukissas 1581 days ago [-]
Did anyone else also think about dry-aged steaks when reading this title? If you're in LA, you'll know what I mean :D
aloukissas 1580 days ago [-]
Haha so apparently you can get downvoted for humor in HN #whysoserious
FullyFunctional 1580 days ago [-]
Because HN is not Reddit. The value of HN comes from good and insightful comments. Humor is noise and noise drowns the signal. EDIT: Humor is a problem per se if it accompanies useful comments.
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 06:49:51 GMT+0000 (Coordinated Universal Time) with Vercel.