NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
Hyperdimensional Computing (github.com)
maxander 1235 days ago [-]
I get why everyone is wondering whether this is autogenerated jargon, but I think the basic thrust of this is that machine learning algorithms work better as they use larger and larger vectors for representation. I'm pretty sure is just a well-known thing in ML, although the projects listed here may be pushing vector size farther than most.

So for instance, a typical NLP algorithm (although not GPT-3, IIRC) might represent a word as a 500-float-long vector, which is the same as saying the algorithm considers each word as a point in 500-dimensional space. This turns out to have weirdly useful properties, to the point where directions in this 500-dimensional space start to have semantic correspondences (e.g. [0], still one of the coolest things in ML, IMHO.) You can't do the same trick with a 3D space- the algorithm doesn't have enough to work with when all it knows about a word is three numbers.

Another cool example- in gradient descent, you're constantly trying to find the lowest point in a "fitness landscape"; in a 3D landscape, you might easily find yourself in a "valley" where every direction is worse than you currently are (a local minima), and you won't know where to go. In a 500D landscape, it's unlikely that you'll find yourself in a valley where all 500 available directions lead somewhere worse. So the algorithm will be much less likely to get stuck, and this effect gets more robust the more dimensions you have.

[0] https://colah.github.io/posts/2014-07-NLP-RNNs-Representatio...

feoren 1235 days ago [-]
That's a very good explanation of why high-dimensional vectors are so useful, but it doesn't seem to have much to do with this post. How many bits is a 500D vector? Is it 10,000 bits, like in the post? Which "coding" does it use, real? Is it a memory-centric 10,000-bit hyperdimensional real-coded vector that can be combined with permutation? What benefit does your 500D vector get from the fact that it's holographic, not micro-coded? How do you do gradient descent without backpropagation? None of those words make any sense and that's exactly why this post is being lambasted. Thank you for trying to make sense of it, but just because it's possible to write sensibly about similar topics does not mean this post belongs anywhere but the trash can.
IcePowder 1236 days ago [-]
Vector Symbolic Architecture (VSA) is the European name. Base is Pentti Kanerva's 1988 book, "Sparse Distributed Memory". Now referred to as "HDC" or "HD". Search Pentti Kanerva now Staff Scientist at UC Berkeley Redwood Center for Theoretical Neuroscience or Mohsen Imani, now UC Irvine former at UC San Diego.

Introductory article - "Hyperdimensional computing and its role in AI" by,Givi Odikadze:

https://medium.com/dataseries/hyperdimensional-computing-and...

throwaway_pdp09 1236 days ago [-]
Thanks for giving the name, it's been bugging me for the past hour at least. I just couldn't get it.

When I read over SDM I got the impression it wasn't terribly useful, if anyone wants to set me right please do.

forgotmypw17 1236 days ago [-]
Accessibility link: https://archive.vn/tHun8
nmrcq 1236 days ago [-]
My first thought was that this was related to multilinear algebra (algebra of tensors and higher-order vector spaces) which has lead to many interesting and suggestive advances in machine learning, but apparently it's not. What a truly opaque collection of nomenclature they've chosen...

After glancing through a couple papers I can't entirely shake the feeling that the entire thing might be a social experiment to see how much jargon and fantastical sound words can be mashed together before people notice it's nonsense (even though I realise it's not).

marcosdumay 1235 days ago [-]
> even though I realise it's not

Are you sure? The jargon is heavy to parse, but the claims are just plainly absurd.

All the jargon is probably there to cover-up to the fact that the claims tell only half of the history, and you won't like the other half.

kortex 1235 days ago [-]
A few thoughts:

- holy jargon, Batman! From what I've gleaned, "holographic" refers to Tony Plate's Holographic Reduced Representations

- "hypervector" is just a swanky term for very-high-dimensionality vectors, a la "hypersonic", "hypervisor", hyper-encabulator, etc.

- this reminds me an awful lot of thought vectors. The first abstract linked mentions HRRs alongside semantic vectors, so both are types of hypervectors I think?

- this also reminds me of Numenta/Nupic/sparse distributed represntations. I think SDR's are also hypervectors?

byecomputer 1235 days ago [-]
>"hypervector" is just a swanky term for very-high-dimensionality vectors, a la "hypersonic", "hypervisor", hyper-encabulator, etc.

Or a la hypercube. Hyper- is an established prefix for describing n-dimensional objects

scottlocklin 1236 days ago [-]
Bizarro the one on-topic comment here is "Dead." I came across this in reviewing echo state networks which this appears to be a distant relative of. The code samples made it useful.
_Microft 1236 days ago [-]
This referred to IcePowder's comment, I revived it since the information given in it seems to be accurate as far as I can tell.

You can do this yourself by the way: click the time of the post (".. minutes ago" / ".. days ago") to view the comment on its own and click the link to "Vouch" in the top line, where also the name and the time since commenting is shown.

scottlocklin 1236 days ago [-]
Thanks kindly for that; I didn't know I had this power.
two_almonds 1236 days ago [-]
Have you come across any other interesting resources? I've recently been delving into topics surrounding echo state networks for my thesis, so I'd appreciate any pointers.
scottlocklin 1236 days ago [-]
Some of the Italians at University of Pisa seemed to want to understand the echo state property better. Gallicchio and Filippo Maria Bianchi (I assume his student) wrote a couple of interesting papers I haven't totally absorbed yet, probing the critical ridge in various ways.

That said, it's pretty marginal stuff. ESNs are weird. Cool weird anyway.

If I were looking at it in a serious way rather than reading the funny papers, I'd think about ways of applying topological ideas to ESNs. ESNs on the critical ridge probably have an interesting graph when projected onto a topological space.

two_almonds 1235 days ago [-]
Thanks!
forgotmypw17 1236 days ago [-]
It might be because of the medium.com link, which is disliked by many for its many accessibility issues.
egfx 1236 days ago [-]
Whoa. Hyperdimensional what? Parsing through this felt a bit like being dumped into a pit of different puzzle sets mixed together. And I was lost in the terminology. It’s all very nebulous but I guess that’s the point.
jampekka 1236 days ago [-]
I had the same impression. I had a peek at some of the papers and did a quick search on the topic, but everything is so covered in hype it's difficult to get a grasp what's actually going on.

My quick impression is that it's something similar to "embeddings", where some features (e.g. words) are mapped to some high-dimensional vector space and computations are done in that space. What those computations are seem to vary quite a bit from paper to paper.

klyrs 1236 days ago [-]
I'm having trouble too... it sounds like they mean "1- and 2-dimensional vectors which are large" and not, say, billion-dimensional tensors. But if I'm reading that wrong I'd love to hear what's actually interesting here.
bitwize 1236 days ago [-]
Just the other day we had spatial computing; this sounds like when you get to the "but first we have to talk about parallel universes" bit of that.
IcePowder 1235 days ago [-]
The following Patent Application by Intel is an interesting use case for Hyperdimensional Computing in a Localizing Key Value Store. It's also behind the recent move to enriched metadata in RDBMS systems (VLDB Endowment):

https://patents.google.com/patent/US20190227739A1/en?oq=US20...

6gvONxR4sf7o 1236 days ago [-]
This reads like someone got really carried away with the buzzwords. Hyperdimensional hypervectors?
Rhinobird 1235 days ago [-]
by 'vector' they just mean a list of numbers (ex. [32,5,8,11]).

By dimension they mean the number of items in the list. In the above example it's a 4 dimensional vector.

Make the numbers binary digits and voila.

By hyperdimensional they mean the list has 500 or more items.

IIAOPSW 1235 days ago [-]
I don't know if the hypothesis that brains function by using hyperdimensional vectors is true or not. But if it is true then it seems like the type of problem that is highly suited to quantum computers. A quantum state of just 10 qubits is a 1024 dimensional complex valued vector.
ilaksh 1236 days ago [-]
I remember looking into this before. My conclusion was that the name is kind of like clickbait.
hansdieter1337 1236 days ago [-]
Is that a GPT-3 paper generator?
bra-ket 1235 days ago [-]
feoren 1236 days ago [-]
The original machine had a base plate of prefabulated amulite, surmounted by a malleable logarithmic casing in such a way that the two main spurving bearings were in a direct line with the panametric fan. The latter consisted simply of six hydrocoptic marzlevanes, so fitted to the ambifacient lunar waneshaft that side fumbling was effectively prevented. The main winding was of the normal lotus-o-deltoid type placed in panendermic semi-boloid slots in the stator, every seventh conductor being connected by a nonreversible tremmie pipe to the differential girdlespring on the "up" end of the grammeters.
alentist 1235 days ago [-]
Top-level comments should be informative/substantive.

This isn’t Reddit.

ur-whale 1235 days ago [-]
The top comment is actually pretty informative in that it gives a very good sense of what to expect if you dive into the subject.
alentist 1235 days ago [-]
> The top comment is actually pretty informative

It’s not.

ttul 1236 days ago [-]
Gotta love those grammeters!
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 09:32:06 GMT+0000 (Coordinated Universal Time) with Vercel.