The Neuralink YouTube channel (which is apparently a thing that exists) released a demo of their technology using Pager, a nine year old Macaque monkey. 



Video Overview

In the video, Pager plays two games using a joystick. For the first, he moves a cursor to an orange square in a grey grid, then moves it to the next square to pop up. For the second, he plays his favorite game, Pong.


While he plays, the Neuralink team have been analyzing the neural activity in his brain using a Neuralink implanted in his brain. They are able to receive data in realtime, and figure out which patterns of activity correspond to each hand movement.


The voiceover states that "After only a few minutes of calibration, we can use the output from the decoder to move the cursor instead of the joystick". The team then unplugs the joystick and has Pager play. Pager is then able to just think about moving his arm, and is able to play Pong using his mind. 

Pager plays MindPong



First of all, Neuralink was launched 1 year ago, and we already have monkeys playing games with their mind. I predict with 70% confidence that, within a year, Neuralink will be placed in a human and will have basic functionality. If I'm wrong, I think it'll mainly be because of Neuralink not being legally allowed to conduct a human trial, or due to long term safety concerns as opposed to short term ones.

New Comment
9 comments, sorted by Click to highlight new comments since: Today at 3:21 AM

My impression from people who have been working on BCI for a long time is that this isn't that impressive.  Not that the hardware isn't impressive, but that they've done similar-ish things with worse hardware for a long time.

Just a quick Google, I'm not exactly endorsing this article, but it seems to support my impression.

The hardware is impressive — it's best-in-class, but the presentation was mostly theatrics. We've had brain-computer interfaces for cursor control for 30 years (pong can be reduced to 1D cursor control — it's even simpler than the first task).

It's just a lot cooler to the public when its Musk getting a monkey to play a video game.

If this has been a thing for 30 years, why is the hardware best-in-class? Also, is there a presentation that is more impressive/innovative but perhaps less theatrical?

The hardware should be best-in-class due to the massive amount of channels (over 1,000), and the fact that each channel is surgically implanted into the head. For comparison, 16 channels is on the high end for consumer-grade BCI kits, and each channel is a sensor that rests on top of the skin.

As far as why they aren't making use of its capabilities to do something more impressive, I don't know. 

For what I would consider a more technically impressive presentation, see this video of a man controlling two prosthetics in 3d space to slice bread.

I assume because Musk put a lot of money into it and got the right people together.

I think that monkey video is the most advanced thing Neuralink has put out.

but that they've done similar-ish things with worse hardware for a long time.

Including people doing things like the monkey does in the video, right? 'Mind pong'.

I'm fairly confident I saw pong being played with a BCI many years ago.  I'm sure you could find some videos without too much trouble, but Rabrg just posted a video of what I think is even more impressive than pong.

Neuralink is cool and very hyped, but I also think this is more subtle and perhaps even cooler: Facebook bought a company which creates a wrist based human interface device. They claim that they can sense hand & finger position from the signals detected from a specialized wrist strap.

Given how much expertise humans have in fine motor control of their hands, and the astonishing generalizeable capability our hands have displayed (in sports, writing, crafts, fighting), I am optimistic about a wrist based input device becoming common place, simply because there is no onerous requirement of surgery.

I suspect that the first use case will be like the monkey example, except where humans type on a phantom keyboard, and from there people will start learning entirely new ways to communicate using only hands - possibly as their primary interface to any computer.

Woah, just on a watch-like device! How far along is this technology?

New to LessWrong?