Watch a film through the eyes of a MOUSE

Watch a film through the eyes of a MOUSE: Scientists use AI to decode a rodent’s brain signals in real-time – and the results are scarily accurate

  • Scientists trained an algorithm to match movie frames to mouse brain activity
  • It could then predict what frame a mouse was looking at from its neural data

Have you ever struggled to describe something to your friend that you watched on TV last night? 

Soon, you might be able to project your mental images onto the big screen, as scientists have been doing so with mice.

A team from École Polytechnique Fédérale de Lausanne (EPFL) developed an artificial intelligence (AI) tool that can interpret the rodents’ brain signals.

The algorithm, named CEBRA, was trained to map neural activity to specific frames in videos, so it could then predict and reconstruct what a mouse is looking at.

The news comes shortly after researchers at the University of Texas at Austin used AI to turn people’s thoughts into text in real-time. 

A team from École Polytechnique Fédérale de Lausanne (EPFL) developed an artificial intelligence (AI) tool that can interpret the rodents’ brain signals. The original movie is pictured top, while the decoded movie is pictured bottom

The algorithm, named CEBRA, was trained to map neural activity to specific frames in videos, so it could then predict and reconstruct what a mouse is looking at

Dr Mackenzie Mathis, the study’s principal investigator, told MailOnline: ‘In the future, since CEBRA is not limited to vision, we think it’s a powerful tool for brain machine interfaces. 

WHAT IS CEBRA?

CEBRA is a machine learning algorithm – a computer program that can improve its performance on a task by learning from data.

It was provided movies that had been watched by mice and their real-time brain activity.

CEBRA learnt which brain signals are associated with which frames using this data.

It could then be given some new brain activity it had not come across before, and from that it was able to predict what the mouse had been watching at the time.

The researchers were able to turn this information into a CEBRA-generated film, that could be compared with the original.

‘For example, it could be used for controlling computer cursors in patients that can’t move, or be used to help provide visual sensations in the visually impaired if its paired with real-time stimulation of the brain. 

‘Of course, I can’t predict this fully and its years away, but these are areas I am excited to see people use CEBRA for.’

For the study, published today in Nature, the researchers trained CEBRA using movies watched by mice and their real-time brain activity.

Some of the activity was measured directly with electrode probes inserted into the visual cortex area of the brain.

The rest was collected using optical probes on genetically-engineered mice whose neurons turn green when activated.

Using this data, CEBRA learnt which brain signals are associated with which frames of a specific movie.

Then it was given some new brain activity it had not come across before, from a mouse watching a slightly different example of the movie clip.

From that, it was able to predict what frame the mouse had been watching in real time, and the researchers turned this data into a film of its own.

Dr Matthis told MailOnline: ‘We don’t predict each pixel, but rather the frame. 

‘Chance level would be 1/900, so over 95 per cent accuracy is, we think, quite exciting. But this pixel-wise decoding is something we plan to do next.’

In an example video, the mouse can be seen watching a 1960s black and white movie clip of a man running to a car who opens the trunk.

A separate screen shows what CEBRA thinks the mouse is looking at, which is a near identical video, albeit a bit more glitchy.

In an example video (top), the mouse can be seen watching a 1960s, black and white movie clip of a man running to a car who opens the trunk. A separate screen (bottom) shows what CEBRA thinks the mouse is looking at, which is a near identical video, albeit a bit more glitchy

The algorithm is able to do this using data from just one per cent of neurons in a mouse’s visual cortex, the equivalent of about 0.5 million neurons.

‘We wanted to show how little data – both in terms of movie clips and neural data – we could use,’ Dr Mathis told MailOnline.

‘This makes it much more realistic for clinical applications in the future.

‘Notably, the algorithm can run in real-time, so it takes less than one second for the model to predict the whole video clip.’

The researchers say that CEBRA is not limited to just interpreting visual information from brain data.

It can also use it to predict arm movements in primates, and determine where a rat is located in its pen while it freely runs around.

Dr Mathis said: ‘[CEBRA] can also give us insight into how the brain processes information and could be a platform for discovering new principles in neuroscience by combining data across animals, and even species. 

The researchers say that CEBRA is not limited to just interpreting visual information from brain data. It can also use it to predict arm movements in primates, and determine where a rat is located in its pen while it freely runs around

‘The potential clinical applications are exciting.’

A similar technology was unveiled by a team from Osaka University last month, which works on human brain data.

Their AI-powered algorithm reconstructed around 1,000 images, including a teddy bear and an airplane, from brain scans with 80 percent accuracy.

It used the popular Stable Diffusion model, included in OpenAI’s DALL-E 2, which can create any imagery based on text inputs.

The researchers showed participants individual sets of images and collected fMRI (functional magnetic resonance imaging) scans, which the AI then decoded.

Similarly, just this week, scientists at the University of Texas at Austin revealed a technology that turn a person’s brain activity into text.

Three study participants listened to stories while lying in an MRI machine, while an AI-powered  ‘decoder’ analysed their brain activity.

They were then asked to read a different story or make up their own, and the decoder could then turn the MRI data into text in real time.

The breakthrough raises concerns about ‘mental privacy’ as it could be the first step in being able to eavesdrop on others’ thoughts.

HOW ARTIFICIAL INTELLIGENCES LEARN USING NEURAL NETWORKS

AI systems rely on artificial neural networks (ANNs), which try to simulate the way the brain works in order to learn.

ANNs can be trained to recognise patterns in information – including speech, text data, or visual images – and are the basis for a large number of the developments in AI over recent years.

Conventional AI uses input to ‘teach’ an algorithm about a particular subject by feeding it massive amounts of information.   

AI systems rely on artificial neural networks (ANNs), which try to simulate the way the brain works in order to learn. ANNs can be trained to recognise patterns in information – including speech, text data, or visual images

Practical applications include Google’s language translation services, Facebook’s facial recognition software and Snapchat’s image altering live filters.

The process of inputting this data can be extremely time consuming, and is limited to one type of knowledge. 

A new breed of ANNs called Adversarial Neural Networks pits the wits of two AI bots against each other, which allows them to learn from each other. 

This approach is designed to speed up the process of learning, as well as refining the output created by AI systems. 

Source: Read Full Article