Researchers have been working on mind-control gaming machines for a long time. These rather advanced gaming machines can be controlled with thought alone, sounding like something straight out of fiction but still becoming quite real with the advancement of gaming devices. But researchers have found out that there’s a darker side to these machines.
Imagine using a mind-control gaming machine sometime in future. If a hacker, with fairly advanced equipment, is able to hack into your system and read the brain signals that are being transmitted through such a machine, he may be able to extract accurate hints of some very private information of the user, such as the location of his house, his credit card PIN etc.
This has been revealed by a group of researches from University of California at Berkeley, Oxford University and the University of Geneva. During the course of the study, the researchers considered 28 subjects who used these brain-machine interface devices built by companies such as Neurosky.
Commenting on the finding of the study, a member of the faculty in the department of computer science at Oxford, Ivan Martinovic wrote, “These devices have access to your raw EEG [electroencephalography, or electrical brain signal] data, and that contains certain neurological phenomena triggered by subconscious activities. So the central question we were asking with this is work was, is this is a privacy threat?”
The researchers were able to gather private data by showing the subjects different pictures and numbers. Every time the subject recognized one of these, he transmitted a different brain signal known as P300 response, confirming that he knew what was being shown to it or that it had some significance to him. To find the location of their homes, the test subjects were shown maps and again, their brain signals hinted as to where the location of their homes was. Researches were able to guess this data correctly with a 60% accuracy.
Researchers hint that making use of this technique to find personal data and information will not be too difficult for a potential attacker in the future. According to professor Dawn Song, who was part of the group which led the Berkeley chapter of the research, “In this threat model, the attacker doesn’t need to compromise anything. He simply embeds the attack in an app, such as a game using [brain-machine interface] that the user downloads and plays. In this case, the malicious game designs and knows the visual stimuli the user is looking at and also gets the brain signal reading at the same time.”
Naturally, this also puts users’ privacy rights into jeopardy and researchers as well as gaming equipment vendors will have to find a way to shield the users from any such threats and vulnerabilities.
Source: Research paper