Software Turns Mental Handwriting Into On-screen Words, Sentences

Dr. Krishna Shenoy of Stanford University is an electrical engineer by profession. Recently his team has developed a technique that can show the brain thoughts of a person who is unable to speak and write in the form of words or sentences on a computer screen. He named it ‘mind writing’. In his experiment, Dr. Shenoy asked a patient suffering from paralysis to imagine writing the alphabet in his mind. Sensors from Krishna’s device mounted on the patient’s brain read the brain’s signals and, with the help of artificial intelligence (AI), showed those signals on a computer screen attached to the device. Dr. Shenoy and his team have actually created a brain-computer interface. It can be implanted in the brain of a paralyzed person.

Successful findings published in 2017
Its software was able to convert and decode the thoughts going on in the patient’s mind into text on the computer screen. Scientists from Stanford University first published the successful findings of this experiment in ‘E-Life’ magazine in the year 2017. Based on new research findings, Dr. Shenoy and his team claim that this technology could in the future become the voice of hundreds of people who have lost their ability to speak due to spinal injuries, strokes or paralysis. This technique can be particularly useful in amyotrophic lateral sclerosis, also known as Lou Gehrig’s disease.

This is how technology works
This technology will help such patients in exactly the same way as a person types SMS on a smartphone. Patients participating in the study typed brainstorming thoughts on a computer screen at a speed of about 18 words per minute. Whereas able-bodied people of the same age type about 23 words per minute on a smartphone. Each chip in this device has 100 electrodes that pick up signals from firing neurons in part of the motor cortex. This outer region of the brain controls the movement of our hands. The software sends these signals from the nervous system via sensors to a computer, where artificial algorithms decode these signals and estimate the patient’s hand and finger movements. The algorithm has been designed at Stanford’s Neural Prosthetics Translational Lab.

Source: Patrika : India's Leading Hindi News Portal by

*The article has been translated based on the content of Patrika : India's Leading Hindi News Portal by If there is any problem regarding the content, copyright, please leave a report below the article. We will try to process as quickly as possible to protect the rights of the author. Thank you very much!

*We just want readers to access information more quickly and easily with other multilingual content, instead of information only available in a certain language.

*We always respect the copyright of the content of the author and always include the original link of the source article.If the author disagrees, just leave the report below the article, the article will be edited or deleted at the request of the author. Thanks very much! Best regards!