A squat, cir­cular robot scur­ries along the floor of a lab­o­ra­tory, moving left, then right, then left again, before coming to a stop.

A North­eastern Uni­ver­sity stu­dent researcher com­mands the gadget through a brain-​​computer inter­face that con­trols the move­ment of the robot using sig­nals pro­duced by his visual cortex.

The inno­v­a­tive tech­nology was devel­oped for a senior cap­stone project under the direc­tion of elec­trical and com­puter engi­neering pro­fes­sors Deniz Erdogmus and Bahram Shafai. The team mem­bers, who won first prize in the elec­trical and com­puter engi­neering cap­stone project com­pe­ti­tion, included Saumitro Das­gupta, Mike Fanton, Jonathan Pham and Mike Willard.

Erdogmus says that the tech­nology could serve a variety of prac­tical uses, from assisting or enhancing human cog­ni­tive or sen­sory motor func­tions in dis­abled or neu­ro­log­i­cally impaired users, to con­trol­ling mil­i­tary vehi­cles, light switches or TVs.

Sci­en­tific researchers have long exper­i­mented with brain-​​computer inter­faces, but only in the last 10 years has tech­nology advanced far enough to make the impos­sible possible.

People with dis­abil­i­ties will soon be able to com­mu­ni­cate through the com­puter to operate wheel­chairs or other vehi­cles or devices,” Erdogmus says.

A user’s ability to com­mand the robot to move in any direc­tion is based on a neu­ro­log­ical prin­ciple: When the retina expe­ri­ences a visual stim­ulus, ranging in fre­quency from 3.5 to more than 100HZ, the brain gen­er­ates elec­trical activity at the same frequency.

Team mem­bers exploited the phe­nom­enon. They divided a com­puter screen into four checker­board pat­terns that flash at dif­ferent fre­quen­cies and rep­re­sent dif­ferent con­trol com­mands for the robot, including the ability to move left, right and forward.

When a user stares at one of the checker­board pat­terns, the resulting brain sig­nals are picked up by elec­trodes attached to the back of the head and sent to a com­puter soft­ware pro­gram. The pro­gram wire­lessly trans­mits the con­trol com­mand to a laptop that is mounted on the robot, moving it according to the user’s visual cortex command.

The user can track the robot’s where­abouts through real-​​time video feed­back using Skype.

Noting the “good accu­racy and speed” of his team’s system, Das­gupta says, “Nav­i­gating through obsta­cles requires a degree of fine con­trol. You can use this system to con­trol dif­ferent types of environments.”

Next year, a new team of elec­trical and com­puter engi­neering stu­dents will advance the project a step fur­ther by designing a brain-​​computer inter­face for a wheel­chair, Das­gupta says.