News

Researchers Use AR To Study How Electric Fish Sense Their Surroundings

NJIT researchers are using AR to hack the electric sensors of Glass knifefish.

All living creatures rely on a their own set of unique senses to assist them on a day-to-day basis; including everything from catching their next meal, to defending themselves from predators. A spider can feel vibrations to let them know when they have a fly caught in its web, a bat uses sound waves to avoid flying into the side of house, your dog’s use their keen sense of smell to let them know when you have a fresh pepperoni pizza with extra cheese in your lap; even humans are capable of miniscule eye movements to detect objects in their field of vision.

According to a recently published report by the New Jersey Institute of Technology, a group of researchers have begun hacking the electrical sensing organs of a specific species of electric fish known as Eigenmannia virescens, or Glass knifefish, by placing the creatures into a tank and tricking them into thinking they are hiding inside a smaller shelter generated completely in augmented reality. The goal is to see if they could change the communication between the sensory and motor systems in a way that wouldn’t completely unlink them, but rather alter their behavior.

In layman’s terms, researchers are creating AR shelters for the fish to swim under or hide in, and then moving that shelter to see if the fish follow their protective cover.

Fortune’s lab uses real time video tracking of Eigenmannia virescens in an artificial refuge environment to learn how the fish control sensing behavior used for station-keeping. Image Credit: NJIT/Johns Hopkins

“We’ve known for a long time that these fish will follow the position of their refuge, but more recently we discovered that they generate small movements that reminded us of the tiny movements that are seen in human eyes,” said Eric Fortune, associate professor of biology at NJIT, adding, “that led us to devise our augmented reality system.”

To conduct their research, Fortune and his team – which includes graduate student Debojyoit Biswas, undergraduate researcher Luke A. Arend, postdoctoral fellow Sarah A. Stamper, and associate research engineer Balázs P. Vágvölgyi, all from Johns Hopkins University –  placed glass knifefish into a tank, and using a digital AR shelter that moved based on the real-time back and forth motion of the fish, the research team studied the fish’s behavior and movement in the AR shelter and explored how it could be altered in two different categories of experiments:

Closed Loop Experiments – The fish’s back and forth movement is synced to the motion of the AR shelter.

Open Loop Experiments – The motion of the AR shelter is “replayed” to the glass knifefish like a tape recorder, and seeing how the fish responded.

A species closely related to the glass knifefish, the brown ghost knifefish (Apteronotus leptorhynchus), displays its station-keeping ability. Image Credit: NJIT/Johns Hopkins

Researchers observed that the fish responded by swimming the farthest to collect sensory information during closed loop experiments when the AR’s positive “feedback gain” was cranked up, or when the AR shelter mirrored the fish’s movement.

From the perspective of the fish, the AR action in both closed and open loop experiments are the same. Noah Cowan, professor at Johns Hopkins University and co-author of the study, said, “From the perspective of control, one test is linked to behavior and the other is unlinked,” Cowan continues, “It is similar to the way visual information of a room might change as a person is walking through it, as opposed to the person watching a video of walking through a room.”

Through their AR tank experiments, researchers found that fish behave differently when the stimulus is controlled by an individual versus when a stimulus is played back to them. Basically, the fish knows that it is controlling the sensory world around it.

Image Credit: John Hopkins University

Up next for the research team is investigating the neurons responsible for each control loop and exploring how the results could be used to explore active sensing behaviors in humans or to develop advanced robotics.

Another objective for Fortune and his team is conducting similar experiments that could pave the way to learning more about vision in humans and our own neurobiology, which could possibly help with improved eye tracking in both virtual and augmented reality; resulting in a digital experience that in the future will feel more realistic.

The New Jersey Institute of Technology report was developed in collaboration with Johns Hopkins University and supported by James McDonnell Foundation Complex Systems Scholar Award grant 112836, Collaborative National Science Foundation Award grants 1557895 and 1557858, and National Science Foundation Research Experiences for Undergraduates grant 1460674.

About the Scout

Bobby Carlton

Hello, my name is Bobby Carlton. When I'm not exploring the world of immersive technology, I'm writing rock songs about lost love. I'd also like to mention that I can do 25 push-ups in a row.

Send this to a friend