Ditch those virtual meeting blues with this creepy and cool VR solution.
Since the pandemic started, many of us have been relying on video calls to stay connected with colleagues. Unfortunately, that often means staring at a grid of tiny talking heads on a flast computer screen; a much less engaging experience compared to in-person meetings.
Lorraine Underwood, author of the book Save the World with Code and one of the hosts of the YouTube series Element 14 Presents, points out how during a video call with multiple people you sometimes can’t tell who is looking at who.
So Underwood came up with a plan to remove that confusion using her expert coding skills and a small handful of tech that includes a couple of Pimoroni Pan-Tilt HAT’s connected to servo motors, cameras, a webserver, Raspberry Pi, and VR.
Sure, she could have jumped into VR platforms like Engage or AltSpace to meet with her colleagues, but then she’d be missing out on all the fun of building a solution from scratch. Plus, Underwood likes a good challenge.
The first thing she did was sketch out a pathway to solving her video call problem.
- Build the Pimoroni Pan-Tilt HAT and test. This is the mechanism that moves the heads.
- Set up the code for the web server so people attending the meeting can access the webpage.
- Download and install the code allowing for VR support.
- Setup remote access. This allows those in the meeting to join the network.
- Call her colleagues and conduct a live meeting.
Step 1 – Build
Underwood built out moving heads for each person in the meeting by connecting a servo motor Pimoroni Pan-Tilt HAT to a Raspberry Pi device. Each servo-bot is controlled by the head movements of those in VR; a small camera is attached to the servo-bot so users can see who they are speaking to.
Step 2 – Set Up the Webserver
This step involved a lot of coding. I am not a coder and this is way beyond my knowledge of coding. Maybe I should read Underwood’s book. For me to try to explain this step would probably turn into a giant mess.
Step 3 – Set up the Code for VR
Underwood connected her servo-bots to the web so that users are able to make sure the code is set up to broadcast in StereoEffect and that the servo-bots match physical head movements.
Once everything had been put together, she then needed to test out her solution using a Gear VR headset and her Samsung Galaxy S9. Using the Oculus VR environment and a little trial and error, Underwood found that the Samsung Internet browser gave her the results she was looking for. She was able to view the real world through her VR headset. When she turned her head, the servo-bots would respond with her exact head movements.
Step 4 – Setup Remote Access
Next on the list was setting up a way for everyone to safely join her VR meeting. She created an access point through her router and shared the information with the intdned guests.
Step 5 – Set Up a Meeting
Finally, Underwood set up both servo-bots in her home office. She then invited her husband and her brother-in-law to her remote meeting with each of them having their own designated servo-bot. Once they were all logged in, Underwood was able to talk to both bots and look at them individually. At the same time, her husband and brother-in-law were able to turn their heads to look at each other or at Underwood. Meeting success!
Sort of.
Underwood did say that her solution was far from perfect, running into issues such as lag, computers crashing, and cybersecurity. Regardless of these setbacks, Underwood isn’t giving up on her idea. She’s asking the public for suggestions on how to improve her video call solution.
You can check out more of Underwood’s work on her website.
Image Credit: Lorraine Underwood