Wireless engineering professor exploring ways to optimize AR/VR device speeds

0

Can Augmented Reality/Virtual Reality (AR/VR) wireless devices run at optimal real-time speed? Shiwen Mao, director of the Wireless Engineering Research and Education Center, will find out soon.

His three-year, $993,319 study, “Learning based Resilient Immersive Media-Compression, Delivery, and Interaction,” was one of 37 studies selected for the Resilient and Intelligent NextG Systems (RINGS) program. from the National Science Foundation (NSF). RINGS is jointly funded by the NSF, the Office of the Undersecretary of Defense for Research and Engineering (OUSD R&E), the National Institute of Standards and Technology (NIST), and several industry partners, seeking to accelerate research in areas that potentially have a significant impact on emerging next-generation wireless and mobile communications, networking, sensing, and computing systems.

Mao, who will explore innovative technologies to provide a unified multimedia compression, communication and computing framework to enable real-time AR/VR, believes the study has the potential to have significant impacts within the research community and society.

“Immersive media, such as augmented reality/virtual reality, has been recognized as a transformative service for Next Gen network systems, while wireless AR/VR will provide great flexibility and an enhanced immersive experience for users and will release a plethora of new apps,” Mao said.

For example, retail inventory managers can don wireless AR/VR glasses, connect to specific servers, and, voila, suddenly have real-time location access…as if they were there. were. “But the challenge is that it has to be real-time,” Mao said, noting that wireless devices – which give users more freedom – aren’t as fast as their wired counterparts and that immersive media applications typically require. too much data to transmit.

The project has five axes, Mao said. Parts one and two focus on learning-based immersive multimedia compression or the development of high-efficiency light field and point cloud compression solutions. Axis three and four explore fundamental performance concepts and techniques that facilitate wireless AR/VR transmission and interaction. Thrust Five will integrate all developed techniques and validate their respective performance with simulation studies using open source datasets and experimental studies using an AR/VR testbed and publicly available wireless and cloud-based platforms.

Mao, the lead researcher, is collaborating with Zhu Li, co-researcher and associate professor of computer science and electrical engineering at the University of Missouri-Kansas City. “Our team is unique,” Mao said. “Dr. Li is an expert in video compression and signal processing, and my expertise is in communications and wireless networking. We complement each other, so we can deliver results.

“Receiving this award is more than an honor,” Mao said. “It’s also a big responsibility. We must deliver.

But Mao wants to push this study a little further than the RINGS program. He wants to “connect what is not connected”.

“44 million homes in America don’t have a broadband connection,” he said. “Many live in rural areas with limited access to schools or education. What if we could offer immersive multimedia applications? What if we could bring virtual classrooms to them? We want to use this kind of AR/VR – using tools we found in this study – and bring real-time education to places that can’t offer it.

“We can impact the world, and that’s what’s most important.”

Share.

About Author

Comments are closed.