Researchers at University of Colorado Boulder’s ATLAS Institute have recently created RoomShift, a haptic and dynamic environment that could be used to support a variety of virtual reality (VR) experiences. This new haptic environment, introduced in a paper pre-published on arXiv and presented at the ACM CHI Conference on Human Factors in Computing Systems 2020 (CHI ’20), uses a team of small robots that can rearrange furniture inside a room.
“In virtual reality, your experience is usually only visual; you can see objects in VR, but you cannot touch and feel walls and furniture,” Ryo Suzuki, one of the researchers who carried out the study, who is now an assistant professor at the University of Calgary, told TechXplore. “This limits the sense of full immersion in the virtual world. We wanted to make VR more immersive by adding tactile and haptic experiences.”
Suzuki and his colleagues essentially set out to explore how users can physically interact with VR environments, not just through their hands (i.e., using gestures or a controller), but using their whole bodies. In other words, they wanted to create an environment that would allow users to touch, walk around, sit on, grasp, manipulate and lean against objects in VR, just as they would interact with objects in the real world.
Their recent study builds on a number of previous projects where the researchers explored the potential of integrating robots with visual experiences. This includes the development of ShapeBots, a swarm of shape-changing robots that can be used to create visual displays, and LiftTiles, a set of actuator-based building blocks for creating shape-shifting interfaces; both previously featured in TechXplore.
“In our previous work, we encountered two key limitations: one associated with speed (e.g., inflatables are too slow to rapidly reconfigure the environment) and the other with the size and stability of the robots (e.g., the mechanical structure of shape-changing robots can easily break under load and may not be able to withstand humans sitting on them),” Suzuki said. “These trade-offs were the main problem of prior work, as we found that inflatable structures are stable but slow, while mechanical structures are fast but not so stable enough to support heavy weights. After extensive prototyping and several design discussions, we decided to explore a new approach where, rather than having robots render the environment directly, they instead lift, move and place furniture.”
The new approach allows for the furniture inside a room to be rapidly rearranged using items that are robust enough to support physical interactions involving all body parts. Using a swarm of robots to rearrange chairs, desks, walls and other furniture, it thus enables the addition of a whole-body haptic component to VR experiences.
Essentially, the robots used by the researchers move the furniture in a room to recreate the general features of the environment that a user is navigating in VR. This ultimately allows users to physically interact with items in their surroundings (e.g., sit on a chair, lean against a wall, etc.), making their VR experience even more immersive.
“Uniquely, we can now support room-scale haptic interactions,” Suzuki said. “Previously, researchers primarily proposed handheld or wearable haptic devices and/or haptic interfaces that use tabletop-size robots. In contrast, RoomShift can simulate large-scale environments, such as walls, surfaces, furniture, and floors, which would be difficult to achieve with existing approaches.”
The robots used by Suzuki and his colleagues to implement RoomShift consist of a robotic system that moves on the floor, called Roomba, and a mechanical scissor lift. The scissor lift can extend and retract vertically, growing from a size of 30cm to 100cm. Moreover, it can support a maximum weight of 22kg.
Their unique design allows the robots to move underneath items of furniture, lift them and move them across a room to different locations. The robots can thus rapidly ‘reconfigure’ a physical environment, so that it matches the arrangement of furniture in the virtual environment that a user is navigating at a given point in time.
So far, Suzuki and his colleagues fabricated nine robots and programmed them to coordinate with one another to rearrange furniture in specific ways. Building the robots was relatively simple and inexpensive, as their primary components were a Roomba robotic platform, a metal laundry rack bought from Target and two linear actuators.
“Only a few existing systems can support room-scale haptic environments, so our work will open up a set of new interesting applications and opportunities that are previously unexplored,” Suzuki said. “For example, we explored how this system can support architectural application scenarios, such as rendering physical room interiors for virtual real estate tours and collaborative architectural design, two increasingly common application areas for VR.”
In the future, this haptic environment could have many applications. In addition to enabling new forms of VR-assisted architectural planning, it could give those looking to buy a home a chance to view properties remotely in an interactive way.
“Virtual real estate tours reduce the time and cost compared to on-site viewings, but currently lack the bodily experience of being able to touch surfaces and sit down,” Suzuki explained. “In architectural design, on the other hand, VR aids the communication between architects and clients, where proposed designs can be experienced, discussed and modified before they are built.”
RoomShift could also make video games or other forms of VR-based entertainment more engaging. For instance, a version of the environment could be introduced in amusement parks, offering visitors the possibility to experience VR in an entirely new way. Moreover, it could be applied in industrial settings, for instance, as a tool to design and test the layout of cars, furniture or airplanes.
“We are motivated by how RoomShift can enable people with various physical abilities to experience, test and co-design these environments with their bodies,” Suzuki said. “The potential of RoomShift may not be limited to VR haptics applications. We are also interested in exploring how these robots can be deployed in our environment and support everyday life (e.g., rearranging chairs and desks of a room to transform from a meeting space to a teaching space, etc).”
So far, the researchers have primarily demonstrated the potential of the haptic environment they created as a tool for architectural design. However, they hope that other research teams worldwide will soon feel inclined to test RoomShift for other applications.
“We demonstrated a novel approach for creating room-scale haptic environments using a furniture-moving robot swarm,” Suzuki added. “We think that this will inspire the research community to further explore many different approaches and applications. In our future work, we also want to expand our approach using different form factors and robot designs, such as drones or robots that can move on a ceiling to provide room-scale mid-air haptic sensations.”
Suzuki et al., RoomShift: Room-scale dynamic haptics for VR with furniture-moving swarm robots. arXiv:2008.08695 [cs.RO]. arxiv.org/abs/2008.08695
© 2020 Science X Network
RoomShift: A room-scale haptic and dynamic environment for VR applications (2020, September 30)
retrieved 30 September 2020
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.