The excitement around virtual reality (VR) was high when it made a splashy debut two decades ago followed by its closely associated brethren augmented reality (AR) more recently. Each held promise to revolutionize many industries by enabling users to enter virtual or mixed reality worlds and interact with objects just as they would in the real world.
Engineers could design their products intuitively without the interference of traditional computer user interfaces. People could walk through homes that were not yet built. Customers could try out products that only existed digitally to test and fall in love with them prior to final design. Sounded too good to be true, and sadly, it was.
The real reality was that the technology, while exciting, was in its infancy. So many of the required elements of VR and AR—both on the hardware side and the software sides were not yet sufficient to create environments that mimicked the real world. After all, to fulfill its promise, users needed to feel as if they are truly immersed in either a virtual or mixed reality world, which required more powerful computers, better input devices (helmets, gloves, etc.), and software that took advantage of both—all at price points that made the technology economically feasible.
Where are we now?
The good news is that significant advances have been made on all fronts. Computers are faster and several heavy hitters (Google, Microsoft, Facebook, to name a few) have spent a considerable amount of time and capital developing input devices that have the potential to bring VR to the masses.
But how could these advances impact the use of VR/AR in product design? Engineers have long known the benefits of virtual prototyping, being able to conduct simulations to prove out designs long before manufacturing. Taking that one step further, what if you could immerse that virtual product in its real world, which by definition is AR?
Our users have been doing just that since 2013 when it was announced that eDrawings would be adding an AR-based update to its iOS app. What AR brought to the eDrawings app is the ability to accurately communicate scale and proper context when it comes to virtual products. That’s important because whether you’re designing a small product, such as a remote control, or a mammoth-sized object, such as a piece of industrial machinery, everything ends up being the same size on your screen. While you can zoom-to-fit to maximize the work space, there is no visual reference on the screen to give you context in terms of its real size and how it would compare with objects in it real-world environment.
Wouldn’t it be great if you could take your design and put a virtual model of it on the table right next to you or in your room so you could then look at it next to real products, slide and spin it around on the table, and get a complete sense of the product’s true size, scale, and proportions? The eDrawings app lets you see a product at full scale so its true size is clearly communicated.
Where do we go from here?
There are several areas of product development that could benefit from mixed reality solutions in the future. The first one is sales and marketing. Enabling companies to let their customers try out products in these environments would enable them to know in advance how the market will respond to their products and what areas of their products need to be tweaked prior to release but also help their customers connect with products in a way they have never been able to before, emotionally. This is also the area with the easiest immediate path to ROI.
The second area is collaboration; enabling design participants to enter into a common virtual environment, in which they can naturally interact with virtual products in order to provide more accurate feedback, participate in real-time discussions or interactive design reviews. The result of which would be better, richer information being fed back into the development pipeline.
The final area, and perhaps the one with the highest bar of requirements, is the idea of designing products in either virtual or mixed reality environments. Will the devices and UX evolve enough that designers can be immersed for long periods of time in order to accomplish design tasks? In addition, will these devices be accurate enough to maintain the tolerances that engineering precision requires? Will the costs come down to a level at which this is economically feasible for companies?
It’s really a new paradigm that would have to be adopted. Early adopters will most likely be younger engineers and designers who have never known a non-digital world. For this group of users, there is no hurdle to adoption from a cultural perspective, no fear of mistakes; they just want to explore. For other users, there will be more apprehension, much like there was when the initial space mouse first came out. It introduced a new paradigm in the way we interacted with computers.
There are many challenges with integrating AR and VR into production environments. When you take into account the newness of these devices and the mixed stages of our users (younger early adopters and more reluctant users) merging the gap to develop a cross-reality design experience is not easy. We are, however, committed to investigating the best ways to provide future solutions that leverage the best of VR, AR and MR.
We already have the tools and the know-how to help people work together in a more immersive, collaborative way. Our millions of users are already creating amazing 3D content; this new paradigm will empower them with a better way to share the experience of that content, in a beautiful, engaging way. Users trust us to deliver premium experiences in terms of building and showing off their products. We want to make sure we’re addressing a lot of different users and a lot of devices, to lower that bar to entry for our users.
Note: This post was co-authored by Chin-Loo Lama, User Experience Design Senior Manager, and David Randle, Senior Business Development Manager at Dassault Systemes SOLIDWORKS.