How virtual reality is changing engineering
The UK has a long history of work in virtual reality (VR). In the 1990s, pioneering UK companies such as Virtuality and Division provided turnkey immersive systems to a wide variety of industries. While the hype died down and those companies faded away, over the past 20 years, pockets of use of immersive systems have emerged. In his 1999 ‘What’s real about virtual reality?’ article, published in IEEE Computer Graphics & Applications, Professor Frederick P Brooks Jr described a few examples of the two ‘traditional’ engineering uses of VR: design review and vehicle training. In that era, with high-end VR systems costing hundreds of thousands of pounds, the investment could only be justified in situations where the training would otherwise be very dangerous, or there were well-understood cost savings to be made in the design process.
Since the late 1990s, desktop computers have become capable of handling much more complex three-dimensional (3D) models and processing. Over the past two years, a watershed has been crossed where highend desktop computers are now capable of driving real-time graphics at the speed required to enable a low latency experience in an immersive system.
At the beginning of 2017, Samsung announced that it had shipped over five million GearVR devices to convert its highend smartphones into headmounted displays. This type of display, and similar displays such as Google Cardboard, allows users to be immersed inside a computer-generated image. By turning their heads, they can experience a virtual scene as if it was surrounding them. More advanced systems such as the HTC Vive and Oculus Rift add motion tracking of the headmounted display and handheld tracking devices, which allow the experience to be much more interactive. Having motion tracking enables the user to move around and use their hands to interact, and combined with appropriate software, this enables the user to experience the virtual scene as if it were real. This style of user interface, where the user employs their own motions, is easy to learn and very flexible; the user can at least attempt to interact with the objects that they hear and see in a similar way to the real world.
The fact that the virtual scene can be modelled on both the appearance and behaviour of real scenes suggests obvious routes for exploitation for engineering. While simulation and visualisation have long been tools that engineers have used, VR promises to make these technologies even more accessible to them.
Enhancing design efficiency
Jaguar Land Rover has used immersive VR since 2006, employing the technology to design cars more efficiently. Brian Waterfield, Virtual Reality and High-end Visualisation Technical Lead, initiated a project to build a state-of-the-art CAVE-like display, an immersive virtual environment in which users stand in a small room where the walls are displays. It has stereo imagery (two views of the same scene, side by side) on four walls at 4K by 4K resolution (ultrahigh definition). Although the displays are not head-mounted, the user still has to wear a pair of glasses to separate the views for their left and right eyes. Similar to a head-mounted display, the images on each wall can be drawn in a first-person point of view, so that the user sees 3D objects that appear to be inside the display, and they can walk around and duck as they would do in the real world (although without the risk of banging their head).
In 2006, the displays, including the projectors and the 16 PCs needed to drive them, were a very significant investment. This spend was justified for a specific problem: designing the ‘packaging’ of a new vehicle. This is part of the early design process where the interior spaces of vehicles are laid out and tested for ergonomic fit and utility, and where the design team needs to make judgements such as lines of sight out of the vehicle and reach for the controls. It also needs to address interactions between passengers and the driver, and how the vehicle users will interact with non-driving controls, such as the boot space.
Using VR, the team was able to give early feedback about the impact of new designs on the requirement for all-round vision; this was something that was hard to assess from plans and renderings alone. To enable this, Waterfield’s VR team used software to set up a simple process for taking 3D models of vehicles from their PLM (product lifecycle management) software to the display. VR became a part of individuals’ work, as well as playing a major role in weekly cross-team design reviews. As packaging is a part of the design where interaction and user assessment is extremely important, the CAVE-like display, given its size, reasonable resolution and very wide field of view, was a good fit for these requirements.
Using VR, the team was able to give early feedback about the impact of new designs on the requirement for all-round vision
Jaguar Land Rover subsequently started to invest in VR in other areas. In 2011, it built a large powerwall display – an ultra-high-resolution display – that is used for two-dimensional images and to aid visual design decisions. The company also invested in a high-end headmounted display and an accurate large-scale system to track the user’s body. This enabled assessment of manual interactions with the vehicle, including testing assembly and maintenance procedures. With the release of lowcost consumer VR, more teams across Jaguar Land Rover are now looking at using it in their processes. Most visibly, a consumer VR experience was part of the launch of the Jaguar I-PACE concept, an electricpowered sports car. While several VR product visualisations exist, the novel aspect of this experience was that it was a social experience between HTC Vive head-mounted displays across the globe. More than 300 guests at the launch event were transported into a specially created virtual space where they watched projections of the car’s creators, could interact with each other, and put themselves in the concept, ‘sitting’ on its virtual seats and having it built around them.
Modelling buildings
Modelling in 3D has been a ubiquitous tool in many engineering disciplines for decades; the design, construction, and use of buildings and infrastructure is one such area where 3D models have had very broad impact. However, access to 3D models still requires that the users have the skills to interact with the model and interpret the images that they see. Global engineering consultancy Arup has been using real-time 3D models within its design and engineering processes since 2001. The use of these has constantly changed and evolved the organisation’s practice through new graphics technologies and VR.
In 2001, Arup started implementing real-time visualisation of 3D models, which were generally commissioned by certain clients as a way to explore design issues. Traditionally, visualisation was achieved with static renderings and fly-through videos, but the addition of an interactive element introduced new opportunities. Alvise Simondetti, Global Leader of Virtual Design at Arup, explains that basic computer game software was used to create virtual walkthroughs of sites. The ability to walk through models was helpful in consultations with stakeholders, as elements that are hard to convey on video, such as crowd movements, could be visualised. However, moving models from a computer-aided design (CAD) format to the game software platform proved to be time consuming and involved remodelling original parts so that they would work in real time.
The combination allows a non-specialist to access the very large model quickly and easily
In later projects, such as Arup’s redevelopment work on King’s Cross Station, realtime 3D models became much more important for interdisciplinary working. For example, Simondetti recounts how a 3D model of the new Western Concourse was used in areas ranging from design of the CCTV coverage through to the marketing of the commercial restaurant units. In this and other station models, one interesting use of VR that had significant impact was the investigation of crowd movement. In trials, users could explore station models to follow routes that were expected to be challenging. A maximum journey time between any two points in the station had been agreed and the records of paths that users took around the models in VR could be used to examine potential design problems, such as congestion caused by missing, misplaced or contradictory signage.
More recently, Arup is exploiting new 3D technologies. Its work on High Speed 2 is using interactive visualisation to enable engagement with stakeholders, the media and the public. Because of the complexity and extent of the planning involved in the project, a 3D model has been built that integrates birdseye-view navigation (similar to Google Earth) and panoramic footage where real photography has been augmented with renderings of the future line infrastructure. The combination allows a non-specialist to access the very large model quickly and easily.
Transforming factory processes
As a relative newcomer to VR, in 2014, Siemens invested in display systems at its manufacturing facility in Congleton, Cheshire, where it designs and manufactures variable speed drives for motors. Its customers come from a variety of sectors, including automotive, machine building and the airport industry. A key part of the work carried out at the factory is the design of individual workcells to manufacture a new product, which can be a costly and timeconsuming process.
Anil Thomas, a transformation manager at Siemens, explains that the process for designing a new workcell or revising an old design takes 12 weeks. A key part of this is when operators from the assembly line meet designers and engineers for an intensive five-day design session. The team previously interacted by using sketches or building physical cardboard prototypes, and VR was brought in to make this process leaner and more efficient. The company invested in an ActiveWall system, designed by VR specialist Virtalis, which comprised a large, wide projection wall and a floor. While less common than the CAVE-like display format used by Jaguar Land Rover, the ActiveWall allows groups of up to 10 people to work together in the immersive space and collaborate more easily. It is also of sufficient size to show a typical workcell at oneto-one scale, and be able to look down onto the working surfaces.
A high-end head-mounted display, which can be used by another person in parallel to the ActiveWall, also enables the designers to investigate reach and interaction around and between workcell operators, as some require one person to feed material and another to operate tools. As well as removing the need to build a physical prototype, VR has improved the quality of the design process, and Siemens is keen to introduce it to more of its process engineering and assembly to test whole factory performance.
The future for VR
Despite the opportunities demonstrated, engineering is still a challenging area for VR, as engineering models are large and complex. Much of the content for the first wave of consumer VR has used simple models of cartoony appearance or panoramic video, because the use of a head-mounted display requires reliable, high frame rates so that users do not get disorientated when making rapid head movements. Over the next three to five years, as graphics cards to operate VR become cheaper, higher-end cards will be able to drive very large models of millions of polygons with complex lighting and shading.
Aside from an improvement in computing hardware, significant improvements in consumer head-mounted displays can be expected, with potential developments such as built-in eye-tracking to enable gaze-dependent rendering and social interaction, or a head-mounted display with 4K resolution in each eye. Although consumer devices are driving the acceptance and visibility of the technology, there is ample room for a high-end VR industry, supporting systems with higher display and tracking quality and new input modalities.
Powerwall and CAVE-like displays that allow multiple users to simultaneously experience VR will also have a place. There are prototype consumer room-scale projection systems, such as Razer’s Project Ariana projector prototype shown at the Consumer Electronics Show (CES) 2017, which should enable large-wall interaction at a much lower price. A slightly different technology trend is mixedreality where VR is combined with video to enable interaction between multiple headsets within a workplace.
[VR] allows access to situations or simulations that would otherwise be difficult to visualise or inaccessible to anyone except the specialists involved
Aside from the hardware systems, there are several software engineering challenges left to tackle. VR addresses user input and output, but so far there are no standards that enable easy integration into other software stacks. Over the next couple of years, initiatives such as WebVR application programming software and efforts of bodies such as the Khronos industry consortium should allow it to be supported within standard web stacks. This will greatly simplify the integration of VR in larger systems.
It is clear that engineering is a field where VR is already making a huge impact. In general, it allows access to situations or simulations that would otherwise be difficult to visualise or inaccessible to anyone except the specialists involved. By realising a virtual model, communication and interaction with designs and processes is made more efficient. Over the next two years, consumer technology should begin to meet the needs of engineering. There are a number of opportunities for companies to innovate in this area through software that combines established engineering tools with the new interactive opportunities of VR.
***
This article has been adapted from "How virtual reality is changing engineering", which originally appeared in the print edition of Ingenia 70 (March 2017).
Contributors
Professor Anthony Steed leads the Virtual Environments and Computer Graphics group in the Department of Computer Science at UCL. His research area is real-time interactive virtual environments, with particular interest in mixed-reality systems, large-scale models and collaboration between immersive facilities. The group runs a CAVE-like facility and several other high-end virtual reality systems.
Keep up-to-date with Ingenia for free
SubscribeRelated content
Technology & robotics
When will cars drive themselves?
There are many claims made about the progress of autonomous vehicles and their imminent arrival on UK roads. What progress has been made and how have measures that have already been implemented increased automation?
Autonomous systems
The Royal Academy of Engineering hosted an event on Innovation in Autonomous Systems, focusing on the potential of autonomous systems to transform industry and business and the evolving relationship between people and technology.
Hydroacoustics
Useful for scientists, search and rescue operations and military forces, the size, range and orientation of an object underneath the surface of the sea can be determined by active and passive sonar devices. Find out how they are used to generate information about underwater objects.
Instilling robots with lifelong learning
In the basement of an ageing red-brick Oxford college, a team of engineers is changing the shape of robot autonomy. Professor Paul Newman FREng explained to Michael Kenward how he came to lead the Oxford Mobile Robotics Group and why the time is right for a revolution in autonomous technologies.
Other content from Ingenia
Quick read
- Environment & sustainability
- Opinion
A young engineer’s perspective on the good, the bad and the ugly of COP27
- Environment & sustainability
- Issue 95
How do we pay for net zero technologies?
Quick read
- Transport
- Mechanical
- How I got here
Electrifying trains and STEMAZING outreach
- Civil & structural
- Environment & sustainability
- Issue 95