The Road to Virtual Safety Simulations
By SAP Insights | 15 min read
Sure, vehicles are safer than they’ve ever been. The automotive industry has used decades of crash-testing data to make cars safer and minimize the impact of accidents. But while vehicular design has changed drastically since Cadillacs ruled the roads, crash testing has evolved more slowly.
And that’s a problem.
The latest cars use different materials, are structured differently, and have different safety equipment. Humans aren’t all the same either. Yet crash test dummies are based on the average male in the 1970s – 5 foot, 9 inches tall and 171 pounds – and this model drives vehicle safety designs. And even though the U.S. National Highway Traffic Safety Administration has smaller dummies available to test for women, children, and even infants, the agency’s recent studies have shown that female drivers and passengers, when wearing a seatbelt, are 17% more likely to die than a male occupant in a crash. It will take a lot more reengineered tests to provide more precise data on women and children.
This time-warp effect means that today’s testing isn’t as effective as it could be, which puts revenues and profits – and especially lives – at risk.
Not to mention that physical crash tests are expensive, time consuming, and wasteful. They mean, in the case of automotive testing, taking a finished prototype and destroying it.
Nor is this situation limited to the auto industry. Many other industries that have traditionally used physical testing, like engineering, aerospace, and extractive industries, are looking for ways to expand their knowledge and efficacy of testing.
Vintage testing techniques are ripe for disruption, and digital is that disrupter. Digital technologies are making testing quicker, simpler, and cheaper. Industries that are using virtual testing (also known as simulations) are beginning to implement them, or using them more in different industries such as transportation and construction engineering.
These developments are still in the early stages, but they point to an urgent need: the sooner we adopt advanced digital techniques, the sooner we’ll be able to build better, save lives, and limit environmental impact. We’ll see less serious injuries in accidents and fewer physical faults, recalls, warranty claims, and injuries, plus better, more reliable components. All these things will save businesses time and money and help them manage risks and their reputations.
The sooner we adopt advanced digital techniques, the sooner we’ll be able to build better, save lives, and limit environmental impact.
Traditional testing is widespread. Almost every physical object we interact with, from construction materials to our smartphones, is stress tested for operational and safety assurances – but materials are often unpredictable given the many interactions and scenarios we can subject them to. Cohesive information from every possible collision, bump, or drop is difficult and expensive to measure. These factors make virtual testing harder to do than it sounds. And yet doing so is a matter of life and death.
Moving testing to the virtual realm gives us methods like digital twins that use simulations to recreate a virtual copy of a physical item, neural networks, and self-aware structures that can predict structural problems and damage before they become problems. By using a virtual testing environment, we’ll be able to unleash even more innovations – and provide more safety.
Modernizing and digitizing vehicle safety
Many tests, both physical and virtual, are already being done on cars and the materials used to build them, but those tests are limited.
By law, specific tests, like frontal impact and rollovers, are required to certify a car, says Michael Worswick, professor in the Department of Mechanical and Mechatronics Engineering at the University of Waterloo and executive director of the Waterloo Node of the Advanced Manufacturing Consortium. These are the final tests out of many. Yet making late-stage changes based on such testing is not always easy. If, for example, a car fails the crumple zone test – the section at the front of a vehicle designed to crumple on impact – change can be costly. “If there’s a problem with a crash, it’s very, very late in the game, and any changes can be really, really expensive,” Worswick says.
Vehicle companies already model every component down to every spot weld – there can be 5,000 spot welds in one car – which requires massive models and large supercomputer clusters, says Worswick. These companies want to minimize the number of tests they have to do and ensure the tests they do perform run to perfection. That’s where most of the crash simulation work happens now.
Vehicle manufacturers are also using more advanced materials, like ultra-high-strength steel and high-strength aluminum. But they require trade-offs: gaining strength for less ability to deform without cracking, for example. Part of Worswick’s job is to help companies with material property evaluation under crash conditions in their quest to find the best performing materials – metals and composites – for their vehicles. “We do an endless series of characterizations of all kinds of materials for these companies, and then we feed them the raw material data, or in some cases, we feed them the computer model for the material.”
Navigating the data-sharing problem
But much of the research that engineers like Worswick conduct occurs in a vacuum because companies aren’t eager to share.
Jessica Jermakian, vice president of vehicle research at the Insurance Institute of Highway Safety, says that sharing data is a roadblock to digitizing testing. An automaker’s computer model is part of its intellectual property, yet if it’s used to evaluate safety, then it needs to be shared with regulators, external testing agencies, and even other companies. Anyone on the sharing end would need to be confident in that model, she says, and that it does what it purports to do as well. “There needs to be some sort of verification of that,” she says. “There are a whole lot of data-sharing issues when you get to sharing proprietary data and having to evaluate it from an outside perspective.”
Next-Generation Capabilities
The automotive industry is racing into a new world of mobility.
Some sharing is already happening, however. Earlier this year, Toyota began to offer the company’s virtual crash test Total Human Model for Safety (THUMs) software for free. The company got much of the baseline data to build those six virtual human dummies from Stewart Wang, a trauma surgeon and researcher. Wang serves as executive director of the Morphomic Analysis Group and director of the International Center for Automotive Medicine, and he’s a professor in the division of acute care surgery and director of burn surgery at the University of Michigan in Ann Arbor. He’s seen firsthand results of vehicular crashes on people when they come to the emergency department and quickly realized preventative measures are better than post-collision treatment.
Physical crash test dummies are created based on historical research from the 1960s through the 1980s. New technologies and data analytics are providing us with much more granular vehicle and crash data.
To improve accuracy, Wang works on calibrating the data from virtual dummies with data from humans who have been in crashes. He used custom software to analyze thousands of emergency room scans of patients who have been in vehicle crashes. The extremely detailed resulting measurements were assembled into a massive data set that helps link human body sizes, shapes, and anatomy with vehicle, crash, and injury outcomes. “We’ve used high throughput statistical techniques to identify what specific components of the body geometry are important for each class of injury,” like head injuries and chest injuries, Wang says.
Physical crash test dummies are created based on historical research from the 1960s through the 1980s, when corpses were used in crash tests. But, as Wang points out, deceased bodies don’t behave the same as living bodies for a variety of reasons, like tissue stiffening. Virtual bodies allow car companies to model crash incidents on a variety of body shapes, sizes, and ages. New technologies and data analytics are providing us with much more granular vehicle and crash data.
“There is a need for these virtual crash dummies,” Wang says. “The advantages are they can more efficiently reflect a broader segment of the population, which is very important.”
Safety testing in an autonomous future
Over the past couple of decades, there have been significant improvements in modeling the design process of vehicles, says Jermakian. But for safety evaluations, physical tests are still necessary, even though these tests are often limited. “What happens out in the real world is much more diverse, the people are much more diverse, the crash configurations are much more diverse,” she says. Historically, testing has not taken into account the diversity of body types and crash configurations. “We are thinking about how we could incorporate computational modeling and virtual testing into our evaluation program, not to eliminate the physical testing, but to supplement it. To maybe look at a broader range of aspects of crashes that happen in the real world beyond what we can test in our physical test program.”
One way Worswick sees virtual tests as being both possible and useful is in instances when car model variations occur. Car manufacturers tweak models from year to year in between major design overhauls, for example. Today, that would require a separate crash test, but down the road, it’s possible those iterations can be simulated instead.
And what about autonomous vehicles? If fancy and futuristic predictions are any measure, passengers in fully driverless vehicles will be watching TV, napping, or playing video games. They will be positioned within and react to the movements of vehicles that differ from conventional automobiles. And that changes how we think of crash testing.
“The physical dummies were designed to be in those standard seating positions, facing front with your hands on the steering wheel in a very specific position,” says Jermakian. “But when we remove the steering wheel and allow people to sit in different positions, our dummies aren’t designed to work that way. So they’re using computational modeling to look at those other scenarios.”
Even with all these materials testing efforts, there is still a need for passive safety – protecting the vehicle’s occupants – says Wang. Without near-perfect safety for passengers, no autonomous vehicle company will survive, he explains, because they will be sued out of existence. “If you’re going to assume responsibility for driving the car, you better be perfect or darn near close to it.”
The future of simulations: Piece by piece, moment by moment
The future of ensuring safety for big objects like planes, trains, and automobiles – not to mention bridges, buildings, and other infrastructure – will start with extremely small parts of these larger wholes. Researchers are working on ways to virtually model objects by breaking them into many discrete data points. Simulations will improve as we are able to garner more granular information about different materials and their reactions to stressful situations.
The improvements will require ongoing research. A big question is how to turn physics equations into physical simulations, says Dr. Joshuah Wolper, a recent doctoral graduate studying computer graphics at the University of Pennsylvania. Computer-generated simulations use physics from engineering and mechanics, fields that have studied how a material’s velocity or other properties like damage change over time. “With computers, we need a way to integrate these equations over space and over time,” he says.
A challenge with crash testing is breaking material simulations down into small periods of time and varying time lengths. Environments with multiple materials are challenging to model because each material acts and reacts differently. The more rigid a material, the smaller a time step is required, says Wolper. “Something like glass, for example, is very stiff. And so explicit integration might be tricky, because you might have to take extremely small time steps to maintain simulation stability,” he says.
Wolper is working on material point method (MPM), a mathematical technique used to simulate physical properties of objects. This method is promising in the context of fractures and crashes because it can manage really large material deformations, says Wolper. In other words, something can be very squashed or very stretched to extremes and MPM looks at how the strain is changing from time step to time step – how much the deformation changes with each step compared to the previous. (For more on MPM, see “Simulations Tech Stars in Animated Film” below.)
The road from video games to structures that set off alerts when they’re broken
While Wolper’s current research focuses on realistic effects for movies and video games, MPM holds potential for applications to many other sectors, he says. Apart from crash testing, medicine, bioengineering, and robotics could all benefit. For example, simulations performed with MPM can be used with soft robotics – robots that are flexible and bendy – to see how movement can stress parts. “I think there is real excitement about these methods,” he says. “Rather than busting up your robot, you can simulate it.”
We’ll likely see simulations developed as part of digital twins, an increasingly popular method that creates a virtual representation of a physical object and often uses simulations as part of that process. Formula 1 race teams use digital twins with simulations to model parts, and now digital twins are increasingly being used in aerospace and other industries. For example, the German Aerospace Center recently opened its Virtual Product House, a project that will use simulations, among other technologies, to test and certify aircraft parts.
AI and machine learning that optimize simulations
Advanced technologies, including artificial intelligence (AI) and machine learning, are being used more often as part of simulations. AI is already having a big impact on materials optimization, which is finding the best material for a given application, says Waterloo’s Worswick. Researchers in the electronics industry, for example, have fully automated material fabrication by brute-force programming a range of combinations and testing as they go. “So you’re generating this massive data set that the AI needs to train a neural network, and now you can start to identify where you should go in the material optimization,” he says.
The boom in machine learning is helping researchers figure out how to integrate data from traditional experiments with data from simulations and combining all of it into a unified framework, says Bianca Giovanardi, assistant professor of aerospace engineering at Delft University of Technology. “In many applications, people are putting together physics-based simulations and machine learning techniques in such a way that they can get answers much faster,” she says.
The boom in machine learning is helping researchers figure out how to integrate data from traditional experiments with data from simulations. The combination has the potential to provide answers faster.
The combination of the two would provide a best-possible view on the status of materials and structures, opening the door to innovations. But there are challenges to using machine learning and AI in simulations. One is feeding historical data from, for example, years of crash testing into an AI system because that data is often scattered and in various formats. Another is that computer simulations are often very expensive, particularly for something like real-time fracture simulations.
And then there’s the fact that machine learning doesn’t have predictive power because it isn’t aware of the laws of physics. Traditional scientific computing, however, is based upon the laws of physics.
Neural networks can be used as a surrogate for physics-based simulations. Offline simulations are run first, and then the network, which has been trained on the completed simulations, can be asked for specific results.
However, echoing Michigan’s Wang, a significant caveat is the issue of accuracy. Machine learning algorithms have traditionally been developed for fields where occasionally being wrong isn’t a big deal, says Giovanardi. Netflix’s movie-viewing recommendations may be frustrating when they’re off base, but certainly not life-threatening. In the context of building something like a car, bridge, or airplane, 98% or even 99% accuracy isn’t good enough. “The community has been trying to see how to make the tools much more reliable, before they can be applied on an everyday basis,” states Giovanardi.
There’s an even newer idea: self-aware structures. These structures can report on their own condition by combining machine learning and data-driven digital tools that use real-time sensor data. The machine learning algorithms can be trained on physics-based models and learn “which one best represents the state of the structure,” says Giovanardi.
For example, if a foreign object damages an airplane wing during flight, the sensor data will tell you how the wing is vibrating, deformed, or otherwise behaving unusually. Eventually, with that information, the computer simulations trained using machine learning will be able to tell you about the damage in real time and what actions to take before things get dangerous.
“If you had this mechanism in place, then you could say, yes, this is a critical engine failure so I definitely need to land this aircraft as soon as possible. Or you can say, no, this is completely safe, you can still go to the destination and have the engine checked there,” explains Giovanardi.
These are the types of advances that will minimize risk and danger in the real world. With the type of detailed information provided by simulations, driving a car, flying an airplane, and building a bridge will be quicker, less wasteful, and safer.
Fans of the animated film Frozen may not be surprised that material point method (MPM) – which has been around for about 25 years – was actually popularized by Disney. In 2013, Walt Disney Animation Studios and researchers from UCLA published a research paper about snow-simulation effects. (They’ve since published several more on MPM.) In the real(er) world, it’s been used to model landslides and avalanches.
“We get really natural multimaterial coupling with MPM. You can model Jell-O in a pool of liquid and then throw some snow on top of it and throw a block of metal in there,” says Wolper, a doctoral student studying computer graphics at the University of Pennsylvania. “You can mix all these materials together and it can handle it all beautifully.”
Further reading
SAP Insights Newsletter
Ideas you won’t find anywhere else
Sign up for a dose of business intelligence delivered straight to your inbox.