In a world driven by digitalization, more and more researchers are relying on computer simulations to recreate and expand upon physical experiments.
Whether an investigation is too dangerous to conduct in person, such as detonating explosive materials, or depends on a natural event to occur—like a hurricane—simulations offer a quick and cost-effective alternative to traditional studies.
Since 2009, Dr. Rui Tuo, assistant professor in the Department of Industrial and Systems Engineering at Texas A&M University, has worked to enhance the accuracy of computer simulations by developing a new theoretical framework for computer calibration.
Digitally generated data acts as a modern foundation for advancing disaster safety, material handling, component construction and much more. However, for research based on computer modeling to produce long-lasting and effective results, the data output must be as exact as possible.
“In computer modeling, we always have some parameters that we don't know the true value of,” said Tuo. “The task of calibration is to identify the values of those parameters so that we can predict the physical output accurately.”
These unknown values are known as calibration parameters.
By identifying calibration parameters and the discrepancy between the computer and physical responses, researchers can predict and virtually develop the outcomes of experiments to draw conclusions and share their findings.
A multitude of computer codes can be used to approximate calibration parameters, each offering a different view on a single subject. Researchers choose which code will give them the cleanest and most relevant estimation of the unknown values through a process called calibration.
As Tuo explained, while the current method of calibration and data estimation is widely accepted, the theoretical basis is not well understood and it poses a long-debated identifiability issue.
“Nowadays, computer simulations are very popular and almost every simulation study involves calibration, so this is a very fundamental issue,” said Tuo.
In order to solve this, Tuo partnered with Dr. Jeff Wu, Coca-Cola Chair in Engineering Statistics and professor at the Georgia Institute of Technology, to propose a new foundation for calibration in computerized experiments.
Their work was published in the joint Society for Industrial and Applied Mathematics/American Statistical Association Journal on Uncertainty Quantification in a paper titled “A Theoretical Framework for Calibration in Computer Models: Parametrization, Estimation and Convergence Properties.”
By critiquing the issues of past methods and creating a fresh, accurate framework for calibration, Tuo and Wu are making computer experiments a more viable and comprehensive method of research.
Through their calibration framework, computer simulations will be able to produce data sets that are more reliable and relatable to future interdisciplinary studies while continuing to reduce the monetary and time expenses associated with experimentation. This will allow more accurate and understandable calibration parameters to be identified and used.
Through their ongoing work, Tuo and Wu are bridging the gap between digital and physical. In doing so, they are setting the stage for more effective and efficient simulations and advancing the world of computer research.