The design of real-world automated control systems that do everything from regulating the temperature of skyscrapers to running the widget-making machine in the widget factory down the street requires expertise in sophisticated physics-based modeling. The need for this modeling expertise increases operational costs and restricts the applicability of automated control to systems in which marginal operational performance improvements lead to huge economic benefits, according to data scientists.
With unlimited access to supercomputers and mountains of data, engineers can train artificial intelligence systems such as deep neural networks, a type of machine learning model, to perform automated control. But many people lack access to the necessary computational power to do so, or the ability to generate the amount of data needed to train a controller that has a deep neural network.
What’s more, these types of deep neural networks are so-called black-box models, which means that the factors they use to make decisions are hidden from the end user.
In addition to the lack of interpretability, the behavior of standard deep neural networks is difficult to certify, which prevents their use in applications where the safety and performance of the controller must be guaranteed, explained Aaron Tuor, a data scientist at the Pacific Northwest National Laboratory (PNNL) in Richland, Wash.
“What we are trying to do is bring this deep-learning–based modeling into a more data efficient regime enabling its use in real-world applications, which may need interpretability and guarantees of operation that black-box deep-learning modeling can’t offer,” he said.