Incorporating uncertainties in aircraft design
By Qiqi Wang|December 2016
The Non-Deterministic Approaches Technical Committee advances the art, science and cross-cutting technologies required for applying non-deterministic modeling and analysis to aerospace systems.
Aircraft design can be viewed as optimizing a complex system under uncertainty. One approach to managing this difficult task is to hierarchically decompose the problem, both along disciplines such as aero, structures and propulsion and along subsystems such as wing, empennage and fuselage. Although this decomposed multilevel optimization approach can offer more efficient solutions, effectively incorporating uncertainties has proven to be challenging. Mississippi State University is tackling this challenge through variations of the decomposition and internal subproblem coordination, especially for problems involving multiscale systems.
Goal-oriented design, such as designing a jet for a specific type of mission, can benefit from tools of uncertainty quantification. In such design scenarios, the goal or the mission poses constraints such as endurance, observability and maneuverability. Designs with a maneuverability constraint must consider flutter, which can be modeled using either a detailed and expensive flow solution or a simplified model. Such different levels of fidelity are often available in modeling each physics. Wright State University in Ohio is using uncertainty quantification to decide the fidelity for each physics, given the goal of the design and available computational resources.
Integrated computational material engineering can change how aerospace engineers design and manufacture material. Arizona State University is using deep learning to understand how structural properties of a material depend on its manufacturing process. It used Convolutional Deep Belief Network, a technique used in state-of-the-art voice recognition, to predict what a material looks like at multiple scales. Their outputs are multichannel binary images that represent possible spatial patterns of a material. The resulting images help to accurately reconstruct stochastic microstructure of materials, such as the Ti-6Al-4V alloy widely used in the aerospace industry.
In June, Sandia National Laboratories held a workshop on “Complex Systems Models and Their Applications: Toward a New Science of Verification, Validation and Uncertainty Quantification.” The workshop illuminated research opportunities that intersect with two important Sandia communities: the complex systems modeling community, and the verification, validation and uncertainty quantification community. The overarching research opportunity is how to quantify the credibility of knowledge gained from complex systems models, such as cyber systems, climate and economics, the electric grid, and disease spread or natural disasters with human reactions. The workshop established a long-term research agenda for answering these questions.
Accounting for uncertainty in engineering problems requires the quantification of observational noise, numerical discretization errors and other error sources in mathematical models. For simulations used for design, control, discovery and decision-making, quantifying these error sources using big data helps equip their solutions with a degree of confidence. The University of Texas at Austin has been developing large-scale data-driven algorithms to quantify the uncertainty in statistical inverse problems. Among those that are constructed and tested are derivative-enhanced Markov Chain Monte Carlo techniques, particle-based methods for Bayesian posteriors, randomization methods for big data in inverse problems, and triple-model — state, parameter and data — reduction methods. ★
Contributors: Yongming Liu and Yi Ren