Summary of LFTE Panel Workshops

Menu

Abstract | Introduction | Method | Scope | Physics-Based Models | Connecting Modeling to Military Utility
Strategies for Uncertainty | Workshop Results | Recommendations, Conclusions and Observations | References

Pre-print of paper presented at "Workshop on HPC in support of T&E", Aberdeen, MD, 13-16 July 1998

Physics-Based Models to Support Test and Evaluation

Samuel Blankenship
Test & Evaluation Research and Education Center
Georgia Institute of Technology
Atlanta, Georgia 30332-0840

Frank Mello
OSD/DOTE/LFTE
1700 Defense Pentagon
Washington, DC 30301

Abstract

Models to support the Simulation, Test and Evaluation Process (STEP) and Simulation Based Acquisition (SBA) must provide detailed understanding and accurate predictions of phenomena involved in the interaction of weapons and targets. Current Live Fire T&E (LFT&E) also requires access to such models for both lethality and vulnerability assessments. Models with central algorithms based on well understood physical principles have an intrinsic advantage over engineering or heuristic models in these applications, particularly when extrapolated well beyond regions of plentiful test data. Until recently, these models have been run exclusively on supercomputers but are now available on High Performance Computers (HPC). While very highly resolved calculations still require one-of-a-kind resources, advances in HPC have made physics-based modeling a viable tool within the assessment community. Mr. James F. O'Bryon, Deputy Director of OT&E, Live Fire Test, established and chaired the Live Fire Test and Evaluation Panel on Physics-Based Models to evaluate the current state of the art of these models and to suggest a set of programs to advance the application of physics-based models to specific T&E problems. The Panel, whose members were drawn from development, research, and test communities, met for over a year in small, highly focused workshops to evaluate models for assessing vulnerability and lethality phenomena. The results of these workshops include a current program to apply physics-based models to the T&E of fuel fires on military aircraft. Observations about the general applicability of physics-based models supported on HPC are made.

Introduction

The science and technology communities have developed models based on first principles physics to describe details of many physical phenomena. These models are often complex and require significant computing resources, but they can also provide very precise descriptions of nature. They have potential utility to the test and evaluation (T&E) community in providing more trustworthy prediction, assessment, and evaluation tools.

To determine the extent of the applicability of such models to support T&E, the Live Fire Test and Evaluation Panel on Physics-Based Models was established. Chaired by Mr. James F. O'Bryon, Deputy Director of Operational Test and Evaluation, the panel established the current state of the art and identified areas for expansion and application of these models to T&E, specifically Live Fire T&E (LFT&E). This restricted scope provided a convenient size for study, and the lessons learned can have applicability throughout T&E. More details about the study are available in Reference 1.

Method

The panel convened a series of workshops by inviting of a representative set of the country's experts in various associated fields to discuss specific topics associated with the objectives. Attendees represented different communities, including broad T&E representation (not only LFT&E, but also Operational T&E and Developmental T&E); product developers (both U.S. Government and contractor personnel), modelers, and data providers. The workshops were restricted in size; they were working sessions, not conferences. Seven panel workshops met over a period of 18 months. A summary of subjects, locations, and dates is given in Table 1 below.

Table 1. Workshops Summary

Workshop Subject
Location and Date
Kickoff
Atlanta, GA, 17-18 Sep 96
Blast and Penetration
Atlanta, GA, 12-13 Nov 96
Underwater Explosions
Atlanta, GA, 25-26 Feb 97
Fire
San Antonio, TX, 29-30 Apr 97
Hypervelocity Impacts
Los Alamos, NM, 24-25 Jun 97
High Power Microwaves
Albuquerque, NM, 26-27 Aug 97
Wrap Up
Las Cruces, NM, 24 Feb 98

 

Scope

The limitation of the panel scope to LFT&E retains a very large set of problems for investigation. LFT&E is a Congressionally mandated program concerned with establishing the vulnerability and lethality (V/L) of weapon systems, configured for combat and exposed to realistic threat environments. It has a civilian counterpart in automobile crash testing and many other areas.

Vulnerability and lethality can be analyzed in terms of Figure 1, which shows the damage scales, activities, and time scales for an attack upon a target and its eventual return to mission effectiveness. (Reference 2.) From a lethality perspective, generally the best result is producing damage that drives target mission capability below the point where recovery is possible, preferably so that the target shows some unmistakable evidence of being removed from the order of battle. From a vulnerability and survivability perspective, generally the best result is minimizing damage so that capability remains high or returns quickly. For some targets, damage control and mission recovery steps are not possible.

Not all of the processes in Figure 1 can currently be modeled from basic principles of physics or other disciplines. Battle damage repair, for example, is not yet amenable to this level of modeling and may never be. The effects that have been the subject of physics-based modeling efforts include the primary physical damage caused by the direct effects of the weapon on the target, fires initiated by or subsequent to the primary and subsequent phases, and the spread of fires during the progressive damage phase.

Target Effects Time Line
Figure 1. Target Effects Time Line

Physics-Based Models

Physics-based models are in principle no different from other software. Their parts consist of input, processing, and output, and are all supported by an infrastructure. However, these models do not include hardware-in-the-loop simulations or other real-time modeling. Typically speaking, the models of most interest at the workshops run so slowly compared to real time that the kind of linking often used in real-time applications cannot be made.

The input phase can be characterized as the process of describing the physical traits of the system in terms the software understands. This often laborious task can represent a significant fraction of the human effort in the process. This is particularly true for physics-based models because a computational structure must be constructed in addition to the detailed geometric description; a process that can require extensive user interactions.

Physics-based modeling tools are further distinguished from engineering models or heuristic design tools by the computational resources devoted to processing. Since fundamental conservation equations are solved for the physical state within a computational cell or element, complex global states can be accurately represented by applying a highly resolved computational template. Numerical methods for solving the very large numbers of equations required are the source of continual research and development, which extends from basic software algorithms to specialized hardware designs.

One byproduct of highly resolved physics-based calculations is an extreme amount of output data. The quantities of data which can be produced routinely by physics-based codes would overloaded the mass storage capabilities of just a few computer generations ago. To gain insight from the modeling process, this data must be made accessible by interactive, or at least rapid, interrogation by the user. This is a vital step if the modeling and simulation is used to support T&E. Analytical measures of performance have to be developed which can be related to performance metrics measurable during testing.

The explicit inclusion of well-understood basic physics in the processing algorithms should produce results that are understandable, trustworthy, readily extrapolated to new conditions, and provide the most utility to V/L assessments, particularly in LFT&E. The modeling detail required to realize this expectation is typically only practical in the high performance computing (HPC) or supercomputing environments. Recent advances in HPC horsepower have made computing more affordable and allowed researchers to work problems that were unmanageable in the past.

Connecting Modeling to Military Utility

Despite the enormous technical challenges of developing accurate robust physics-based modeling software, one of the most significant challenges the T&E community faces is to connect modeling capabilities and decision processes. Military systems are evaluated on the basis of variables such as lethality and vulnerability, which are not physical quantities. To be effective in this context, physics based models must be embedded in a process that connects physical variables to military metrics. (Reference 3)

As an example, consider the penetration of a metal rod through tank armor. Physics-based models of penetration can predict the amount of penetration prior to a test or be used to interpret the results of a test after the data are available and to explain differences perhaps between expectation and results. An evaluation of the lethality of a weapon must go beyond this and determine the military effect of the penetration upon the system suffering the attack. Whether the system continues to function after the penetration is an engineering and human effects issue, and cannot be decided from these first-principles models. This is shown graphically in Figure 2. The first principles models are an important part of the method for V/L determinations, but there are other parts, and the first principles models of penetration alone are insufficient.

Physics-Based Models
Figure 2. Physics-Based Models


Strategies for Uncertainty

To apply physics-based models in the process depicted in Figure 2, the performance requirements at each stage must be consistent with the overall level of fidelity required to support the overall objectives. This sensitivity to context is one of the most conceptually difficult aspects of effectively integrating modeling into a traditionally test-based process. Our ability to refine the predicted behavior of a few elements using physics-based models can provide a sense of accuracy that may not be real.

It is very important to understand the quality of the answer required to support a decision and to assure that modeling and simulation results of that quality are possible. There are inherent uncertainties in describing any system. There are further uncertainties associated with building a model of that system. Uncertainty stemming from imperfect knowledge of the system is irreducible without testing the system outside the context of the modeling process. In other words, data can not be used to both develop and validate a model. Model uncertainty, which is distinct from inherent uncertainty, can be reduced by improving algorithms or increasing resolution. However, if inherent uncertainty is high this effort may be wasted.

Uncertain model inputs can be accounted for using mature statistical techniques such as Monte Carlo methods. These are already used in the decision process to estimate the probability of kill given variations in weapon end-game engagement conditions. The requirement to feed these methods with repeated performance estimates is inconsistent with using highly resolved physics models and has fostered the development of fast running engineering models, which use lookup tables or other techniques to quickly interpolate behavior. It is not clear that the error committed in fast running models is justified by the precise statistics that might be generated. Effective, affordable decision making requires a balance between accuracy, precision, and speed. The extensive resources needed to build and to apply physics-based models on HPC requires an approach that ensures the balance is identified and maintained.

Workshop Results

The intent of the workshops was to evaluate the state of the art in physics-based modeling, to engage experts in the various fields supporting such modeling and to articulate the needs of the LFT&E community for making effective use of these tools. It was postulated that the phenomena required to model LFT&E events could be crudely sorted into the categories listed in Table 1. With the participation of several regular attendees and a larger cast of supporting specialists, these workshops were an effective forum for exchanging information and forming action plans. Proposals for advancing the state of the art in each focus area were prepared. Two proposals have resulted in direct support from the LFT&E office and other organizations interested in these phenomena are invited take advantage off these efforts.

During the early workshops, participants attempted to identify and prioritize improvements to modeling and simulation tools in a more general sense. They considered improvements to input, processing and output tools but quickly settled on processing improvements as the most valuable. Since this approach was attempted early in the workshop series the suggested improvements were prioritized separately for three applications. Above the surface refers to blast, penetration and impact loading, below the surface implies underwater explosion effects, and fire was the last category. Table 2 presents these prioritized lists side by side for comparison. Statistical uncertainties of the prioritization process argue against making too much of these results, but the voting process permitted a rough measure based on the considerable expertise of the workshop participants.

Table 2. Prioritized Processing Upgrade Suggestions

Above the surface Below the surface Fire
Coupled effects Full system modeling Uncertainties
Non-ideal explosives Multiple effects Fire Suppression Models
Full system modeling Stochastic phenomena Coupled Effects
Stochastic phenomena Coupled effects Models of fire itself
Multiple effects Non-ideal explosives Stochastic phenomena
Time scaling Length scaling Non-linear phenomena
Length scaling Time scaling New Codes
Length Scaling
Time Scaling

In general, systemic improvements in modeling uncertainty and stochastic phenomena and in the integration of sub-system models into full-system models were viewed as having broad impact.

Later workshops did not attempt to generalize areas of suggested upgrades. After Workshop 4, the Fire meeting, the panel became involved in developing specific tasks to demonstrate the applicability of these models to T&E. These tasks directly addressed shortcomings in current capabilities, but also considered T&E needs for current DoD programs. These tasks concern both vulnerability and lethality and are listed in Table 3 in the order briefed at Workshop 6 by task proponents (ad hoc leaders selected from the workshop participants.)

These proposed tasks are the most significant outputs of the workshops. More details about the tasks are available in Reference 1. The Fire Suppression task became part of the Safety and Survivability of Aircraft Initiative funded through the LFT&E office and is the subject of another paper accepted at this conference (Reference 4). Some of the general workshop findings are briefly discussed here.

Table 3. Recommended Tasks to Extend First Principle Model Applications

Task Vulnerability or Lethality? Proponent
Propellant Reactivity
L
Mike Heard, USAF
Helicopter HPM
V
Mark Henderson, USN
Building HPM
L
Pat Vail, USAF
Missile Defense
L
Bob Weir, SNL
Fire Suppression
V
Dan Rondeau, SNL
Hydrodynamic Ram
V
Frank Addessio, LANL
Ship Fire
V
Ron Reese, IDA

Current capabilities permit the modeling of blast, penetration, and shock with some confidence for both armor design and structural shock loading. The modeling of behind-armor spall and some other target effects, such as whole-system effects, are typically less well modeled and are handled with statistical methods. Largely due to the success of lethality and vulnerability methods, developed in conjunction with extensive testing of armored vehicles, the focus of the penetration category shifted to the lethality of missile intercepts. Realistic velocities for many of these engagements can not be reproduced in ground based testing where data gathering is easiest. In order to extrapolate well beyond the available data, unprecedented confidence in our modeling tools is required.

Fire effects models predict temperatures and smoke spread as a function of time. Currently, indoor pool fires are modeled with some confidence. Modeling of fire effects outdoors is less robust and modeling of the fire itself is not modeled with confidence. Particularly challenging are predictions of initiation, fire/suppressant interactions and other phenomena such as the transition to detonation. The fire category was accordingly split into two sub-topics; one focusing on fire suppression modeling in engine compartments and the other on shipboard fire spread models.

Models of high-power microwave (HPM) weapon-target interactions work well in the generation and propagation of the microwave beam, and in its bathing of the target surface. The effects of the beam inside the target are less well modeled. Minor variations from target to target can have an influence on the internal effects that is difficult to predict.

As with all modeling and simulation discussions, validation, verification and accreditation was an underlying concern. Explicit verifications and validations of the model results against data have typically occurred, but documentation is often informal. Accreditation for application to LFT&E, as outlined in the DoD 5000 series, is almost completely absent, but the occasion for such an exercise has generally not arisen.

Recommendations, Conclusions and Observations

The tasks recommended through the workshop process represent clearly stated technical objectives with direct benefit to defense programs. The workshop participants are to be commended for their willingness to work for the common good in connecting model development initiatives to programmatic needs in a cooperative atmosphere. All of the recommended tasks are worthy of further evaluation and implementation on the following bases:

  1. The tasks extend the capability or range of applicability of current models to support assessments of military worth.
  2. In each focus area basic models exist and the potential for further development is high, resulting in a favorable risk to payoff ratio.
  3. The tasks are important to military systems currently under development and so have near-term utility.

The ability to predict performance based on fundamental understanding has obvious payoff beyond LFT&E. For example, there is an ongoing effort to apply current radio frequency codes to predict the signature of low observable targets in support of T&E requirements. Detailed physics-based models hosted on HPC systems are the best alternative for addressing this critical need.

Current acquisition and T&E reforms require physics-based models and powerful computers. The Virtual Proving Ground and Virtual Range concepts require realistic synthetic environments, generally in real time. Physics-based models on HPC have considerable capability to support these applications. The Simulation, Test and Evaluation Process (STEP), by virtue of its emphasis on simulation, has need for the kind of detail and fidelity that physics-based models on HPC can provide. With high fidelity physics-based models, Simulation-Based Acquisition is a profound challenge; without such models it is impossible.

Finally, among other observations,

  1. Physics-based modeling is not new. Investigation of appropriate applications of these models to T&E is the novelty addressed in this paper.
  2. Physics-based models will not replace testing. Modeling should stimulate the process of discovery, suggesting explanations for observed results that may form the basis for further tests.
  3. Physics-based modeling is not cheap. Planning and budgeting for meaningful model development must proceed at the highest strategic levels.

References

  1. Test & Evaluation Research and Education Center, "First Principles Munitions Effectiveness Modeling", Draft Report, 18 May 1998, prepared for DOTE/LFTE. Some information also available at http://www.terec.gatech.edu under research.
  2. Figure adapted from Fred Fisch, "Modeling Weapons Effects Against Ships", Briefing, 18 September 1996, Atlanta, GA.
  3. Paul Deitz and Michael Starks discuss one such structure in "The Generation, Use, and Misuse of "PKs" in Vulnerability/Lethality Analyses", Proceedings of the 8th Annual TARDEC Symposium, 25-27 March 1997, Monterey, CA; also US Army Research Laboratory Technical Report ARL-TR-1640, March 1998.
  4. Louis Gritzo, Carl Peterson, and Martin Lenz, "Integrated Numerical Modeling, Experiments, and Live Fire Resting for Improved Fire Safety and Survivability", ITEA Workshop on High Performance Computing in Support of Test and Evaluation, Aberdeen, Maryland, 13-16 July, 1998.

top

Last Updated December 20, 2007