Economics of T&E

October 14 - 16, 1997

Schedule

Abstracts: Technical Session I | Technical Session II | Poster Papers

Invited Speakers | Conference Organizing Committee

Conference Summary

The TEREC Conference on the Economics of Test and Evaluation (T&E) was held in Atlanta on 14-16 October 1997. The conference attracted a broad representation from the T&E community, including over 50 attendees representing both the military and civilian federal government as well as industry and universities.

The need for better economic data and information was presented by many of the participants. For example, better accounting methods are needed to track cost data accurately. Methods to find the optimum point on a life cycle cost vs. T&E cost curve would help T&E planners. The response of the participating accountants and economists was that the tools and methods needed to fulfill the needs exist. Some, such as cost effectiveness analyses, have already been applied to T&E issues. The conference also included practical studies on cost and risk. These included the description by Bill Krupp of Lockheed Martin on the economic and other considerations taken into account during a company exercise to optimize the T&E facility infrastructure.

The feedback from the conference has been very positive. All of the respondents to the conference evaluation forms said that they would recommend the conference to colleagues. There were also many suggestions for ways to improve the conference, such as increasing participation by the economics and business communities, having more cost and benefit study examples particularly for T&E facilities, and broadening the selection of practical short courses. All who mentioned the possibility of another conference on this subject were positive, with some of the responses being very enthusiastic.

Although just an initial effort, the conference elevated interest in the economics of T&E and gave exposure some of the current work in the area. TEREC is committed to ensuring that the impetus given to this important topic results in an enhanced understanding and improved practice in the economics of T&E.

Conference Schedule

Tuesday, October 14, 1997

1300 - Short Course: Cost Benefit/Cost Effectiveness Analysis for the T&E Community, Professor Peter Sassone

1700 - End

Wednesday, October 15, 1997

0830 - Administrative Remarks, Dr. Samuel Blankenship, Director TEREC

Welcome, George Harrison, Maj Gen (Ret), Director ELSYS Lab, GTRI

0900 - Keynote Addresses: "The Economics of T&E", The Honorable Philip Coyle, Director, Operational Test & Evaluation, OSD

"T&E; Business in Need of Management.", Dr. Patricia Sanders, Director, Test, Systems Engineering and Evaluation, OSD

1000 - Break

1030 - Panel Discussion: "What do decision makers need from economic analysis of T&E?"

Chair: Dr. Marion Williams

Panel:

1200 - Lunch

1315 - "Sewing Buttons on Jello, Quantifying T&E Benefits and Costs and Other Special Skills", Mr. James O'Bryon, Deputy Director, Operational T&E, OSD

1400 - Break

1430 - Technical Session I

Chair: Dr. Martha Nelson

1700 - End

1830 - Social at Hotel

Thursday, October 16, 1997

0830 - Panel Discussion: "What can economic analysis of T&E provide?"

Chair: Dr. Henry Dubin

Panel:

1000 - Poster Papers - Break

1030 - Technical Session II

Chair: David Duma

1200 - Lunch

1300 - "Cost Effectiveness of Industrial Labs: A Case Study." Mr. Bill Krupp

1345- Break

1400 - Panel Discussion: "What do we do next?"

Chair: Dr. Henry Dubin

Panel:

Pete Adolph, Senior Vice President, Manager SAIC Test and Evaluation Group

Henry Dubin, Technical Director, US Army Operational Test and Evaluation Command

Alan Yankolonis, Chief, Solider, Ground and Live Fire Systems Division, US Army Test and Evaluation Command

Richard White, IDA

Jesse Poore, Professor of Computer Science at the University of Tennesse

1530 - End

top

Abstracts

Technical Session I

Title: The Changing Economics of Wind Tunnel Testing and Computational Fluid Dynamics in the Development of Flight Vehicles

Abstract: Rapid changes in the cost of computer modeling are creating economic and technical parity between CFD and wind tunnels as a source of aerodynamic data for flight systems. Economic parity will accelerate development of an integrated method of testing and modeling leading to reduced cycle time, and hence, reduced cost to field new flight systems. A technology diffusion model is used to analyze the changing scenario.

Author: Edward M. Kraft, Micro Craft, Inc.
Dr. Edward M. Kraft is Executive Vice President for Operations at Micro Craft, Inc. in Tullahoma, Tennessee, a woman-owned small business dedicated to providing full spectrum support to research, development, test and evaluation. He is responsible for strategic planning, marketing and overall operations of the company whose products and services include engineering analysis, design, manufacture of prototype hardware, instrumentation, calibration, test support, testing, test analysis, operations and maintenance of test facilities, and manufacture of flight hardware.

Dr. Kraft has over 28 years experience at the U.S. Air Force Arnold Engineering Development Center, where he held a number of technical and management positions, including General Manager of the Micro Craft Technology operating contract for all flight dynamic test facilities. His experience includes wind tunnel test engineering, test technology development, test facility research and development, flow diagnostic development, and the application of computer simulations to T&E. He has been the principal proponent for the integrated T&E process, which melds modeling and simulation with ground testing and flight testing to reduce the time and cost of system development. Application of this integrated T&E approach resulted in major savings to programs like the F-15E and F-22.

A recipient of numerous awards, Dr. Kraft is a Fellow of the American Institute of Aeronautics and Astronautics, a Fellow of the Arnold Engineering Development Center, and a Distinguished Alumnus of the University of Tennessee Space Institute. He is also an adjunct professor of aerospace engineering at the University of Tennessee Space Institute. He is of member of ITEA.

Phone: (615)455-2617x261
FAX: (615)455-7060

Author: John A. Benek, Micro Craft, Inc.
Dr. John A. Benek is Manager for Integrated Test and Evaluation at Micro Craft Inc. in Tullahoma, Tennessee, a woman owned, small business dedicated to providing full spectrum support to research, development, test and evaluation, RDT&E. He is responsible for planning, marketing, and development of integrated test and evaluation services and associated modeling and simulation capability.

Dr. Benek has over 28 years of experience at the Air Force Arnold Engineering Development Center, where he held a number technical and management positions including Manager of the Computational Fluid Dynamics Department. His experience includes experimental and numerical research in subsonic, transonic, and low density flows. His research and development experience includes development of diagnostic flow field instrumentation using mass spectrometry, and Laser Velocimetry, temperature and pressure sensitive coatings. He has been instrumental in the development of CFD algorithms and simulations methodologies and their practical application to increase the understanding of ground based simulation issues. He has been a proponent of the integrated test and evaluation approach to the acquisition of aerodynamic data. These efforts have resulted to significant savings in the F-15E and F-22 programs.

Dr. Benek has received numerous awards and is an Associated Fellow in the American Institute of Astronautics and Aeronautics. His is an Adjunct Professor at the University of Tennessee Space Institute and a member of ITEA.

Phone: (615)455-2617x261
FAX: (615)455-7060

 


 

Title: Cost, Benefit and Value - Required Information for the T&E Decision Maker

Abstract: It is almost a cliché to state that upper level management requires accurate and reliable information to support decision making. Indeed, the T&E community embodies this thought by the very nature of it's mission; namely, to test, evaluate and provide technical and performance information to it's program manager customers. As pointed out in the conference literature, however, this same focus and commitment to information often does not extend to the internal management of the T&E enterprise.

At present, the economic and cost/befit information required by Defense T&E management is generally inadequate for supporting the difficult decisions that must be made with respect to T&E activities, infrastructure and methodologies. Examples include corporate experience with the resources and capacity information collected to support the BRAC - 91, 93 and 95 process, the routine difficulties associated with comparing financial information between services and T&E sites, the strengths and weaknesses of RUMS, etc.

Using an idealized design approach, it is clear that T&E decision makers need these types of information in order to manage the enterprise: cost, performance and value. Key attributes of these three information areas include:

Cost - Financial information should be structured on cost management, not resource management. At present, the information available to decision makers is based on resources (e.g., civilian labor, capitol equipment purchases, etc. and budget control. As a result, there is a general lack of information on what true cost to own and operated the T&E enterprise.

Cost based information should be based on these primary attributes: cost element, visibility, activity/process resolution and emphasis on total cost versus operating cost. Activity based costing, in particular, offers opportunities to address these needs, and provides the scalable information necessary to meet the different information requirements of local, service and nation management.

Performance - Once cost information is known, it must be complemented by performance data. Management requires performance data to have both an operations focus (e.g., local productivity ratios, range utilization, etc.) and a strategic focus (e.g., cooperate capacity, product/portfolio balance, etc.). More importantly, the performance data needs to be process/product based and be capable of supporting enterprise bench marking.

Value - Once cost and performance information is available in a relational basis, strategic value assessments present opportunities for fact based advocacy and decision making. Value chain Analysis and various portfolio assessment methodologies may be applied to present information on the value that a particular product/process provides the market and the organization, and provides an opportunity to inject a "profit" motive into T&E operations and decision making.

The presentation illustrates private sector cost, performance and value methodologies that have applicability to the T&E enterprise.

Author: Marc J. Miller, Manager, Federal Services Group
Mr. Miller's work focuses on strategic planning for Department of Defense (DoD) research, development, test and evaluation (RDT&E) activities, other Federal organizations and the defense industry. Mr. Miller's background provides a unique mix of strategic and technical management expertise, and emphasizes the application of organizational assessment, performance measurement, cost analysis and strategy formulation tools in addressing the needs of the Federal RDT&E community.

Representative Accomplishments

Before joining KPMG Peat Marwick, Mr. Miller was an officer in the United States Navy, where he served as the Training and Operating Procedures evaluator for the Pacific Maritime Patrol Aviation community.

Mr. Miller was awarded an M.B.A. from San Jose State University, where he focused on strategic planning, and graduated with honors. He received a B.S. in Engineering from the United States Navy Academy.

Phone: (202)467-3000
FAX: (202) 467-5869
KPMG Peat Marwick II P, 2001 M Street N.W., Washington D.C. 20030-3389

 


 

Title: Plan versus Actual on Production Acceptance Test and Evaluation

Abstract: DoD acquisition is typically accomplished by an arrangement of private industry providing defense systems to the government via an acceptance process. This procurement process often requires a T&E program that is driven by user requirements and performance specifications.

This paper discusses how the T&E program manager can design a Production Acceptance T&E (PAT&E) program for weapon systems that will provide risk mitigation and systems confidence while meeting the customer's expectation on cost and schedule. It provides techniques for evaluating how much testing is required and what risk is "acceptable." A process of breaking down the T&E program into work-packages will be shown based on T&E product flow and decision points.

A fresh approach to understanding the customer's requirements is provided to allow the T&E architect the ability to develop a shared purpose and solid relationship with the customer.

Industrial process engineering and earned value performance measurement systems play a critical role in optimizing the T&E flow of execution and measuring customer's product quality. Graphic and statistical examples of effective structures will illustrate T&E program cost and performance and will provide a perspective on what can be expected when academic modeling is applied to real-life T&E programs.

Author: David A. Miskimens - Head of the Test, Training, and Evaluation Project Department, Naval Undersea Warfare Center, Division.
Mr. Miskimens is currently managing a department of project and program managers responsible for the test and evaluation of undersea systems including; weapons, targets, unmanned underwater vehicles, RDT&E, ASW ship systems, and ranges. His organization applies the program management and economic disciplines to develop sponsor requirements into executable T&E programs. Previous assignments include; privatization analysis for ASN(RDA), program and line management for the MK48 ADCAP torpedo, production manager in the USW Program Office (PMS404, Wash, D.C.), and project engineering on acoustic systems, including the Trident submarine program.

Mr. Miskimens has earned a BSME from Washington State University, and an MPA from Indiana University. He was born in Olympia, Washington and currently resides in Poulsbo, WA with his wife Angela.

Phone: (360) 396-7852
FAX: (360) 396-2000

 


top

 

Abstracts

Technical Session II

Title: A Model of Risk Reduction Using Test and Evaluation

Abstract: Risk is a measure of uncertainty. Risk level is categorized by the probability of occurrence and the consequence of occurrence. In the United States Department of Defense, test and evaluation is a part of the weapon system acquisition process because it provides information that can be used in programmatic risk management to reduce uncertainty and associated risk. However, deciding how much to test or how much to pay for a particular kind of test is subjective and difficult.

This paper will use Statistical Decision Theory to present a decision support mechanism for risk management. We will discuss a quantitative, cost-based economic approach for decision making and present a model of how testing reduces risk. The model provides insight into the comparative value of various kinds of testing, e.g. modeling and simulation, static ground tests, and open-air flight tests. The conclusions will provide a basis for a standard technique that allows risk analysis and test benefits to be explained in decision support terms, within the context of system life cycle costs. This communication approach could help program managers and testers determine whether to do additional testing as a part of an overall risk management approach.

Author: Lee Gardner, 412th TW/TSDI
Lee Gardner works at the USAF Flight Test Center at Edwards AFB, CA. He is currently the Program Manager for the Test Instrumentation Management System (TIMS), an operational ground support system for the Airborne Test Instrumentation System (ATIS), Advanced ATIS (AATIS), and the Common Airborne Instrumentation System (CAIS). The TIMS automates the design of Pulse Code Modulation (PCM) data formats and the generation, loading, and pre-flight validation of load images for the on-board instrumentation system units. Mr. Gardner's previous experience includes managing several other large software projects to process, manage, and analyze flight test data. Mr. Gardner has written or co-authored and presented several papers on flight test data management and quantitative methods for risk management to national and international audiences.

In addition to his job at Edwards AFB, Mr. Gardner teaches Computer Information Science courses at Antelope Valley College. He is a member of Institute of Electrical and Electronics Engineers (IEEE), the Association for Computing Machinery (ACM), Society for Computer Simulation (SCS). He currently serves as the International Newsletter Editor for the Society of Flight Test Engineers (SFTE), the Technical Program Chairman for the annual Test Instrumentation Workshop for the International Test and Evaluation Association (ITEA), president of the Palmdale Lions Club, and National Board Secretary of the Lions Project for Canine Companions for Independence (LPCCI).

Mr. Gardner has a Bachelor of Science Degree in Applied Mathematics and has done graduate coursework in Computer Science at California State University. He has a Master of Science with a Certificate in Test and Evaluation from Georgia Institute of Technology.

Phone: (805)275-4359
FAX: (805)275-4488
412th TW/TSDI, North Wolfe Avenue, Building 3949 Room 115, Edwards AFB, CA 93524

Author: Russell W. Lenz, Chief, Systems Integration Division, 412th Test Wing
Mr. Russell Lenz is Chief of the Systems Integration Division in the 412th Test Wing at the Air Force Flight Test Center, Edwards Air Force Base, California. Mr. Lenz was the Chief of the Acquisition and Development Division at the AFFTC, developing systems for test instrumentation, the Edwards' range, and avionics and electronic warfare ground test facilities. He also initially developed the flutter and structures capability for AFFTC. He has been on the staff in the Office of the Secretary of Defense, Director Test and Evaluation, Test Facilities and Resources. He was the Air Force representative to develop the startup of the Joint Program Office for Test and Evaluation. He has worked for Martin-Marietta as a structures and dynamics engineer. He was a Space Systems Operations Officer in the Strategic Air Command.

Mr. Lenz has a Bachelor's in Aerospace Engineering and Master's in Aerospace Engineering from the Georgia Institute of Technology. He has a Master's in Public Administration degree from the Harvard Kennedy School of Government. He is a graduate of the Defense Systems Management College Program Manager course.

Mr. Lenz is a past president of the Antelope Valley International Test and Evaluation Association chapter and was the chapter president of the American Institute of Aeronautics and Astronautics. He is a member of the Society of Flight Test Engineers. He has been chairman of the local Apple computer club.

Phone: (805)275-9069
FAX: (615)455-7060
412th TW/TSS, Air Force Flight Test Center, Edwards AFB, CA 93524

 


 

Title: A Decision Support Framework for Risk, Confidence, and Cost Tradeoffs in Operational Test and Evaluation

Abstract: A framework is presented to help decision makers select and modify operational test designs based on risk, confidence and cost tradeoffs. In the setting defined here, risk is used when observed system performance is to be compared with an operational requirement. Confidence applies when system performance is to be characterized. Risk and confidence can be either objective or subjective. Both are assumed to be affected by measurement errors and sampling errors. We also assume that these errors can be mitigated somewhat by better instrumentation, larger samples, test environments that better match expected operational conditions, and innovative test designs. The first three generally involve increased costs. The latter may not. Tradeoffs are made by defining cost-per-measure as a function of risk level, confidence level, type of level (objective or subjective), and test method. The framework forces decision makers to clearly understand decision rules and to quantify precision requirements in operational terms. One example illustrates how substantial cost savings are possible if decision makers are willing to accept a relatively small amount of subjectivity. A second example shows how large costs can be incurred with little improvement in risk or confidence level when decision rules and operational precision requirements are not clearly understood.

Author: Frank B. Gray, Statistician, AFOTEC
Dr Frank B. Gray is a statistician in the Chief Scientist's Office, Air Force Operational Test and Evaluation Center. He has bachelors and masters degrees in Aeronautical and Astronomical Engineering from The Ohio State University, and a Ph.D. in Industrial Engineering from New Mexico State University. After graduating from Ohio State in 1971, he entered Air Force active duty, finished flying training, and completed operational assignments in F-4C/D/E aircraft at Udorn Royal Thai AFB, Thailand, and Eglin AFB FL. He graduated from the USAF Test Pilot School in 1979 and flew F-4, F-111, and F-15 aircraft in flight test programs at Edwards AFB CA, Eglin AFB FL, and Holloman AFB NM, testing such systems as the imaging IR Maverick, laser Maverick, PAVE TACK, ARN-101, PAVE MOVER, AMRAAM, Low Level Laser Guided Bomb, and F-15 synthetic aperture radar. He is also a graduate of USAF Fighter Weapons School and Defense Systems Management College. Special awards include the ATC Commanders Trophy as top graduate from flying training, the R. L. Jones Award as top graduate from USAF Test pilot School, and the Air Force Association's David C. Schilling Award for work on F-15 test programs. After retiring from active duty in 1991, he finished his Ph.D. and taught math and engineering at New Mexico State University. He has been at AFOTEC since January, 1996.

Phone: (505)846-1845
FAX: (505)846-9726

Author: Suzanne Beers, Advisor to the Chief Scientist, AFOTEC
Major Suzanne M. Beers is the Advisor to the Chief Scientist at the Headquarters, Air Force Operational Test and Evaluation Center. She entered active duty in November 1983 at Kirtland Air Force Base, New Mexico where she served as a Nuclear Systems Physicist, developing design criteria for cruise missile programs such as the Air Launched Cruise Missile, Sea Launched Cruise Missile, and the Advanced Cruise Missile. Assigned to the Ground Launched Cruise Missile's 487th Tactical Missile Wing at Comiso Air Station, Sicily, she served as a Launch Control Officer, Weapon System Instructor, Emergency Actions Procedure Instructor, and Battle Planner. Returning from overseas to the Air Force Operational Test Evaluation Center she served as the Lead Operational Effectiveness Analyst for the Short Range Attack Missile II Program, the Survivability Analyst for the Milstar Satellite Communications System, the Test Manager for the Cheyenne Mountain Upgrade Program, and as the Advisor to the Chief Scientist. She has an M.B.A. from the New Mexico Highlands University, an M.S.E.E. from the New Mexico State University in Las Cruces and Ph.D. in Electrical Engineering from Georgia Institute of Technology. Her Ph.D. research was aimed at developing a new analysis methodology using intelligent techniques, to aggregate low-level test data into high level information for use by decision makers.

Phone: (505)846-9929
FAX: (505)846-9726


Title: Cost-Benefit-Risk Considerations for Live Fire Test and Evaluation (LFT&E)

Abstract: The legislative requirements for Live Fire Test and Evaluation (LFT&E) include Title 10, U.S. Code, Section 2366, which specifies that LFT&E must include "realistic survivability testing" and/or "realistic lethality testing" unless the Secretary of Defense certifies to the Congress that such testing would be "unreasonably expensive and impractical," and a suitable alternative LFT&E program is approved. (The DoD 5000.2-R uses the term "full-up, system-level" for the "realistic" testing of the legislation.)

Questions have arisen for several LFT&E programs as to when testing should be considered "unreasonably expensive and impractical," and as to what constitutes an acceptable alternative to full-up, system-level testing. At various times, the Institute for Defense Analyses (IDA) has been tasked by OSD's Office of the Director, Operational Test and Evaluation/Live Fire Testing (DOT&E/LFT) to provide an analytical basis to assist DOT&E/LFT in its decision making.

This paper will present IDA's review of various cost-benefit-risk considerations, and assess the strengths, limitations and potential usefulness of decision making methodologies for LFT&E applications.

Author: Lowell Tonnessen, Project Leader, IDA
Dr. Lowell Tonnessen has been a research staff member at the Institute for Defense Analyses (IDA) since 1984. He has been IDA's project leader for Live Fire Test and Evaluation (LFT&E), in support of the Office of the Secretary of Defense (OSD), since the legislative requirement for LFT&E was initiated. Prior to his experience at IDA, Dr. Tonnessen was an operations research analyst at the Naval Surface Weapons Center. His academic degrees include B.A. in Mathematics from the University of Delaware; M.S. in Computer Science and Applications from Virginia Tech (VPI&SU); and M.A. and Ph.D. in Mathematics from the University of Wisconsin.

Phone: (703) 845-6921
FAX: (703) 845-6977
Institute for Defense Analyses, 1801 N. Beauregard Street, Alexandria, VA 22311

top

Abstracts

Poster Papers

Title: Economic Impact of Meteorology on Test and Evaluation

Abstract: Eventually, any weapons system or article must overcome weather and perform to specifications. Weather has often been looked upon as something that must be endured and struggled against to finally get the testing phase of development done so that production can occur. This need not be the case, especially in light of the improvements in measurements and forecasting . A properly run weather operation in a test center will pay for the cost of its operations several times during the course of a year through savings of time and material, as well as indicate areas requiring improvement in the system or article under test.

Sizable cost savings can be realized by forecasting weather on a regular basis. Early and accurate warning of severe weather, e.g. lightning, will likewise improve cost-efficiencies. Regular and accurate measurements, while less dramatic and obvious, will increase savings and indirectly improve test programs and system/article development through a better understanding of real world effects on the system or article. Considerable cost savings can be performance and weather effects versus repeat testing because the analysis of test data could not separate weather effects from other potential sources of error. Consultation by the meteorologist with developers and testers can identify early weaknesses or shortcomings in a system or article by properly identifying the type of environment it will encounter.

As the RDTE community moves towards greater emphasis on modeling and simulation (M&S) and the Virtual Proving Ground (VPG), the reliance on accurate modeling of the atmosphere increases. Meteorological data must be modeled in such a manner as to match the scale of the simulation as accurately to real-world situations as possible. Basing atmospheric variables on real-world data must be assumed in order to have the simulation accuracy approximate actual weather regimes and phenomena. This will allow for the greatest cost efficiencies in the M&S phase of development. Systems/articles tested without benefit of meteorological data will have the greatest propensity to fail in live situations. Once the M&S has been completed, material must be tested in an open environment. Documentation of the environment must be sufficiently detailed to ensure credible/defensible analysis and should be a major consideration as system/sensor performance responds to the environment.

Author: Nicholas F. Borns
Graduated from Purdue University, 1975 with a BS in Meteorology. Continuing education at the graduate level in oceanography done at the Naval Post Graduate school, Monterey, CA, additional graduate work at Old Dominion University, Norfolk, VA in International Studies, MS awarded in 1996 from Indiana University, New Albany, IN. Spent 11 years on active duty in the U.S. Navy, working as an oceanographer and meteorologist. Duty assignments included working as an operational forecaster, program manager for computer upgrade project, assistant manager for support of field operations out of Norfolk, VA, commanded two Naval Oceanographic Reserve Activity units. Served with the National Weather Service as an observer (surface and upper air) and forecaster. Worked for the U.S. Army as a Meteorological Detachment Chief, later moving to the Headquarters, Test and Evaluation Command staff, duties including budgeting for field operations, equipment evaluation and purchasing, and planning future acquisitions for future requirements.

Phone: (410)278-1408

 


 

Title: Re-Inventing RDTT&E by using Virtual Complex with a Federation Concept

Abstract: The Research, Development, Testing, Training, and Evaluation (RDT2E) community within DoD has been the center of attention of countless studies and movements to reduce size and cut costs. This briefing will present an alternative concept that has the potential to unite all RDT2E stakeholders to work together in a "leaner - cleaner" environment in which overhead, infrastructure, and costs are reduced and in which overall stability is the end result.

This paper considers case studies underscoring popular approaches, examines successful ventures such as the Range commanders Council, and explores the concept of the Virtual National Signature Data Center as a model RDT2E structure. Known forces of change, such as the internet explosion, partnerships, etc. are folded in to arrive at the concept of a Federation of RDT2E headquartered in a Virtual Complex. The foundation of the Federation is the establishment of a charter that will lay out the methodology of addressing tough choices in a sound effective manner and will leverage the forces of change and technological advances.

Author: James Fasig, Technical Director, Aberdeen Test Center
Phone: (410) 278-2556

Author: James Finfera

Phone: (410) 278-9270

Author: Samuel Harley, Team Leader - VISION
Dr. Harley is a graduate of Morehead State University, KY (BS, Physics and Mathematics) and of Virginia Polytechnic Institute (PhD, Physics). While in government service he has been engaged in the development of advanced test technology and its application to a wide variety of Army materiel. He has held positions as Aberdeen Test Center's Principal Instrumentation Development Scientist, Chief of the Instrumentation Development Division, Chief of the Advanced Concepts Division, and as Deputy Technical Director for Command Reengineering. His current assignment is that of Leader of the team developing the Versatile Information System - Integrated, On-line, Nationwide (VISION). Dr. Harley has been awarded the Department of the Army Research and Development Achievement Award for Technical Achievement for accomplishments related to intelligent instrumentation systems.

Phone: (410) 278-9463


Invited Speakers

Honorable Philip E. Coyle III

Director, Operational Test and Evaluation (DOT&E)

Philip Coyle was confirmed by the Senate as the Director, Operational Test and Evaluation (DOT&E), in the Department of Defense (DoD) on September 29, 1994. In this capacity, he is the principal advisor to the Secretary of Defense and the Under Secretary of Defense for Acquisition and Technology on operational test and evaluation in the DoD. Mr. Coyle is the principal operational test official within the senior management of the DoD.

During the Carter Administration, Mr. Coyle served as Principal Deputy Assistant Secretary for Defense Programs in the Department of Energy (DOE). In this capacity he had oversight responsibility for the nuclear weapons testing programs of the Department.

Mr. Coyle has 20 years experience in testing and test-related matters. From 1959 to 1979, and again from 1981 to 1993, Mr. Coyle worked at the Lawrence Livermore National Laboratory in Livermore, California. During the more recent period he served as an Associate Director of the Laboratory. First-from 1981 to 1984-he was Associate Director for Test. Later, from 1987 to 1993, he served as Laboratory Associate Director and a Deputy to the Laboratory Director. In November 1993, Mr. Coyle retired from the Laboratory. In recognition of his 33 years' service to the Laboratory and to the University of California, the University President named Mr. Coyle Laboratory Associate Director Emeritus.

Earlier in his career while at Lawrence Livermore, Mr. Coyle was directly responsible for many of the testing programs of the DOE and its predecessor agencies. He served as a Scientific Advisor on testing matters to the Nevada Operations Office. For many years he was a Test Director at the Nevada Test Site and at other testing locations. In 1971 he was the Test Director the full-scale underground test of the Spartan warhead on Amchitka Island in the Aleutians. In the mid-1970s, Mr. Coyle also served as a Deputy in the Laboratory's laser program, developing high power lasers for fusion, isotope separation, and other applications.

Mr. Coyle has been active in community and educational programs. In 1991 he was named as a Commissioner of the East Bay Conversion and Reinvestment Commission which developed defense conversion plans for Alameda Naval Air Station and the East Bay. He was a member of the Board of the Alameda County Economic Development Advisory Board. He also served on the boards of several educational organizations.

Mr. Coyle graduated from Dartmouth College with an MS in Mechanical Engineering (1957) and a BA (1956). His wife, Martha Krebs, currently serves as Director of Energy Research in the DOE. They have four grown children and live in Washington, D. C..

 


 

Dr. Patricia Sanders

Director, Test, Systems Engineering and Evaluation (DTSE&E)

Dr. Patricia Sanders is the Director, Test, Systems Engineering and Evaluation (DTSE&E) for the Department of Defense (DoD) where she is responsible for ensuring the effective integration of all engineering disciplines into the system acquisition process. These include design, production, manufacturing and quality, acquisition logistics, modeling and simulation, and software engineering with emphasis on test and evaluation as the feedback loop. She is also responsible for oversight of the Department of Defense's Major Range and Test Facility Base (MRTFB) and the development of test resources such as instrumentation, targets and other threat simulators. The MRTFB comprises more than 50 percent of the DoD land resources, represents a capital investment of more than $25 billion, and employs approximately 47,000 government and contractor personnel. Dr. Sanders chairs the Defense Test and Training Steering Group, the Systems Engineering Steering Group, and the Acquisition Council on Modeling and Simulation. She reports directly to the Principal Deputy Under Secretary of Defense for Acquisition and Technology.

Dr. Sanders has over twenty-two years of experience in the Department of Defense with particular emphasis in the areas of test and evaluation, modeling and simulation, resource allocation, and strategic planning. Prior positions within the Office of the Secretary of Defense included serving as the Deputy Director for Test Facilities and Resources, the Director of Land Forces in the Office of the Assistant Secretary of Defense for Program Analysis and Evaluation, and as a Staff Specialist for the Director of Operational Test and Evaluation. Other assignments have included serving as Deputy Director for Analysis, United States Space Command, Science Advisor to the Command, Control, Communications, and Countermeasures Joint Test Force, and Chief of Modeling and Simulation and Technical Advisor to the Electronics Systems Division at the Air Force Operational Test and Evaluation Center. Her government career was preceded by university faculty positions.

Dr. Sanders received her doctorate in mathematics in 1972 as a National Science Foundation Fellow at Wayne State University and is a 1992 graduate of the Senior Executive Fellow Program, John F. Kennedy School of Government, Harvard University. She is a member of the Senior Advisory board and a past President of the international Test and Evaluation Association (ITEA). She is a Fellow of the American Institute of Aeronautics and Astronautics, and a member of the Board of Directors of the Military Operations Research Society.

 


 

James F. O'Bryon

Deputy Director, Operational Test & Evaluation Live Fire Testing

Mr. O'Bryon accepted the Senior Executive Service position of Assistant Deputy Under Secretary of Defense, Live Fire Testing in November 1986, a position created in response to an act of Congress. The legislation requires realistic Live Fire Testing be performed on our major conventional weapons and an independent Live Fire Test Report be prepared and submitted to the Congress before these systems enter full-rate production. Since that time, he has also served within OSD as Deputy Director, Test and Evaluation; as Director, Live Fire Testing and as Director, Weapon Systems Assessment. He is currently serving as Deputy Director, Operational Test and Evaluation, a Deputy Assistant Secretary of Defense position. Mr. O'Bryon reports to the Director, OT&E, the Honorable Philip Coyle.

Jim has more than 25 years of leadership experience in weapon system technology and survivability. Mr. O'Bryon's technical experience includes work in the biophysics department at IBM's Thomas J. Watson Research Center, the Actuarial Department at the home office of New York Life Insurance Company, the Ballistic Research Laboratories, the Army Material Systems Analysis Activity at Aberdeen and, since 1986, for the Office of the Secretary of Defense at the Pentagon.

Jim O'Bryon received his undergraduate degree in Mathematics. He also has graduate degrees from George Washington University in Operations Research and from MiT through the Electrical Engineering Department.

He's the author of over 60 technical publications and owns several copyrights. His honors include Who's Who in America, Outstanding Young Men in America, Sigma Xi, is recognized as a Distinguished Lecturer at the Defense Systems Management College, nominated to Who's Who in Government and Who's Who in Engineering. Mr. O'Bryon is also a Fellow of the Center for Advanced Engineering Study at MIT and is Chairman of the Test and Evaluation Division of the American Defense Preparedness Association and National Security Industrial Association.

 


 

W.E. Krupp

Acting Director
Test and Evaluation
Lockheed Martin Aeronautical Systems

Mr. Krupp is the Acting Director of Test & Evaluation at Lockheed Martin Aeronautical Systems where he is responsible for flight test, structural test, materials testing, aircraft systems testing, and wind tunnels. Mr. Krupp has a twenty nine year career at Lockheed Martin Aeronautical Systems with six years in Fatigue and Fracture Mechanics Research and Development and 23 years in management of Test and Evaluation organizations.

Conference Organizing Committee

Dr. Samuel M. Blankenship
Director,
Test & Evaluation Research and Education Center

Dr. Peter G. Sassone
Associate Professor
School of Economics
Georgia Institute of Technology
Atlanta, GA 30332-0615
Work: (404)894-4912
Fax: (404)894-1890

David Duma
SAIC
Crystal Square 5, Suite 202
1755 Jefferson Davis Highway
Arlington, VA 22202
Work: (703)413-3137
Fax: (703)413-5116

Dr. Martha K. Nelson
Associate Professor
Department of Business Administration
Franklin & Marshall College
Lancaster, PA 17604-3003 Work: (717)291-3937
Fax: (610)429-4912

Edward F. Smith
Institute for Defense Analysis
1800 N. Beauregard Street
Alexandria, VA 22311 Work: (703)845-6938
Fax: (703)845-2274

Kathie C. Prado
Technical Editor
Test & Evaluation Research and Education Center
Georgia Institute of Technology
Atlanta, Georgia 30332-0840 Work: (404)894-7311
Fax: (404)894-8636

top

Last Updated January 15, 2008