NAVAL POSTGRADUATE SCHOOL

Monterey, California


Autonomous Agent-Based Simulation of an AEGIS Cruiser Combat Information Center Performing Battle Group Air-Defense Commander Operations

 

by

 

Sharif H. Calfee

 

March 2003

 

   Thesis Co-Advisors:                                      Neil C. Rowe

                                                                        John Hiles


THESIS

 

This thesis done in cooperation with the MOVES Institute

Approved for public release; distribution is unlimited


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

THIS PAGE INTENTIONALLY LEFT BLANK



             REPORT DOCUMENTATION PAGE

Form Approved OMB No. 0704-0188

Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction Project (0704-0188) Washington DC 20503.

1. AGENCY USE ONLY (Leave blank)

 

2. REPORT DATE 

March 2003

3. REPORT TYPE AND DATES COVERED

Master’s Thesis

4. TITLE AND SUBTITLE:  Autonomous Agent-Based Simulation of an AEGIS Cruiser Combat Information Center Performing Battle Group Air-Defense Commander Operations

5. FUNDING NUMBERS

 

6. AUTHOR(S) Sharif H. Calfee

7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES)

Naval Postgraduate School

Monterey, CA  93943-5000

8. PERFORMING ORGANIZATION REPORT NUMBER   

9. SPONSORING /MONITORING AGENCY NAME(S) AND ADDRESS(ES)

N/A

10. SPONSORING/MONITORING

      AGENCY REPORT NUMBER

11. SUPPLEMENTARY NOTES  The views expressed in this thesis are those of the author and do not reflect the official policy or position of the Department of Defense or the U.S. Government.

12a. DISTRIBUTION / AVAILABILITY STATEMENT 

Approved for public release; distribution is unlimited

12b. DISTRIBUTION CODE

 

13.    ABSTRACT (maximum 200 words)

 

The AEGIS Cruiser Air-Defense Simulation is a program that models the operations of a Combat Information Center (CIC) team performing the ADC duties in a battle group using Multi-Agent System (MAS) technology implemented in the Java programming language.  Set in the Arabian Gulf region, the simulation is a top-view, dynamic, graphics-driven software implementation that provides a picture of the CIC team grappling with a challenging, complex problem.  Conceived primarily as a system to assist ships, waterfront training teams, and battle group staffs in ADC training and doctrine formulation, the simulation was designed to gain insight and understanding into the numerous factors (skills, experience, fatigue, aircraft numbers, weather, etc.) that influence the performance of the overall CIC team and watchstanders.  The program explores the team’s performance under abnormal or high intensity/stress situations by simulating their mental processes, decision-making aspects, communications patterns, and cognitive attributes.  Everything in the scenario is logged, which allows for the reconstruction of interesting events (i.e. watchstander mistakes, chain-of-error analysis) for use in post-scenario training as well as the creation of new, more focused themes for actual CIC team scenarios.  The simulation also tracks various watchstander and CIC team performance metrics for review by the user.

14. SUBJECT TERMS 

Battle Group Air-Defense, Multi-Agent Systems, Artificial Intelligence, Air-Defense Commander, Naval Simulations, Combat Information Center, Air-Defense Simulation, AEGIS, Cruiser, CG, Human-Computer Interface (HCI), Watchstander Training, Naval Air Defense, Threat Assessment, Decision Making, Cognitive Factors, AEGIS Doctrine, Air-Defense Doctrine, Interactive Training Systems, Watchstander Fatigue, Link-16/TADIL J, Link-11/TADIL A, USS Vincennes

15. NUMBER OF PAGES

 

16. PRICE CODE

17. SECURITY CLASSIFICATION OF REPORT

Unclassified

18. SECURITY CLASSIFICATION OF THIS PAGE

Unclassified

19. SECURITY CLASSIFICATION OF ABSTRACT

Unclassified

20. LIMITATION OF ABSTRACT

 

UL

NSN 7540-01-280-5500                                                                                                                      Standard Form 298 (Rev. 2-89)

                                                                                                                                                            Prescribed by ANSI Std. 239-18


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

THIS PAGE INTENTIONALLY LEFT BLANK


Approved for public release; distribution is unlimited

 

 

Autonomous Agent-Based Simulation of an AEGIS Cruiser Combat Information Center Performing Battle Group Air-Defense Commander Operations

 

 

Sharif H. Calfee

Lieutenant, United States Navy

B.S., United States Naval Academy, 1996

 

 

Submitted in partial fulfillment of the

requirements for the degree of

 

 

MASTER OF SCIENCE IN COMPUTER SCIENCE

 

 

from the

 

 

NAVAL POSTGRADUATE SCHOOL

March 2003

 

 

 

Author:             Sharif H. Calfee

 

 

Approved by:               Neil C. Rowe

Thesis Co-Advisor

 

 

John Hiles

Thesis Co-Advisor

 

 

Peter J. Denning

Chairman, Department of Computer Science

 


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

THIS PAGE INTENTIONALLY LEFT BLANK


ABSTRACT

 

 

 

The AEGIS Cruiser Air-Defense Simulation is a program that models the operations of a Combat Information Center (CIC) team performing the ADC duties in a battle group using Multi-Agent System (MAS) technology implemented in the Java programming language.  Set in the Arabian Gulf region, the simulation is a top-view, dynamic, graphics-driven software implementation that provides a picture of the CIC team grappling with a challenging, complex problem.  Conceived primarily as a system to assist ships, waterfront training teams, and battle group staffs in ADC training and doctrine formulation, the simulation was designed to gain insight and understanding into the numerous factors (skills, experience, fatigue, aircraft numbers, weather, etc.) that influence the performance of the overall CIC team and watchstanders.  The program explores the team’s performance under abnormal or high intensity/stress situations by simulating their mental processes, decision-making aspects, communications patterns, and cognitive attributes.  Everything in the scenario is logged, which allows for the reconstruction of interesting events (i.e. watchstander mistakes, chain-of-error analysis) for use in post-scenario training as well as the creation of new, more focused themes for actual CIC team scenarios.  The simulation also tracks various watchstander and CIC team performance metrics for review by the user.

 

 

 

 

 

 

 

 

 

 


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

THIS PAGE INTENTIONALLY LEFT BLANK

 


TABLE OF CONTENTS

 

 

 

I.          introduction........................................................................................................ 1

A.        the aegis cruiser battle group Air-Defense simulation 1

B.        scope of the cruiser Air-Defense simulatiOn proJECT... 2

1.         ADC Simulation Project Thesis........................................................... 2

2.         Interviews with Air-Defense Experts.................................................. 3

3.         ADC Simulation Design....................................................................... 3

4.         Testing and Analysis of ADC Simulation and Conduct of Reality Survey 5

C.        relevance of the adc simulation in training for the complex and challenging task of Air-Defense OPERATIONS in the modern era           6

1.         Situation of Concern............................................................................. 6

2.         Current Training Needs and the ADC Simulation.............................. 6

a.         Current Situation..................................................................... 6

b.         The Need for New Systems to Assist Training Teams............ 6

c.         A Potential Solution................................................................ 7

D.        brief history of naval AND battle group Air Defense... 7

E.         watchstander organization of a cruiser combat information center......................................................................................................................... 11

1.         Overview of a CIC Organization........................................................ 11

2.         Brief Description of the CIC Air-Defense Watchstanders.............. 12

a.         Force Tactical Action Officer (F-TAO)................................ 12

b.         Force Anti-Air Warfare Coordinator (F-AAWC)................. 12

c.         Ship Tactical Action Officer (S-TAO).................................. 12

d.         Ship Anti-Air Warfare Coordinator  (S-AAWC).................. 13

e.         Electronic Warfare Control Officer (EWCO)....................... 13

f.          Radar Systems Controller (RSC)........................................... 13

g.         Tactical Information Coordinator (TIC).............................. 14

h.         Identification Supervisor (IDS)............................................. 14

i.          Combat Systems Coordinator (CSC)..................................... 14

j.          Missile Systems Supervisor (MSS)......................................... 14

k.         Red Crown (RC)..................................................................... 15

F.         application of multi-agent system technology in the adc simulation......................................................................................................................... 15

II.        related work in the area of naval Air-Defense simulation 19

A.        related work introduction........................................................ 19

B.        Area Air-Defense commander (aadc) battle management system         20

c.        tactical decision-making under stress (tadmus) DECISION SUPPORT SYSTEM......................................................................................................... 21

d.        multi-modal watch station (mmws) program.................. 23

e.         naval Air-Defense thReat assessment:  cognitive factors model         25

f.         Air threat assessment:  research, model, and display guidelines        26

g.        cognitive and behavioral task implications for three dimensional displays used in combat information/direction centers    28

h.        battle force tactical training (bftt) SYSTEM.................. 29

i.          naval combat SIMULATION video games:  the precursor to modern-day air-defense simulations.................................................................. 30

1.         Strike Fleet:  The Naval Task Force Simulator™............................ 31

2.         Fifth Fleet™........................................................................................ 32

3.         Harpoon:  Modern Naval Combat Simulation™ Series Video Games..... 33

4.         Summary............................................................................................. 35

j.         comparison AND contrast of the cruiser adc simulation program    36

k.        research questions posed for the cruiser adc simulation program 37

iII.       User-Centered Design (UCD) process of the adc simulation human-computer interface (hci).................................................................................................... 39

A.        need for utilization of user-centered design (ucd) process in developing computer program interfaces....................... 39

B.        ucd process phase one:  problem statement...................... 40

1.         Problem Statement............................................................................. 40

2.         Activity/Utility to Users...................................................................... 41

3.         Users................................................................................................... 41

4.         Criteria for Judgment......................................................................... 41

C.        ucd process phase two:  requirements gathering......... 41

1.         Needs Analysis................................................................................... 41

a.         Situation of Concern.............................................................. 41

b.         Need/Utility of System............................................................ 42

c.         Features of System.................................................................. 43

2.         User Analysis...................................................................................... 44

a.         Utility of the Simulation........................................................ 44

b.         Collective Team Skills and Experience Required (User Characteristics)  44

c.         Frequency of Simulation Use................................................ 45

3.         Task Analysis..................................................................................... 45

d.        ucd process phase three:  conceptual design of adc simulation program.................................................................................................... 46

1.         Conceptual Design Introduction........................................................ 46

2.         Conceptual Design.............................................................................. 47

a.         Agents...................................................................................... 47

b.         Objects..................................................................................... 48

c.         Necessary Attributes of Agents.............................................. 49

d.         Necessary Attributes of Objects............................................. 52

e.         Agent Relationship................................................................. 55

f.          Object Relationships.............................................................. 56

g.         Actions on Agents and Objects.............................................. 57

3.         Visual Design...................................................................................... 57

4.         Early Analysis..................................................................................... 58

a.         Reviewer #1 Comments.......................................................... 59

b.         Reviewer #2............................................................................. 59

e.         ucd process phase four:  adc simulation interface implementation  60

f          ucd process phase five:  usability analysis of adc SIMULATION interface......................................................................................................................... 61

1.         Usability Analysis Introduction.......................................................... 61

2.         Task List Overview............................................................................ 62

3.         Subject Profile..................................................................................... 63

4.         Data Collection................................................................................... 64

5.         Analysis of Task Data........................................................................ 65

6.         Analysis of Subject Evaluation Surveys............................................ 65

a.         Screen Layout......................................................................... 65

b.         Overall Display Layout Relative for Menu-Bars and Pop-Up Menus        65

c.         Menu Location and Wording................................................. 66

d.         Ease of Performance of the Task Completion List............... 66

7.         Recommendations.............................................................................. 67

a.         Subject #1................................................................................ 67

b.         Subject #2................................................................................ 67

c.         Subject #3................................................................................ 67

d.         Subject #4................................................................................ 67

e.         Subject #5................................................................................ 68

g.        ucd process phase six:  interface modification/redesign    68

iV.       description of the adc simulation program design AND structure    69

A.        program language and SYSTEM REQUIREMENTS for adc simulation      69

B.        discussion about multi-agent systems.................................. 69

1.         Coordinated Collaboration................................................................. 71

2.         Anticipative-Reactive Agents............................................................ 72

3.         Adaptation and Evolution................................................................... 73

4.         Cooperation within the Multi-Agent System..................................... 73

5.         Connector-Based Multi-Agent Systems (CMAS)............................ 75

C.        OVERALL VISUAL DESIGN OF THE SIMULATION............................ 76

1.         Tactical Display.................................................................................. 76

2.         Contact Data Display......................................................................... 78

3.         Scenario Control Buttons................................................................... 79

4.         CIC Watchstander Display and Watchstander Attributes Display. 80

d.        adc simulation program:  menu options.............................. 81

1.         File Menu Options.............................................................................. 81

2.         Watchstander Attributes Menu......................................................... 82

3.         CIC Equipment Setup Menu.............................................................. 82

4.         Scenario External Attributes Menu................................................... 83

5.         Doctrine Setup Menu......................................................................... 83

6.         Simulation Logs Menu....................................................................... 83

7.         Task Times and Probabilities Menu.................................................. 84

8.         Time Factor Ratio and Simulation Time Windows............................ 84

E.         design/structure of AIRCRAFT contacts............................... 84

1.         Overview............................................................................................. 84

2.         Aircraft Behaviors.............................................................................. 86

a.         Neutral Aircraft...................................................................... 86

b.         Hostile Aircraft....................................................................... 86

c.         Friendly Aircraft.................................................................... 87

3.         Aircraft Contact Generation Module................................................ 88

f.         RELEVANT SIMULATION POP-UP WINDOWS..................................... 88

1.         Modify Contact Attributes Window (Figure 19)................................ 88

2.         Scenario Setup Wizard Selection Window  (Figure 20)..................... 89

3.         Select Specific Contact Window  (Figure 21).................................... 90

4.         Scenario Run Time Input Window  (Figure 22)................................. 90

G.        design/structure of watchstander agents....................... 90

1.         Watchstander Attributes.................................................................... 90

a.         Skills........................................................................................ 90

b.         Experience.............................................................................. 91

c.         Fatigue.................................................................................... 92

d.         Decision-Maker Types............................................................ 93

2.         Watchstander Communication........................................................... 94

a.         Input/Receive Message Queue............................................... 95

b.         Watchstander Message Priority Processor............................ 95

c.         High/Medium/Low Priority Message Queue......................... 95

d.         Watchstander Action Processor............................................. 95

e.         Output/Transmit Message Queue.......................................... 96

3.         Watchstander Agents Skill Listings.................................................. 96

H.        combat information center (cic) Combat systems equipment     98

1.         Overview............................................................................................. 98

2.         SPY-1B Radar System....................................................................... 99

3.         SLQ-32 Electronic Signal Detection System................................... 101

4.         Identification Friend or Foe (IFF) System....................................... 102

5.         Link 11 (TADIL A)/Link 16 (TADIL J) System............................. 103

6.         External Communications System................................................... 104

7.         Vertical Launching System (Surface-to-Air Missiles).................... 104

8.         Close-In Weapons System (CIWS)................................................. 105

I.          simulation log records AnD event reconstruction. 105

1.         Overview........................................................................................... 105

2.         Scenario Events Log......................................................................... 105

3.         Watchstander Decision History Log............................................... 106

4.         CIC Equipment Readiness Log....................................................... 107

4.         Watchstander Performance Log...................................................... 107

5.         Parser/Analyzer Log......................................................................... 108

J.         adc simulation external/environmental attributes 109

1.         Overview........................................................................................... 109

2.         Atmosphere/Weather....................................................................... 109

3.         Contact Density................................................................................ 110

4.         Scenario Threat Level...................................................................... 110

5.         Hostile Contact Level....................................................................... 110

K.        adc simulation DOCTRINE attributeS.................................... 111

1.         Overview........................................................................................... 111

2.         AEGIS Doctrine................................................................................ 111

L.         discussion of probability AND skill-time values in adc simulation     112

M.       air-defense contact identification, threat assessment AND classification in the simulation............................................. 113

N.        air-defense decision-making:  inside the heads of the f-tao AND f-aawc watchstander agents.................................................................... 117

V.        research question results AND EVALUATION OF THE SIMULATION   123

A.        research question Introduction........................................... 123

1.         Overview........................................................................................... 123

2.         Testing Methodology....................................................................... 123

a.         Scenario Default Settings.................................................... 123

b.         Number of Runs.................................................................... 124

c.         Limitation of Variability in Testing.................................... 124

3.         Philosophy of Testing and Data Results Analysis.......................... 124

4.         Philosophy of the Use of the ADC Simulation................................. 125

5.         Simulation Testing Input Settings and Measurements Lists......... 125

a.         Inputs and Functions........................................................... 125

b.         Independent Variables......................................................... 126

c.         Dependent Variables............................................................ 126

d.         Test Categories..................................................................... 126

B.        radar systems controller (rsc) agent testing and analysis results....................................................................................................................... 127

1.         Expected Results Based on Air-Defense Expert Interviews......... 127

2.         Results from the Simulation (See Appendix C Section A for Graphs) 127

3.         Analysis of Results and Recommendations.................................... 129

a.         Radar Operations Skill Results........................................... 129

b.         Experience Level Results..................................................... 129

c.         Fatigue Level Results........................................................... 129

d.         SPY-1B Radar Results......................................................... 129

C.        electronic warfare control officer (ewco) agent testing and analysis results................................................................................. 130

1.         Expected Results Based on Air-Defense Expert Interviews......... 130

2.         Results from the Simulation (See Appendix C Section A for Graphs) 130

3.         Analysis of Results and Recommendations.................................... 132

a.         ES Analysis Skill Results..................................................... 132

b.         Experience Level Results..................................................... 132

c.         Fatigue Level Results........................................................... 132

d.         SLQ-32 System Results........................................................ 133

d.        force tactical action officer (f-tao) agent testing and analysis results..................................................................................................... 133

1.         Expected Results Based on Air-Defense Expert Interviews......... 133

2.         Results from the Simulation (See Appendix III Section A for Graphs)      133

3.         Analysis of Results and Recommendations.................................... 134

a.         Situation Analysis Skill Results.......................................... 134

b.         Experience Level Results..................................................... 134

c.         Fatigue Level Results........................................................... 135

d.         Decision-Maker Type Results.............................................. 135

E.         combat information center (cic) watch team attribute PROFILE testing and analysis........................................................................ 135

1.         Expected Results Based on Air-Defense Expert Interviews......... 135

a.         Trial Profile Summary......................................................... 135

b.         Expectations......................................................................... 135

2.         Results from the Simulation (See Appendix C Section A for Graphs) 136

3.         Analysis of Results and Recommendations.................................... 136

f.         combat information center (cic) watch team testing and analysis of weather options................................................................................. 136

1.         Expected Results Based on Air-Defense Expert Interviews......... 136

2.         Results from the Simulation (See Appendix C Section A for Graphs) 137

3.         Analysis of Results and Recommendations.................................... 137

g.        results of the survey of the atrc detachment, san diego air-defense experts..................................................................................................... 137

1.         Survey Overview.............................................................................. 137

2.         RSC Watchstander Questions and Results.................................... 138

a.         Questions Posed.................................................................... 138

b.         Results (See Appendix C Section B for Graphs)................. 139

c.         Analysis and Recommendations.......................................... 139

3.         EWCO Watchstander Questions and Results................................ 140

a.         Questions Posed.................................................................... 140

b.         Results (See Appendix C Section B for Graphs)................. 141

c.         Analysis and Recommendations.......................................... 141

4.         F-TAO Watchstander Questions and Results................................. 141

a.         Questions Posed.................................................................... 141

b.         Results (See Appendix C Section B for Graphs)................. 142

c.         Analysis and Recommendations.......................................... 142

5.         CIC Team Questions and Results................................................... 143

a.         Questions Posed.................................................................... 143

b.         Results (See Appendix C Section B for Graphs)................. 144

c.         Analysis and Recommendations.......................................... 144

6.         Additional CIC Team Questions and Results................................. 144

a.         Questions Posed.................................................................... 144

b.         Results (See Appendix C Section B for Graphs)................. 145

c.         Analysis and Recommendations.......................................... 145

vI.       future work and development of the cruiser adc simulation   147

A.        future work introduction......................................................... 147

B.        future work to expand the scope and detail of the adc simulation   148

1.         Implement Networked Simulation of Battle Group Air-Defense Operations        148

2.         Implement a More Detailed Watchstander Fatigue/Vigilance Model 150

3.         Implement Aircraft Contacts as Watchstander Agents.................. 150

4.         Implement a More Detailed Log Parser Using XML.................... 151

5.         Implement a More Detailed Capability for AEGIS and Air-Defense Doctrine     151

6.         Implement Alternate Scenario Locations........................................ 151

7.         Implement More Detailed Treatment of SPY-1B Radar System, SLQ-32 System, and Communications System.................................................................. 152

8.         Conduct a More In-Depth Study of Metrics for Watchstander Performance Attributes (OR/OA)............................................................................................ 152

9.         Implement the Capability to Replay Previous Scenarios and/or Portions of Those Scenarios........................................................................................................... 153

10.       Implement the Capability to Build Scenarios with Specified Contact Aircraft of Various Types and Behaviors........................................................................ 153

c.        future work TO adapt the adc simulation for advanced training of watchstanders................................................................................... 153

1.         First Phase Single Watchstander Training System......................... 153

2.         Second Phase Multi-Watchstander, Interlinked Training System. 154

vii.     summary AND conclusion......................................................................... 157

appendix A.  ucd process phase three data................................................. 159

A.        conceptual design sketches...................................................... 159

Appendix B.  ucd process phase five data..................................................... 163

A.        Analysis of Task Data...................................................................... 163

B.        Simulation Evaluations................................................................ 177

1.         Evaluation Charts (Number of Errors and Task Completion Times) 177

C.        Simulation Evaluation Surveys............................................... 191

1.         Evaluation Survey Charts (Average and Raw Data)...................... 191

Appendix C.  simulation evaluation results AND Air-Defense expert survey results................................................................................................................. 195

A.        adc simulation evaluation results...................................... 195

1.         Evaluation Results for the RSC Watchstander Agent................... 195

2.         Evaluation Results for the EWCO Watchstander Agent............... 201

3.         Evaluation Results for the Force TAO Watchstander Agent......... 207

4.         Evaluation Results for the CIC Team Comparison Trials............. 211

5.         Evaluation Results for the SCENARIO WEATHER Trials.......... 213

B.        Air-Defense EXPERT SURVEYS of adc simulation performance        215

1.         Individual and Averaged Survey Results for the RSC Watchstander Questions  215

2.         Individual and Averaged Survey Results for the EWCO Watchstander Questions          217

3.         Individual and Averaged Survey Results for the Force TAO Watchstander Questions   219

4.         Individual and Averaged Survey Results for CIC Team Questions 221

5.         Individual and Averaged Survey Results for Additional CIC Team Questions     223

bibliography................................................................................................................. 225

INITIAL DISTRIBUTION LIST........................................................................................ 227

 

 

 

 

LIST OF FIGURES

 

 

 

Figure 1.           ADC Simulation Interface.................................................................................... 1

Figure 2.           CIC Air-Defense Organization........................................................................... 11

Figure 3.           ADC Simulation MAS Overview Diagram......................................................... 17

Figure 4.           Cognitively Based Model of Threat Assessment................................................. 26

Figure 5.           Threat Assessment Model.................................................................................. 27

Figure 6.           Strike Fleet™ Video Game............................................................................... 32

Figure 7.           Fifth Fleet™ Video Game.................................................................................. 33

Figure 8.           Harpoon Series™ Video Games........................................................................ 34

Figure 9.           Preliminary Conceptual Sketches of ADC Simulation GUI.................................. 58

Figure 10.         Early Implementation of ADC Simulation GUI before Usability Analysis.............. 61

Figure 11.         Updated ADC Simulation GUI following Usability Analysis................................ 68

Figure 12.         ADC Simulation Tactical Display....................................................................... 77

Figure 13.         ADC Simulation Aircraft Classification Icons...................................................... 78

Figure 14.         Contact Data Display......................................................................................... 79

Figure 15.         Scenario Control Buttons Display....................................................................... 79

Figure 16.         CIC Watchstander Display and Watchstander Attributes Display........................ 81

Figure 17.         ADC Simulation Main Menu Bar....................................................................... 81

Figure 18.         Generalized Aircraft Contact Object.................................................................. 85

Figure 19.         Modify Contact Attributes Popup Window........................................................ 89

Figure 20.         Scenario Setup Wizard Selection Popup Window.............................................. 89

Figure 21.         Select Specific Contact Popup Window............................................................. 90

Figure 22.         Scenario Run Time Input Popup Window........................................................... 90

Figure 23.         Message Handling Structure for all Watchstander Agents.................................... 95

Figure 24.         Link 16 Example............................................................................................. 104

Figure 25.         Scenario Events Log........................................................................................ 106

Figure 26.         Watchstander Decision History Log................................................................. 106

Figure 27.         CIC Equipment Readiness Log........................................................................ 107

Figure 28.         Watchstander Performance Log....................................................................... 108

Figure 29.         Parser/Analyzer Log........................................................................................ 109

Figure 30.         Weather Conditions Window........................................................................... 109

Figure 31.         Contact Density Window................................................................................. 110

Figure 32.         Scenario Threat Level Window........................................................................ 110

Figure 33.         Hostile Contact Level Window........................................................................ 111

Figure 34.         AEGIS (Auto-Special) Doctrine Popup Window............................................. 112

Figure 35.         Skill Probabilities Modification Window........................................................... 113

Figure 36.         Watchstander Agent Collaborative Contact Detection and Reporting Process... 114

Figure 37.         Generic Air Contact Classification Path............................................................ 117

Figure 38.         Contact Classification Artificial Neuron............................................................ 119

Figure 39.         Battle Group Simulation of Air-Defense Operations.......................................... 149

Figure 40.         Live Watchstanders Participating in Air-Defense Training Simulation................. 154

Figure 41.         Early Menu Design Sketches for ADC Simulation............................................. 159

Figure 42.         Early Menu Design Sketches for ADC Simulation............................................. 159

Figure 43.         Early Menu Design Sketches for ADC Simulation............................................. 160

Figure 44.         Early Menu Design Sketches for ADC Simulation............................................. 160

Figure 45.         Early Menu Design Sketches for ADC Simulation............................................. 161

Figure 46.         Average Number of Errors per Task................................................................ 177

Figure 47.         Errors During Performance of Tasks................................................................ 178

Figure 48.         Average Number of Errors per Task................................................................ 179

Figure 49.         Errors During Performance of Tasks................................................................ 180

Figure 50.         Average Number of Errors per Task................................................................ 180

Figure 51.         Average Number of Performance of Tasks....................................................... 181

Figure 52.         Errors During Performance of Tasks................................................................ 182

Figure 53.         Average Task Completion Time....................................................................... 183

Figure 54.         Total Time to Complete Tasks......................................................................... 184

Figure 55.         Average Task Complete Time.......................................................................... 185

Figure 56.         Total Time to Complete Tasks......................................................................... 186

Figure 57.         Average Task Completion Time....................................................................... 187

Figure 58.         Total Time to Complete Tasks......................................................................... 188

Figure 59.         Average Task Completion Time....................................................................... 189

Figure 60.         Total Time to Complete Tasks......................................................................... 190

Figure 61.         Screen Layout Survey Averages...................................................................... 191

Figure 62.         Survey Scores................................................................................................. 191

Figure 63.         Overall Display Layout Survey Averages.......................................................... 192

Figure 64.         Survey Scores................................................................................................. 192

Figure 65.         Menu Location and Wording Survey Averages................................................. 193

Figure 66.         Survey Scores................................................................................................. 193

Figure 67.         Task Completion Survey Averages.................................................................. 194

Figure 68.         Survey Scores................................................................................................. 194

Figure 69.         RSC Averaged Times-Radar Operations Skill Level......................................... 195

Figure 70.         RSC Averaged Errors-Radar Operations Skill Levels....................................... 196

Figure 71.         RSC Averaged Number Attempted CIC Classifications, Radar Operations Skill Level.  196

Figure 72.         RSC Averaged Times-Experience Level.......................................................... 197

Figure 73.         RSC Averaged Number Attempted CIC Classifications-Experience Level........ 197

Figure 74.         RSC Averaged Times-Fatigue Level................................................................ 198

Figure 75.         RSC Averaged Errors-Fatigue Levels.............................................................. 198

Figure 76.         RSC Averaged Number Attempted CIC Classifications-Fatigue Level.............. 199

Figure 77.         RSC Averaged Times-SPY-1B Radar Readiness Level................................... 199

Figure 78.         RSC Averaged Errors-SPY-1B Radar Readiness Levels................................. 200

Figure 79.         RSC Averaged Number Attempted CIC Classifications-SPY-1B Radar Readiness Level.         200

Figure 80.         EWCO Averaged Times-ES Analysis Skill....................................................... 201

Figure 81.         EWCO Averaged Errors-ES Analysis Skill...................................................... 201

Figure 82.         EWCO Averaged Number Attempted CIC Classifications-ES Analysis Skill.... 202

Figure 83.         EWCO Averaged Times-Experience Level...................................................... 202

Figure 84.         EWCO Averaged Errors-Experience Level...................................................... 203

Figure 85.         EWCO Averaged Number Attempted CIC Classifications-Experience Level.... 203

Figure 86.         EWCO Averaged Times-Fatigue Levels.......................................................... 204

Figure 87.         EWCO Averaged Errors-Fatigue Levels.......................................................... 204

Figure 88.         EWCO Averaged Number Attempted CIC Classifications-Fatigue Level......... 205

Figure 89.         EWCO Averaged Times-SLQ-32 System Readiness Levels............................ 205

Figure 90.         EWCO Averaged Errors-SQL-32 System Readiness Levels............................ 206

Figure 91.         EWCO Averaged Number Attempted Classifications-SLQ-32 System Readiness Level.           206

Figure 92.         Force TAO Averaged Times-Situational Awareness Skill Level........................ 207

Figure 93.         Force TAO Averaged Classifications Errors (Percentage)-Situation Assessment Skill Level.      207

Figure 94.         Force TAO Averaged Times-Experience Levels.............................................. 208

Figure 95.         Force TAO Averaged Classification Errors (Percentage) – Experience Level.... 208

Figure 96.         Force TAO Averaged Times – Fatigue Levels.................................................. 209

Figure 97.         Force TAO Averaged Classification Errors (Percentage) – Fatigue Levels........ 209

Figure 98.         Force TAO Averaged Times-Decision-Make Type.......................................... 210

Figure 99.         Force TAO Averaged Classification Errors (Percentage) – Decision-Maker Type. 210

Figure 100.       CIC Team Profile Trials Averaged Times......................................................... 211

Figure 101.       CIC Team Profile Trials Averaged # of Classification Errors (Percentage)......... 211

Figure 102.       CIC Team Profile Trials Averaged # of Attempted Classifications..................... 212

Figure 103.       Scenario Weather Trials Averaged Times......................................................... 213

Figure 104.       Scenario Weather Trials Averaged # of Classification Errors (Percentage)........ 213

Figure 105.       Scenario Weather Trials Averaged # of Attempted CIC Classifications............. 214

Figure 106.       Respondent Survey Results for RSC Simulation Questions................................ 215

Figure 107.       Averaged Survey Results for RSC Simulation Questions................................... 216

Figure 108.       Respondent Survey Results for EWCO Simulation Questions........................... 217

Figure 109.       Averaged Survey Results for EWCO Simulation Questions............................... 218

Figure 110.       Respondent Survey Results for Force TAO Simulation Questions..................... 219

Figure 111.       Averaged Survey Results for Force TAO Simulation Questions........................ 220

Figure 112.       Respondent Survey Results for CIC Team Simulation Questions....................... 221

Figure 113.       Averaged Survey Results for CIC Team Simulation Questions.......................... 222

Figure 114.       Respondent Survey Results for Additional CIC Team Simulation Questions...... 223

Figure 115.       Averaged Survey Results for Additional CIC Team Simulation Questions.......... 224

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

THIS PAGE INTENTIONALLY LEFT BLANK


LIST OF TABLES

 

 

 

Table 1.            F-TAO............................................................................................................. 49

Table 2.            TAO................................................................................................................. 49

Table 3.            F-AAWC......................................................................................................... 49

Table 4.            AAWC............................................................................................................. 50

Table 5.            CSC................................................................................................................. 50

Table 6.            RSC.................................................................................................................. 50

Table 7.            MSS................................................................................................................. 51

Table 8.            Red Crown....................................................................................................... 51

Table 9.            EWCO............................................................................................................. 51

Table 10.          TIC................................................................................................................... 51

Table 11.          IDS................................................................................................................... 52

Table 12.          List of Tasks...................................................................................................... 63

Table 13.          Usability Analysis Attributes............................................................................... 64

Table 14.          Listing of Watchstander Messages..................................................................... 96

Table 15.          CIC Equipment Levels of Performance.............................................................. 99

Table 16.          Systems Associated with Specified Watchstander Agents................................... 99

Table 17.          Abstracted Model SLQ-32 System Operational Model.................................... 102

Table 18.          Five Categories for the IFF Systems................................................................ 103

Table 19.          Abstracted Model IFF System Operational Model........................................... 103

Table 20.          Force TAO Contact Selection Prioritization Criteria......................................... 115

Table 21.          Force AAWC Contact Selection Prioritization Criteria..................................... 115

Table 22.          Ship TAO Contact Selection Prioritization Criteria............................................ 116

Table 23.          RSC Contact Selection Prioritization Criteria.................................................... 116

Table 24.          EWCO Contact Selection Prioritization Criteria................................................ 116

Table 25.          IDS Contact Selection Prioritization Criteria..................................................... 116

Table 26.          TIC Contact Selection Prioritization Criteria..................................................... 117

Table 27.          Red Crown Contact Selection Prioritization Criteria.......................................... 117

Table 28.          Evaluation Input Cues Used by the Watchstanders........................................... 118

Table 29.          Default Classification Threshold Values............................................................ 120

Table 30.          Scoring (Weighted) Values for the Various Input Cues..................................... 121

Table 31.          Radar Operations Skill Tests............................................................................ 127

Table 32.          Experience Level Tests.................................................................................... 128

Table 33.          Fatigue Level Tests.......................................................................................... 128

Table 34.          SPY-1B Radar Tests....................................................................................... 128

Table 35.          Electronic Signal (ES) Analysis Skill Tests........................................................ 130

Table 36.          Experience Level Tests.................................................................................... 131

Table 37.          Fatigue Level Tests.......................................................................................... 131

Table 38.          SLQ-32 System Radar Tests........................................................................... 131

Table 39.          Situation Awareness Skill Tests........................................................................ 133

Table 40.          Experience Level Tests.................................................................................... 133

Table 41.          Fatigue Level Tests.......................................................................................... 134

Table 42.          Decision-maker Type Tests............................................................................. 134

Table 43.          CIC Watch Team Attribute Profile Tests.......................................................... 136

Table 44.          Weather Option Tests...................................................................................... 137

Table 45.          Results of RSC Questions................................................................................ 139

Table 46.          Results of EWCO Questions............................................................................ 141

Table 47.          Results of F-TAO Questions............................................................................ 142

Table 48.          Results of CIC Team Watchstander Questions................................................. 144

Table 49.          Results of CIC Team Watchstander Questions................................................. 145

Table 50.          Errors During Performance of Tasks................................................................ 181

 


ACKNOWLEDGMENTS

 

 

 

The completion of my thesis was the culmination of nearly a year of intense research and development, and this experience has been an extraordinarily rewarding one for me, academically and professionally.  However, this thesis as well as the enriching experience it provided me would have not been possible without the assistance and guidance of a number of exceptional individuals listed below.  These people shared their invaluable knowledge and experience with me, and I am profoundly grateful for their contribution to my academic and professional growth.  To all of you, Thank You!

 

Naval Postgraduate School, Monterey California

 

Neil C. Rowe, PhD – Professor, Department of Computer Science

John Hiles – Research Professor, Modeling Virtual Environments and Simulation

(MOVES) Institute

 

Donald Gaver, PhD – Associate Professor, Department of Operations Research

Patricia Jacobs, PhD – Associate Professor, Department of Operations Research

 

Robert Harney, PhD – Associate Professor, Department of Systems Engineering

 

AEGIS Training & Readiness Center (ATRC), Detachment San Diego, California

            LCDR J. Lundquist, USN

            LT Brian Deters, USN

           

OSCS (SW) Mackie, USN

            OSC (SW) Couch, USN

            OSC (SW) Coleman, USN

 

Fleet Technical Support Center, Pacific (FTSCPAC)

            FCC (SW) Timothy Simmons, USN

 

Command, Space and Naval Warfare Systems Center (COMSPAWARSYSCEN) San Diego, California

            Glenn Osga, PhD

 

I would like to express my appreciation to COMSPAWARSYSCEN for funding my research through the SPAWAR Research Fellowship Program.


The following is a list of course that were considerably beneficial to the completion of my thesis:

 

MV-4015        Agent-Based Autonomous Behavior for Simulations

CS-4920          Expert Systems/CS-4311 (Directed Study)

CS-4310          Artificial Intelligence Techniques for Military Applications

MV-4203        Human-Computer Interaction

CS-4322          Artificial Intelligence & Knowledge Engineering Seminar

MV-4920        Human Agents

MV-4920        Multi-Agent System Shipboard Damage Control Trainer (Directed Study)

CS-4554          Computer Network Modeling & Design

CS-3310          Artificial Intelligence

CS-3773          Java as a Second Language

 

 

 


I.       introduction

A.        the aegis cruiser battle group Air-Defense simulation

The Air-Defense Commander (ADC) Simulation is a top-view, dynamic, Java language-based, graphics-driven software implementation of an AEGIS Cruiser Combat Information Center (CIC) team performing the Battle Group Air-Defense Commander duties in the Arabian Gulf region.  Designed using multi-agent systems technology, it is a fully interactive and customizable program that allows the user to configure a wide variety of the simulation parameters to create unique and realistic air-defense scenarios.  The program simulates the mental processes, decision-making aspects, cognitive attributes, and communications of an eleven-member CIC air-defense team performing their duties under stressful conditions caused by the requirement to maintain an overall situational awareness of the battle group’s airspace.  Below, in Figure 1, is displayed the graphical user interface (GUI) of the ADC Simulation.

 

                                                                                                                        Figure 1.                       ADC Simulation Interface.

The ADC Simulation was designed to assist in gaining insight and understanding on the effects on a CIC team or watchstander of variation in the following variables:

·                    Watchstander skill levels

·                    Watchstander experience levels

·                    Watchstander decision-maker types

·                    Watchstander fatigue levels

·                    Combat systems equipment readiness level

·                    Aircraft density  (number of aircraft)

·                    Aircraft types (hostile, unknown, friendly)

·                    Scenario threat level

·                    Weather conditions

·                    AEGIS and air-defense commander battle doctrines

The ADC Simulation gives the users the capability to design and run scenarios that generate realistic problems a CIC team could encounter.  The GUI allows them to then watch as these scenarios unfold and observe the performance of the simulated CIC team based on the user-specified configuration.  Additionally, the user can modify any of the scenario attributes “on-the-fly” to explore different potential outcomes.  Lastly, all of the events in the scenario are logged for each watchstander and combat systems equipment, which allows for the reconstruction of particular events of interest (i.e. watchstander mistakes, misidentification of aircraft chain-of-error analysis, etc.).  The simulation also includes a capability to review the performance metrics of each watchstander (number of errors, average time to complete tasks) to ascertain the degree to which modifying various attributes influences the simulated watchstander’s performance.

B.        scope of the cruiser Air-Defense simulatiOn proJECT

1.         ADC Simulation Project Thesis

The first phase of this project was an extensive review and analysis of formal scientific literature (reports, papers, books, etc.) and research on the subject of battle group air defense, decision-making under stress, cognitive factors in decision-making, air-defense simulations, other air-defense-related projects, and multi-agent systems.  The second phase involved the conduct of in-depth, detailed interviews with air-defense training experts from the AEGIS Training and Readiness Center (ATRC), San Diego Detachment, to gather direct data from experienced personnel.  The third phase dealt with the design and development of the actual ADC Simulation.  The next phase involved a comprehensive testing of the simulation using parametric analysis, and the recording of the results.  The final phase used the results of the simulation to produce an ADC Simulation Realism Survey that was taken by the ATRC air-defense experts to assess the level of accuracy of the simulation compared to their professional experiences.

2.         Interviews with Air-Defense Experts

The interviews with the air-defense experts at the ATRC detachment in San Diego focused on the various attributes of the watchstanders with the objective of trying to determine the relationships between these attributes and the performance of the CIC team, both collectively and individually.  The specific attributes discussed during these interviews were skill levels, experience levels, fatigue levels, and decision-maker types.  A considerable portion of the attribute discussions revolved around the debate of differentiating skill from experience in watchstander performance (which will be further discussed in Chapter IV as part of the design of watchstanders in the simulation).  These topics were further analyzed to determine varying levels of performance for each of the attributes  (i.e., Basic, Experienced, and Expert Skill Levels) and a set of skill types were assigned for each watchstander.  To complement the skill types, estimates for probability of success in the conduct of tasks associated with each watchstander’s skills were formulated and maximum task times (based on the air-defense experts’ experiences) were assigned.

3.         ADC Simulation Design

Utilizing the research gathered from the interviews and the formal scientific literature research, the ADC Simulation was developed within multi-agent system architecture.  The simulation is classified with the following characteristics:

·                    Dynamic – The model represents a system as it changes over time.

·                    Stochastic – The model contains one or more random variables that influence the events in the simulation.

·                    Continuous-State Model – The state variables are continuous.

·                    Continuous-Time Model – The system state is defined at all times.

·                    Exogenous – The model describes activities and events in the environment that affect the system.

·                    Stable – Dynamic behavior of the model is independent of time.

·                    Closed model – All input is generated internal to the model.

The ADC Simulation was designed with the following features:

·                    Graphical User Interface - Displays the aircraft contacts in the battle group’s operational air space with the capability to interact with them to determine the CIC team’s assessment of their classification.

·                    Implements the following watchstanders:

·                    Force Tactical Action Officer (F-TAO)

·                    Force Anti-Air Warfare Coordinator (F-AAWC)

·                    Ship Tactical Action Officer (S-TAO)

·                    Ship Anti-Air Warfare Coordinator (S-AAWC)

·                    Radar Systems Controller (RSC)

·                    Electronic Warfare Control Officer (EWCO)

·                    Identification Supervisor (IDS)

·                    Tactical Information Coordinator (TIC)

·                    Combat Systems Coordinator (CSC)

·                    Missile Systems Supervisor (MSS)

·                    Red Crown Watchstander (RC)

·                    Implements for each watchstander the following attributes:

·                    Skill Types (various)

·                    Experience Level

·                    Fatigue Level

·                    Decision-maker Type (F-TAO, F-AAWC, S-TAO, S-AAWC)

·                    Stimulates the following combat systems equipment:

·                    SPY-1B Radar System

·                    SLQ-32 Electronic Signal Detection System

·                    Link 11 (TADIL A) / Link 16 (TADIL J) System

·                    Identification Friend or Foe (IFF) System

·                    External Communications System

·                    Vertical Launching System (VLS) – Surface-to-Air Missiles

·                    Close-In Weapon System (CIWS)

 

·                    Implements the following external environment attributes:

·                    Scenario Weather Options

·                    Scenario Threat Level Options

·                    Scenario Contact Density (Numbers) Options

·                    Scenario Hostile Contact Level (Numbers) Options

·                    Implements an option to activate AEGIS doctrine (Auto-special).

·                    Implements the following log/data recording features:

·                    Overall Scenario Events Log (Major Events)

·                    Decision History Log for each Watchstander

·                    Readiness Log for each Combat Systems Equipment

·                    Performance Metric Log for each Watchstander

4.         Testing and Analysis of ADC Simulation and Conduct of Reality Survey

The final phase of the ADC Simulation Project consisted of the comprehensive testing and analysis of the ADC Simulation followed by the assessment of the level of reality of the simulation via a survey given to the air-defense experts at the ATRC Detachment in San Diego.  As part of testing of the simulation, the following questions were postulated and parametric analysis performed to gather data (the number of errors, the averaged times to complete tasks):

·                    For the RSC watchstander, what is the effect of varying the skill, experience, fatigue, and SPY-1B radar equipment readiness levels (singly) on individual watchstander and CIC team performance?

·                    For the EWCO watchstander, what is the effect of varying the skill, experience, fatigue, and SLQ-32 system equipment readiness levels (singly) on individual watchstander and CIC team performance?

·                    For the F-TAO watchstander, what is the effect of varying the skill, experience, and fatigue levels and decision-maker types (singly) on individual watchstander and CIC team performance?

·                    For the CIC team watchstander, what is the difference in performance between a CIC team led by an expert but exhausted F-TAO and consisting of a basic/newly qualified but fully rested CIC team opposite a basic/newly qualified but fully rested F-TAO leading an expert but exhausted CIC team?

·                    What is the effect of varying the weather attributes on the CIC team performance?

Once this data was collected and analyzed, an ADC Simulation Realism survey was created which used the results of the testing to develop scenarios for the questions.  The questions in the survey were designed to elicit responses from the air-defense experts on the level of realism if the simulation based on their professional experiences.

C.        relevance of the adc simulation in training for the complex and challenging task of Air-Defense OPERATIONS in the modern era

1.         Situation of Concern

Air Warfare is the most rapid, intense, and devastating type of warfare that the U.S. Navy currently trains for, and battle group operations are primarily focused on gaining proficiency in this mission area.  Due to the fast pace uncertain, and dangerous aspects of air warfare, the battle group commander’s Air-Defense Team (the AEGIS cruiser CIC Team) must be trained extensively in the fundamental tenets of these operations in order to effectively protect the aircraft carrier, high-value units, and other naval ships in the vicinity.  With the immense range of duties and responsibilities, there are multitudes of individual watchstation-specific and collective skill sets that must be mastered in order to effectively perform the ADC duties. 

2.         Current Training Needs and the ADC Simulation

a.         Current Situation

The waterfront training teams (AEGIS Training and Readiness Center (ATRC detachments) are charged by the fleet type commanders with providing the Air-Defense Commander (ADC) training to the cruisers, and the quality of the training they provide is typically outstanding.  However, the ADC operations are considerably complex, and the waterfront training teams are limited by the available training time as well as the scope of the training attempted.  Furthermore, interactions (watchstander to watchstander, ship to ship, ship to aircraft, watchstander to equipment, etc.) that are part of daily operations are numerous and potential ADC team performance deficiencies may not be noticed during the limited training periods. 

b.         The Need for New Systems to Assist Training Teams

The limitations of human comprehension of ADC operations due to the countless interactions places a barrier on the level, type, and quality of training that can be accomplished.  Because there are many different variables to account for in these operations, the training teams and ships must only rely on their collective past experiences for producing effective training.  This limits the potential gain of the training since the training teams and ships must formulate ADC scenarios for the future based on experiences from the past because it is simply too much for humans to analyze all of the variables involved.  This begs the question, how can the Navy design training that integrate smoothly with the current (and expected future) CIC team proficiency levels (skills, experience, equipment setup, etc.) to support and improve the training requirements for the ships and waterfront training commands?  This training would need to use the valuable experience of the ships and training commands to create scenarios, which accurately simulate the enormous complexities inherent in ADC operations.  To surpass this limitation, both groups require a system that will enable them to build scenarios, based on the current skill and training levels of the ADC team as well as the environment they will face, to assist them in training towards more realistic threats.

c.         A Potential Solution

The ADC simulation could provide a solution to the problems discussed above.  After an initial assessment of the training, experience, and equipment readiness levels of a specific ship, the initial settings for the ADC team and environment can be inputted into the system.  Upon completion of the setup, the program will allow the training teams (as well as the ships) to create simulations based upon the ship’s potential operational scenarios in order to discover the performance deficiencies.  The training teams and ships would use the results from the simulation to provide more focused training on the areas where deficiencies were noted.  The ADC Simulation could also be useful to battle group staffs to assist in the planning and development of battle group air-defense tactics and operations.  Also, the program can be employed to validate the usefulness of future scenarios intended for use in the training of the ships.  For the doctrine-formulation commands, this simulation will give them the opportunity to evaluate the validity of theoretical changes to ADC and AEGIS doctrine before implementing them in the fleet.

D.        brief history of naval AND battle group Air Defense

Modern battle group air defense is the collective effort by the naval warships and carrier air wing to protect, first, the aircraft carrier and other high value units such as amphibious and supply ships and second, the fleet’s warships from attack, disablement, and/or destruction by hostile air, naval, and shore forces.  Essentially, the primary focus of battle group air defense is the preservation of its assets to ensure the ability to project military power ashore in support of the United States’ strategic objectives.  It is primarily an intensive search, detection, and classification process to accurately determine and maintain positive identification al all aircraft and surface vessels within the battle group’s operational area.

The topic and problem of naval and battle group air defense was thrust upon the United States Navy in the early 1920s when General Billy Mitchell demonstrated the vulnerability of naval vessels to air power by sinking a battleship with bombs launched from his aircraft, radically altering the vision and conduct of warfare at sea.  As World War II would prove, the aircraft carrier, not the battleship, was the primary means by which nations (the United States foremost among them) would project military power onto foreign shores.  The aircraft carriers became the centerpiece of the United States’ strategy to drive back the Imperial Japanese Fleet, recapture its lost possessions, and capture victory.  Recognizing the threat of the aircraft carrier, the Japanese quickly refocused their attacks from the battleships to the carriers forcing a similar realignment in the thinking of the United States Navy.  Additionally, the technology of radar became a widely used and effective tool to organize the protection of the battle group.  The massive surface fleets of the Navy were now assigned another new primary task:  Defend the aircraft carrier. 

Early naval air defenses relied upon massive, uncoordinated fire from anti-aircraft artillery such as 20mm, 40mm, three-inch, and five-inch guns…Air defense was made up of a series of local anti-air battles fought close aboard, strictly in self defense.[1]

Towards the end of the war, the danger of the kamikazes led to another reorganization and innovation in battle group air defense known as defense-in-depth. 

Tactics evolved quickly, including tightly grouped defensive ship formations and picket ships for early warning.  Although primitive by current standards, the concept of effective, coordinated defense-in-depth took shape.[2]

Following World War II, the 1950s and 1960s ushered rapid advances in offensive military technology, which required similar changes in tactics and tactics in defensive tactics to protect the battle group.  Foremost among these advances were the introduction of jet power and unmanned missiles, especially, anti-ship missiles.

The advent of unmanned missiles and long-range Soviet bombers led the Navy to develop defensive weapons and enhance ship-to-ship coordination…In the 1950s, the Navy began deploying three guided SAM variants known as 3-T missiles:  long-range Talos, medium-range Terrier, and short-range Tartar.  Simultaneously, a large-scale program to convert previously non-missile ships to missile shooters was initiated with vessels capable of firing one of these missiles.[3]

However, the continual advances (speed, maneuverability, and accuracy) in offensive anti-ship missile technologies reached a point where, despite the capability of the defensive missiles to intercept, the human watchstander became the weak point in the overall air-defense system.  The watchstanders were unable to communicate, coordinate, and react quickly enough to defend against the most advanced and deadly of missile technologies.

Faster and more reliable means of surveillance and identification data exchange were required.  The Navy tactical data system (NTDS) was introduced in 1958, the world’s first shipboard tactical data system based on programmable computers.  This was an initial step in the integration of multi-ship systems in a force-wide air-defense system.[4]

Advances in the capabilities of NTDS would allow for quicker and more accurate transmission of critical air-defense data for battle group air defense.  Eventually, airborne early warning aircraft such as the E-2A Hawkeye (which is still in use as the upgraded E-2C variant) were deployed to increase the surveillance range of the battle group.  By the early 1980s, the long-term AEGIS project reached fruition and the first cruisers carrying the powerful SPY-1 phased array radar, which was integrated into a potent command and control system, was introduced into the fleet.  “Introduced operationally in 1983, the heart of the AEGIS weapon system is the SPY-1 phased array radar, which provides automatic detection and fire control quality tracking for hundreds of targets simultaneously.”[5]  The AEGIS cruisers (Ticonderoga class) followed by the AEGIS destroyers (Arleigh Burke class) tremendously improved the Navy’s capability to perform battle group air defense and countered a multitude of previously dangerous anti-ship missile threats.  Since the AEGIS fleet’s arrival, the last twenty years have been marked by the steady advance/counter-advance of offensive versus defensive weapons.  The offensive strides in technology were characterized by greater increases in speed and lethality of anti-ship missiles.  Similar measures were achieved in defensive missile systems, but one of the significant advances in overall battle group air defense occurred with the development of Link 11 followed by its more effective follow-on Link 16.  These battle group data exchange systems (descended from the original NTDS) markedly increased the capability of the battle group units to effectively coordinate their detection information and actions.

Battle Group Air Defense has continued to be a primary mission for the United States Navy.  In the late 1980s, two incidents highlighted the need for the research into two fields of study to enhance naval performance, the psychology of decision-making under stress and human factors design.  The first incident involved the USS Stark, which was attacked by two Exocet anti-ship missiles and was nearly sunk.  The second incident occurred in 1988 and involved the USS Vincennes, which mistakenly shot down a civilian Iranian airliner during a surface battle with Iranian naval forces.  Caused by several factors involving CIC communications among watchstanders and exacerbated by CIC systems produced with poor human-computer interactive design, the Vincennes believed it was involved in a coordinated hostile air-sea battle and reacted accordingly.  Both incidents caused the United States Navy to reassess the importance of the human being in the entire air-defense process, a topic that had previously been relegated to a lower priority to that of technological advances.  Some of the most prominent research studies and projects that resulted from these incidents are discussed in Chapter II, and the overall thrust of these documents assert that the watchstander should always be at the forefront of the understanding of the performance of battle group air-defense operations.  The ADC Simulation was developed with this premise in mind.

E.         watchstander organization of a cruiser combat information center

1.         Overview of a CIC Organization

Onboard naval ships, the Combat Information Center (CIC) is the nexus of all of the ship’s tactical operations, and it is from this location that the commanding officer and watch teams coordinate these activities.  Often several different warfare operations are being conducted simultaneously from the CIC including Air Warfare (which Battle Group Air Defense is a part of), Surface Warfare, Undersea Warfare, and Strike Warfare.  Information is channeled into the CIC for review, analysis, and assessment by the appropriate warfare teams and, following the decisions by the commanding officer or Tactical Action Officer; the CIC team, performs the required actions.  Displayed below (Figure 2) is the organizational diagram of the CIC air-defense team implemented in the ADC Simulation.  Dashed lines indicate indirect leadership control of the watch team members under the specified watchstander.

 

                                                                                                                Figure 2.                       CIC Air-Defense Organization.

 

The CIC contains a comprehensive assortment of equipment and combat systems to support the watch team foremost among them tactical systems consoles.  These consoles have a wide variety of uses such as activating weapon systems (launching missiles, firing guns), configuring sensory systems (radar, IFF systems), displaying contact tracks (aircraft, ships, submarines, etc.), modifying/displaying this track information, and communicating externally with other ships and aircraft.  Additionally, an internal communications system allows the watchstanders to communication with each other.

2.         Brief Description of the CIC Air-Defense Watchstanders

a.         Force Tactical Action Officer (F-TAO)

The Force Tactical Action Officer is in overall control of the air-defense operations for the battle group and is responsible for most of the major decisions.  Decisions made at lower levels can be overridden by the F-TAO, if deemed necessary.  Most importantly, the F-TAO makes the final decisions on contact classifications as well as weapon batteries release for ship and aircraft missile engagements.  The F-TAO and F-AAWC work very closely to coordinate the air-defense operations of the battle group.

b.         Force Anti-Air Warfare Coordinator (F-AAWC)

The Force Anti-Air Warfare Coordinator directly runs the air-defense identification process within the battle group’s surveillance airspace picture and is responsible to the F-TAO for the performance of this process.  The F-AAWC coordinates the movement and assignment of friendly aircraft via the external communication circuit to other ships and the Red Crown watchstation.  The F-AAWC can order repositioning of aircraft and orders for visual intercept/identification of unknown aircraft, but requires the authorization of the F-TAO to order an engagement of an aircraft with weapon systems.  Additionally, the F-AAWC via the F-TAO is responsible for the ordering of weapons employment to battle group air-defense ships. 

c.         Ship Tactical Action Officer (S-TAO)

The Ship Tactical Action Officer leads the CIC watch team for the ship and is responsible for most of the major decisions made during air-defense operations for the S-TAO’s ship only.  The Ship TAO is responsible for the defense of his or her ship and is authorized to employ weapon systems in its defense, if a perceived threat of danger from attack is imminent.  On ships not assigned performing the ADC duties, the S-TAO is charge of the CIC watch team and works for the F-TAO on the ADC ship.

d.         Ship Anti-Air Warfare Coordinator  (S-AAWC)

The Ship Anti-Air Warfare Coordinator directs the aircraft detection and classification process within the ship’s airspace and is responsible to the Ship TAO for the performance of the team ID process.  Although subordinate to the Ship TAO, the S-AAWC receives a substantial amount of air-defense tasking from the F-AAWC, who is coordinating the overall battle group air-defense process.  Among other responsibilities, the S-AAWC controls the movement of friendly aircraft assigned to the ship and, upon proper authorization, employs the ship’s self-defense missile weapons system via the Missile Systems Supervisor.  On ships not assigned performing the ADC duties, the S-AAWC directs the CIC team in the performance of the air-defense duties.

e.         Electronic Warfare Control Officer (EWCO)

The Electronic Warfare Control Officer is responsible for the operation of the electronic emissions detection equipment, which is used to detect and classify various types of aircraft based on their radar signal emissions.  These radar signal emissions are one of the primary means by which the CIC air-defense team distinguishes friendly and neutral aircraft from potentially hostile/unfriendly aircraft.  Although working directly for the Ship TAO, on the air-defense commander cruiser, the Force TAO and Force AAWC also use this watchstander’s reports to assist them in their duties.

f.          Radar Systems Controller (RSC)

The Radar Systems Controller operates the SPY-1A/B radar systems which are the primary means by which aircraft are detected and tracked by the ship.  Often, radar detections are the first indications of the presence of an aircraft, and the initial kinematic data (course, speed, altitude, location) influence the initial assessment of the aircraft’s threat potential and priority for observation.  Although working directly for the Ship TAO, on the air-defense commander cruiser, the Force TAO and Force AAWC also use this watchstander’s reports to assist them in their duties.

 

 

g.         Tactical Information Coordinator (TIC)

The Tactical Information Coordinator operates and maintains the Tactical Digital Information Link (TADIL) A/Link 11 and TADIL J/Link 16, which communicates tactical data among the friendly ships and aircraft in the battle group.  These interlinks allow the friendly units to possess an expanded view of the battle group’s airspace, increasing overall tactical situation awareness.  On the ADC cruiser, the TIC has more demanding duties and is responsible for the coordination and control of the entire battle group’s Link 11/16 picture.  The quality of this picture is of significant importance to the primary air-defense decision-makers (Force TAO, Force AAWC).

h.         Identification Supervisor (IDS)

The Identification Supervisor is primarily responsible for performing Identification Friend or Foe (IFF) system challenges on unknown aircraft and inputting the results of this information (and other relevant identification data) into the CIC track database (AEGIS Command and Display system) for viewing by other watchstanders.  Additionally, when directed, the IDS will initiate query and/or warning procedures against specified contacts via the external communications system.  The results of these challenges assist the primary decision-makers in the classification of aircraft contacts.

i.          Combat Systems Coordinator (CSC)

The Combat Systems Coordinator is in charge of the activation, monitoring, and deactivation of the primary and secondary combat systems that support the CIC.  Combat systems equipment degradations and failures are reported to the CSC for resolution and repair.  The CSC also is the primary lead for initiating troubleshooting procedures for certain combat systems equipment including the communication systems, Link 11/16 systems, and IFF systems.  Additionally, the CSC is directly responsible for the input, activation, and deactivation of AEGIS doctrine (weapons, IFF, and identification).

j.          Missile Systems Supervisor (MSS)

The Missile Systems Supervisor is directly responsible for the employment (firing) of the ship’s surface-to-air missiles and the self-defense Close-In Weapon System (CIWS).  The MSS works directly for the Ship AAWC and receives authorizations to activate weapon systems from that watchstander.

k.         Red Crown (RC)

The Red Crown watchstander is responsible for checking friendly aircraft (both launching from and returning to the aircraft carrier) to verify their identity and mission assignment.  These duties require the Red Crown to validate IFF code assignments and communicate with the aircraft directly.  After being cleared by Red Crown, the aircraft are allowed to proceed on their assignment mission or continue their approach to the carrier.

F.         application of multi-agent system technology in the adc simulation

The ADC Simulation watchstanders were implemented using a multi-agent system (MAS) technology where each of them was designed as “agents.”  Within the context of this simulation, an agent is a component of software with the following characteristics:

·                    It is capable of acting in an environment.

·                    It can communicate directly with other agents.

·                    It is driven by a set of tendencies (in the form of individual objectives or of a satisfaction/survival function which it tries to optimize).

·                    It possesses resources of its own.

·                    It is capable of perceiving its environment (but to a limited extent).

·                    It has only a partial representation of this environment.

·                    It possesses skills and can offer services.

·                    Its behavior tends toward satisfying its objectives, taking account of the resources and skills available to it and depending on its perception, its representations and the communications it receives.[6]

Essentially, the watchstander agents in the ADC Simulation contain intent and objectives (perform their assigned duties) communicate amongst each other to achieve their objectives, and possess resources (skill, experience, fatigue, and decision-maker type attributes as well as combat systems equipment).  They perceive their environment to a limited extent since each watchstander agent either receives this information via combat systems sensory equipment or through verbal communications (from other watchstander agents) or CIC watchstation information display systems.  The watchstander agents offer services to each other by disseminating information vital to their performance of the air-defense duties and operations of the CIC and influence within the environment (and other agents) through their actions (i.e. Force TAO classification of aircraft as Hostile). 

MAS technology is a blending of the cognitive/social sciences (psychology, ethology, sociology, philosophy), the natural sciences (ecology, biology), and the computer sciences since they simultaneously model, explain, and simulate natural phenomena (in this case human behavior in the ADC Simulation) and provide models for self-organization.[7]  Traditionally programming is often very mechanistic, hierarchical, and modular and, subsequently, does not lend itself well to simulating the often surprising (whether organized or chaotic) behavior of interactive human and environmental systems.  However, MAS technology is less restrictive in its design, which produces simulation behavior often more akin to that observed in the real world.  The term “multi-agent system” is applied to a system comprising the following elements:

·                    An environment, E, that is, a space which generally has a volume.

·                    A set of objects, O.  These objects are situated; that is to say, it is possible at any given moment to associate any object with a position in E.  These objects are passive, that is, they can be perceived, created, destroyed and modified by the agents.

·                    An assembly of agents, A, which are specified objects (A Í O) representing the active entities of the system.

·                    An assembly of relations, R, which link objects (and thus agents) to each other.

·                    An assembly of operations, Op, making it possible for the agents of A to perceive, produce, consume, transform and manipulate objects from O.

·                    Operations with the task of representing the application of these operations and the reaction of the world to this attempt at modification.[8]

In the ADC Simulation, the watchstander agents perform their duties within a layer of environments (Combat Information inside of the AEGIS cruiser within the battle group’s operational area) that contain a multitude of objects (aircraft contacts).  The watchstander agents have the capability to execute a set of operations to perceive the environment as well as the objects in it and communicate with each other.  Conversely, the objects within the ADC Simulation environment can also perform operations to perceive and interact with the AEGIS cruiser (and thus affecting the watchstander agents) and aircraft carrier.  These operations are governed by relationships that determine the scope and degree to which the operations can occur.  The diagram below (Figure 3) provides an overview of the implementation of MAS technology in the ADC Simulation.

 

                                                                                               Figure 3.                       ADC Simulation MAS Overview Diagram.

 

The integration of the agents, and assembly of relations among the agents and objects, assembly of operations among the agents and objects into an environment within the ADC Simulation produces a highly dynamic and realistic model of a challenging and complex task performed by humans. 


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

THIS PAGE INTENTIONALLY LEFT BLANK

II.      related work in the area of naval Air-Defense simulation

A.        related work introduction

During the research for the design and development of the ADC Simulation, extensive resources were discovered relating to the subject of air defense, human-computer interaction, cognitive modeling, team training systems, naval simulation, and threat assessment.  It was noted that the most numerous research into the topic of air defense was often sanctioned or supported by the United States Navy or other affiliated organizations and a sharp increase in such research, notably in the areas of decision-making under stressful conditions and air-threat assessment, occurred starting in the late 1980s.  From reviewing the research, we determined that the likely cause of this surge was due to the USS Vincennes incident, which sparked an effort by the Navy to understand the underlying factors (mental, physical, and informational) affecting the crew’s performance during the shoot-down of the Iranian commercial airliner.  Some of this research resulted in follow-on projects by the Navy to develop better Combat Information Center (CIC) consoles for the watchstanders to improve performance.  Other related work originated in the commercial software development sector where a multitude of video games that simulate naval operations have been produced.  The following papers, projects, systems, and programs were most relevant to the research, development, and implementation of the ADC Simulation:

·                    Area Air-Defense Commander (AADC) Battle Management System

·                    Tactical Decision Making Under Stress (TADMUS) Decision Support System

·                    Multi-Modal Watch Station (MMWS) Program

·                    Naval Air-Defense Threat Assessment: Cognitive Factors Model

·                    Air Threat Assessment:  Research, Model, and Display Guidelines

·                    Cognitive and Behavioral Task Implications for Three Dimensional Displays Used in Combat Information/Direction Centers

·                    Battle Force Tactical Training (BFTT) System

·                    Naval Combat Simulation Video Games

B.        Area Air-Defense commander (aadc) battle management system

The Area Air-Defense Commander (AADC) Battle Management System was developed by the Navy for more effective coordination of air-defense planning and execution for multi-service (i.e. Army, Air Force, Navy, & Marines) and coalition (international) operations, following the Gulf War.  The primary mission for an Area Air-Defense Commander is to develop and execute a theater-wide air-defense plan to support the strategic and operational plans of the Joint Forces Commander (JFC) during an operation.  Prior to the development of the AADC system, such air-defense planning could only be accomplished manually, a laborious task that “used to take 10 to 15 people hours or even days to generate air-defense plan.”[9]  To complicate matters, it was necessary to conduct numerous evaluations and war-gaming scenarios against the plan to evaluate its effectiveness.  However, this process was significantly limited by the bounds of human performance since only a minimum number of scenario variables could be modified and tested before the analysis became unwieldy and intractable.

Designed and developed by John Hopkins University’s Advanced Physics Laboratory (APL), the AADC System had to attain two objectives.  First it would “provide a single, integrated picture of the battle-space so that a joint commander can quickly gather data on air and missile attacks and defend against them.”[10]  This would greatly enhance the AADC’s ability to maintain an accurate view the operational area.  Second, the AADC System would allow the air-defense staff to rapidly create, modify, and evaluate plans through system’s automated uses which substantially reduced the time of the process. 

A number of complex issues surround planning and coordinating wide-area air defense…These variables represent hundreds of courses of action combinations that planners must consider…Every time you change a variable, you change the results…The AADC can repeatedly war game a plan against possible enemy attacks, running a complete scenario up to 25 times to verify results.[11]

Given this capability, the air-defense staff could now evaluate and analyze an air-defense plan, including the extraordinary number of possible variables in a scenario, with a greater level of confidence than previously possible since a larger number of potential outcomes could be explored.

The AADC System is similar to the ADC Simulation in two ways.  First, both programs are designed to improve the Navy’s ability to conduct air defense.  Second, the AADC System and ADC Simulation allow the users to modify variables in the programs to explore the potential outcomes that might result from those changes.  However, these programs differ significantly in their objectives and focus in that the AADC System is developed for theater-wide, strategic and operational planning by the AADC while the ADC Simulation concentrates on battle group air defense as well as the performance of the ADC watchstanders and is implemented using a multi-agent systems architecture. 

c.        tactical decision-making under stress (tadmus) DECISION SUPPORT SYSTEM

The Tactical Decision-Making Under Stress (TADMUS) study was one of the first comprehensive explorations into the causes of the USS Vincennes incident. 

The congressional investigation of this incident suggested that emotional stress may have played a role in contributing to this incident and the TADMUS program was established to assess how stress might affect decision making and what might be done to minimize those effects.[12]

The TADMUS study revealed the Combat Information Center consoles and systems in use during the timeframe of the study (late 1980s to early 1990s) contained significant Human-Computer Interaction (HCI) flaws which degraded watchstander performance under stressful conditions in high contact density littoral environments.  The direct result of these flaws was that

Teams exhibited periodic losses of situation awareness, often linked with limitations in human memory and shared attention capacity.  Environmental stressors such as time compression and highly ambiguous information increased decision biases.[13]

The following problems were identified with the short-term memory limitations:

·                    Mixing up track numbers and forgetting track numbers.

·                    Mixing up track kinematic data and forgetting track kinematic data.

·                    Associating past track related events/actions with the wrong track and associating completed own-ship actions with wrong track.[14]

A second set of problems were categorized as decision bias-related and included the following:

·                    Carrying initial threat assessment throughout the scenario regardless of new information (framing error).

·                    Assessing a track based on information other than associated with the track (e.g., old intelligence data, past decision-maker experiences, etc.).[15]

All of these problems occurred during the USS Vincennes shoot-down of the Iranian commercial airliner, and the results of the study demonstrated the significant negative impact the HCI design of the CIC consoles had on the watchstanders.  Once the research and analysis of the current CIC systems was completed, the TADMUS program embarked on a second phase of research with the goal to develop improved CIC display consoles.  This system, known as the Decision Support System (DSS), had the following objectives:

·                    Minimize the mismatches between cognitive processes and the data available in the CIC to facilitate decision-making.

·                    Mitigate the shortcomings of current CIC displays in imposing high information-processing demands and exceeding the limitations of human memory.

·                    Display the data in the CIC in graphical rather than numeric representations wherever appropriate.[16]

The evaluation of the DSS component during training simulations determined that the new system greatly improved the overall performance of the air-defense teams, especially in the area of situational awareness.  Additionally, the watchstander participants rated the DSS component with a higher level of usability than existing CIC console displays.

The TADMUS and DSS research programs contained two relevant issues to the development of the ADC Simulation.  The TADMUS project was among the initial studies conducted to examine the cognitive processes of naval air-defense personnel during stressful situations.  The mental models proposed to explain the decision-making process of these personnel created a foundation for more detailed subsequent work.  Many of the general principles of watchstander cognition formulated from the TADMUS and DSS projects were incorporated into the design of the ADC Simulation.  Second, the TADMUS program identified two categories of cognitive errors (short-term memory limitation and decision bias) that occurred when watchstanders experienced stressful situations while performing air-defense operations.  These errors ultimately led to the loss of situational awareness by the watchstanders.  To replicate a reasonable level of realism in the program, the watchstanders in the ADC Simulation were designed so that generic approximations of the cognitive errors listed above occur (based on a random probability function) during scenarios.

d.        multi-modal watch station (mmws) program

The Multi-Modal Watch Station (MMWS) program was a four-year project focused on the development of specialized watchstation consoles that incorporated improved human-computer interface (HCI) designs to improve the performance of watch-teams during battle group air defense and land-attack warfare operations.

MMWS is a concept design for a future command and control decision support system intended to serve as a test prototype to develop human-computer interface (HCI) design recommendations for future Navy combat and command/control information systems.[17]

Diverging from past, traditional CIC console engineering processes, the MMWS program initially performed a detailed analysis of air-defense watchstanders’ behaviors, interactions, and processes to determine all of the requirements for their task lists and workload. 

Key requirements were identified related to the user tasks or workload and included mission, work management, communication, and HCI control.  User support concepts were developed and refined in relation to work management user tasks, which included the ability to assist the user in the selection of tasks and work strategies…Further, research in workload management led to the refinement of models to access and predict workload during real-time tactical operations.[18]

The comprehensive design phase was a significant departure from the process typically used by military contracting corporations because its primary focus on developing a system that supported the user’s task and work requirements as opposed to forcing the user to adapt his or her requirements to the system.  The MMWS project also advanced another substantial research initiative that could transform shipboard CIC console design.  Currently, the Navy contracts for the building of large hardware console systems which results in an extraordinary cost to maintain the supporting infrastructure for contractors, initial installation, repair parts, and the training pipeline for technicians.  During the development of the MMWS, “… a Java version of the software was developed to test the feasibility of transition for the HCI components into a fielded naval software system.”[19]  If adopted as standard for future implementation in future naval ships, this would allow the Navy to divorce itself from investing in highly expensive, inflexible hardware system and move towards a common computer display system that would run a software implementation of the older console systems.  Such a step could substantially reduce the cost (production, installation, maintenance, and training) of CIC console systems while also ensuring that upgrades would occur more frequently and at a reduced cost.

The MMWS consoles developed during the project implemented decision-aid user-support tools to increase usability and learnability and decrease the potential for information overload and errors.  The MMWS research conducted extensive interviews and console evaluations with air-defense subject matter experts, which resulted in several succeeding versions of the system.  At the conclusion of the project, the team showed a suite of MMWS consoles that corrected many of the HCI design problems inherent in the current set of AEGIS CIC consoles, which caused information overload, increased the likelihood of errors, and aggravated the potential for loss of situational awareness.  During a comprehensive system evaluation, the project team demonstrated the MMWS consoles could reduce the size of the typical air-defense team by 2-3 people while increasing their overall performance levels.

e.         naval Air-Defense thReat assessment:  cognitive factors model

Another investigation examined the cognitive aspects of the threat assessment process used by naval air-defense officers during battle group operations are referenced here.  The research evaluated personnel during exercises and operations to determine the factors in the decision-making of identifying and classifying air contacts. 

Factors are the elements of data and information that are used to assess air contacts.  Traditionally, they are derived from kinematics, tactical, and other data.  Examples of such data include course, speed, IFF mode, and type of radar emitter.[20]

The research indicated the watchstanders mentally maintained a range of possible-track templates, derived from a set of twenty-two identifying factors, which they used to classify contacts and calculate threat assessments.  Some of the most promising factors are listed below:

·                    Electromagnetic Signal (ES) Emissions

·                    Course (with respect to the battle group)

·                    Speed

·                    Altitude

·                    Point of Origin

·                    Identification Friend or Foe (IFF) Modes 1,2,3, 4, C

·                    Flight Profile

·                    Intelligence Information

“Threat assessment is defined…as the process of evaluating aircraft that are flying in the vicinity of one’s ship, and determining how much of a threat they represent to the ship as well as the battle group.”[21]  A contact’s factors and other data were compared against relevant templates, and the template with the highest degree of fit was used to identify the air contact as well as make a threat assessment.  The figure below inferred the threat assessment process.

                                                                                     Figure 4.                       Cognitively Based Model of Threat Assessment[22].

 

This research and the ADC Simulation were consistent because the latter models the mental decision-making and threat-assessment processes of the Combat Information Center (CIC) watchstanders as part of their process of identifying and classifying air contacts.  During the initial stages of research gathering, we conducted in-depth interviews with several air-defense experts at the AEGIS Training & Readiness Center (ATRC) Detachment in San Diego, which focused on the above contact threat assessment and identification processes.  The results of our interviews confirm the previous work’s findings about factors, templates, and the contact threat assessment and identification processes.  A similar cognitive model, including the factors and templates, was implemented in the Air-Defense Simulation for the air-defense decision-making of the CIC watchstander agents.

f.         Air threat assessment:  research, model, and display guidelines

The paper reviews several other studies, including the one cited in Section E in an ongoing study into the practice of air threat assessment of contacts during battle group air-defense operations. 

The studies provided a theoretical and applied basis for threat assessment by defining specific cue-data relationships and detailing the cognitive processes involved in air defense simulation assessment.  Those processes were incorporated into a proposed model of threat assessment that was successfully validated against threat ratings from experienced air defense decision makers.[23]

The studies addressed by the paper include the Tactical Decision Making Under Stress (TADMUS) program, the subsequent Decision Support System (DSS), and the Basis For Assessment (BFA) tool, and several papers covering Naturalistic Decision Making (NDM).  The knowledge gained from these studies was used toward developing a new threat assessment model (displayed below) and the creation/update of guidelines for displaying contact threat assessment data.  The original TADMUS research led to a follow-on project, DSS, to implement specialized air-defense displays for watchstanders.  These displays were designed to enhance the performance of the air-defense personnel by providing them with critical data for decision-making while preventing information/screen overload.

 

                                                                                                                     Figure 5.                       Threat Assessment Model[24].

Updated threat assessment interface guidelines were recommended:

·                    Display a threat assessment window on-screen when a track is hooked.

·                    Compute and display the threat ratings of tracks.

·                    Show threat rating history.

·                    Provide a list of all assessment cues.

·                    Order cues by importance to the decision maker.

·                    Show the impact of each cue on overall threat rating.

·                    Provide a track priority list.[25]

Using these recommendations, the authors produced a limited prototype for demonstration.

g.        cognitive and behavioral task implications for three dimensional displays used in combat information/direction centers

This paper reviews the behavioral and cognitive task analysis of the Joint Maritime Command Information System (JMCIS) for the purpose of determining whether the implementation of three-dimensional displays would be useful.  The objective of JMCIS is to produce a Common Tactical Picture (CTP) for the battle group or joint-force commander to ensure the maintenance of battlespace situational awareness.  As part of this objective, the CTP is designed to integrate the undersea warfare (USW), mine warfare (MW), Surface Warfare (SW), Air Warfare (AW), Amphibious Warfare (AMW), and C4ISR (Command, Control, Communications, Computers, Intelligence, Surveillance, and Reconnaissance) battlespace pictures into one comprehensive display.  During the analysis of the HCI of most combat system displays, this study cited many of the problems associated with poor display designs from the TADMUS project including short-term memory limitations and loss of situational awareness.

The researchers identified situational awareness as the primary area concern during the task analysis and selected the Applied Cognitive Task Analysis (ACTA) methodology (developed by Klein and Associates), which addresses the mental models of both novices and experts.

Situation awareness, as defined by Endsley, is a threefold process including (1) perception of the elements in the environment within a volume of time and space, (2) comprehension of the meaning, and (3) projection of status in the near future.  At the first cognitive level, the user detects the target cues or objects in the environment.  During the second cognitive level, the perceived information is processed and integrated into an assessment of the situation.  At the third cognitive level, new projected outcomes are formulated for the situation.[26]

The study further asserted that situational awareness was also affected by the following four factors:  (1) capabilities, (2) training and experience, (3) preconceptions and objectives, (4) and ongoing task workload.  Taking into account all of these factors, “as task workload and stress increase, decision-makers will often lose a ‘Big Picture’ awareness and focus on smaller elements.”[27]

The discussion of situational awareness in this study had a particular relevance to the development of the ADC Simulation.  The four factors affecting situational awareness mentioned above were incorporated into the design of the watchstander agents in the simulation.  Our interviews with the air-defense subject matter experts validated the conclusions the researchers made concerning situational awareness, especially the factors of training and experience with task workload.

h.        battle force tactical training (bftt) SYSTEM

The Battle Force Tactical Trainer System was designed for the fleet-wide training of naval units by providing each ship with a comprehensive training system (using the existing CIC console architecture) run by a specialized computing system and is primarily used for air-defense training of the CIC team.

The BFTT system…provides Commanding Officers, the Afloat Training Organization (ATO) and Battle Group/Battle Force (BG/BF) commanders with the ability to conduct coordinated, realistic, high stress combat system training for developing war fighting proficiency and maintaining combat readiness.[28]

On the unit level, BFTT allows the ships to develop realistic training by designing high-fidelity scenarios which inject actual signal information into the ship combat systems to emulate reality.  On a battle force scale, the BFTT system can produce a synthetic theater where an entire fleet of ships and staffs (whether in-port or underway) can participate in a worldwide war-gaming exercise.  Additionally,

By leveraging the BFTT scenario generation environment, replay is familiar to the operator in terms of map appearance, controls, and track features.  It is also an extremely powerful learning tool, displaying both the ground truth and the perceived tracks from one or more exercise participants.[29]

Consequently, upon completion of the training scenario, individual units as well as the entire battle force can immediately conduct a review of the scenario events for each of the watch-teams and provide them with near-instant feedback on their overall performance.  If desired, specific portions of the scenario where a watchstander(s) made a mistake could be replayed so that the person could correct the deficiency. 

Although the objective of the BFTT system is considerably different from the ADC Simulation there are two aspects that the systems share in common.  First, similar to BFTT, the ADC Simulation will allow the user to always see the ground truth for air contact identification along with the perceived identification of the aircraft.  Second, the ADC Simulation will maintain a record log of all of the actions, inputs and outputs, and events for each watchstander as well as for the entire scenario so that event reconstruction can be performed.

i.          naval combat SIMULATION video games:  the precursor to modern-day air-defense simulations

A review of research and system development in naval air-defense simulations, past and present, would not be complete without an examination of the video game industry’s recreational software programs that attempt to model actual naval operations and combat.  Before the Navy began investing considerable funds into modeling and simulation, during the 1980s and early 1990s, the software gaming industry had programs that simulated naval combat in various location environments.  First-person “shooter” games like Unreal Tournament™, Quake™, and Medal of Honor™ have recently become templates for several military research projects into the creation of training programs for ground infantry troops.  Conversely, many of these naval games were designed with the assistance of former naval officers who served as advisers on the, and in some cases, the games were so realistic that military (due to the absence of equivalent programs) used them to train personnel.  The following games represent some of the most popular and realistic naval simulation games which contained a significant battle group air-defense component as part of the game engine.

1.         Strike Fleet:  The Naval Task Force Simulator™

Making its debut in 1987, Strike Fleet™ was one of the first and most successful video games that simulated naval battle group operations.  The game arrived on the scene during the height of tensions between the United States and Iran in the Arabian Gulf and included familiar scenarios like oil tanker escort (through the Strait of Hormuz) and patrol/combat operations against the Iranian Navy.  Another set of scenarios dealt with the British Navy during the Falklands Island conflict.  Strike Fleet™ also had an option which allowed the user to participate in a structured campaign (series of scenarios) against the Soviet Navy in the Atlantic Ocean and northern European theater.  The game introduced a unique game interface that allowed the user to concentrate on high level battle group operations during a scenario, or the player could control the individual combat performance of ships and helicopters.  This feature offered the players a considerable level of fidelity within the game, especially with respect to selection of task force size and composition (before the scenario commenced), control of the radar (range, active and passive modes), course, speed, and weapons employment (guns, short-range missiles, long-range missiles, torpedoes, CIWS, chaff, helicopter sonobuoys).  Rich with detailed features and challenging opponents (who employed realistic tactics), Strike Fleet™ provided the user with a realistic game that required careful strategic thinking to win.[30]

 

                                                                                                                     Figure 6.                       Strike Fleet™ Video Game.

 

2.         Fifth Fleet™

Fifth Fleet ™, the video game, was introduced in 1994 and immediately set a standard for the accurate depiction of naval operations and realistic game play for several reasons.  First, the designers developed a different type of engine for a naval simulation and implemented a turn-based game, similar to a number of strategic board games and role-playing games.  The movement of platforms (ships, aircraft, submarines) across a map divided into equal-sized hexagonal grid units was dictated by the speed of those platforms.  Also the game more closely emulated actual naval operations by including such detailed features as smaller mission-oriented task groups, plans for coordinated air strikes and other aircraft missions, weather phenomena, and accurate logistical constraints (and consumption) which necessitated task force underway replenishment.  Second, Fifth Fleet™ was designed as if the player were the fleet commander; therefore, numerous features including automation of control of individual units and task forces (through artificial-intelligence programming) freed the user from becoming mired in repetitive, time-consuming actions.  Third, the game offered the player a variety of realistic, mature, politically-charged scenarios, which occurred in the Arabian Gulf and Indian Ocean regions and involved military forces from nineteen different countries.

    

                                                                                                                       Figure 7.                       Fifth Fleet™ Video Game.

 

Fifth Fleet™ contained an impressive breadth and depth of realistic platforms f along with highly accurate representations of the weapon systems (missiles, torpedoes, guns, etc.).  The game differentiated over one hundred different classes of surface ships and submarines and over sixty different types of aircraft.  Lastly, with the rapid expansion of Internet capability during the early-to-mid 1990s, Fifth Fleet™ introduced Internet game play for naval simulations, allowing users to move beyond simply playing against the computer and gave them the opportunity (and thrill) to challenge each other in the scenarios.[31]

3.         Harpoon:  Modern Naval Combat Simulation™ Series Video Games

The Harpoon™ Series (Harpoon 1-4™) games have been arguably the most popular games of the naval combat simulation genre, and they have spanned nearly fourteen years, with Harpoon 1™ published in 1989 and the most recent version (Harpoon 4™) arriving in March 2003.  Although the game received wide acceptance during its first incarnation, the second version, Harpoon 2™ (1994) became wildly popular (eventually becoming a video-gaming classic) when it introduced a level of realism for never before seen or since surpassed in a naval combat simulation.  The Harpoon™ game series engines were based on a realistic war-gaming and operational analysis model designed by the creator, Larry David, a former naval analyst and author.  It featured exceptionally accurate representations of platforms, weather phenomena, weapon systems, geography, friendly and opponent tactics, as well as believable scenarios and campaigns based on current and future political and/or actual conflicts.

 

 

                                                                                                           Figure 8.                       Harpoon Series™ Video Games[32].

 

The Harpoon™ games approached the control of the units and fleet from the task-force commander level to allow the player to concentrate on the strategic and operational missions and capabilities of naval operations.  To facilitate this concept, the game employed sophisticated artificial-intelligence engine (for both friendly and enemy combatants) to manage the behaviors and actions of those units realistically.  The most recent incarnation Harpoon 4 ™ contains the following carefully detailed and accurately represented features:

·                    A detailed Order of Battle with over 1,000 ships, subs, and aircraft from the United States, Soviet Union, United Kingdom, Canada, France, Netherlands, Norway, Sweden, Belgium, Finland, Denmark, China, Australia, and Japan.

·                    A highly detailed map of the Northern European region created from satellite imagery.

·                    A map display with a variety of overlays, including weapons, sensor and fuel ranges, as well as bathymetric, weather, cloud cover, threat zone, and satellite data.

·                    A detailed tactical 3D environment where players can view their ships, aircraft, and submarines at critical events.

·                    An extensive database of units, weapons, and sensors.

·                    Accurate sensor and electronic countermeasures modeling.[33]

The game also includes capabilities for Internet online game-play against other people.  The Harpoon™ Series has been considered so accurate in its representation of modern naval operations that several nation’s militaries and military-affiliated organizations have used the game as part of their training, including the United States (United States Air Force Command and Staff College, U.S. Naval Institute), Australia (Australian Department of Defense), and Brazil (Brazilian Naval War College).[34] 

4.         Summary

As discussed above, the video game industry has produced some very realistic, robust, and comprehensive naval simulation games that for many years, surpassing even some of the military’s best simulations.  Originally designed for entertainment purposes, many of these programs were developed with very accurate models of naval operations, platforms, tactics, environments, and weapon systems and have only grown more accurate over the years.  Consequently, the United States military has been one of the leading advocates and contractors for military (and military-relevant) game simulations to train its personnel, especially since the actual (live) training is usually extraordinarily expensive. 

It is precisely because of this mission that the US Military is the world’s largest spender on and user of Digital Game-Based Learning.  The military uses games to train soldiers, sailors, pilots, and tank drivers to master their expensive and sensitive equipment.  It uses games to teach mid-level officers…how to employ joint force military doctrine in battle and other situations.  It uses games to teach senior officers the art of strategy.[35]

The Harpoon ™ series of games has attained a level of accuracy so similar to actual naval operations that it has been used by the military.  However, these games have not been extensively used by the military because, despite their accurate operational models, their primary purpose is for entertainment value.  Subsequently, they lack many of the key features such as comprehensive logging of events for future analysis (among many others) needed to make them suitable and attractive for widespread employment.  The capability to review the record following a simulation or training event to formulate lessons learned and discover potential areas for improvement is one of the paramount objectives for any type of training conducted by the military.  Since the games often exclude these features, this limits their overall usefulness. 

The ADC Simulation has much in common with these games because it attempts to simulate naval operations such as the air defense of the battle group.  Some of the look, feel, and interactivity of the program’s interface was adopted from the strategic games as well as the capabilities to structure the simulation environment before commencement and modify the time compression/progression of the scenarios.  However, ADC Simulation differs from the above video games because its overall objective is to train and provide insight for military personnel into the performance of battle group air defense with an eye towards understanding the mental processes of the involved watchstanders operates to gain experience (and lessons-learned), not entertainment value. 

The wargamer [recreational user] wants a historically valid game, but also an enjoyable and entertaining experience; the military gamer wants a historically valid game, but both enjoyment and entertainment are secondary criteria…Generally, military games may be characterized by an extended learning period and an extending playing period – both of which combine to often prohibit the lessons learned because of time constraints.  Thus, certain commercial wargames can offer lessons to the military professional.  Such games offer playability, realistic lessons learned, and/or game aspects, which the military professional could adapt for his own games.[36]

We recognized that the inclusion of certain game-related features into the ADC Simulation would enhance the usability, playability, and satisfaction of the program experience and would improve the training-value of the utility.

j.         comparison AND contrast of the cruiser adc simulation program

Although there are many areas of commonality between the ADC Simulation and previous research, the simulation occupies a unique and relevant niche in the study and development of naval air defense for the following reasons, which support its usefulness reasons:

·                    It focuses on the decision-making and other mental processes of the watchstanders as a function of the operational environment in which they operate.

·                    It examines the performance of battle group air defense by studying performance of the air-defense watchstanders.

·                    It delves into the role of the critical skills necessary for the performance of the watchstanders and explores the influence that a watchstander’s various proficiency levels has on the performance of the air-defense team.

·                    It uses the data from research and interviews with air-defense experts to implement the capability to select various proficiency levels, experience levels, fatigue levels, and type of decision-maker psychology for each of the watchstanders in the simulation.

·                    It allows the user to configure the external environmental attributes for the simulation (i.e. number of contacts, scenario threat level, weather, doctrine, probability and task time settings) to determine the effects of such changes on the performance of the watchstanders.

·                    It allows the user to configure the CIC equipment operational-readiness attributes to determine the effects of such changes on the performance of the watchstander.

·                    It allows the user to watch the performance of the air-defense team over an extended period of time (using time compression) so as to examine the positive actions and mistakes the watchstanders make concerning the identification of air contacts.  The program will display ground-truth information so the user can always compare the actual situation to the perceived situation of the watchstanders.

·                    It employs a Multi-Agent System architecture to simulate the watchstanders, which provides for a realistic reproduction of human behaviors within the simulation.

·                    It allows the user to record into log files all of the actions, inputs, and outputs of each watchstander during a scenario for later analysis and review for performance anomalies or searches for chain-of-errors for incorrect air-defense identifications or engagements.

k.        research questions posed for the cruiser adc simulation program

During the development of the ADC Simulation, we attempted to gain insight into the complex interactions and influences involved in air-defense operations to determine the degree to which individual watchstander performance (skill, experience, fatigue), equipment operational readiness, and the external environment (number of contacts, weather, etc.) affected the overall performance of the ADC team.  Also, the effect the above factors cause on the performance of the individual watchstander was also explored.  The following questions were posed:

·                    What are the collective critical skills necessary for a CIC team to perform ADC duties/operations effectively?

·                    What are the individual critical skills sets necessary for the primary ADC personnel to perform their responsibilities effectively?

·                    How do you measure the collective proficiency and performance level of an individual ADC watchstander?

·                    How do you measure the collective proficiency and performance level of an individual ADC watchstander?

·                    What are the effects (positive and negative) of one CIC watchstander’s performance on another watchstation?

·                    How does the decision-making type of the ADC team leadership (F-TAO, F-AAWC) affect the overall performance of the team?

·                    How does the external environment affect the collective performance of the ADC team and the performance of the individual watchstanders?

·                    What are the maximum effective performance limits of the ADC team, collectively and individually, when the maximum outer environment stress is experienced?

·                    What influence or effect can degraded performance of critical air-defense equipment have on the performance of the ADC team, collectively and individually?

·                    What influence or effect can degraded human performance due to fatigue have on the performance of the ADC team, collectively and individually?

While conducting interviews at the AEGIS Training & Readiness Center (ATRC) Detachment in San Diego and data collection from various sources, insight was gained into the understanding of some of the question.  Many of the other questions required the completion of the ADC Simulation before they could be answered so that specific scenarios could be performed and parametric analysis conducted on the data/results.  Once this analysis was completed, a survey, consisting of scenarios based on the results of the ADC Simulation tests, was given to the experts at the ADC to determine the simulation’s realism as compared to their professional air-defense experiences.

iII.    User-Centered Design (UCD) process of the adc simulation human-computer interface (hci)

A.        need for utilization of user-centered design (ucd) process in developing computer program interfaces

Almost everyone in the Navy has a story to tell about a particular piece of hardware or a computer program, which greatly frustrated them due to its difficulty to use.  Despite the effectiveness or necessity of the equipment or software, usability issues that impeded the productivity of the user significantly hampered its utilization.  This situation seemed to reach its apex in the 1980s and early 1990s as technological innovations such as computers transformed the workplaces on naval ships, submarines, bases, and squadrons.  During this period, there were undoubtedly countless instances of systems with poor user interfaces that frequently translated into lost productivity and increased frustration by the “victims.”  However, the USS Vincennes incident, which occurred in 1988, involving the engagement and shoot-down of an Iranian commercial airliner, highlights the potential negative impact that combat systems and programs with poor usability can contribute to an already dangerous and tense situation.  Without recounting the entire situation (other sources provide a comprehensive accounting) or trivializing the other major factors involved in the incident, essentially the usability design of the Combat Information Center (CIC) consoles, were considered to have contributed negatively to the processing and dissemination of vital information to the key watchstanders. 

Fortunately, starting in the mid-to-late 1990s, with the growth of the human-computer interface design community, the importance of engineering usability into combat and information systems has increased significantly. 

The last decade of research and practice in user interface design has [created] some good models for designing user interfaces.  Getting input from users early and continuously throughout the design process, using rapid prototyping and iterative design techniques, and conducting formal usability testing are now proven methods for assuring good user interfaces.[37]

Several principles govern the HCI community when designing effective interfaces with good usability.

·                    Use Simple and Natural Dialogue

·                    Speak the User’s Language (user knowledge, level of understanding)

·                    Minimize User Memory Load

·                    Ensure Consistency throughout the Interface (improve learnability)

·                    Provide Feedback when Users Perform Actions (keep the user informed)

·                    Provide Useful and Visible Shortcuts to Improve Usability

·                    Provide Clear, Helpful Error Messages (plain language)

·                    Prevent User-Initiated System Errors by Careful Design of the Interface[38]

To ensure usability, user satisfaction, and good productivity are attained in the program, the User-Centered Design (UCD) Process was used to develop the ADC Simulation interface.  The following six phases make up the UCD Process and will be discussed in greater detail in the next sections:

·                    Phase One:       Creation of the Problem Statement

·                    Phase Two:      Conduct Requirements Gathering

·                    Phase Three:     Conceptual Design of the ADC Simulation

·                    Phase Four:      Implementation of the ADC Simulation Interface

·                    Phase Five:       Usability Analysis of ADC Simulation Interface

·                    Phase Six:         Redesign/Modification of ADC Simulation Interface

B.        ucd process phase one:  problem statement

1.         Problem Statement

The goal of this multi-agent system is to develop an autonomous agent-based artificial intelligence simulation of an AEGIS cruiser performing Battle Group Air-Defense Commander duties.

2.         Activity/Utility to Users

The resultant simulation will be used to gain insight and understanding into numerous factors that influence (positively or negatively) the effective performance of both the CIC ADC Team collectively and watchstation personnel individually.  Additionally, the simulation will allow for the exploration of team and individual watchstation performance during abnormal or high intensity/stress simulations to determine the role of skill proficiency levels in the effective execution of ADC duties.  Furthermore, this simulation will give naval war-fighters at the unit (ship) level the ability to experiment with various modifications to ADC tactical doctrine and organization to gain insight into potential effect of those changes on CIC team performance before implementing them.  Lastly, this simulation will serve as a proof of concept to the usefulness of similar simulations in training of ship personnel on various team-oriented missions/duties and CIC operations.

3.         Users

The potential users of this simulation will be training and doctrine-formulation commands, waterfront training teams, and individual combat units (ships).

4.         Criteria for Judgment

The primary criteria for judgment will be the usefulness of the simulation to the potential users.  This criterion includes the ease of setup, modification, and execution of the simulation for the desired output.

C.        ucd process phase two:  requirements gathering

1.         Needs Analysis

a.         Situation of Concern

Air Warfare is the most rapid, intense, and devastating type of warfare that the U.S. Navy currently trains for, and battle group operations are primarily focused on gaining proficiency in this mission area.  Due to the fast-pace, uncertain, and dangerous aspects of air warfare, the battle group commander’s Air-Defense Team (the AEGIS cruiser CIC Team) must be trained extensively in the fundamental tenets of these operations to effectively protect the aircraft carrier, high-value units, and other naval ships in the vicinity.  With the immense range of duties and responsibilities, there are multitudes of individual watchstation and collective skill sets that must be mastered to effectively perform the ADC duties. 

b.         Need/Utility of System

(1)        Current State.  The waterfront training teams (AEGIS Training and Readiness Center (ATRC) detachments) are charged by the fleet type commanders with providing the Air-Defense Commander (ADC) training to the cruisers, and the quality of the training they provide is typically outstanding.  However, the ADC operations are considerably complex, and the waterfront training teams are limited by the available training time as well as the scope of the training attempted.  Furthermore, a myriad of interactions (watchstander to watchstander, ship to ship, ship to aircraft, watchstander to equipment, etc.) that are part of daily operations are numerous and potential ADC team performance deficiencies may not be noticed during the limited training periods. 
(2)        Need.  The limitations of human comprehension of ADC operations due to the countless interactions places a barrier on the level, type, and quality of training that can be accomplished.  Because there are many different variables to account for in these operations, the training teams and ships can only rely on their collective past experiences as the basis for producing effective training.  This limits the potential gain of the training because training teams and ships are formulating ADC scenarios for the future based on experiences from the past.  To surpass this limitation, both groups require a system that will enable them to build scenarios, based on the current skill and training levels of the ADC team as well as the environment they will face, that will allow them to train towards more realistic threats.
(3)        Solution.  The ADC simulation will provide a solution to the problems discussed above.  After an initial assessment of the training, experience, and equipment readiness levels of a specific ship, the initial settings for the ADC team and environment can be inputted into the system.  Upon completion of the setup, the program will allow the training teams (as well as the ships) to create simulations based upon the ship’s potential operational scenarios to discover the performance deficiencies.  The training teams and ships will use the results from the simulation to provide more focused training in the areas where deficiencies were noted.  Also, the program can be employed to validate the usefulness of future scenarios intended in the training of the ships.  For the doctrine-formulation commands, this simulation will give them the opportunity to evaluate the validity of theoretical changes to ADC and AEGIS doctrine before implementing them in the fleet.

c.         Features of System

The ADC Simulation will allow for the observation and collection of data in three main categories, Individual Watchstander performance, CIC team performance, and Overall Simulation performance.

Individual Watchstanders:

 

Determine the effect of varying…

 

·                    Skill levels on a single watchstander’s performance.

·                    Experience levels on a single watchstander’s performance.

·                    Type of decision-maker (F-TAO/F-AAWC) on a watchstander’s performance.

·                    Fatigue levels on a single watchstander’s performance.

·                    Equipment operational level on a single watchstander’s performance.

·                    Contact density on a single watchstander’s performance.

·                    Contact type (hostile, unknown, etc.) on a single watchstander’s performance.

·                    Atmospheric conditions on a single watchstander’s performance.

·                    Record watchstander decisions for post-simulation review (Log).

 

CIC Team:

 

Determine the effect of varying…

 

·                    Skill levels on collective CIC team performance.

·                    Experience levels on collective CIC team performance.

·                    Type of decision-maker (F-TAO/F-AAWC) on collective CIC team performance.

·                    Fatigue levels on collective CIC team performance.

·                    Equipment operational level on collective CIC team performance.

·                    Contact density on collective CIC team performance.

·                    Contact type (hostile, unknown, etc.) on collective CIC team performance.

·                    Atmospheric conditions on collective CIC team.

·                    AEGIS doctrine on CIC team performance.

·                    ADC Battle Doctrine on CIC team performance.

 

Simulation:

 

·                    Run simulations over user-determined period of time (time compression available).

·                    Allow user to view CIC team’s contact ID process (and engagement process if applicable).

·                    Allow users to see errors made by CIC team as they happen.

·                    Allow users to interact with the CIC team agents to view current decision logs and modify various attributes.

 

2.         User Analysis

a.         Utility of the Simulation

Since this program is designed to simulate ADC operations, the pool of users will probably be restricted to the following three groups:  AEGIS waterfront training commands (ATRC detachments), AEGIS ships (ADC personnel), and AEGIS/ADC doctrine formulation commands.  For them, there are two significant benefits of this simulation listed below:

·                    The training commands and ships will employ the simulation to provide some foresight into the future performance of shipboard watch-teams under various scenarios.  The information/results gained from running these simulations will assist them in providing more focused and effective training for these watch-teams. 

·                    The doctrine formulation commands will use the simulation to conduct evaluations on potentially new/theoretical AEGIS and ADC doctrine changes to provide some data on the performance of those modifications.  This data could then be analyzed and reviewed before moving to the field-testing phase of the implementation.

b.         Collective Team Skills and Experience Required (User Characteristics)

Although a single user highly experienced in ADC, could effectively use the simulation, it is more likely that a team of users representing the various skills and watchstation backgrounds will be employed to initially set up and use the program.  The following is a list of the qualifications, skills, and experience a team, which plans on using the simulation should possess:

 

·                    Naval officers with 5 or more years of fleet experience

·                    Senior enlisted personnel (E-6 and above) with 10 or more years of experience

·                    All personnel familiar with Battle Group ADC operations

·                    Personnel familiar with the performance/conduct of the following watchstations and their requisite skills:  F-TAO, F-AAWC, S-TAO, S-AAWC, Red Crown, TIC, IDS, RSC, CSC, MSS, EWCO

·                    Personnel familiar with carrier launch & recovery air operations

·                    Personnel familiar with aircraft, flight intercept, and control operations

·                    Personnel familiar with AEGIS Core Tactical Doctrine

·                    Personnel who understand the basic operation of personal computers including Windows programs

c.         Frequency of Simulation Use

The simulation program usage will probably vary depending upon where a ship is in the training/work-up cycle.  If it is somewhere in the middle of the training cycle, it will probably be used (by the waterfront training teams and ships) fairly often (3-5 times a week) to provide information to guide the ship’s training plan.  However, if the ship has completed the training cycle and is deployed, it may be used less frequently (1-2 times a month).

3.         Task Analysis

During the preliminary design of the ADC Simulation interface, four primary tasks were identified along with several associated subtasks for each task.

 

Primary Task 1: Input Watchstander Attributes:

      Subtask 1.A:  Set Skill levels

Subtask 1.B:  Set Experience levels

Subtask 1.C:  Set Fatigue levels

Subtask 1.D:  Set Decision-maker Type levels

 

Primary Task 2:  Input Equipment Setup:

      Subtask 2.A:  Set Equipment Readiness levels

      Subtask 2.B:  Input Equipment Setup (Radar, Data Links)

 

Primary Task 3:  Input Scenario Setup

      Subtask 3.A:  Set Atmospheric Conditions

      Subtask 3.B:  Set Contact Density

      Subtask 3.C:  Set Scenario Threat level

 

Primary Task 4:  Input Doctrine Setup

Subtask 4.A:  Set ADC Battle Doctrine

Subtask 4.B:  Set AEGIS Doctrine

 

Although many of these subtasks tasks are listed individually, it is very likely that upon implementation some of these tasks will be centralized into one interface window.  For example, the subtasks in Primary Task #1 could be combined into one input window for each watchstander to simplify the interface (increase the ease-of-use) for the user.

d.        ucd process phase three:  conceptual design of adc simulation program

1.         Conceptual Design Introduction

Phase Three of the UCD Process commenced the actual definition and categorization of the critical components that comprised the ADC Simulation and was completed in the following four steps.  First, the team conducted comprehensive interviews with experienced air-defense Subject Matter Experts (SME) from the AEGIS Training and Readiness Center (ATRC) Detachment in San Diego California and the Fleet Technical Support Center Pacific (FTSCPAC) to collect data about battle group air-defense operations onboard an AEGIS cruiser.  These personnel possessed between five to fifteen years of naval air-defense experience, and all of them were considered experts in this field.  The interviews covered the following topics:

·                    Air-Defense Identification & Threat Assessment Process

·                    Battle Group Air Defense & Aircraft Operations

·                    Collective Skills Required for Effective ADC Team Performance

·                    Individual Watchstander Skills Required for Effective Performance

·                    Differences Between Skill & Experience

·                    Measures of Effectiveness (Successful Task Performance) of Skill

·                    Measures of Effectiveness of Experience

·                    Affect of Fatigue on Individual Watchstander Performance

·                    Affect of Fatigue collectively on ADC Team Performance

·                    Affect of Individual Watchstander Performance on ADC Team Performance

·                    Affect of CIC Equipment Readiness on Individual and ADC Team Performance

·                    Affect of External Environment on Individual and ADC Team Performance

·                    Classification of Different Types of Decision-makers for F-TAO & F-AAWC

·                    Affect of Different Types of Decision-makers on ADC Team Performance

·                    Affect of Watchstander Mistakes on ADC Team Performance

·                    Affect of Different Levels of Individual Watchstander Skill & Experience Proficiencies on ADC Team Performance

·                    Classifying Different Levels of Skill, Experience, and Fatigue

Following this research collection effort, the Subject Matter Experts’ data was analyzed and used to develop the conceptual foundation and structure for the design of the simulation.

Second, the fundamental components of the simulation were determined, which in this case were the agents (watchstanders) and the objects (various items in both the interface and the simulation itself).  Upon completion of this step, the attributes of the agents and the objects were ascertained and listed.  Third, the relationships between each agent and the other agents and objects in the simulation was explicitly defined with the same task performed for each object.  Lastly, utilizing the information from determining the relationship among agents and objects, all of the actions (for each agent and object) were defined.  When this process was finished, the team had generated a well-defined, comprehensive, high-level view of the interrelationships, interactions, and processes that would occur in the ADC Simulation, which simplified the development of the actual prototype discussed in Section D, UCD Process Phase Four and helped to reduce a number of potential user-interface errors.  Following is the conceptual design of the simulation that was used to produce the first program interface (Section E).  After the Phase Five Usability Analysis was completed, minor adjustments were made to the conceptual design, which were reflected in the subsequent program interface displayed in Section G.

2.         Conceptual Design

a.         Agents

·                    Force Tactical Action Officer (F-TAO)

·                    Ship Tactical Action Officer (TAO)

·                    Force Anti-Air Warfare Coordinator (F-AAWC)

·                    Ship Anti-Air Warfare Coordinator (AAWC)

·                    Combat Systems Coordinator (CSC)

·                    Radar Systems Coordinator (RSC)

·                    Missile Systems Supervisor (MSS)

·                    Red Crown Watchstander (RC)

·                    Electronic Warfare Control Officer (EWCO)

·                    Tactical Information Coordinator (TIC)

·                    Identification Supervisor (IDS)

b.         Objects

·                    Simulation Scenario

·                    Simulation Interface:  Shortcut Control Buttons

·                    Simulation Interface:  Tactical Display

·                    Simulation Interface:  Tactical Display Contact Icons  (Air, Surface)

·                    Simulation Interface:  Contact Display

·                    Simulation Interface:  CIC Agent Display

·                    Simulation Interface:  CIC Agent Display Icons

·                    Simulation Interface:  Agent Attributes Display

·                    Simulation Interface:  Menu Bar

·                    Simulation Interface:  CIC Equipment Display Icons

·                    Simulation Interface:  CIC Equipment Pop-up Menu

·                    Simulation Interface:  Contact Pop-up Menu

·                    CIC Equipment (various types)

·                    Simulation Interface:  Agent Pop-up Menu

·                    Agent Decision History Log (one for each agent)

·                    Equipment Status Log (one for each piece of equipment)

·                    Scenario Event Log (one for each scenario executed)

·                    Contacts (Air, Surface)

 

 

 

 

 

c.         Necessary Attributes of Agents

Decision/Psych Profile:              Type:

 

Attribute Sets

Proficiency Levels for Attributes

Type of Decision-Maker

Aggressive, Balanced, Reserved

Situation Assessment

Expert, Experienced, Basic

Tactical Situation Maintenance

Expert, Experienced, Basic

Communication

Expert, Experienced, Basic

Information Management

Expert, Experienced, Basic

AD Battle Doctrine

Expert, Experienced, Basic

Combat Leadership

Expert, Experienced, Basic

Platform Knowledge

Expert, Experienced, Basic

Watch Experience Level

Expert, Experienced, Newly Qualified

Fatigue Level

Rested/Alert, Tired, Exhausted

 

                                                                                                                                                    Table 1.            F-TAO.

 

Attribute Sets

Proficiency Levels for Attributes

Situation Assessment

Expert, Experienced, Basic

Tactical Situation Maintenance

Expert, Experienced, Basic

Communication

Expert, Experienced, Basic

Information Management

Expert, Experienced, Basic

AD Battle Doctrine

Expert, Experienced, Basic

Combat Leadership

Expert, Experienced, Basic

Platform Knowledge

Expert, Experienced, Basic

Watch Experience Level

Expert, Experienced, Newly Qualified

Fatigue Level

Rested/Alert, Tired, Exhausted

 

                                                                                                                                                        Table 2.            TAO.

 

Decision/Psych Profile:              Level:

 

Attribute Sets

Proficiency Levels for Attributes

Type of Decision-Maker

Aggressive, Balanced, Reserved

Situation Assessment

Expert, Experienced, Basic

Tactical Situation Maintenance

Expert, Experienced, Basic

Communication

Expert, Experienced, Basic

Information Management

Expert, Experienced, Basic

AD Battle Doctrine

Expert, Experienced, Basic

Combat Leadership

Expert, Experienced, Basic

Platform Knowledge

Expert, Experienced, Basic

Watch Experience Level

Expert, Experienced, Newly Qualified

Fatigue Level

Rested/Alert, Tired, Exhausted

 

                                                                                                                                                Table 3.            F-AAWC.

 

Attribute Sets

Proficiency Levels for Attributes

Situation Assessment

Expert, Experienced, Basic

Tactical Situation Maintenance

Expert, Experienced, Basic

Communication

Expert, Experienced, Basic

Information Management

Expert, Experienced, Basic

AD Battle Doctrine

Expert, Experienced, Basic

Combat Leadership

Expert, Experienced, Basic

Platform Knowledge

Expert, Experienced, Basic

Watch Experience Level

Expert, Experienced, Newly Qualified

Fatigue Level

Rested/Alert, Tired, Exhausted

 

                                                                                                                                                    Table 4.            AAWC.

 

Attribute Sets

Proficiency Levels for Attributes

Situation Assessment

Expert, Experienced, Basic

Tactical Situation Maintenance

Expert, Experienced, Basic

Communication

Expert, Experienced, Basic

Information Management

Expert, Experienced, Basic

Systems Troubleshooting

Expert, Experienced, Basic

AEGIS Doctrine Employment

Expert, Experienced, Basic

Platform Knowledge

Expert, Experienced, Basic

Watch Experience Level

Expert, Experienced, Newly Qualified

Fatigue Level

Rested/Alert, Tired, Exhausted

 

                                                                                                                                                        Table 5.            CSC.

 

Attribute Sets

Proficiency Levels for Attributes

Situation Assessment

Expert, Experienced, Basic

Tactical Situation Maintenance

Expert, Experienced, Basic

Radar EM Fundamentals

Expert, Experienced, Basic

Atmospheric/Environmental

Expert, Experienced, Basic

Radar Sensitivity Calibration

Expert, Experienced, Basic

Radar Power Level Calibration

Expert, Experienced, Basic

Radar System Troubleshooting

Expert, Experienced, Basic

Communication

Expert, Experienced, Basic

Radar Jamming Evaluation

Expert, Experienced, Basic

Radar Land/Sea Interface Cal.

Expert, Experienced, Basic

AEGIS Core Doctrine

Expert, Experienced, Basic

Watch Experience Level

Expert, Experienced, Newly Qualified

Fatigue Level

Rested/Alert, Tired, Exhausted

 

                                                                                                                                                        Table 6.            RSC.

 

 

 

 

 

 

Attribute Sets

Proficiency Levels for Attributes

Missile Systems Employment

Expert, Experienced, Basic

Situation Assessment

Expert, Experienced, Basic

CIWS Employment

Expert, Experienced, Basic

Missile/CIWS Troubleshooting

Expert, Experienced, Basic

Communication

Expert, Experienced, Basic

Watch Experience Level

Expert, Experienced, Newly Qualified

Fatigue Level

Rested/Alert, Tired, Exhausted

 

                                                                                                                                                        Table 7.            MSS.

 

Attribute Sets

Proficiency Levels for Attributes

Communication

Expert, Experienced, Basic

Aircraft Control

Expert, Experienced, Basic

Carrier Operations

Expert, Experienced, Basic

IFF System Operation

Expert, Experienced, Basic

Watch Experience Level

Expert, Experienced, Newly Qualified

Fatigue Level

Rested/Alert, Tired, Exhausted

 

                                                                                                                                              Table 8.            Red Crown.

 

Attribute Sets

Proficiency Levels for Attributes

Situation Assessment

Expert, Experienced, Basic

Tactical Situation Maintenance

Expert, Experienced, Basic

Radar EM Fundamentals

Expert, Experienced, Basic

Atmospheric/Environmental

Expert, Experienced, Basic

ES Equipment Operation

Expert, Experienced, Basic

ES Analysis/Classification

Expert, Experienced, Basic

Equipment Troubleshooting

Expert, Experienced, Basic

Communications

Expert, Experienced, Basic

Watch Experience Level

Expert, Experienced, Newly Qualified

Fatigue Level

Rested/Alert, Tired, Exhausted

 

                                                                                                                                                    Table 9.            EWCO.

 

Attribute Sets

Proficiency Levels for Attributes

Link Equipment Operation

Expert, Experienced, Basic

B.G. Link Equip Knowledge

Expert, Experienced, Basic

Link Communication

Expert, Experienced, Basic

Link Coordination

Expert, Experienced, Basic

Link Resolution

Expert, Experienced, Basic

Watch Experience Level

Expert, Experienced, Newly Qualified

Fatigue Level

Rested/Alert, Tired, Exhausted

 

                                                                                                                                                       Table 10.          TIC.

Attribute Sets

Proficiency Levels for Attributes

Information Input

Expert, Experienced Basic

IFF Challenge

Expert, Experienced Basic

Query & Warning Evaluation

Expert, Experienced Basic

Communications

Expert, Experienced Basic

Watch Experience Level

Expert, Experienced, Newly Qualified

Fatigue Level

Rested/Alert, Tired, Exhausted

 

                                                                                                                                                       Table 11.          IDS.

 

d.         Necessary Attributes of Objects

(1)        Simulation Scenario.

·                    Atmospheric Conditions (weather conditions, temperature)

·                    Contact Density

·                    Scenario Threat Level

·                    Contact Arrival Rate

·                    Hostile/Unknown Contact Aggressiveness Level

·                    AEGIS Doctrine

·                    Air-Defense Doctrine

 

(2)        Simulation Interface:  Shortcut Control Buttons Display.

·                    Start/Continue Simulation Button

·                    Pause Simulation Button

·                    Stop Simulation Button

·                    Increase Time Compression

·                    Decrease Time Compression

 

(3)        Simulation Interface:  Tactical Display.

·                    Air, Surface Contacts (clickable)

 

(4)        Simulation Interface:  Tactical Display Contact Icons.

·                    Contact Attributes (specific to the contact)

 

 

 

 

 

(5)        Simulation Interface:  Contact Data Display.

·                    Contact Data Display Window

·                    Data for this display via left mouse button click on a Tactical Display Contact Icon

 

(6)        Simulation Interface:  CIC Agent Display

·                    CIC Agent Icons (clickable)

·                    CIC Equipment Icons (clickable)

 

(7)        Simulation Interface:  CIC Agent Display Icons (Agents).

·                    Agent Attributes (specific to the agent)

 

(8)        Simulation Interface:  Agent Attributes Display.

·                    Agent Attribute Display Window

·                    Data for this display via left mouse button click on an Agent Display Icon

 

(9)        Simulation Interface:  Menu Bar

·                    Scenario Utilities

·                    Watchstander Attributes

·                    CIC Equipment Setup

·                    Scenario External Attributes

·                    Doctrine Setup

 

(10)      Simulation Interface:  CIC Equipment Display Icons (Equipment)

 

·                    Equipment Status (specific to the equipment)

 

(11)      Simulation Interface:  Agent Pop-up Menu (mouse right button click)

 

·                    Display Agent Decision History Log

·                    Modify Agent Attributes

 

 

(12)      Simulation Interface:  CIC Equipment Pop-up Menu (mouse right button click)

 

·                    Display CIC Equipment History Log

·                    Modify CIC Equipment Attributes/Status

 

(13)      Simulation Interface:  Contact Pop-up Menu (mouse right button click)

 

·                    Modify Contact Type/Attributes

 

(14)      CIC Equipment (various types)

·                    Types of Equipment

·                    SPY-1B Radar

·                    Link 16 Tactical Data System

·                    Link 11 Tactical Data System

·                    SLQ-32 System, OJ-451 CIC Consoles

·                    Readiness Levels (for each type)

·                    Fully Operational

·                    Partially Degraded

·                    Severely degraded

·                    Non-operational

 

(15)      Agent Decision History Log (one for each agent)

·                    History Log

 

(16)      Equipment Status Log

·                    Status/History Log

 

(17)      Scenario Event Log

·                    Log of major events in scenario

 

 

 

 

 

 

(18)      Contacts

·                    ***CIC-perceived/assigned Data***

·                    Contact # - The simulation assigned index number for the contact

·                    Track # - The CIC/Agent assigned index number

·                    Classification - Hostile, Suspect, Unknown, Neutral, Friendly

·                    Speed - Measured in Nautical miles per hour

·                    Course - Measured in degrees true (0-359)

·                    Bearing - Measured in degrees true (0-359)

·                    Altitude - Measured in feet above sea level

·                    ES Emissions - specific electronic equipment signal emissions

·                    Type of Contact - (air, surface)

·                    Specific platform - (Mig-27, F-14, patrol boat, destroyer)

·                    ***Actual Data***

·                    Contact # - The simulation assigned index number for the contact

·                    Track # - The CIC/Agent assigned index number

·                    Classification - Hostile, Suspect, Unknown, Neutral, Friendly

·                    Speed - Measured in Nautical miles per hour

·                    Course - Measured in degrees true (0-359)

·                    Bearing - Measured in degrees true (0-359)

·                    Altitude - Measured in feet above sea level

·                    ES Emissions - specific electronic equipment signal emissions

·                    Type of Contact - (air, surface)

·                    Specific platform - (Mig-27, F-14, patrol boat, destroyer)

e.         Agent Relationship

·                    Each agent has a set of watchstander attributes that can be set/modified in the Watchstander Attributes menu.

·                    Each agent has a one-to-one relation with the Agent Icons in the Agent Attribute Display and CIC Agent Display.

·                    Each agent has a one-to-one relation with one Decision History Log.

·                    Each agent has a zero-to-many relation with contacts (processing contacts).

·                    Each agent’s decision history log has one associated pop-up menu to display the log.

·                    Each agent has a one-to-many relation with the CIC equipment.

·                    Each agent has a one-to-many relation with other agents (communications).

·                    Each agent has a one-to-one relation with an associated pop-up menu.

·                    Each Simulation Scenario contains a set of CIC watchstander agents.

f.          Object Relationships

·                    Each contact has a one-to-one relation with the Contact Data Display.

·                    Each contact has a one-to-one relation with the Tactical Display.

·                    Each contact has a one-to-many relation with agents (processed by agents).

·                    Each contact has a one-to-one relation with the Tactical Display Icons.

·                    Each contact has a one-to-many relation with CIC equipment (processed by equipment).

·                    Each piece of CIC equipment has a one-to-one relation with CIC Agent Display.

·                    Each piece of CIC equipment has a one-to-one relation with a CIC Agent Display Equipment Icon.

·                    Each piece of CIC equipment has a one-to-one relation with the Equipment Status Log.

·                    Each Equipment Status Log has a one-to-one relation with an associated pop-up menu.

·                    Scenario Log has a one-to-one relation with an associated scenario.

·                    Each Simulation Scenario Object contains the following objects:

·                    Shortcut Control Buttons Display

·                    Tactical Display

·                    Contact Data Display

·                    CIC Agent Display

·                    Agent Attributes Display

·                    Menu Bar

·                    CIC Agent Display:  Equipment Icons (one for each watchstation)

·                    CIC Agent Display:  Agent Icons (one for each agent)

·                    Tactical Display:  Contact Icons (one for each contact object)

·                    Agent Decision Log (one for each agent)

·                    CIC Equipment Status Log (one for each piece of equipment)

·                    Scenario Event Log (one for each Scenario executed)

·                    Contacts (multiple numbers)

·                    Agent Pop-up Menu (associated with a selected CIC agent)

·                    Contact Pop-up Menu (associated with a selected contact)

·                    CIC Equipment Pop-up Menu (associated with a specific piece of equipment)

g.         Actions on Agents and Objects

·                    Agents:  Change Attributes

·                    Shortcut Control Button Display:  Select/Deselect Start/Continue Sim., Pause Sim., Stop Sim., Increase Time Compression, Decrease Time Compression buttons

·                    Tactical Display:  Display, Move, & Delete Contact(s)

·                    Agent Attributes Display:  Display Agent Attribute Data

·                    Agent Pop-up Menu:  Display Agent Decision History Log, Change Agent Attributes

·                    CIC Equipment Pop-up Menu:  Change Setup, Display CIC Equipment Status Log

·                    CIC Equipment:  Change equipment readiness & setup (via CIC Equipment Pop-up Menu)

·                    Contact Pop-up Menu:  Display/Change Contact Attributes

·                    Contacts:  Change Attributes (via Contact Pop-up Menu)

·                    Menu Bar:  Open, Close, Create, Save, Start/Continue, Pause, Stop Scenarios, Increase/Decrease Scenario Time Compression

·                    Agent Decision History Log:  Append Decision Data, Delete Log

·                    CIC Equipment Status Log:  Append Status Data, Delete Log

·                    Scenario Event Log:  Append Scenario Event Data, Delete Log

3.         Visual Design

During the Phase Three development of the ADC Simulation, preliminary pencil sketches of the simulation interface were designed.  These sketch designs covered the initial interfaces for the main simulation interface window as well as all of the expected menus and pop-up input menus that would appear from the selection of menu items.  “The idea here is to get something visible early.  Sketches, of both screens and of task flows, are useful as a first step for getting quick feedback.”[39]  These designs were presented to personnel experienced with combat information center air-defense operations to collect feedback early in the simulation development process.  Using pencil drawings implies to the reviewers that the interface design is still in the preliminary stages and thus constructive comments are more easily obtained (preventing resistance due to fear of insulting the developer).  Six sketch designs were created with one listed below, and the other five drawings are displayed in Appendix A.  The results of the initial design sketches are provided in the following section.

 

                                                                       Figure 9.                       Preliminary Conceptual Sketches of ADC Simulation GUI.

 

4.         Early Analysis

Two experienced reviewers were selected to review the preliminary sketches and provide feedback regarding the effectiveness of the ADC Simulation interface design.  Their comments are listed below.

 

 

a.         Reviewer #1 Comments

(1)        Recommendations.

·                    Tactical Display:  Highlight a contact when it has been selected with the mouse.

·                    CIC Watchstander Display:  Highlight agent or watchstation/equipment when it has been selected with the mouse.

·                    Since the Tactical Display is the centerpiece of the simulation, prevent pop-up windows from displaying on top of it.

·                    Employ auditory cues to alert the user when unusual or anomalous events occur (i.e. misidentified contact, cruiser shoots a missile) to ensure the user’s attention is focused on the associated situation.

·                    Implement a sub-window at the bottom of the simulation screen that displays the top three most important contacts of interest.

·                    Use tool-tips to display the contact’s track number and actual (as opposed to the CIC perceived) basic data such as altitude, speed, and course.

(2)        Comments on Recommendations.

·                    Will implement.

·                    Will implement.

·                    Will attempt to implement.  Controlling the location of a pop-up window is not always possible.

·                    Will implement.

·                    Most likely will not implement in this form.  The symbols used in the simulation distinguish among hostile, friendly, and unknown contacts as well as whether they are surface or air contacts.  Additionally, the hostile, unknown, and red contact types will be classified by color, red, yellow, and blue, respectively.  If the recommended feature is implemented, it will probably be used to list the hostile contacts for ease of reference.

·                    Will implement.

b.         Reviewer #2

(1)        Recommendations.

·                    If appropriate, change the “Scenario Utilities” menu name to “File” since most of the menu’s actions are similar to options found in most Microsoft “File” menus.  This will enhance the understandability and learnability of the simulation.

·                    Shortcut Control Button Display:  Add a display to show the current time compression ratio.

·                    When the simulation program is initially loaded, implement a default setting for all of the watchstander attributes, CIC equipment settings, scenario external attributes, and doctrine setup so the user can run the program immediately.  (Currently, the design is for the user to manually configure all of these features before running the simulation, or they will receive an error prompting them to complete the task.)

·                    Use the Java Help Set API to organize help information throughout the simulation.

(2)        Comments on Recommendations.

·                    Will take this under consideration.  Although this is an appealing recommendation, there are some options under that menu which did not lend themselves to the typical “File” menu actions.

·                    Will implement with one possible modification.  The “Time Compression Ratio” display may be placed underneath the simulation “time elapsed” display to allow the user one place to look for time-related information.

·                    Will implement with two modifications.  First, upon attempting to run the simulation for the first time (with the default settings), the user will be asked whether they he or she would like to re-configure the scenario settings.  The second modification will be to allow the user to use a setup wizard to configure the scenario to desired settings.

·                    Will implement.  This recommendation originated from a discussion of involving the implementation of a “Help/Amplification” button on the Watchstander Attributes, CIC Equipment Setup, Scenario External Attributes, and Doctrine Setup menu-item pop-up input windows to provide the user some amplifying information concerning the various setting options (i.e. Basic, Experienced, Expert).  We intended to include this capability in the simulation, and Reviewer #2 recommended that this capability could be further organized in a larger simulation “Help Feature” utilizing the Java Help Set API.

e.         ucd process phase four:  adc simulation interface implementation

During Phase Four, we used the design sketches to implement a working prototype of the ADC Simulation interface.  The prototype was developed using the Java language, and was a key initial component in the building of the entire simulation program.

 

                                                 Figure 10.                     Early Implementation of ADC Simulation GUI before Usability Analysis.

 

f          ucd process phase five:  usability analysis of adc SIMULATION interface

1.         Usability Analysis Introduction

The Usability Analysis phase is an essential (and often ignored) portion of a software or hardware system evaluation and, as discussed earlier, can often lead to more profound problems later for the users of the system.  A working, interactive prototype of the ADC Simulation interface was developed (see t Figure 3) for the evaluation of the system for several reasons. 

[First,] building the prototype forces critical thinking about details of the interface, brining to the surface issues that are not obvious when looking at static screens.  [Second,] live demos of the prototype are important for getting buy-in for your design…[Third,] the prototype can also, of course, provide valuable usability data to feed the iterative design process.  Finally, the prototype itself . . . becomes part of your user interface requirements.[40]

Subject Matter Experts (SME) from the AEGIS Training and Readiness Center (ATRC) Detachment in San Diego, California, were selected to evaluate the ADC Simulation interface thoroughly. 

Before the evaluation occurred, we generated a comprehensive list of common tasks that a potential user of the system would need to perform to test the usability of the interface, and these tasks were then evaluated by the team.  This pre-test was conducted to calculate preliminary performance data that was used to construct the task list data-recording sheet.  Two types of task attributes of the evaluators were analyzed:  (1) initial performance of certain tasks and (2) the learnability of the system.  The Initial Performance attribute tested the evaluators ability to perform a task based on the intuitiveness and comparative familiarity of the interface with other previously experienced (and possibly similar interfaces) and with generally “seems like the logical action to take” to complete the task.  The Learnability attribute examined the level of ease or difficulty required to learn how to use the interface.  This attribute was measured by prompting the evaluators to perform certain tasks either similar in some manner or identical to previously performed tasks in earlier in the session.  To capture the performance of the evaluators, two metrics were selected when they conducted a task, (1) total time to complete the task and (2) number of errors while performing the task.  Prior to commencing a trial, each SME was informed the objective of the evaluation was to test the overall usability of the ADC Simulation and that their performance (whether positive or negative) was indicative of the system’s “user-friendliness” (or lack thereof), not a measure of their personal skill or intelligence.  This statement was given primarily to set the evaluator at ease so they would provide the maximum amount of feedback concerning the interface.

2.         Task List Overview

The following set of tasks was part of the evaluation of the simulation.  A majority of the tasks require the user to set various attributes of the simulation program via either the menu bar options or icons and buttons in the GUI.  These tasks are representative of the majority of the tasks that will be performed by the user when running the fully operational simulation.  Following the SME’s evaluation of the ADC Simulation, they were given a survey to rate the usability of the interface and were provided a post-testing feedback session to discuss the design of the interface with the developer.  The results of the usability analysis are listed below and in Appendix B (detailed data and comparison charts/graphs).

TASK #

TASK NAME

1

Open Scenario Menu

2

Open Watchstander Attributes Menu

3

Open CIC Equipment Setup Menu

4

Open Scenario Doctrine Setup Menu

5

Open Scenario External Attributes Menu

6

Open Simulation Logs Menu

7

Change the Maximum time it takes a Watchstander to complete a Task

8

Select a contact to display data in the Contact Data Display Window

9

Select the F-TAO watchstander to display data in the Agent Attributes Window

10

Open a contact’s Pop-up Options Window

11

Open the F-TAO Pop-up Options Window

12

Increase the Time Compression of the Simulation

13

Pause the Simulation

14

Pause the Simulation (Alternate method)

15

Set the Situation Assessment Skill Level to Expert for the Force TAO (F-TAO)

16

Set the Fatigue Level to Exhausted for the RSC

17

Set the SPY-1B Radar Equipment Readiness Level to Non-Operational

18

Set the ADC Doctrine Query Range to 30 NM & Warning Range to 20 NM

19

Set the Scenario Threat Level to Red

20

Open the Scenario Event Log

21

Open the SLQ-32 System Status Log

22

Set the Performance Probabilities Watchstander Fatigue levels to (0.5, 0.7, 0.9)

23

Change the Maximum time for the F-TAO Watchstander to complete a task

24

Change the speed of the Hostile Air contact to 500 KTS

25

Change the F-AAWC Experience Attribute to Expert

26

Change the Link Equipment Status to Partially Degraded

 

                                                                                                                                           Table 12.          List of Tasks

 

3.         Subject Profile

The subjects for this study came from the AEGIS Training & Readiness Center (ATRC) Detachment, San Diego, CA.  The evaluation of the AEGIS Cruiser Combat Information Center (CIC) Air-Defense Simulation was conducted on 12-13 September 2002 at the ATRC Detachment.  The subjects’ air-defense experience ranged between 10 to 20 years, and their ranks spanned E-7 (Chief Petty Officer) to O-3 (Lieutenant).

·                    Subject #1:  Chief Petty Officer/E-7 (Operations Specialist)

·                    Subject #2:  Senior Chief Petty Officer/E-8 (Operations Specialist)

·                    Subject #3:  Senior Chief Petty Officer/E-8 (Fire Control Technician)

·                    Subject #4:  Chief Petty Officer/E-7 (Operations Specialist)

·                    Subject #5:  Lieutenant/O-3 (Surface Warfare Officer/Prior Enlisted)

4.         Data Collection

A set of twenty-six tasks was formulated as part of the evaluation of the CIC Air-Defense Simulation GUI.  These tasks ensured the subjects interacted with all of the major aspects of the simulation to collect a comprehensive set of data about user performance.  Two performance metrics were recorded during the evaluation process:  number of errors committed while performing the task and total time to successfully complete the task.

TASK #

USABILITY ATTRIBUTE

VALUE TO MEASURE

1

Initial Performance

# of Errors

Length of time to successfully complete task

2

Initial Performance

Length of time to successfully complete task

3

Initial Performance

Length of time to successfully complete task

4

Initial Performance

# of Errors

5

Initial Performance

Length of time to successfully complete task

6

Initial Performance

# of Errors

7

Initial Performance

# of Errors

Length of time to successfully complete task

8

Initial Performance

Length of time to successfully complete task

9

Initial Performance

# of Errors

10

Initial Performance

# of Errors

11

Initial Performance

Length of time to successfully complete task

12

Initial Performance

# of Errors

Length of time to successfully complete task

13

Initial Performance

# of Errors 

Length of time to successfully complete task

14

Initial Performance

# of Errors

Length of time to successfully complete task

15

Learnability

Length of time to successfully complete task

16

Learnability

# of Errors

17

Learnability

# of Errors

18

Learnability

Length of time to successfully complete task

19

Learnability

Length of time to successfully complete task

20

Learnability

# of Errors

21

Learnability

Length of time to successfully complete task

22

Learnability

# of Errors

Length of time to successfully complete task

23

Learnability

# of Errors

24

Learnability

# of Errors

Length of time to successfully complete task

25

Learnability

# of Errors 

Length of time to successfully complete task

26

Learnability

# of Errors 

Length of time to successfully complete task

 

                                                                                                                   Table 13.          Usability Analysis Attributes.

5.         Analysis of Task Data

For each task, either the primary and secondary measurement values or two primary measurement values are provided.  In the case of the former set of measurement values, the primary value has the best case, worst case, and target level for the measurements included with the average value for that measurement.  For the latter set of measurement values, they both include the best cases, worst cases, and target levels for those measurements.  Following each summary table are comment blocks for noteworthy errors and memorability/learnability issues that were encountered during the evaluations.  The best case, worst case, and target levels for number of errors and times to complete tasks were determined during the initial development of the task list.  Listed in Appendix B is a summary breakdown of the key data collected from the five subjects’ evaluations of each task they were requested to perform. 

6.         Analysis of Subject Evaluation Surveys

After each session, the subject was given a survey to record his evaluation of the usability of the simulation.  Based on the results of the surveys, the subjects generally evaluated the simulation’s interface favorably.  The survey was divided into the following four categories:

·                    Screen Layout

·                    Overall Display Layout relative for menu-bars and pop-up menus

·                    Menu Location & Wording

·                    Task Completion

a.         Screen Layout           

The average survey scores ranged between 3.8 and 4.4 out of a scale of 5, which indicated the subjects generally felt the simulation’s Screen Layout was between “Acceptable” and “Best Possible.”  The individual subjects’ breakouts are displayed in Figure 18, but out of a total of forty possible selections for this category (eight per subject for 5 subjects), thirteen (13) were rated with a score of 5.0, sixteen (16) with a score of 4.0, and eleven (11) with a score of 3.0.  There were no areas rated below 3.0.

b.         Overall Display Layout Relative for Menu-Bars and Pop-Up Menus

The average survey scores ranged between 4.0 and 4.4 out of a scale of 5, which indicated the subjects generally felt the simulation’s Overall Display Layout was near “Best Possible.”  The individual subjects’ breakouts are displayed in Figure 20, but out of a total forty possible selections for this category (eight per subject – 5 subjects), fifteen (15) were rated with a score of 5.0, fifteen (15) were rated with a score of 4.0, and ten (10) with a score of 3.0.  There were no areas rated below 3.0.

c.         Menu Location and Wording

The average survey scores were 3.8 out of a scale of 5, which indicated the subjects generally felt the simulation’s Menu Location & Wording was between “Acceptable” and “Best Possible”, but closer to the “Acceptable” middle value.  The individual subjects’ breakouts are displayed in Figure 22, but out of a total fifteen possible selections for this category (three per subject for 5 subjects), six (6) were rated with a score of 5.0 and nine (9) with a score of 3.0.  There were no areas rated below 3.0.  The lower scores in this category are possibly due to some of the difficulty a couple of the subjects encountered when trying to perform tasks involving the selection of menus (regular & pop-ups) that were not intuitive for them.  Details for some of these difficulties were discussed in the “Analysis of Task Data” and “Overall Simulation Analysis” sections in Appendix B, and their remedies provided in the “Recommendations” section below.

d.         Ease of Performance of the Task Completion List

The average survey scores ranged between 3.6 and 4.2 out of a scale of 5, which indicated the subjects generally felt the ease of performance of the simulation’s Task Completion list was between “Acceptable” and “Best Possible.”  The individual subjects’ breakouts are displayed in Figure 24, but out of a total twenty-five possible selections for this category (five per subject for 5 subjects), eight (8) were rated with a score of 5.0, seven (7) were rated with a score of 4.0, and ten (10) with a score of 3.0.  There were no areas rated below 3.0.  Again, the lower scores in this category are possibly due to some of the difficulty a couple of the subjects encountered when trying to perform tasks involving the selection of menus (regular & pop-ups) that were not intuitive for them. 

 

 

 

7.         Recommendations

After each evaluation session, the events of the session were reviewed with the participant, and requests were solicited for recommendations to improve the usability of the simulation.  The subjects provided the following recommendations:

a.         Subject #1

·                    Place information sub-window displays (i.e. Contact Data & Watchstander Attribute Displays) on one side of the screen and action/interface displays (i.e. CIC Agent & Shortcut Control Button Displays) on the other side.

·                    Change the “Watchstander Tasks & Skills” menu to another name (“Task & Skill Modifiers” recommended) to prevent confusion with the “Watchstander Attributes” menu.

·                    Upgrade the CIC Agent Display icons to have all of the “Watchstander Attribute” options from the menu bar in the icon’s pop-up menu.

b.         Subject #2

·                    On the Shortcut Control Button Display and “File” menu, change the usage of the term “Simulations” to “Scenario” to promote increased familiarity.  This term is more recognizable/understandable to the potential users of the system.

·                    Upgrade the CIC Agent Display icons to have all of the “Watchstander Attribute” options from the menu bar in the icon’s pop-up menu.

·                    Implement a “Zoom In/Out” feature for the map in the Tactical Display.

c.         Subject #3

·                    Implement a “Zoom In/Out” feature for the map in the Tactical Display.

·                    Increase the font size in the Simulation Interface.

·                    Rename the “Start/Continue Simulation” button on the Shortcut Control Button Display and in the “File” menu to “Run/Continue” to prevent confusion with “Open Scenario.”

·                    Upgrade the CIC Agent Display icons to have all of the “Watchstander Attribute” options from the menu bar in the icon’s pop-up menu.

d.         Subject #4

·                    Rename the “Watchstander Tasks & Skills” menu to another name (“Probabilities & Tasks” recommended) to prevent confusion with the “Watchstander Attributes” menu.

·                    Implement an optional “Simulation Setup Wizard” feature to assist with the configuration of scenarios.

 

 

e.         Subject #5

·                    Rename the “Watchstander Tasks & Skills” menu to another name (“Probabilities & Tasks” recommended) to prevent confusion with the “Watchstander Attributes” menu.

·                    Upgrade the CIC Agent Display icons to have all of the “Watchstander Attribute” options from the menu bar in the icon’s pop-up menu.

g.        ucd process phase six:  interface modification/redesign

Phase Six of the UCD process involved the modifying of the Air-Defense Simulation interface to implement user-design alterations.  Eligible modifications were drawn from the quantitative data (charts and graphs) derived from the usability analysis as well as from the qualitative comments provided by the Subject Matter Experts.  The figure below displays the updated program interface following the changes.

 

                                                                    Figure 11.                     Updated ADC Simulation GUI following Usability Analysis.

 

iV.     description of the adc simulation program design AND structure

A.        program language and SYSTEM REQUIREMENTS for adc simulation

The ADC Simulation was written in the Java Language (Java Development Kit Version 1.3.1) and was developed using the JBuilder 5© Application Development Environment.  The simulation was designed to run on a system with the following requirements:

·                    Pentium 3 or equivalent and higher.

·                    Minimum 256 megabytes of RAM (512 megabytes preferred).

·                    A system with Java Development Kit Version 1.3.1 or higher installed.

·                    Screen display of 1600 x 1200 pixels.

The processing power and memory requirements are emphasized because the ADC Simulation is a multithreaded program, which places substantial demands on the computer system.  Multithreading is a feature of the Java Language, which allows various components in a program (in this case the ADC Simulation) to employ time division multiple access or timesharing on a computing systems resources (single process and memory) to perform multiple tasks in a simulated parallelism.  This capability of the Java language was essential to the development of the ADC Simulation because the program attempts to emulate a human activity and process that occurs in parallel.

B.        discussion about multi-agent systems

The ADC Simulation watchstanders were implemented using a multi-agent system (MAS) technology where each was designed as an “agent”.  Within the context of this simulation, an agent is a component of software that:

·                    Is capable of acting in an environment;

·                    Can communicate directly with other agents;

·                    Is driven by tendencies (in the form of individual objectives or of a satisfaction/survival function which it tries to optimize);

·                    Possesses resources of its own;

·                    Is capable of perceiving its environment (but to a limited extent);

·                    Has only a partial representation of this environment;

·                    Possesses skills and can offer services;

·                    Has behavior that tends toward satisfying its objectives, taking account of the resources and skills available to it and depending on its perception, its representations and the communications it receives.[41]

The watchstander agents in the ADC Simulation contain intent and objectives (perform their assigned duties), communicate amongst each other to achieve their objectives, and possess resources (skill, experience, fatigue, and decision-maker type attributes as well as combat systems equipment).  They perceive their environment to a limited extent since each watchstander agent either receives this information via combat systems sensory equipment or through verbal communications (from other watchstander agents) or CIC watchstation information display systems.   The watchstander agents offer services to each other by disseminating information vital to their performance of air-defense duties and operations of the CIC, and influence within the environment (and other agents) through their actions (i.e. Force TAO classification of aircraft as Hostile). 

MAS technology is a blending of the cognitive/social sciences (psychology, ethology, sociology, philosophy), the natural sciences (ecology, biology), and the computer sciences since they simultaneously model, explain, and simulate natural phenomena (in this case human behavior in the ADC Simulation) and provide models for self-organization.[42]  Traditionally programming is often very mechanistic, hierarchical, and modular and, subsequently, does not lend itself well to simulating the often surprising (whether organized or chaotic) behavior of interactive human and environmental systems. However, MAS technology is less restrictive in its design, and this produces simulation behavior often more akin to that observed in the real world.  The term “multi-agent system” is applied to a system comprising the following elements:

·                    An environment, E, that is, a space which generally has a volume.