List of Figures

Figure 2.1      Illustration showing Human Factors effort is better placed in early stages of the design process

Figure 2.2      Application of Human Factors methods by phase of the design process

Figure 3.1      Battle Group Headquarters

Figure 3.2      Planning timeline on a flipchart

Figure 3.3      Threat integration on map and overlay

Figure 3.4      Mission Analysis on a whiteboard

Figure 3.5      Effects Schematic drawn on a flipchart and laid on the map

Figure 3.6      COAs developed on a flipchart

Figure 3.7      DSO on map and overlay

Figure 3.8      DSOM on a flipchart

Figure 3.9      Coordination of force elements on map and overlay via a wargame

Figure 3.10    Coordination Measures captured on a whiteboard

Figure 3.11    Fire control lines on map and overlay also recorded in staff officer’s notebook

Figure 3.12    Relationships between the cells in Battle Group Headquarters during mission planning

Figure 4.1      Abstraction Hierarchy for the command process

Figure 4.2      Subjective opinion of the digital system in work domain terms

Figure 4.3      Concordance of positive ratings between levels

Figure 4.4      Concordance of negative ratings between levels

Figure 5.1      Hierarchical Task Analysis procedure

Figure 5.2      Combat Estimate Seven Questions task model

Figure 5.3      Question One HTA extract (1)

Figure 5.4      Question One HTA extract (2)

Figure 5.5      Question One HTA extract (3)

Figure 5.6      Question Two Mission Analysis HTA

Figure 5.7      Question Three HTA

Figure 5.8      Questions Four-Seven HTA

Figure 5.9      DSO construction HTA

Figure 5.10    Synchronisation matrix construction HTA

Figure 5.11    SHERPA EEM taxonomy

Figure 5.12    SHERPA flowchart

Figure 5.13    Advantages and disadvantages of each planning process

Figure 6.1      Propositional network example

Figure 6.2      Bde/BG HQ layout showing component cells

Figure 6.3(a)  Question 1 SA requirements

Figure 6.3(b)  Question 2 SA requirements

Figure 6.4       Combat Estimate task model

Figure 6.5       Question one propositional network

Figure 6.6       Question two propositional network

Figure 6.7       Question three propositional network

Figure 6.8       Question four propositional network

Figure 6.9       Question five propositional network

Figure 6.10     Question six propositional network

Figure 6.11     Question seven propositional network

Figure 6.12     Ops table layout

Figure 6.13     Battle Execution task model

Figure 6.14     Harry & Scabbers propositional network

Figure 6.15     Voldemort propositional network

Figure 6.16     Hagrid propositional network

Figure 6.17     Dobby propositional network

Figure 6.18     Dumbledore propositional network

Figure 6.19     Hedwig propositional network

Figure 6.20     Engineer versus intelligence components differing views on enemy and ground information elements

Figure 6.21     Inaccurate information elements

Figure 6.22     Untimely information elements

Figure 6.23     Lack of trust in information elements

Figure 7.1       Illustration of archetypal networks. Associated with each is empirical evidence concerning its performance on simple and complex tasks

Figure 7.2       Pie chart showing the ‘type’ of data communications being transmitted

Figure 7.3       Pie chart showing the type of data communications received

Figure 7.4       Pie chart showing the content of voice communications transmitted according to Bowers et al.’s (1998) taxonomy

Figure 7.5       Pie chart showing the type of voice communications received according to Bowers et al.’s (1998) taxonomy

Figure 7.6       Illustration of the 34 separate social network analyses plotted into the NATO SAS-050 Approach Space to show how the configuration of digitally mediated communications changes over time (grey numbered spots)

Figure 7.7       Periodogram illustrating the presence of periodic changes in network density

Figure 7.8       Spectral analysis graph illustrating the presence of periodic changes in network diameter

Figure 7.9       Illustration of the 34 separate social network analyses plotted into the NATO SAS-050 Approach Space to show how the configuration of voice mediated communications changes over time (grey numbered spots)

Figure 7.10     Spectral analysis graph illustrating the presence of periodic changes in network density

Figure 7.11     Spectral analysis graph illustrating the presence of periodic changes in high-status nodes

Figure 8.1       Diagram showing the main system screen (Local Operational Picture; LOP)

Figure 8.2       Diagram showing the obscuration of LOP window by the Operational Plan window

Figure 8.3       Diagram showing the LOP window and a planning window

Figure 8.4       Diagram showing the ability to resize windows

Figure 8.5       Diagram showing overlapping resizable windows and multiple windows displayed on a single screen

Figure 8.6       Diagram showing how to navigate to the favourites palette

Figure 8.7       Operational record within the Watch Keeper log

Figure 8.8       Diagram displaying a number of icons on top of one another

Figure 8.9       Diagram showing the LOP with the e-map turned off

Figure 8.10     Diagram displaying the ability to hide all icons except the user’s own

Figure 8.11     Diagram showing the purple colour coding of certain buttons

Figure 9.1       Overall median values for Visual Clarity

Figure 9.2       Comparison of median values for Visual Clarity by group

Figure 9.3       Overall median values for Consistency

Figure 9.4       Comparison of median values for Consistency by group

Figure 9.5       Overall median values for Compatibility

Figure 9.6       Comparison of median values for Compatibility by group

Figure 9.7       Overall values for Informative Feedback

Figure 9.8       Comparison of median values for Informative Feedback by group

Figure 9.9       Overall median values for Explicitness

Figure 9.10     Comparison of median values for Explicitness

Figure 9.11     Overall median values for Appropriate Functionality

Figure 9.12     Comparison of median values for Appropriate Functionality by group

Figure 9.13     Overall median values for Flexibility and Control

Figure 9.14     Comparison of median values for Flexibility and Control by group

Figure 9.15     Overall values for Error Prevention and Correction

Figure 9.16     Comparison of median values for Error Prevention and Correction by group

Figure 9.17     Overall values for User Guidance and Support

Figure 9.18     Comparison of median values for User Guidance and Support by group

Figure 9.19     Overall median values for System Usability Problems

Figure 9.20     Comparison of median values for System Usability Problems by group

Figure 9.21     Overall median values for categories 1 to 9

Figure 9.22     Comparison of median values for categories 1 to 9 by group

Figure 10.1     Graph showing how PMV values map on to the predicted percentage of people thermally dissatisfied

Figure 10.2     Longitudinal overview of the thermal environment extant in Bde HQ

Figure 10.3     Longitudinal overview of the thermal environment extant in BG HQ

Figure 10.4     Longitudinal overview of relative humidity extant in Bde HQ

Figure 10.5     Longitudinal overview of relative humidity extant in BG HQ

Figure 10.6     Noise levels measured in dB(A) at Bde HQ during the CPX

Figure 10.7     Noise levels measured in dB(A) at BG HQ during the CPX

Figure 10.8     The Cornell Office Environment Survey

Figure 10.9     BG and Bde responses to questions about environmental conditions

Figure 10.10   BG and Bde responses to questions about physical symptoms

Figure 10.11   Bar chart showing the extent of non-compliance with environmental guidelines

Figure 11.1     Key enablers to enhance performance