Chapter 5
Hierarchical Task Analysis

Introduction to Hierarchical Task Analysis

Hierarchical Task Analysis (HTA; Annett et al., 1971) is a normative task analysis approach that is used to describe systems in terms of goals, sub-goals and operations. The ‘task’ in HTA is therefore something of a misnomer since HTA does not in fact analyse tasks, rather, it is concerned with goals (an objective or end state) and these are hierarchically decomposed. HTA works by decomposing activities into a hierarchy of goals, subordinate goals, operations and plans, which allows systems to be described exhaustively; it focuses on ‘what an operator…is required to do, in terms of actions and/or cognitive processes to achieve a system goal’ (Kirwan & Ainsworth, 1992, p. 1). HTA outputs therefore specify the overall goal of a particular system, the sub-goals to be undertaken to achieve this goal, the operations required to achieve each of the sub-goals specified and the plans, which specify the sequence, and under what conditions different sub-goals have to be achieved in order to satisfy the requirements of the superordinate goal.

Despite the vast range of Human Factors and Ergonomics methods available, the popularity of the HTA methodology is unparalleled. It is the most commonly used, not just out of task analysis methods, but also out of all Ergonomics methods (Annett, 2004; Kirwan & Ainsworth, 1992; Stanton, 2006). HTA has been applied now for over 40 years in all manner of domains and its heavy use shows no signs of abating, certainly not within Human Factors and Ergonomics circles. Although the process of constructing a HTA is enlightening in itself (that is, the analyst’s understanding of the domain and system under analysis increases significantly), HTA’s popularity is largely due to the flexibility and utility of its output. In addition to the goal-based description provided, HTA outputs are used to inform various additional Human Factors analyses; further, many Human Factors methods require an initial HTA as part of their data input (Stanton et al., 2005a). Indicative of its flexibility, Stanton (2006) describes a range of additional applications to which HTA has been put, including interface design and analysis, job design, training programme design and evaluation, human error prediction and analysis, team task analysis, allocation of functions analysis, workload assessment, system design and procedure design. The HTA process is rigorous, exhaustive and thorough, involving collecting data about the task or system under analysis (through techniques such as observation, questionnaires, interviews with Subject Matter Experts (SMEs), walkthroughs, user trials and documentation review to name but a few) and then using this data to decompose and describe the goals, sub-goals and tasks involved. The HTA procedure is presented in Figure 5.1.

The HTA analysis undertaken during the exercise involved the construction of HTAs for each component of the seven questions planning process. The HTAs were developed from the data collected during our observations of the activities undertaken during the exercise and also supplementary data taken from Standard Operating Instructions (SOIs) and interviews and discussions with SMEs.

image

Figure 5.1 Hierarchical Task Analysis procedure

Digital Mission Planning and Battle Management Seven Questions Analysis

HTAs were constructed for the digital Mission Planning and Battlespace Management (MP/BM) component of each question during the seven questions planning process. The HTAs developed represent how each of the seven questions would be conducted using only the digital MP/BM system (as opposed to the combination of paper map process and digital MP/BM process that was observed during the exercise activities). Each HTA was then used to identify any design and usability issues relating to the digital MP/BM system and also for error identification purposes. The HTAs for each question are discussed in more detail overleaf.

Seven Questions Task Model

A high-level task model depicting the key activities undertaken was developed based on the seven questions digital MP/BM HTA. The task model also represents those seven questions products that can be constructed using the digital MP/BM system. The task model is presented in Figure 5.2.

image

Figure 5.2 Combat Estimate Seven Questions task model

The task model shows, albeit at a high level, the critical activities that the Brigade (Bde) and Battle Group (BG) were engaged in during the seven questions planning process. Additionally, the task model shows the products that are produced at the conclusion of each question.

The digital MP/BM’s role in the seven questions planning process is therefore to provide planners with the information that they require to undertake each component and also with the tools required to develop the planning products and distribute them accordingly. This analysis therefore focuses on the design and usability (that is, ease of use, potential for error, time required, user frustration and so on) of the different tools within digital MP/BM and also the provision of the information required for the planning process.

Question One

The process begins with question one, which involves the conduct of the BAE, the threat evaluation and the threat integration. The BAE involves conducting a terrain analysis using maps of the battlefield area and entails an assessment of the effects of the battlespace on enemy and friendly operations. It also involves the identification of likely mobility corridors, Avenues of Approach (AoAs) and manoeuvre areas. For the terrain analysis phase, the mnemonic ‘OCOKA’ is used, which comprises the following aspects of the terrain (CAST, 2007):

Observation

Cover and Concealment

Obstacles

Key terrain and

AoA.

Other key aspects of the terrain analysed during the terrain analysis component include the weather, restricted areas and potential choke points.

The threat evaluation phase involves identifying the enemy’s likely modus operandi by analysing their tactical doctrine, past operations and their strengths and weaknesses. The end state of the threat evaluation phase is to ‘visualise how the enemy normally executes operations and how the actions of the past shape what they are capable of in the current situation’ (CAST, 2007, p. 12). Key aspects of the enemy that are investigated here include their strengths and weaknesses, their organisation and digital MP/BM effectiveness, equipment and doctrine and also their tactics and preparedness. The outputs of the threat evaluation phase are a series of doctrinal overlays which portray the enemy in terms of their organisation and digital MP/BM effectiveness, equipment and doctrine and also their tactics and preparedness.

The threat integration phase then involves combining the Battlefield Area Evaluation (BAE) and threat evaluation outputs in order to determine the enemy’s intent and how they are likely to operate. The products of the threat integration include the enemy Effects Schematic, situation overlays for each enemy Course of Action (CoA) identified and an event overlay. Key elements identified during this phase include the Named Areas of Interest (NAIs) and likely enemy CoA. The output of the threat integration phase is the enemy Effects Schematic, which depicts the enemy’s mission in terms of effects and intent, situation overlays, which depict likely and most dangerous enemy CoAs and an event overlay, which depicts when and where key tactical events are likely to occur.

The output derived from question one is therefore an understanding of the battlespace and its effects on how the enemy (and friendly forces) are likely to operate. The question one HTA is presented in Figure 5.3, Figure 5.4 and Figure 5.5.

Based on the HTA and also the evidence gathered during the exercise, we can conclude that undertaking question one with the digital MP/BM system is a difficult, error prone and overly time-consuming process. The process of analysing the battlefield area is made particularly difficult due to problems with the mapping, screen size and screen resolution on the digital MP/BM system. Firstly, the mapping and screen size capability during the exercise was such that the entire battlefield area could not be viewed in its entirety on the one screen. This meant that users had to continually zoom in and out of the battlefield area, a process which resulted in them losing context in terms of what area of the battlefield they were actually looking at.

Secondly, problems with the screen resolution and available mapping meant that users could not see specific areas (such as towns) in the level of detail required. It was difficult for users to establish an appropriate level of screen resolution on the system in order to be able to identify key areas (for example, towns, rivers, roads and so on) on the map. Thirdly and finally, producing overlays (for example, terrain analysis, event overlay) on the digital MP/BM system appeared to be time consuming and error prone. The drawing tools offered by the digital MP/BM system are convoluted and counter-intuitive (for example having to left click on the map and then right clicking on a drawing object whilst holding control in order to place objects on the map), and as a result the process of drawing on the map using the digital MP/BM system is difficult, time consuming and error prone.

As a result of these problems, planners used a combination of the old paper map process and the new digital MP/BM process. As certain aspects of the digital MP/BM system were too time consuming and deemed unusable, some of the activities were undertaken using paper maps, acetates and traditional drawing or marking-up tools (for example, pens, stickies and so on). Although this appeared to work well, there ultimately comes a point when the end planning product has to be produced on digital MP/BM so that it can be distributed around the planning system and this therefore leads to a doubling of the process (that is, the process is undertaken on paper and then undertaken again on the digital MP/BM system), which has the effect of lengthening the overall planning process and ultimately reducing operational tempo.

The question therefore remains as to whether the process of analysing the battlefield area (that is, question one) can be facilitated in any way via digitisation. Since it is a mainly a cognitive process it is questionable as to whether digitising this aspect of the planning process is beneficial in any way. The traditional paper maps certainly suffice in terms of providing an accurate representation of the battlefield area, and planners can ‘zoom in and out’ of the battlefield area using maps of different scale. Drawing on acetates is also much easier and user friendly than drawing on the digital MP/ BM system. The outputs of question one consist of various overlays and textual descriptions and so it is here where perhaps the only real benefit of digitising the process emerges, since electronic outputs can be communicated more rapidly and to a wider audience. However, this could easily be achieved via the use of scanners or smartboards.

Question Two

Question two is known as the Mission Analysis and asks the question ‘what have I been told to do and why?’ Of specific interest during question two are the specified and implied tasks and the freedoms and constraints associated with the mission. Undertaking the Mission Analysis involves completing a Mission Analysis record, which requires a statement of the mission intent both 2 up (that is, 2 echelons up the command chain) and 1 up (1 echelon up the command chain), a statement of the main effort, specification of the specified and implied tasks, their deductions, Requests for Information (RFI) and Commander’s Critical Information Requirements (CCIRs) and finally the freedoms and constraints associated with the mission. Specified tasks are typically found in the mission statement, the coordination instructions, the Decision Support Overlay (DSO), the intelligence collection plan and the Combat Service Support for Operations (CSSO) (CAST, 2007). The output derived from question two is a completed Mission Analysis record detailing the mission, the main effort, the specified and implied tasks, any RFIs and the CCIRs.

image

Figure 5.3 Question One HTA extract (1)

image

Figure 5.4 Question One HTA extract (2)

image

Figure 5.5 Question One HTA extract (3)

The HTA for the digital MP/BM question two Mission Analysis process is presented in Figure 5.6. Completing the Mission Analysis component in digital MP/BM entails manually entering a description of the mission, the specified and implied tasks, any mission constraints and any additional information using the Mission Analysis tool. Following this, CCIRs and RFIs are entered into the RFI section in digital MP/BM. The Mission Analysis component of digital MP/BM uses a simple text box and keyboard entry system.

Completing the Mission Analysis record within digital MP/BM is a straightforward process and has the added benefit of allowing planners to review and refine the Mission Analysis details as they enter them. Again the key benefit here is that the Mission Analysis record can be quickly disseminated to other agents using the system’s messaging functionality. The only problematic aspect revealed by the HTA is the lack of a free text entry function for the specified and implied tasks section. Currently the user has to use an ‘add specified/implied task’ button, which opens up a new window in which the details are subsequently entered. Again this is convoluted and allowing the user to enter the text directly into the specified/implied task text boxes would be more appropriate. Additionally (although this is a problem throughout the system), the use of a ‘close’ (X) icon, rather than an ‘OK’ icon to exit completed data entry windows is problematic and counter-intuitive.

Question Three

Question three involves the Commander specifying the effects that they wish to have on the enemy (CAST, 2007), what is referred to as their battle-winning idea, or ‘that battlefield activity or technique which would most directly accomplish the mission’ (CAST, 2007, p. 23). Based on the information derived from questions one and two, the Commander should now understand the battlespace area and the aims of the friendly forces involved and should also comprehend how the enemy are likely to operate. Using this understanding the Commander then identifies the effects required in order to achieve the mission and also prevent the enemy from achieving their mission. The Commander specifies their effects using an Effects Schematic and gives the purpose and their direction to the staff for each of the effects described. Additionally the Commander also specifies what the main effort is likely to be and also their desired end state. Additional direction designed to focus the planning effort is also given at this stage. This might include guidance on the use of applicable functions in digital MP/BM, principles of war and principles of the operation (CAST, 2007). Finally, the Commander confirms their CCIRs and RFIs.

During the exercise the completion of question three using the digital MP/BM system was not observed; a HTA was therefore developed based on the analyst’s understanding of how question three should be undertaken using the digital MP/BM system. The HTA for question three is presented in Figure 5.7. The process involves the use of the user defined overlay and the digital MP/BM drawing tools to construct the commander’s Effects Schematic. The tasks involved when opening a new user defined overlay and using the drawing tools to construct an overlay are described in the other HTAs in this section. We can therefore conclude that the same issues are present here, that is, that there are various usability problems associated with the drawing tools.

image

Figure 5.6 Question Two Mission Analysis HTA

image

Figure 5.7 Question Three HTA

Questions Four, Five, Six and Seven

Questions four, five, six and seven are primarily concerned with the development of the CoAs required to achieve the Commander’s desired end state. Since the questions are typically undertaken together or in parallel the digital MP/BM system contains a Q4–7 CoA development tool. The HTA for questions 4–7 is presented in Figure 5.8.

Question four involves identifying where each of the actions and effects specified by the Commander are likely to be best achieved in the present battlespace area and involves placing the Commander’s effects, NAIs, Target Areas of Interest (TAIs) and Decision Points (DPs) on the map. Although some of the effects are likely to be dictated by the Commander and the ground, others, such as STRIKE and DEFEAT effects, can often potentially take place in a number of areas depending on a variety of factors such as enemy location, terrain and friendly force capability. The output of question four is the draft DSO which contains the Commander’s effects, NAIs, TAIs and DPs for the mission. A HTA for the construction of the DSO is presented in Figure 5.9.

Undertaking question four in digital MP/BM involves manually adding the Commander’s effects, NAIs, TAIs and DPs to the battlefield area and then textually adding these features using the DSO tool. Again due to the usability issues associated with the digital MP/BM system’s drawing tools, producing the DSO in digital MP/BM is both time consuming and error prone. Observation of this process during one instance indicated that it took approximately four times longer (it took 60 minutes after significant practice) to produce the DSO using digital MP/BM as opposed to using paper maps and acetates.

Question Five

Question five involves specifying resources for each of the Commander’s effects, NAIs, TAIs and DPs. This involves considering the effects required and then the mission, digital MP/BM power, type, size and strength of the enemy at each NAI and TAI. Much of this information can be derived from the assessment of the enemy’s strengths and weaknesses made during question one as part of the threat evaluation. The output of question five is a series of potential CoAs for each effect, NAI and TAI and a Decision Support Overlay Matrix (DSOM). The Commander then makes a decision of how each effect, NAI and TAI is to be resourced, which leads to the production of the final DSOM. The DSOM is produced semi-automatically in digital MP/BM; however, some portions of it still require completion, namely the purpose, assets and remarks sections. The lack of explicit links between the NAIs, TAIs and DPs within the digital MP/BM DSOM is a problem and users cannot easily discern the relationship between the NAIs, TAIs and DPs. There is some concern over the automatic production of the DSOM and the lack of user involvement in this process may enhance error potential since the user is not refining the details as they complete the DSOM.

Question Six

Question six focuses on the time and location of each CoA, that is, when and where do the actions take place in relation to one another? To determine this, a synchronisation matrix is produced, which includes a statement of the overall mission and the concept of operations and then a breakdown of events related to time, including enemy actions, and friendly force components’ activities and DPs. The output of question six is a draft synchronisation matrix and a decision support matrix.

Question six is completed within digital MP/BM using the synchronisation matrix (synch matrix) tool (manually the synch matrix is constructed using a flipchart). The digital MP/BM synch tool was found to be problematic during the exercise. For example, during the planning phase in BG the synch matrix was constructed on a digital MP/BM terminal in the plans cell. A HTA of the synch matrix construction task is presented in Figure 5.10.

image

Figure 5.8 Questions Four-Seven HTA

image

Figure 5.9 DSO construction HTA

Due to problems with the synch matrix tool, the synch matrix product took around 6 hours to complete. The process of constructing the synch matrix within digital MP/BM appeared to be unintuitive, error prone and overly time consuming. The old process involved manually drawing a synch matrix on a whiteboard and populating it as appropriate. The digital MP/BM process involves first constructing a synch matrix template and then populating it by adding different action groups (for example, enemy and friendly forces), actions (for example, recce, arty and so on) events and timings. The synch matrix is then distributed via publish and subscribe and printed out.

The user in this case made many errors and had to consult with various other users on how to undertake the process correctly. At one point (after approximately one hour) the user gave up and started over as they could not rectify an earlier error that they had made. The menu icons were also found to be unintuitive and the user had to continually float the mouse over the menu icons in order to see what they actually were. The errors made by the user range from failing to enter data (for example, failing to enter action group name), failing to select data (for example, failing to select action group or action), entering the same data twice (for example, entering action name twice), entering the wrong data (for example, selecting wrong item from drop down menu, entering wrong name) and constructing the overlay incorrectly (the user put all of the actions under the enemy rather than friendly action group and could not rectify it). Eventually the user had to stop and restart the entire process almost an hour after beginning it. This was due to the fact that the error in this case could not be recovered (although it would be simple to make it recoverable).

image

Figure 5.10 Synchronisation matrix construction HTA

Also evident during this vignette was a lack of awareness as to what the digital MP/BM system is actually capable of. At one point a colleague pointed out that the synch matrix can be automatically populated (a process that was causing the user great difficulty) via the TASKORG product. Another colleague agreed that this could be done whereas the user constructing the synch matrix was not actually sure. However, all three did not know how to populate the synch matrix with the TASKORG. Printing the synch matrix product was also problematic as the printer used was incapable of printing the entire synch matrix on one page, which meant that the entire synch matrix could not be viewed at once. As a result of the problems identified above, the synch matrix was not ready and the orders were sent without the synch matrix.

It is apparent that the synch matrix construction process could be made much more simple and intuitive. For example, populating the Action group and Actions column of the table could involve simply clicking on the column and adding the action group or action name. Currently the user can right click on this column to modify or move existing action groups and actions but cannot add actions in this manner.

Question Seven

Finally, question seven involves identifying any control measures that are required for the CoAs specified. Control measures are the means by which activities are coordinated and controlled. Control measures include phase lines, boundaries, fire support coordination measures and lines, assembly areas and rules of engagement. Within the digital MP/BM system control measures are added to the map using the drawing tools and the details are entered textually within the Q4–7 CoA development window.

Human Error Analysis

The HTA outputs were used to inform the conduct of a Human Error Identification (HEI) analysis of the digital MP/BM software tool. HEI techniques offer a pro-active strategy for investigating human error in complex socio-technical systems. HEI works on the premise that an understanding of an employee’s work task and the characteristics of the technology being used allow us to indicate potential errors that may arise from the resulting interaction (Baber and Stanton, 1996). Since a number of high-profile, human error-related catastrophes occurring in the late 1970s and early 1980s, such as the Three Mile Island, Bhopal and Chernobyl disasters, the use of HEI techniques has become widespread, with applications in a wide range of domains. HEI analyses typically offer descriptions of potential errors, their causal factors and consequences and proposed remedial measures designed to reduce the potential of the identified errors occurring.

Systematic Human Error Reduction and Prediction Approach

Of the many HEI approaches available, the Systematic Human Error Reduction and Prediction Approach (SHERPA) (Embrey, 1986), is the most commonly used and most successful HEI approach. Most of the approaches available are domain specific (that is, developed for a specific application within a specific domain); however, the SHERPA approach uses a generic error taxonomy and so can be easily applied in new domains. SHERPA was developed originally for use in the nuclear reprocessing industry but has since been applied in various domains.

SHERPA uses a generic External Error Mode (EEM) taxonomy linked to a behavioural taxonomy and is applied to a HTA of the task or scenario under analysis. The behavioural and EEM taxonomies are used to identify credible errors that are likely to occur during each step in the HTA. For example, each bottom level task step from the HTA is firstly classified as one of the five following behaviour types from the SHERPA behaviour taxonomy:

• Action – for example, pressing a button, typing in data and so on.

• Check – for example, making a procedural check.

• Information Retrieval – for example, retrieving information from a display or document.

• Information Communication – for example, talking to a colleague or co-worker.

• Selection – for example, selecting one alternative over another.

Each SHERPA behaviour classification has a set of associated EEMs. The SHERPA EEM taxonomy is presented in Figure 5.11.

The EEM taxonomy and domain expertise are then used to identify, based on the analyst’s subjective judgement, any credible error modes for the task step in question. For each credible error identified, a description of the form that the error would take is provided, such as ‘pilot dials in wrong airspeed’ or ‘pilot fails to check current flap setting’. Next, the analyst describes any consequences associated with the error and any error recovery steps that would need to be taken in event of the error. Ratings of ordinal probability (Low, Medium or High) and criticality (Low, Medium or High) are then provided. The final step involves specifying any potential design remedies (that is, how the interface or device can be modified in order to remove or reduce the chances of the error occurring) for each of the errors identified. A flowchart depicting the SHERPA procedure is presented in Figure 5.12.

A SHERPA analysis was conducted using the HTAs presented previously. This involved one analyst predicting errors for each of the bottom level task steps contained in the seven questions planning HTAs. For example purposes, an extract of the SHERPA analysis for the question one HTA is presented in Table 5.1.

Conclusions

HTA Analysis

The HTA analysis indicated that there are significant problems associated with the seven questions-related planning tools that the digital MP/BM system offers. In the main, the majority of these tools are counter-intuitive, difficult and time consuming to use and error prone. Furthermore, when compared to their paper map process counterparts, it appears that the digitised versions offer no real benefit (the only significant benefit being the ability to disseminate planning products quicker, further and to a wider audience; however this is tempered by the additional time taken to complete the planning products using the digital system). A summary of the main findings in relation to the planning products and tools used during the seven questions process is presented in Table 5.2.

image

Figure 5.11 SHERPA EEM taxonomy

The HTA analyses therefore indicate that there are significant problems associated with the user interface and tools within the digital MP/BM system. The more pertinent issues are summarised below:

Lack of standardised conventions. The most striking finding to emerge from the HTA analysis is the general lack of standardised user conventions contained within the digital MP/BM system. The designers of the GUI have failed to exploit standardised user conventions from existing (and well used) systems, such as Windows XP, Microsoft Visio (drawing package) and Microsoft Word. Examples of these omissions include the so-called digital MP/BM copy process (which involves holding control on the keyboard and right clicking the mouse to copy, and then left clicking to paste), having to click on X in some windows upon completion of a task and the lack of a drag and drop drawing function. These processes all represent instances of where standardised conventions have been overlooked and are thus often alien to new users. Also the icons used (that is, for buttons on toolbars) are typically not standard ones and are often confusing.

image

Figure 5.12 SHERPA flowchart

Convoluted processes. A number of the processes involved when using the digital MP/BM tool are convoluted. In particular, the marking-up of maps and the development of overlays are two processes that are overly complicated and require intricate task steps. Both represent instances where the designers have failed to exploit the user’s mental model of the current process and also standardised conventions.

Table 5.1 Question SHERPA analysis extract

image

Table 5.2 HTA analysis product construction findings summary

image

Oversized menus. Some of the menus contained within the digital MP/BM system are overly large and contain too many items to be usable. This increases user interaction time and also the potential for error. For example, the create/modify menu contains 38 different menu items.

Drawing tools unintuitive and overly complex. The drawing system offered by the digital MP/BM system is especially problematic. The system is unintuitive, difficult to use and heavily prone to error. As a consequence, drawing processes (for example, marking-up of maps, overlay construction) is time consuming and complex; so much so that the paper map drawing system was the preferred option during the exercise activities observed.

Lack of consistency. In terms of user interface design, consistency refers to ‘common action sequences, terms, units, layouts, colour, typography and so on within an application program’ (Schneiderman, 1998, p. 13). Schneiderman (1998) suggests that consistency is a strong determinant of the success of systems and includes it in his eight golden rules of interface design. At times there appears to be a lack of consistency between different aspects of the digital MP/BM system’s interfaces. For example, in some cases the user can click on ‘OK’ buttons to finish a process, where in other instances there are no ‘OK’ buttons and the user has to click on ‘X’ to signal that they have completed a process.

Inappropriate or lack of use of automation. Currently the system does not appropriately exploit the capability for automating certain aspects of the planning process. Processes such as the loading up of the system and the Mission Analysis could easily be improved through the use of appropriately designed automation.

Interface clarity. The current GUI is insufficient to support quick, intuitive and error free performance. Certain aspects of the interface are not sufficiently clear or prominent and as a result, it may take new users a considerable amount of time to find them or to determine what they actually are.

To summarise the main findings derived from the HTA assessment in relation to the overall digital MP/BM system, the advantages and disadvantages of both the paper map and the digital MP/BM planning process can be compared at a high level. This is represented in Figure 5.13.

Systematic Human Error Reduction and Prediction Approach Analysis

The SHERPA analysis highlighted a number of different errors that are likely to occur as a result of user-digital MP/BM interactions. A summary of the different error types is given below:

A4 Operation too little/too many errors – mainly involved not scrolling enough or scrolling too far down the digital MP/BM menus. This resulted in the user selecting the wrong item from the menu.

A5 Misalign errors – mainly involved the user selecting the wrong area on the map display when marking-up maps, selecting the wrong item from menus or pressing the wrong button or command on the interface.

A6 Right operation on wrong object errors – there were many A6 errors identified. These included the user selecting the wrong function from the toolbar or drop down menus, clicking on the wrong item/button/command on the interface (due to inadequate design of icons), entering the wrong data or pressing the wrong key.

A7 Wrong operation on right object errors – these errors mainly included incorrect data entry errors.

A8 Operation omitted errors – these errors involved the user failing to perform a task such as failing to save the current file, failing to enter data, failing to select required functions and failing to locate tools and functions.

C1 Check omitted errors – these errors involved the user failing to check data of some sort, including failing to check data entered, failing to check data presented by digital MP/BM and failing to check that data had been copied successfully.

R2 Wrong information obtained errors – these errors were mainly misread errors where the user misreads data that is presented by the system.

image

Figure 5.13 Advantages and disadvantages of each planning process

S2 Wrong selection errors – involved the user making a wrong selection of some sort, such as selecting the wrong function from the tool bar or drop down menus and the user selecting inappropriate positions.

A series of remedial measures were proposed in order to eradicate or reduce the potential of the identified errors occurring.