User interface design

User interface design

48.1 Purpose

A user interface is a point in the system where a human being interacts with a computer. This chapter discusses several different types of direct (human/computer) interfaces and the interface design process.

48.2 Strengths, weaknesses, and limitations

This chapter introduces some important principles of user interface design. The strengths, weaknesses, and limitations associated with specific interfaces or interface design techniques are discussed in context.

48.3 Inputs and related ideas

Before designing a user interface, the analyst or designer must first know the user and understand the task to be performed. Much of the necessary information is collected during the problem definition and information gathering (Part II), analysis (Part IV) and high-level design (Part V) stages of the system development life cycle. On a data flow diagram (Chapter 24), processes and data flows from sources and to destinations might suggest a need for user interfaces. The data elements that are input by or output to users are typically documented in the data dictionary (Chapter 25). The requirements specification (Chapter 35) identifies user needs, user characteristics (skill, training, etc.), and task requirements. At the high-level physical design stage, symbols on the system flowchart (Chapter 37) identify necessary reports, screens, forms, and keyboard operations. Prototyping (Chapter 31) and rapid application design (Chapter 32) are useful tools for designing a user interface.

This chapter focuses on certain general principles associated with direct user interface design. The contents of the screens that make up a direct user interface are discussed in Chapter 46 (screen and forms design), Chapter 49 (dialogue design), and Chapter 50 (windows design). Related concepts include report design (Chapter 47), web page design (Chapter 51), and natural language processing (Chapter 68).

48.4 Concepts

A user interface is a point in the system where a human being interacts with a computer. The interface can incorporate hardware, software, procedures, and data. The interaction can be direct; for example, a user might access a computer through a screen and a keyboard. Printed reports and forms designed to capture data for subsequent input are indirect user interfaces. This chapter focuses on direct computer interfaces.

48.4.1 The end user

Generally, the purpose of any information system is to provide the right data and information to the right person at the right time. That “person” is an end user.

An end user is any person who needs the output generated by the computer and/or any person who interacts with the computer at an operational level. Examples include a manager reading a report, a clerk entering data, an engineer using a CAD program to prepare a technical diagram, a production supervisor using software to plan a work schedule, and a technical writer using a word processor to prepare a manual. The end user communicates with the system through the user interface.

48.4.2 Types of user interfaces

There are several different types of direct user interfaces.

48.4.2.1 Command interfaces

Some user interfaces rely on abbreviated commands or acronyms. MS-DOS line commands, the single-letter slash commands used by early spreadsheet programs, and the keyboard shortcuts and function key commands available on many word processors are good examples.

Such cryptic commands save a sophisticated user the time that might otherwise be spent traversing menus or windows. Using cryptic commands also reduces the time needed to design the menus and the screens. Command-based interfaces require considerable user training, however, and it is unreasonable to expect users to memorize all the commands without referencing a command template.

48.4.2.2 Menu interfaces

A menu consists of a list of the options available to the user. Typically, the user selects the desired option by typing the option’s letter or number, highlighting the option and pressing enter, or pointing to the option and clicking a mouse button. Often, selecting a given option leads to a second menu listing suboptions, so a set of related screens and windows must be designed and implemented to support a menu-driven interface.

On a well-designed traditional interface, the relevant commands, subcommands, and/or menus should be logically grouped, and the design of the hierarchy should be intuitive to the user. Hidden commands or menus should be avoided. The commands and menus (as well as any related windows or screens) should be easy to access and to terminate.

Compared to cryptic commands, menus are more flexible, easier to use, and easier to learn. Traversing multiple menus can be time consuming, however, and creating a set of linked menus adds to system development time. Also, on large or complex systems, the menus occupy a great deal of random access memory (RAM).

48.4.2.3 Object-oriented interfaces

Object-oriented interfaces, also called icon-based interfaces or graphic user interfaces (GUIs), have become increasingly common since the introduction of the Apple Macintosh and Microsoft’s Windows operating systems. Windows, icons (graphic symbols that represent processing options, files, or executable routines), menus, and pointers are the key elements of an object-oriented interface. (Consequently, they are sometimes called WIMP interfaces.) Generally, the user points to the desired element and clicks a mouse button to trigger the associated action.

On a well-designed object-oriented interface, the meaning of each icon is apparent (almost intuitively obvious) to the user. Embedded or linked objects are clearly defined in the icon’s menu structure. Finally, each icon has a single entry and a single exit.

Object-oriented interfaces are easy to understand, learn, and use, and because all the available choices are displayed on the screen, there is no need for the user to memorize anything. They are also easy to maintain because each icon (or window, or menu) is implemented as an independent module. The windows, icons, and menus and the pointer logic consume considerable processor time and a great deal of memory, however.

48.4.2.4 Expert system interfaces

Expert system interfaces utilize natural language processing (NLP) (Chapter 68). Key elements include the ability to parse and comprehend human sentences and paragraphs, voice recognition, and voice data entry. Such hardware as keyboards, pointing devices, and microphones might be used for input. Speakers provide audio output. Natural language processing requires a very powerful computer with a great deal of memory and a fast processor.

48.4.2.5 Web-form interfaces

Web-form interfaces (Chapter 51) follow the metaphor established by the Internet and the World Wide Web. Files and executable routines are viewed as hyperlinked pages. Some of those pages are designed to resemble forms that users either fill in directly or complete by selecting answers from a default list.

On a well-designed web-form interface, the layout of all forms is clear (almost intuitively obvious) and data entry is always verified. Additionally, the data entry process is supported by appropriate and meaningful prompts.

48.4.3 User interface design criteria

In the past, when computers were expensive and people were (relatively) cheap, users were expected to interact with the computer on the machine’s terms, but that is no longer true. Given today’s technology, a user interface must be designed to allow the user to perform his or her job as effectively as possible. Machine efficiency should, of course, be considered, but only if it does not conflict with the primary objective.

Generally, a good interface is easy to use, easy to maintain, easy to learn, and incorporates readily available on-line help. Also, a good interface never leaves the user hanging, providing (as a minimum) a clear exit path from any operation.

48.4.3.1 System type

The precise nature of the user interface is a function of the type of system to be developed. For example, a typical management information system (MIS) incorporates numerous forms, reports, and access controls. A decision support system (DSS) emphasizes dialogues, windows, and interfaces between a database, a model base, a graph base, and/or a text base. An expert system needs interfaces between a rule base, a database, and/or a natural language processing (NLP) facility. Group decision support systems (GDSS) and/or electronic meeting systems (EMS) need interfaces with facilities that transmit and/or share data, such as the network.

48.4.3.2 The mental model

People tend to form their own mental model of a system. For example, a video card game player visualizes cards on a table, a video golf game player can image actually playing golf, and a flight simulator gives the user a realistic sense of flying an airplane. The mental model helps the user understand how the system works. A good mental model allows the user to predict the system’s response to a given stimulus, and the more accurate those predictions, the more intuitive the system appears. When the user understands the system at an intuitive level, the need for training declines, the error rate improves, and the user becomes more efficient.

When designing a user interface, the designer should try to select a mental model that makes sense to the user. For example, if the user filled out a paper form in the old system, that form might be simulated on the screen. If the mental model cannot be based on the user’s experience, the user must be trained to understand the new mental model and the designer must be prepared to adjust the model if the user has trouble understanding it. A good approach is to adopt a known metaphor such as the Microsoft Windows desktop. There is no point reinventing the wheel, and time spent on Windows training might simplify training for future applications.

48.4.3.3 Environmental issues

The system environment represents a potential constraint on interface design. For example, it is unreasonable to expect an automobile mechanic whose hands are covered with grease to enter data directly into a computer, and people whose work takes them away from sources of electricity or a system access port require special equipment to capture data electronically. Consider the nature of the end user, too. Such variables as education, training, skill, and handicaps serve to limit what a given person can reasonably be expected to do. The system must fit the user.

Other environmental factors have legal, moral, and ethical (as well as financial) implications. For example, over the past several years researchers have identified a variety of problems associated with video terminal use ranging from repetitive stress injuries, to eyestrain, to the possibility that exposure to low-level radiation might represent a hazard for pregnant women. Such factors must be taken into account when the user interface is designed. Many organizations have adopted explicit ergonomic standards for user interfaces.

Finally, consider legal and auditing requirements. For example, if a company’s auditing rules specify that a physical (printed) copy of each sales receipt be retained for a period of one full year, the ability to create, store, and retrieve those copies must be built into the system, perhaps through the user interface. If state law requires that all documents used to compute an employee’s pay be retained until six months after the fiscal year ends, and pay is computed in part as a commission on sales, then there is a legal reason to maintain a file of sales receipts. Such details can make the difference between a successful system and an embarrassing failure.

48.4.4 The user interface design process

End user involvement is valuable throughout the system development life cycle, but it is essentialduring interface design. By definition, supporting the end user is the ultimate objective of any information system. To the end user, the user interface is the system. Consequently, user interface design must be user-centered.

Interface design can be viewed as a complete system analysis and design project in its own right, but given the need for user involvement, prototyping (Chapter 31) and rapid application design (RAD, Chapter 32) are highly recommended. The basic idea is to gradually enhance an initial set of generalized, but inefficient (soft) capabilities until an easy-to-use, efficient, user-friendly system evolves. This soft capability approach allows the interface designer to build on a relatively small set of requirements and contributes to a more bug-free conceptual design. Also, as the system evolves, the soft capabilities are easily replaced by newly available technology, leading to an advanced interface design with greater power at a lower cost.

Figure 48.1 outlines the steps in the interface design process.

48.4.4.1 Overview and define

The first step is to identify and define the interface requirements (including the criteria described in Section 48.4.3) in enough detail to begin building a prototype. The nature of the proposed system must be known, and any important environmental factors must be identified. The nature and characteristics of the user tasks supported by the interface, the user needs implied by the interface, and any links between the interface and the rest of the system must also be known.

48-01
Figure 48.1  The steps in the interface design process.

This information can usually be obtained from the documentation developed during the problem definition and information gathering (Part II), analysis (Part IV), and high-level design (Part V) stages of the system development life cycle. For example, a properly drawn system flowchart (Chapter 37) shows all the system elements (manual procedures, input documents, output documents, display screens, etc.) that call for user interfaces and the links between those elements and the rest of the system.

48.4.4.2 Design and construct

The key objective of this stage is to construct a prototype based on the available information. Often, the first step in the process is to construct a hierarchy chart that shows the required windows or screens and the paths or links between them. For example, the hierarchy chart in Figure 48.2shows the relationships between a menu screen and its immediate subscreens. By convention, control flows from top to bottom and back again, and the user can exit the system only from the top.

A hierarchy chart is an excellent tool for evaluating and planning the value, path, and destination associated with each user choice. The value or response associated with a given choice is an input value that activates the choice; for example, a four-choice menu might recognize as valid only choices 1, 2, 3, or 4. Choice 1 follows a path to a single destination (a single lower-level window or a single subscreen). Choice 2 follows a different path to a different destination, and so on.

Consistency is a particularly important design criterion. Values must be consistent, for example, a given set of related menus might use alphabetic characters or digits, but not both. Also, certain options (0 for return to main menu, 9 for exit the system) might appear on all menus to provide a consistent set of basic navigating rules. Many organizations impose user interface “syntax” rules, often in the form of a standard set of user interface objects, to encourage consistency from application to application and code reusability.

48-02
Figure 48.2  This hierarchy chart shows the relationships between a menu screen and its immediate subscreens.

Once the screens and menus are designed, the associated dialogues (Chapter 49) are coded and the prototype is constructed.

48.4.4.3 Test and evaluate

Before the prototype is turned over to the user, its stability must be tested. The basic idea is to trace the top down and bottom up links between the various screens and windows to ensure that elements are called in proper sequence and the necessary parameters are passed between levels. This process helps to correct any inconsistent paths (such as an exit from anything but the top level), eliminate infinite-loops, and fix other problems that might cause the user to lose navigational control. Several CASE tools, fourth-generation languages (4GL), and object-oriented tools contain embedded routines, modules, or functions that allow the designer to run the prototype and test for stability. For example, FOCUS has a feature called window paint that tests the prototype’s links and connections.

Once the prototype is stable, the user begins to exercise it. The combination of user feedback and designer observation helps to identify unclear or inconsistent elements in the prototype design. Other test criteria include the objectives of the interface, the requirements of any procedures that access or rely on the interface, and such performance factors as error or failure rates, stability, linking sequence, and related systems performance. Essentially, the prototype is compared to the desired and/or expected results. If the prototype interface is acceptable, the next phase is skipped and the interface design is completed.

48.4.4.4 Feedback and refine

If user feedback or other test results suggest a need for reconceptualization, the design process returns to the first phase (overview and define) and the nature of the proposed interface is reevaluated. If user feedback or other test results suggest a need for redevelopment to correct design defects, the design process returns to the second phase (design and construct) and the prototype is modified. If the problems are concerned with the test criteria, test procedures, or test data, the design process returns to the third phase (test and evaluate) for retesting.

48.5 Key terms
Command-based interface —
A user interface that relies on cryptic commands and/or specific keystrokes to identify the desired action.
Direct user interface —
A user interface through which a user directly accesses a computer (for example, via a screen and a keyboard).
End user —
Any person who needs the output generated by the computer and/or who interacts with the computer at an operational level.
Ergonomics —
The study of the relationship between human beings and their workplaces.
Expert system interface —
A user interface that utilizes natural language processing.
Graphic user interface (GUI) —
A user interface that features windows, icons, menus, and pointers; generally, the user points to the desired element and clicks a mouse button to trigger the associated action. The Apple Macintosh and Microsoft Windows interfaces are common examples.
Icon —
A graphic symbol that represents a processing option, a file, or an executable routine.
Indirect user interface —
A user interface that does not involve direct computer access; for example, a printed report or a form designed to capture data for subsequent input.
Menu interface —
A user interface in which the list of the options available to the user is displayed in a table or menu.
Natural language processing —
Hardware and/or software that allows people to communicate with a computer in much the same way they communicate with each other; voice recognition is an example.
Object-oriented interface —
A user interface that features windows, icons, menus, and pointers; generally, the user points to the desired element and clicks a mouse button to trigger the associated action; also called an icon-based interface, a graphic user interface, or a WIMP interface.
Prototype —
A working physical model of a system or a subsystem.
User interface —
A point in the system where a human being interacts with a computer.
Web-form interface —
A user interface that follows the metaphor established by the Internet and the World Wide Web.
48.6 Software

Many CASE products support prototyping. Screen painters, menu builders, report generators, fourth-generation languages, executable specification languages, spreadsheets, and database management programs are popular prototyping tools.

The Apple Macintosh operating system and Microsoft Windows are examples of graphic user interfaces. Dragon Systems’ Naturally Speaking and IBM’s ViaVoice Gold are voice recognition software packages that might be used to support an expert system user interface. Netscape and Microsoft’s Internet Explorer define the metaphor for web-form interfaces.

48.7 References
1.  Davis, W. S., Business Systems Analysis and Design, Wadsworth, Belmont, CA, 1994.
2.  Dewitz, S. D., Systems Analysis and Design and the Transition to Objects, McGraw-Hill, New York, NY, 1996.
3.  Hoffer, J., George, J., and Valacicho, J., Modern Systems Analysis and Design, Benjamin/Cummings, Redwood City, CA, 1996.
4.  Mayhew, D. J., Principles and Guidelines in Software User Interface Design, Prentice-Hall, Englewood Cliffs, NJ, 1992.
5.  Powers, M., Cheney, P., and Crow, G., Structured Systems Development: Analysis, Design, Implementation, 2nd ed., Boyd & Fraser, Boston, MA, 1990.
6.  Rubin, T., User Interface Design for Computer Systems, Ellis Horwood, Chichester, England, 1988.
7.  Whitten, J. L., Bentley, L. D., and Dittman, K. C., Systems Analysis and Design Methods, Richard D. Irwin (McGraw-Hill), New York, 1997.

Comments

Popular posts from this blog

The Conversion Cycle:The Traditional Manufacturing Environment

The Revenue Cycle:Manual Systems

HIPO (hierarchy plus input-process-output)