Introduction to transaction processing:Computer-Based Accounting Systems
Computer-Based Accounting Systems
The final section in this chapter examines alternative computer-based transaction processing models. Computer-based accounting systems fall into two broad classes: batch systems and real-time systems. A number of alternative configurations exist within each of these classes. Systems designers base their
configuration choices on a variety of considerations. Table 2-1 summarizes some of the distinguishing characteristics of batch and real-time processing that feature prominently in these decisions.
DIFFERENCES BETWEEN BATCH AND REAL-TIME SYSTEMS
Information Time Frame
Batch systems assemble transactions into groups for processing. Under this approach, there is always a time lag between the point at which an economic event occurs and the point at which it is reflected in the firm’s accounts. The amount of lag depends on the frequency of batch processing. Time lags can range from minutes to weeks. Payroll processing is an example of a typical batch system. The economic events—the application of employee labor—occur continuously throughout the pay period. At the end of the period, the paychecks for all employees are prepared together as a batch.
Real-time systems process transactions individually at the moment the event occurs. Because records are not grouped into batches, there are no time lags between occurrence and recording. An example of real-time processing is an airline reservations system, which processes requests for services from one traveler at a time while he or she waits.
Resources
Generally, batch systems demand fewer organizational resources (such as programming costs, computer time, and user training) than real-time systems. For example, batch systems can use sequential files stored on magnetic tape. Real-time systems use direct access files that require more expensive storage devices, such as magnetic disks. In practice, however, these cost differentials are disappearing. As a result, business organizations typically use magnetic disks for both batch and real-time processing.
The most significant resource differentials are in the areas of systems development (programming) and computer operations. As batch systems are generally simpler than their real-time counterparts, they
tend to have shorter development periods and are easier for programmers to maintain. On the other hand, as much as 50 percent of the total programming costs for real-time systems are incurred in designing the user interfaces. Real-time systems must be friendly, forgiving, and easy to work with. Pop-up menus, online tutorials, and special help features require additional programming and add greatly to the cost of the system.
Finally, real-time systems require dedicated processing capacity. Real-time systems must deal with transactions as they occur. Some types of systems must be available 24 hours a day whether they are being used or not. The computer capacity dedicated to such systems cannot be used for other purposes. Thus, implementing a real-time system may require either the purchase of a dedicated computer or an investment in additional computer capacity. In contrast, batch systems use computer capacity only when the program is being run. When the batch job completes processing, the freed capacity can be reallocated to other applications.
Operational Efficiency
Real-time processing in systems that handle large volumes of transactions each day can create operational inefficiencies. A single transaction may affect several different accounts. Some of these accounts, how- ever, may not need to be updated in real time. In fact, the task of doing so takes time that, when multi- plied by hundreds or thousands of transactions, can cause significant processing delays. Batch processing of noncritical accounts, however, improves operational efficiency by eliminating unnecessary activities at critical points in the process. This is illustrated with an example later in the chapter.
Efficiency Versus Effectiveness
In selecting a data processing mode, the designer must consider the trade-off between efficiency and effectiveness. For example, users of an airline reservations system cannot wait until 100 passengers (an efficient batch size) assemble in the travel agent’s office before their transactions are processed. When immediate access to current information is critical to the user’s needs, real-time processing is the logical choice. When time lags in information have no detrimental effects on the user’s performance and operational efficiencies can be achieved by processing data in batches, batch processing is probably the superior choice.
ALTERNATIVE DATA PROCESSING APPROACHES
Legacy Systems Versus Modern Systems
Not all modern organizations use entirely modern information systems. Some firms employ legacy systems for certain aspects of their data processing. When legacy systems are used to process financially significant transactions, auditors need to know how to evaluate and test them. We saw in Chapter 1 that legacy systems tend to have the following distinguishing features: they are mainframe-based applications; they tend to be batch oriented; early legacy systems use flat files for data storage, but hierarchical and network databases are often associated with later-era legacy systems. These highly structured and inflexible storage systems promote a single-user environment that discourages information integration within business organizations.
Modern systems tend to be client-server (network)–based and process transactions in real time. Although this is the trend in most organizations, please note that many modern systems are mainframe-based and use batch processing. Unlike their predecessors, modern systems store transactions and master files in relational database tables. A major advantage of database storage is the degree of process integration and data sharing that can be achieved.
Although legacy system configurations no longer constitute the defining features of accounting information systems (AIS), they are still of marginal importance to accountants. Therefore, for those who seek further understanding of legacy system issues, detailed material on transaction processing techniques using flat-file structures is provided in Section B of the Appendix to this chapter.
The remainder of the chapter focuses on modern system technologies used for processing accounting transactions. Some systems employ a combination of batch and real-time processing, while others are purely real-time systems. In several chapters that follow, we will examine how these approaches are con- figured to support specific functions such as sales order processing, purchasing, and payroll.
Updating Master Files from Transactions
Whether batch or real-time processing is being used, updating a master file record involves changing the value of one or more of its variable fields to reflect the effects of a transaction. Figure 2-28 presents record structures for a sales order transaction file and two associated master files, AR and inventory. The primary key (PK)—the unique identifier—for the inventory file is INVENTORY NUMBER. The primary key for AR is ACCOUNT NUMBER. Notice that the record structure for the sales order file con- tains a primary key (SALES ORDER NUMBER) and two secondary key (SK) fields, ACCOUNT NUMBER and INVENTORY NUMBER. These secondary keys are used for locating the corresponding records in the master files. To simplify the example, we assume that each sale is for a single item of inventory. Chapter 9 examines database structures in detail wherein we study the database complexities associated with more realistic business transactions.
The update procedure in this example involves the following steps:
1. A sales order record is read by the system.
2. ACCOUNT NUMBER is used to search the AR master file and retrieve the corresponding AR record.
3. The AR update procedure calculates the new customer balance by adding the value stored in the INVOICE AMOUNT field of the sales order record to the CURRENT BALANCE field value in the AR master record.
4. Next, INVENTORY NUMBER is used to search for the corresponding record in the inventory mas- ter file.
5. The inventory update program reduces inventory levels by deducting the QUANTITY SOLD value in a transaction record from the QUANTITY ON HAND field value in the inventory record.
6. A new sales order record is read, and the process is repeated.
Database Backup Procedures
Each record in a database file is assigned a unique disk location or address (see Section A of the chapter Appendix) that is determined by its primary key value. Because only a single valid location exists for each record, updating the record must occur in place. Figure 2-29 shows this technique.
In this example, an AR record with a $100 current balance is being updated by a $50 sale transaction.
The master file record is permanently stored at a disk address designated Location A. The update program reads both the transaction record and the master file record into memory. The receivable is updated to reflect the new current balance of $150 and then returned to Location A. The original current balance, value of $100, is destroyed when replaced by the new value of $150. This technique is called destructive update.
The destructive update approach leaves no backup copy of the original master file. Only the current value is available to the user. To preserve adequate accounting records in case the current master becomes damaged or corrupted, separate backup procedures, such as those shown in Figure 2-30, must be implemented.
Prior to each batch update or periodically (for example, every 15 minutes), the master file being updated is copied to create a backup version of the original file. Should the current master be destroyed after the update process, reconstruction is possible in two stages. First, a special recovery program uses the backup file to create a pre-update version of the master file. Second, the file update process is repeated using the previous batch of transactions to restore the master to its current condition. Because of the potential risk to accounting records, accountants are naturally concerned about the adequacy of all backup procedures. In Chapter 15 we examine many issues related to file backup.
BATCH PROCESSING USING REAL-TIME DATA COLLECTION
A popular data processing approach, particularly for large operations, is to electronically capture transaction data at the source as they occur. By distributing data input capability to users, certain transaction errors can be prevented or detected and corrected at their source. The result is a transaction file that is free from most of the errors that plague older legacy systems. The transaction file is later processed in batch mode to achieve operational efficiency. Figure 2-31 illustrates this
approach with a simplified sales order system such as that used in a department store. Key steps in the process are:
• The sales department clerk captures customer sales data pertaining to the item(s) being purchased and the customer’s account.
• The system then checks the customer’s credit limit from data in the customer record (account receivable subsidiary file) and updates his or her account balance to reflect the amount of the sale.
• Next the system updates the quantity-on-hand field in the inventory record (inventory subsidiary file) to reflect the reduction in inventory. This provides up-to-date information to other clerks as to inventory availability.
• A record of the sale is then added to the sales order file (transaction file), which is processed in batch mode at the end of the business day. This batch process records each transaction in the sales journal and updates the affected general ledger accounts.
You may be wondering at this point why the sales journal and general ledger accounts are being processed in batch mode. Why not update them in real time along with the subsidiary accounts? The answer is to achieve operational efficiency. We now examine what that means.
Let’s assume that the organization using the sales order system configuration illustrated in Figure 2-31 is large and capable of serving hundreds of customers concurrently. Also assume that 500 sales terminals are distributed throughout its many large departments.
Each customer sale affects the following six accounting records:
• Customer account receivable (Subsidiary—unique)
• Inventory item (Subsidiary—almost unique)
• Inventory control (GL—common)
• Account receivable control (GL—common)
• Sales (GL—common)
• Cost of good sold (GL—common)
To maintain the integrity of accounting data, once a record has been accessed for processing, it is locked by the system and made unavailable to other users until its processing is complete. Using the affected records noted here as an example, consider the implications that this data-locking rule has on the users of the system.
When processing a customer account receivable subsidiary record, the rule has no implications for other users of the system. Each user accesses only his or her unique record. For example, accessing John
Smith’s account does not prevent Mary Jones from accessing her account. Updating the inventory subsidiary record is almost unique. Because it is possible that both Mary Jones and John Smith are independently purchasing the same item at the same time, Mary Jones may be kept waiting a few seconds until John Smith’s transaction releases the lock on the inventory account. This will be a relatively rare event, and any such conflicts will be of little inconvenience to customers. As a general rule, therefore, master file records that are unique to a transaction such as customer accounts and individual inventory records can be updated in real time without causing operational delays.
Updating the records in the general ledger is a different matter. All general ledger accounts previously listed need to be updated by every sales transaction. If the processing of John Smith’s transaction begins before that of Mary Jones, then she must wait until all six records have been updated before her transaction can proceed. However, the 20- or 30-second delay brought about by this conflict will probably not inconvenience Mary Jones. This problem becomes manifest as transaction volumes increase. A 20-second delay in each of 500 customer transactions would create operational inefficiency on a chaotic level. Each of the 500 customers must wait until the person ahead of him or her in the queue has completed processing their transaction. The last person in the queue will experience a delay of 500 x 20 seconds ¼ 23=4 hours.
REAL-TIME PROCESSING
Real-time systems process the entire transaction as it occurs. For example, a sales order processed by the system in Figure 2-32 can be captured, filled, and shipped the same day. Such a system has many potential benefits, including improved productivity, reduced inventory, increased inventory turnover, decreased lags in customer billing, and enhanced customer satisfaction. Because transaction information is transmit- ted electronically, physical source documents can be eliminated or greatly reduced.
Real-time processing is well suited to systems that process lower transaction volumes and those that do not share common records. These systems make extensive use of local area network and wide area net- work technology. Terminals at distributed sites throughout the organization are used for receiving, processing, and sending information about current transactions. These must be linked in a network arrangement so users can communicate. The operational characteristics of networks are examined in Chapter 12.
Comments
Post a Comment