Disadvantages of conventional file-processing system

Keeping organizational information in a conventional file-processing system has a number of major disadvantages. They are

1.       Data redundancy and inconsistency: Since the files and application programs are created by different programmers over a long period, the various files are likely to have different formats and the programs may be written in several programming languages. Moreover, same information may be duplicated in several places. This redundancy leads to higher storage and access cost. In addition, it may lead to data inconsistency; that is, the various copies of the same data may no longer agree.

2.       Difficulty in accessing data: Suppose the one of the bank officer needs to find out the names of all customers who live within the city’s zip code is 630001. The officer asks the data processing department to generate such a list. Because this request was not anticipated when the original system was designed, there is no application program on hand to meet it. The point here is that conventional file-processing environment do not allow needed data to be retrieved in a convenient and efficient manner. More responsive data retrieval system must be developed for general use.

3.       Data isolation: Because data are scattered in various files, and files may be in different formats, it is difficult to write new application programs to retrieve the appropriate data.

4.       Integrity problems: The data values stored in the database must satisfy certain types of consistency constraints. For example, the balance of a bank account may never fall below a prescribed amount. Developers enforce these constraints in the system by adding appropriate code in the various application programs. However, when new constraints are added, it is difficult to change the programs to enforce them.

5.       Atomicity problems: A computer system, like any other mechanical or electrical device, is subject failure. In many applications, it is crucial to ensure that, once a failure has occurred and has been detected, the data are restored to the consistent state that existed prior to the failure. The transfer of information must be atomic-it happen in its entirety or not at all. It is difficult to ensure this property in a conventional file-processing system.

6.       Concurrent-access anomalies: So that the overall performance of the system is improved and a faster response time is possible, many systems allow multiple users to update the data simultaneously. In such an environment, interaction of concurrent updates may result in inconsistent data. To guard against this possibility, the system must maintain some form of supervision. Because data may be accessed by many different application programs that have not been coordinated previously, supervision is difficult to provide.

Security problems: Not every user of the database system should be able to access all the data. For example, in a banking system, payroll personnel need to see only that part of the database that has information about various bank employees. They do not need access to information about customer accounts. Since application programs are added to the system in ad hoc manner, it is difficult to enforce such security constraints.

(Visited 6,393 times, 5 visits today)

3 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *