Are computer-based health record systems usable?

Computer-based health records are taking over–this website demonstrates successful case studies, considerations and how to effectively use electronic dental health records. But how usable is this software? Healthcare software often lags in usability compared to other market segments. Three authors investigated four practice management systems for usability of the computer records. They evaluated four types of software: Dentrix, EagleSoft, SoftDent and Practice Works. Dentists who were novices with the software completed a variety of common tasks to evaluate the efficacy and usability of each product.

Unsuccessful attempts to complete tasks, incorrect completion, general negativity about the task or software, incorrect task completion, giving up on the task or making suggestions about the design were all considered to be usability failures: “The range for correctly completed tasks was 16 to 64 percent, for incorrectly completed tasks 18 to 38 percent and for incomplete tasks 9 to 47 percent. The authors identified 286 usability problems.” In general, these difficulties were classified into a series of categories: unintuitive ordering of steps, poor guidance around labels and definitions, poorly organized control and function buttons, and not having a clear idea of the data model behind the task steps were the key difficulties users encountered.

The authors sum up their findings as such:

“The usability of dental CPRs must be improved to increase the adoption of CPR systems in dental practices. A second significant finding was the high frequency of task failures. Study participants failed to complete 28 percent of the tasks, and they made errors in completing 30 percent of the tasks. While it is difficult to infer error rates in daily practice from a laboratory study, these findings strongly suggest that there is a need to examine the incidence of documentation errors in practices that use CPR systems.”

To explain and contextualize their findings, the authors make statements about the tasks and theorized root of the common errors:

“Few of the tasks in our study can be considered hard to perform. Most of them consisted of entering a single finding, diagnosis or planned procedure for a single anatomical location. What seemed to make performing the tasks a challenge for most users was that the user interface provided the capability to enter hundreds, if not thousands, of data with a few mouse clicks or keystrokes. The visual and functional complexity of the software applications seemed to overwhelm most users and appeared to be responsible for a large number of task failures.”

Lessons Learned:

  • Computerized patient record systems are very difficult to use without training, even for simple clinical tasks.
  • The high frequency of errors found across all four software platforms suggests that all electronic patient records (or EHRs) need better design and workflow.
  • Deep understanding of the data model behind computerized records is a potential avenue to reduce errors, but review of clinical documentation will also be necessary to catch data quality control problems.

Thyvalikakath, TP; Schleyer, TK; Monaco, V. (2007) Heuristic evaluation of clinical functions in four practice management systems: a pilot study. Journal of the American Dental Association 138(2): 209-10, 212-8