Immunization-Related Function and Usability

Usability of EHRs and other clinical software systems receives considerable attention from health care stakeholders and in the lay press. While usability is a fundamental issue for any EHR’s functionality, there is no existing program or set of standards that address usability and which is specifically focused on immunization management. The purpose of this section is to provide a path to addressing usability evaluation for Electronic Health Record (EHR) and other clinical software systems from an immunization perspective.

Software product usability and related topics such as user experience (UX), user-centered design (UCD), user “friendly,” human factors, and human-system interaction have been studied by standards organizations and academia for many years.  The International Organization for Standardization (ISO) defines usability as ”the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use.”[1] This ISO 9241 standard has been maintained and updated many times.

Many other initiatives and organizations have crafted and asserted their point of view on the “open field” of usability and EHR certification.[2],[3],[4] ,[5] EHR vendors and others have raised concerns that usability evaluation should not be prescriptive or limit innovation in software design.[6]This section describes approaches to evaluating usability for EHR systems relative to immunization and patient safety.

Usability Definition and Background

Getting stakeholders to agree on how to evaluate usability for the purposes of EHR certification has always been difficult.  Even establishing a common definition for the term “usability” has been a serious challenge.  A different definition may be used depending on the organization or the focus of the study conducted.  This difference has served to support healthy debate among members of the health IT community, and each variation has represented a step toward the eventual specification and standardization of usability for EHRs.

Building upon established literature, we propose the following definition of usability,:

  • The ability of the user to safely and efficiently obtain what is needed from the system to assure patient safety; and
  • The ability of the system to provide information and functionality in a way that allows the user to make the most informed clinical decisions in a safe and effective manner.

For the purposes of this document, the following informs the overall EHR immunization usability definition:

  • Effectiveness: Ability to achieve an intended goal/outcome,
  • Efficiency: Ability to achieve an intended goal/outcome within appropriate time and resource constraints, and
  • Satisfaction: Ability to achieve an intended goal/outcome in a way that delights the user.

NIST uses the ISO 9241 standard to define usability as “the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use [ISO 9241].”  Immunization-related EHR usability specifies immunization workflow as the context of use and adds to it the dimension of patient safety.

This definition informs the suggested approach to immunization-related usability guidance, testing, and evaluation.

Guidance, Testing, and Evaluation

There are multiple ways to improve usability and impact workflow and safety outcomes of EHR immunization-related capabilities.  The approach recommended here builds upon industry best practices and standards (Jakob Nielsen’s “10 Usability Heuristics for User Interface Design”)[7] and NIST EHR-specific usability efforts that build upon Nielsen’s in “NISTIR 7804: Technical Evaluation, Testing, and Validation of the Usability of Electronic Health Records”.[8]  The intent is to create guidance that ensures the patient’s safety through advisement on how to accommodate usability throughout the workflow of all providers involved in the immunization process. The challenge is to provide such assistance without providing an overly prescriptive system design.

The Exhibit below presents the scope of the current effort to address usability in the context of immunization-related EHR evaluation. The effort focuses on usability dimensions that are more quantifiable and objective.   Usability also addresses the more subjective user experience (UX), which is valid and important, yet more subjective and therefore more challenging to evaluate as part of a certification process.  Utility, a term describing the functionality present in the EHR system that enables a user to complete a needed or desired task, is a foundation for usability, but does not alone qualify as usability.

Usability Picture

Distinguishing utility from usability, NIST published a set of critical user interactions that should be addressed to support EHR usability for pediatric patient care (2012).[9] That report listed five specific interactions that directly or indirectly affect immunizations:

  1. Allow ordering vaccination via reminder,
  2. Allow data entry for vaccinations given at other institutions,
  3. Support display and tracking of components of combination vaccines,
  4. Display the days prior vaccinations were given and support alerts for recommended minimum, ideal, and maximum intervals between vaccinations, and
  5. Allow sorting of vaccination data by multiple fields.

EHR software can address utility; i.e., implementation of each of these critical functions. Safety and efficient workflow are the desired outcomes of improving software’s usability (efficiency, effectiveness, and satisfaction).

Usability Criteria (Heuristics) for Evaluation

The NIST EHR Usability Protocol (EUP)[10] defines the following criteria (heuristics) to measure usability:

  1. Error Handling and Prevention – Even better than good error messages is careful design that prevents a problem from occurring in the first place.
  2. Patient identification error – Actions are performed for one patient or documented in one patient’s record that were intended for another patient.
  3. Mode error – Actions are performed in one mode that were intended for another mode (e.g., medication ordering by direct dose vs. weight dose, same units, same measurement system, etc.).
  4. Data accuracy error – Displayed data are not accurate.
  5. Data availability error – Decisions are based on incomplete information because related information requires additional navigation, access to another provider’s note, taking actions to update the status, or is not updated within a reasonable time.
  6. Interpretation error – Differences in measurement systems, conventions, and terms contribute to erroneous assumptions about the meaning of information.
  7. Recall error – Decisions are based on incorrect assumptions because appropriate actions require users to remember information rather than recognize it.
  8. Feedback error – Decisions are based on insufficient information because lack of system feedback about automated actions makes it difficult to identify when the actions are not appropriate for the context.
  9. Data integrity error – Decisions are based on stored data that are corrupted or deleted.
  10. Visibility of System Status – The system should always keep the user informed about what is going on, through appropriate feedback within reasonable time.
  11. Match Between System and the Real World – The system should speak the users’ language, with words, phrases, and concepts familiar to the user, rather than system-oriented terms. The system also should follow real-world conventions, making information appear in a natural and logical order.
  12. User Control and Freedom – Users often choose system functions by mistake and will need a clearly marked “emergency exit” to leave the unwanted state without having to go through an extended dialogue. The system should provide undo and redo functionality.
  13. Consistency and Standards – Users should not have to wonder whether different words, situations, or actions mean the same thing. The system should follow platform conventions.
  14. Help Users Recognize, Diagnose and Recover from Errors – Error messages should be expressed in plain language and not use confusing or ambiguous code.
  15. Recognition Rather than Recall – Minimize the user’s memory load by making objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate.
  16. Aesthetic and Minimalist Design – Dialogues should not contain information that is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility.
  17. Help and Documentation – Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user’s task, list concrete steps to be carried out, and not be too large.
  18. Pleasurable and Respectful Interaction with the User –The user’s interactions with the system should enhance the quality of her or his worklife. The user should be treated with respect. The design should be aesthetically pleasing- with artistic as well as functional value.
  19. Privacy – The system should help the user to protect personal or private information belonging to the user or his/her patients.

[1] Lowry SZ, Quinn MT, Ramaiah M, et. al. Human Factors Guide to Enhance EHR Usability of Critical User Interactions when Supporting Pediatric Patient Care. (NISTIR 7865, June 2012).  Available at: Accessed 29 September 2014.

[2] Lowry SZ, Quinn MT, Ramaiah M. Technical Evaluation, Testing and Validation of the Usability of Electronic Health Records.  National Institute of Standards and Testing. (NISTIR 7804). February 2012. Available at: Accessed 30 November 2014.

[3] International Standards Organization. ISO 9241-210:2010: Ergonomics of human-system interaction — Part 210: Human centered design for interactive systems. Preview available at: Accessed 30 September 2014.

[4]Belden J, Grayson R, Barnes J. Defining and Testing EMR Usability: Principles and Proposed Methods of EMR Usability Evaluation and Rating. Healthcare Information Management and Systems Society Electronic Health Record Usability Task Force. Available at: Accessed 29 September 2014.

[5] McDonnell C, Werner K, Wendel L. Electronic Health Record Usability: Vendor Practices and Perspectives. AHRQ Publication No. 09(10)-0091-3-EF. Rockville, MD: Agency for Healthcare Research and Quality. May 2010. Available at: Accessed 29 September 2014.

[6] Lowry SZ, Ramaiah M, Patterson ES, et. al. Integrating Electronic Health Records into Clinical Workflow: An Application of Human Factors Modeling Methods to Ambulatory Care. NISTIR 7988. March 2014. National Institute of Standards and Technology. Available at: Accessed 29 September 2014.

[7] Zhang, J., Walji, M. (2011). TURF: Toward a unified framework of EHR usability. Journal of Biomedical Informatics, 44 (6), 1056-67.

[8]Armijo D, McDonnell C, Werner K. Electronic Health Record Usability: Interface Design Considerations. AHRQ Publication No. 09(10)-0091-2-EF. Rockville, MD: Agency for Healthcare Research and Quality. October 2009. Available at: Accessed 29 September 2014.

[9] Nielsen J. 10 Usability Heuristics for User Interface Design. January 1, 1995. Available at: Accessed 30 January 2014.

[10] Lowry SZ, Quinn MT, Ramaiah M. Technology Evaluation, Testing and Validation of the Usability of Electronic Health Records. February 2012. Available at: Accessed 30 January 2014.