Suggested revisions to JCAHO equipment management standards seek to allow flexibility while ensuring reliable performance.

 When the Joint Commission on Accreditation of Healthcare Organizations (JCAHO) released fairly extensive revisions to its medical equipment management standards last year, the clinical engineering and biomedical community understandably looked to the commission’s new accreditation process, “Shared Visions—New Pathways,” for guidance. After studying and attempting to apply the new standards, six professionals in the field decided to form a study group with the intention of developing recommendations that would help the Joint Commission and health care organizations improve the application of standards using the knowledge and experience accumulated by the clinical engineering community.

“It was natural because we’ve all known each other for a long time,” says study group member Binseng Wang, ScD, CCE, senior director, program support and quality assurance, ARAMARK Healthcare Management Services, Clinical Technology Services, Charlotte, NC. “We meet once or twice a year to talk about issues and concerns. Individually, we started reading [the new guidelines] and asking each other questions, and we decided to get together and see what we could do jointly to suggest some improvements.”

The group also included Ted Cohen, MS, CCE, manager, clinical engineering, University of California Davis Health System; Emanuel Furst, PhD, CCE, president, Improvement Technologies LLC, in Tucson, Ariz; Ode Keil, MS, MBA, CCE, director of quality management and performance improvement, Provena Mercy Center, Aurora, Ill; Malcolm Ridgway, PhD, CCE, senior vice president, Masterplan Inc, Chatsworth, Calif; and Robert Stiefel, MS, CCE, director of clinical engineering, University of Maryland Medical System.

In developing new standards, the group tried to ensure the changes would help improve patient safety; enhance efficacy, reliability, and availability of equipment; be able to be implemented honestly, without manipulations and adjustments; be simple and easy to understand, with little or no need for consultants or training courses; and be flexible, allowing adaptation to the unique characteristics of each organization.

The group submitted a letter to JCAHO in April 2004 with the proposed changes to specific elements of performance (EPs) designated in the Environment of Care (EC) chapter of the new JCAHO standards. The letter commended the Joint Commission for its “Shared Visions—New Pathways” accreditation process, the reduction of the number of standards and clearer specifications of EPs, and the increased consistency across different types of health care organizations. It also offered comments on “issues related to specific EPs in EC.6.10 and EC.6.20.” The chart on the following page outlines the study group’s recommendations: ‰

Suggested Improvements to Equipment Management Standards

    Original JCAHO Text

Suggested Improvements

Rationale

Element of Performance Scoring Element of Performance Scoring  
EC.6.10.3 The organization establishes and uses risk criteria for identifying, evaluating, and creating an inventory of equipment to be included in the medical management plan before the equipment is used. These criteria address the following:

• Equipment function (diagnosis, care, treatment, and monitoring)
• Physical risks associated with use
• Equipment incident history

Note: The hospital may choose not to use risk criteria to limit the types of equipment to be included in the medical equipment management plan, but rather include all medical equipment.

Category B EC.6.10.3 The organization establishes and uses criteria for identifying, evaluating, and creating an inventory of equipment to be included in the medical equipment management plan before the equipment is used. The inventory may include all medical equipment or use criteria such as the following:

• The equipment’s role and importance within the organization’s mission (ie, how critical it is for patient care)
• The severity, frequency, and detectability of physical risks associated with use
• Reliability
• Availability of equipment and of spares or backup
• Equipment incident, hazard notice, and recall history
• Inspection and/or preventive maintenance needs

Note: The organization may use failure modes and effects analysis (FMEA) to establish the inventory.

Category B In addition to risk, our experience has shown that several other criteria should also be considered. Also, each organization should be given the flexibility of selecting the appropriate criteria to fit its unique characteristics. The inventory may include all equipment, as indicated in the original Note.

FMEA, a widely adopted method for measuring risks, is a good tool for determining which equipment to include in the inventory.

EC.6.10.4 The organization identifies appropriate strategies for all equipment on the inventory for achieving effective, safe, and reliable operation of all equipment in the inventory.

Note: Organizations may use different strategies as appropriate. For example, strategies such as predictive maintenance, interval-based inspections, corrective maintenance, or metered maintenance may be selected to ensure reliable performance.

Category B EC.6.10.4 The organization identifies appropriate inspection and maintenance strategies for all equipment on the inventory for achieving effective, safe, and reliable operation of all equipment in the inventory, and defines criteria for measuring the performance of the inspection and maintenance program.

Note: Organizations may use different strategies for different items as appropriate. For example, strategies such as predictive maintenance, interval-based inspections, statistical sampling, corrective maintenance, or metered maintenance may be selected to ensure reliable performance. Organizations may use different performance measurements for the inspection and maintenance of different groups of equipment.

Category B Allowing each organization to define its own criteria for performance measurement will provide flexibility to focus clinical engineering attention on equipment that is most critical for achieving the organization’s mission.
EC.6.20.3 The organization documents maintenance of equipment used for life support that is consistent with maintenance strategies to minimize clinical and physical risks identified in the equipment management plan (see standard EC.6.10). Category A EC.6.20.3 Not applicable. NA We believe there is no need to segregate life-support equipment from the rest, as life support is only one of the inclusion criteria listed in 6.10.3. EP 4 covers both life-support and nonlife-support equipment.
EC.6.20.4 The organization documents maintenance of nonlife-support equipment on the inventory that is consistent with maintenance strategies to minimize clinical and physical risks identified in the equipment management plan (see standard EC.6.10). Category C EC.6.20.4 The organization documents inspection and maintenance of equipment on the inventory that is consistent with the maintenance strategies and the inspection and maintenance performance measurement criteria identified in the equipment management plan (see EC.6.10). Category B The suggested change of Category C scoring to Category B is to reflect the changes recommended in EP 4 of EC.6.20.

After submitting the suggested improvements, the group had an encouraging conference call with JCAHO. “Our initial contact with [JCAHO] gave us an indication that we could sway them with more detail,” Stiefel says.

“They asked a lot of questions, and we explained our reasoning,” Wang says. “They eventually adopted a few of our suggested editorial changes, which were published in their update for the third quarter. Most substantive issues are still on the table for [JCAHO] to decide whether they can be considered as alternative ways to implement the current standards or if they needed to be revised. Some of those decisions may depend on how successful we are at presenting our case.”

Key Concerns
Though the group began the process with the idea of reviewing all of the standards that apply uniquely and specifically to medical equipment management, it quickly became clear that it was going to take quite a long time, even for only six people to agree on multiple changes.

“There were, however, three items we agreed upon—not only what was wrong, but also what was right about them,” Stiefel says. “So it made sense to try to get some changes accomplished with these as quickly as possible and, then, see if that might ease the way to a longer-term or larger-scale approach to standards revisions.”

Those three areas included dropping JCAHO’s new category of life-support equipment with a 100% inspection-compliance requirement; identifying a more effective measure of performance of the medical equipment management program than PM completion; and offering alternatives to risk assessment as the only means for defining the inventory of equipment that is included in the program.

“JCAHO has long used the completion rate for scheduled inspections as virtually the only measure for performance of a hospital’s medical equipment management program,” Stiefel says. “While that made sense 20 or 30 years ago, clinical engineering departments are now more sophisticated. In addition, equipment is so dependable that simply completing inspections is no longer an indicator of a program’s contribution to patient safety, which is the first priority for health care and for clinical engineering. There are now dozens of ways that clinical engineering can contribute that are not even considered in the standards, much less in their measures.”

An analogy of such a limited-performance evaluation would be, according to Wang, a school system that evaluates student learning based strictly on attendance. “There could be people daydreaming in the classroom every day, as well as people who miss a lot of classes but are learning outside of class by other methods,” he says. “In the same way, measuring performance solely by completion of the PM or inspection of equipment on schedule is an incomplete measure of effectiveness.”

“We believe that we should assess the outcome of the program just like clinicians do: by looking at how well the health care system is taking care of patients,” Wang continues. “For clinicians, it is not a matter of the number of surgeries a patient undergoes, but how well they are doing after receiving the care. Biomeds do not yet have a universally accepted set of metrics for measuring outcomes, but we need to start working on them instead of hanging onto the PM completion rate.”

Performance Criteria
The group decided to begin by defining six criteria for selecting equipment to be included in the management plan: 1) the equipment’s role and importance within the organization’s mission (ie, how critical it is for patient care); 2) whether the failure modes are obvious or subtle, how frequently they may happen, and how they affect the patient; 3) reliability; 4) the availability of backup or alternative devices; 5) equipment incident, hazard notice, and recall history; and 6) inspection and/or preventive maintenance needs.

“Reliability in this context is a question of whether or not the scheduled maintenance process includes something that would prevent failures,” Furst says. “It doesn’t do much good to look at something once a week if such an inspection does not lead you to identify a failure before the user finds it, or permit you to do something to prevent a failure (eg, provide lubrication). For instance, clinical lab equipment users are highly trained and specialized, and they will pick up a problem before one of our technicians does. So we have a spectrum—if we look at outcomes, we might take an entirely different view of how we manage certain equipment. Getting away from the rigidity of these concepts could lead to creative approaches that are more effective and a better use of time, resources, and finances.”

For instance, Furst points out that even with a minor piece of equipment, a failure can shut down an emergency room and put it on bypass, and that has a significant financial impact on the institution in addition to possibly serious consequences with patients.

“That is why our changes allow for measuring scheduled maintenance performance based on outcomes (eg, uptime rates of the most important pieces of equipment, and annual failure rates for others), with inspection priorities set by attributes such as reliability,” he says. “A piece of equipment might be highly reliable, and might have self-check features and an obvious failure mode, and thus doesn’t need to be inspected as regularly as something with a subtle failure mode that can be detected only by inspections.”

With regard to the availability of backup or alternative devices, Wang notes that hospitals have emergency backup plans in place for vital units, such as life-support equipment, but not necessarily for other pieces of equipment for which there are no spares.

“The irony here is that hospitals may have equipment that is not necessarily considered life support, but is big, expensive, and one-of-a-kind-like MRIs or CTs. For those pieces of equipment, we can’t afford to have backups and duplicates,” Wang says. “Nonetheless, if the sole MRI or CT goes down, you have a very serious problem because many patients will go without proper diagnosis or monitoring of their progress. If you can’t diagnose their problems, you can’t treat them.”

“So sometimes priority is not dictated by the nature of equipment, but by the mission of that piece of equipment in the big picture. We need to look at what it contributes to the overall care a hospital provides,” he continues, alluding to the criterion that rates equipment according to its role in the health care facility’s mission.

Progress
During their discussions with JCAHO it became clear to the group that JCAHO believes that much of what has been proposed can be accommodated within the existing standards, without making revisions. While this is true for some of the proposed changes, the study group is concerned that most health care organizations are not aware of the flexibility and, thus, would be unwilling to adopt the proposed changes on their own. While JCAHO is still deciding what it will do with the proposed changes, Wang says the study group is drafting two papers with detailed explanations of their recommendations to be presented to JCAHO for review and possible publication in the Joint Commission’s Environment of Care News.

“We started to draft a second round of changes too, but that will be put on hold until we see what reaction we get from the first round,” he says. “For now, we want to limit [our suggestions] to a few issues and then revisit others later. There is not enough time to analyze everything, and I think this process will help the community understand the issues and hopefully discuss or find even better solutions than those that we are proposing.” The group presented their first draft at the 2004 AAMI meeting in Boston and plans to present more details and solicit comments at the upcoming AAMI meeting in Tampa.

The group also recognizes that progress will be slow, as JCAHO has an extremely difficult job in trying to develop standards that are applicable to the wide variety of health care organizations.

“It’s definitely not easy to write a standard that applies to more than 10,000 organizations across the country with different sizes, conditions, and objectives,” Wang says. “We are a small group of people who are mostly based in fairly well-funded and managed organizations, so perhaps we are not exactly the best representatives of the entire spectrum of facilities. Frankly, JCAHO has a very tough role here. They have to define elements of performance that serve as official “pass/fail” cutoff lines, which may make some organizations ineligible to receive Medicare/Medicaid and/or private insurance funding.”

Preventing Train Wrecks
“JCAHO’s stance with the latest changes was that they were trying to identify the programs they called ‘train wrecks,’ by which they meant departments that were disastrously far from meeting the standards,” Stiefel says. “I suppose there are probably such programs, but I did not understand how creating a new category of equipment called ‘life-support equipment’ that had to have 100% scheduled inspection compliance would prevent a so-called train wreck. Instead, it implies that if we can’t get a medical equipment management program to comply with standards that have been developed over decades, then we should tighten the standards.”

And while the study group members appreciate the desire to “stop a train wreck,” they feel that by doing so, JCAHO is penalizing the rest by creating more prescriptive standards and limiting those who are trying to do better with their own programs.

“Ideally, we would like each hospital to be able to measure its own performance and improve from there, but again, creating a standard that allows people to be creative and productive is a tough challenge,” Wang says. “Some of these standards have been on the books for more than 20 years, and JCAHO has steadily improved them. We are just impatient and want to push the envelope a little more to make improvements quickly. Hopefully, we all can come together with something a little better. We are just trying to advocate for institutions and patients, and we want to be able to use our limited resources in the best, most efficient way we can.” 24×7

Liz Finch is a contributing writer for 24×7.