Fellowship Programs Benefit from QOPI® Participation
Frances A. Collichio, MD, and James A. Stewart, MD
||Frances A. Collichio, MD
||James A. Stewart, MD
Despite the Accreditation Council for Graduate Medical Education (ACGME) competencies (e.g., systems-based practice, practicebased learning) having been infused into internal medicine residencies for several years now, the average fellow remains ill informed about how health care programs measure and ensure quality. The ability of an organization to assess, monitor, and improve its processes and outcomes is increasingly important. After fellowship training, fellows will be expected to participate in, if not lead, quality initiatives. The Quality Oncology Practice Initiative (QOPI®) is a web-based tool that tracks objective measurements of quality of care as assessed by chart review.1,2 The data collected include core measures of care for patients with cancer such as pain control, management of chemotherapy and biologic agents, consent processes, and clarity of staging. Fellows will learn the importance of all of these aspects of care by participating in the QOPI process. They also learn the importance of accurate and timely medical record documentation as part of a high-quality program. Fellows must appreciate that QOPI or similar tools will likely be used as reference measures to demonstrate delivery of quality care and may affect reimbursement levels.3 Educators who help their fellows with QOPI may use the performance reports to complete the selfdirected practice-improvement module for their own Maintenance of Certifi cation Credit. The ACGME Program Information Form, which is used for site visits, has several questions about how the program assesses quality improvement. QOPI can go a long way in meeting this requirement.
QOPI Requirements for Fellows
QOPI is an ASCO membership benefi t so it is free for practices or fellowship programs with at least one member. QOPI requires that the ASCO member register in QOPI, enter data about the organization, and select the modules that will be used for data abstraction. QOPI sets the number of charts that should be reviewed. There are dates defining a 5-week window during which QOPI is open for data entry. QOPI provides a chart-sampling methodology, but each site decides on how it will identify appropriate charts and who will do the abstracting and data entry. Once the program or practice has completed its required charts, the data will be collated by QOPI. Approximately 4 weeks later the program receives a report that documents how well the program met measures of health care quality and how it compares with other similar programs. There are two major differences between the QOPI processes for practices compared with the process for fellowship programs. All practices are required to respond to the core measures, plus at least two modules of their choosing, whereas the fellowship program must respond to core measures and at least one module of their choosing. In addition, the number of charts that practice staff review is based on the number of full-time physicians in the practice.1,2 However, for fellowship programs, ASCO’s Oncology Training Program Subcommittee recommends that each fellow reviews 10 charts per fellow. A program with 10 fellows would review 100 charts for core measures and respond to questions for those same 100 charts for at least one module. Or, a program could choose to review a domain (e.g., symptom and toxicity management, or care at the end of life) and a disease (e.g., breast, non-Hodgkin lymphoma (NHL), colon/ rectal cancer, or non-small cell lung cancer). The fellow would review fi ve domain and five disease cases.
There also are other minor differences that we have detected through experience. Although there is a fall and a spring collection period, we recommend that fellowship programs participate only in the spring so that the first-year fellows will have a chance to accumulate cases and all fellows have had a chance to settle into their academic year before working on QOPI. We have found that fellows should abstract data themselves rather than have a nonphysician do this because this function is a key part of the learning process. Fellows gain a handson appreciation of the key measures, and they learn how medical records should be written to clearly document these important measures. Although data abstraction can be done over a 5-week period, we have found that it is best to set up one or two dedicated time periods to complete the process. After the first chart, it takes 10 to 15 minutes to complete the review for one chart; with 10 charts, review can be completed in 1.5 to 2.5 hours. Programs can make this a “fun” event. It could be planned in the evening with pizza, before the busy work days get started, or as an excused event from usual service responsibilities.
Perceived Barriers to Fellowship Program Participation
“The program cannot afford to pay someone to do the data abstraction.”
It is part of the learning process for the fellows to do the data abstractions themselves. By doing the questions themselves they learn what data is important to include in the medical record. More importantly, fellows can reflect on the care they provide as related to the measures that are assessed in QOPI.
“In the course of busy fellowship training, there is not enough time do QOPI.”
Set up a special time to go over the charts, and the whole process can be done in several hours per fellow. Ask yourself how important is this competency is for training. Surely it is less time consuming than having the fellow invent and carry out a faculty-supervised, individual practice- improvement project.
“The patients have to come from the fellows’ current patient roster.”
Ideally it is nice if fellows review fellows’ cases, but it is not practical in many academic institutions. If it cannot be done that way, the fellows still learn the process of quality assessment and improvement.
“We cannot do it with electronic medical records (EMR).”
We agree that EMR can make it a bit confusing to fi nd the right patient for the right module. One way to do this is to access the chart by International Classifi cation of Disease (ICD) codes. Another very practical solution is to ask the fellows to keep a list of the patients they see throughout the year by diagnosis. Then, the program administrator can sort and scramble the list so that each fellow does not review his or her own cases.
Personal Experiences with Implementation
At the University of North Carolina Chapel Hill (UNC), QOPI started in the spring session of 2010. A didactic session to explain the QOPI process was held 2 weeks before the chart review session. Because UNC uses EMR, we reserved the computer lab for 90 minutes at breakfast time. If the fellows did not complete their charts in 90 minutes, they did the remainder on their own. In 2010, fellows participated in the breast cancer and the NHL modules. Charts were assessed by ICD codes. Many of the charts did not meet requirements when we assessed them this way; we changed the system in 2011 to have the fellows keep a list of the patients whom they had seen, and the program administrator then assigned cases by medical record number.
In 2011, UNC assessed charts in the core and NHL modules. The data from 2010 showed that we should do a better job documenting consent for chemotherapy. A didactic session included the process of how to document the consent. From 2010 to 2011 we saw an improvement in this measure. The biggest improvement we had was when we included granulocyte colony-stimulating factor in our order set for patients older than 65 who were receiving cyclophosphamide, doxorubicin, vincristine, and prednisone (CHOP) combination chemotherapy. Once the granulocyte colony-stimulating factor was included, the compliance in that measurement improved from 63% to 100%.
At the Baystate Medical Center the 2.5 years of a fellowship consists of a weekly half-day continuity hematology clinic and a separate oncology clinic. Thus fellows have a substantial number of patients that are considered “theirs.” The research staff identifi es appropriate patients associated with the core or disease designation and with the fellow, which does require a time investment. The fellows abstract their colleagues’ charts, not their own, with help from the research staff. The research staff enter the data into the computer at a later time.
QOPI results have led to development of a templated process for dictating treatment summary plans, as well as to a new consent form that “forces” the fellow or attending physician to designate a goal of therapy (cure vs. control of disease vs. reduction of symptoms). The fellows are learning that it is indeed possible to collect meaningful data and to compare their practices with other sites. Of note, our research staff is convinced that fellow dictations and overall documentation have improved since the fellows began the QOPI reviews.
Collectively we believe that QOPI is a fellow-friendly path to practice improvement. Measurement of quality with benchmarking against national data will eventually be a part of everyone’s practice, and it is essential that our graduates enter the post-training workforce skilled in this area. Yes, the process does take time, but it is effi cient overall and certainly educational.
About the Authors: Dr. Collichio is a clinical professor in the department of medicine at UNC. She also is the current Chair of the ASCO In-Training Exam, and she received a teaching award in 2004 in hematology/oncology. She has been an ASCO member since 1995. Dr. Stewart is chief of the hematology and oncology division at Baystate Medical Center, as well as the fellowship director. He has been an ASCO member for almost 30 years.